Dec 06 06:57:04 crc systemd[1]: Starting Kubernetes Kubelet... Dec 06 06:57:04 crc restorecon[4953]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:04 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 06:57:05 crc restorecon[4953]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 06:57:05 crc restorecon[4953]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 06 06:57:05 crc kubenswrapper[4954]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 06:57:05 crc kubenswrapper[4954]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 06 06:57:05 crc kubenswrapper[4954]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 06:57:05 crc kubenswrapper[4954]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 06:57:05 crc kubenswrapper[4954]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 06 06:57:05 crc kubenswrapper[4954]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.301780 4954 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304645 4954 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304660 4954 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304665 4954 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304669 4954 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304672 4954 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304677 4954 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304680 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304684 4954 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304688 4954 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304693 4954 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304698 4954 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304702 4954 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304705 4954 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304709 4954 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304712 4954 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304716 4954 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304724 4954 feature_gate.go:330] unrecognized feature gate: Example Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304728 4954 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304732 4954 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304736 4954 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304739 4954 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304743 4954 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304746 4954 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304750 4954 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304754 4954 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304757 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304761 4954 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304764 4954 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304768 4954 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304771 4954 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304775 4954 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304779 4954 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304783 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304786 4954 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304790 4954 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304793 4954 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304796 4954 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304800 4954 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304806 4954 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304810 4954 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304814 4954 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304818 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304822 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304826 4954 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304829 4954 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304833 4954 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304836 4954 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304841 4954 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304846 4954 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304850 4954 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304854 4954 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304859 4954 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304863 4954 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304867 4954 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304870 4954 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304873 4954 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304877 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304882 4954 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304888 4954 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304891 4954 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304896 4954 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304900 4954 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304904 4954 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304907 4954 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304910 4954 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304914 4954 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304917 4954 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304921 4954 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304924 4954 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304928 4954 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.304932 4954 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305195 4954 flags.go:64] FLAG: --address="0.0.0.0" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305208 4954 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305216 4954 flags.go:64] FLAG: --anonymous-auth="true" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305223 4954 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305228 4954 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305232 4954 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305238 4954 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305244 4954 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305250 4954 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305255 4954 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305260 4954 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305264 4954 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305268 4954 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305272 4954 flags.go:64] FLAG: --cgroup-root="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305276 4954 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305281 4954 flags.go:64] FLAG: --client-ca-file="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305284 4954 flags.go:64] FLAG: --cloud-config="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305288 4954 flags.go:64] FLAG: --cloud-provider="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305292 4954 flags.go:64] FLAG: --cluster-dns="[]" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305299 4954 flags.go:64] FLAG: --cluster-domain="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305303 4954 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305307 4954 flags.go:64] FLAG: --config-dir="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305311 4954 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305317 4954 flags.go:64] FLAG: --container-log-max-files="5" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305326 4954 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305330 4954 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305335 4954 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305340 4954 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305344 4954 flags.go:64] FLAG: --contention-profiling="false" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305348 4954 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305352 4954 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305357 4954 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305362 4954 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305367 4954 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305372 4954 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305377 4954 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305381 4954 flags.go:64] FLAG: --enable-load-reader="false" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305386 4954 flags.go:64] FLAG: --enable-server="true" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305390 4954 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305396 4954 flags.go:64] FLAG: --event-burst="100" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305400 4954 flags.go:64] FLAG: --event-qps="50" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305404 4954 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305408 4954 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305413 4954 flags.go:64] FLAG: --eviction-hard="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305419 4954 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305423 4954 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305428 4954 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305432 4954 flags.go:64] FLAG: --eviction-soft="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305436 4954 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305440 4954 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305445 4954 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305486 4954 flags.go:64] FLAG: --experimental-mounter-path="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305491 4954 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305495 4954 flags.go:64] FLAG: --fail-swap-on="true" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305499 4954 flags.go:64] FLAG: --feature-gates="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305504 4954 flags.go:64] FLAG: --file-check-frequency="20s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305509 4954 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305513 4954 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305518 4954 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305522 4954 flags.go:64] FLAG: --healthz-port="10248" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305526 4954 flags.go:64] FLAG: --help="false" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305530 4954 flags.go:64] FLAG: --hostname-override="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305534 4954 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305541 4954 flags.go:64] FLAG: --http-check-frequency="20s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305546 4954 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305550 4954 flags.go:64] FLAG: --image-credential-provider-config="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305554 4954 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305572 4954 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305580 4954 flags.go:64] FLAG: --image-service-endpoint="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305584 4954 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305588 4954 flags.go:64] FLAG: --kube-api-burst="100" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305623 4954 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305629 4954 flags.go:64] FLAG: --kube-api-qps="50" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305633 4954 flags.go:64] FLAG: --kube-reserved="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305638 4954 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305644 4954 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305649 4954 flags.go:64] FLAG: --kubelet-cgroups="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305654 4954 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305658 4954 flags.go:64] FLAG: --lock-file="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305662 4954 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305667 4954 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305671 4954 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305678 4954 flags.go:64] FLAG: --log-json-split-stream="false" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305682 4954 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305686 4954 flags.go:64] FLAG: --log-text-split-stream="false" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305690 4954 flags.go:64] FLAG: --logging-format="text" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305694 4954 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305699 4954 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305703 4954 flags.go:64] FLAG: --manifest-url="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305707 4954 flags.go:64] FLAG: --manifest-url-header="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305714 4954 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305719 4954 flags.go:64] FLAG: --max-open-files="1000000" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305725 4954 flags.go:64] FLAG: --max-pods="110" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305730 4954 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305735 4954 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305741 4954 flags.go:64] FLAG: --memory-manager-policy="None" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305746 4954 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305751 4954 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305755 4954 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305760 4954 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305773 4954 flags.go:64] FLAG: --node-status-max-images="50" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305778 4954 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305783 4954 flags.go:64] FLAG: --oom-score-adj="-999" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305788 4954 flags.go:64] FLAG: --pod-cidr="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305794 4954 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305803 4954 flags.go:64] FLAG: --pod-manifest-path="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305808 4954 flags.go:64] FLAG: --pod-max-pids="-1" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305814 4954 flags.go:64] FLAG: --pods-per-core="0" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305820 4954 flags.go:64] FLAG: --port="10250" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305825 4954 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305830 4954 flags.go:64] FLAG: --provider-id="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305835 4954 flags.go:64] FLAG: --qos-reserved="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305839 4954 flags.go:64] FLAG: --read-only-port="10255" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305845 4954 flags.go:64] FLAG: --register-node="true" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305849 4954 flags.go:64] FLAG: --register-schedulable="true" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305853 4954 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305862 4954 flags.go:64] FLAG: --registry-burst="10" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305866 4954 flags.go:64] FLAG: --registry-qps="5" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305870 4954 flags.go:64] FLAG: --reserved-cpus="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305874 4954 flags.go:64] FLAG: --reserved-memory="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305879 4954 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305883 4954 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305888 4954 flags.go:64] FLAG: --rotate-certificates="false" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305892 4954 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305896 4954 flags.go:64] FLAG: --runonce="false" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305900 4954 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305904 4954 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305909 4954 flags.go:64] FLAG: --seccomp-default="false" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305913 4954 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305918 4954 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305922 4954 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305926 4954 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305931 4954 flags.go:64] FLAG: --storage-driver-password="root" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305935 4954 flags.go:64] FLAG: --storage-driver-secure="false" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305940 4954 flags.go:64] FLAG: --storage-driver-table="stats" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305945 4954 flags.go:64] FLAG: --storage-driver-user="root" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305950 4954 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305955 4954 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305960 4954 flags.go:64] FLAG: --system-cgroups="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305965 4954 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305973 4954 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305979 4954 flags.go:64] FLAG: --tls-cert-file="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305983 4954 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305989 4954 flags.go:64] FLAG: --tls-min-version="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305994 4954 flags.go:64] FLAG: --tls-private-key-file="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.305998 4954 flags.go:64] FLAG: --topology-manager-policy="none" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.306002 4954 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.306006 4954 flags.go:64] FLAG: --topology-manager-scope="container" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.306010 4954 flags.go:64] FLAG: --v="2" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.306016 4954 flags.go:64] FLAG: --version="false" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.306023 4954 flags.go:64] FLAG: --vmodule="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.306028 4954 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.306033 4954 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306141 4954 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306148 4954 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306152 4954 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306157 4954 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306161 4954 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306165 4954 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306170 4954 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306175 4954 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306179 4954 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306183 4954 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306186 4954 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306190 4954 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306193 4954 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306197 4954 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306200 4954 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306204 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306208 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306212 4954 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306215 4954 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306219 4954 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306222 4954 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306226 4954 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306230 4954 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306234 4954 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306238 4954 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306242 4954 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306245 4954 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306249 4954 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306252 4954 feature_gate.go:330] unrecognized feature gate: Example Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306256 4954 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306260 4954 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306263 4954 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306267 4954 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306272 4954 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306278 4954 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306282 4954 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306286 4954 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306291 4954 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306295 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306300 4954 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306304 4954 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306309 4954 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306313 4954 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306317 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306321 4954 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306324 4954 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306328 4954 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306389 4954 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306394 4954 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306397 4954 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306401 4954 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306405 4954 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306410 4954 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306415 4954 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306419 4954 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306422 4954 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306426 4954 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306430 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306434 4954 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306439 4954 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306443 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306446 4954 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306450 4954 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306454 4954 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306457 4954 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306483 4954 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306487 4954 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306490 4954 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306494 4954 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306498 4954 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.306504 4954 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.306518 4954 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.314590 4954 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.314615 4954 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314684 4954 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314689 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314694 4954 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314699 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314704 4954 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314708 4954 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314713 4954 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314717 4954 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314722 4954 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314727 4954 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314731 4954 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314735 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314740 4954 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314744 4954 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314751 4954 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314758 4954 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314763 4954 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314767 4954 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314773 4954 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314777 4954 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314782 4954 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314786 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314791 4954 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314797 4954 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314802 4954 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314808 4954 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314813 4954 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314819 4954 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314824 4954 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314829 4954 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314833 4954 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314838 4954 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314842 4954 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314846 4954 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314852 4954 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314856 4954 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314860 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314864 4954 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314868 4954 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314873 4954 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314877 4954 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314882 4954 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314887 4954 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314891 4954 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314895 4954 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314899 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314903 4954 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314908 4954 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314912 4954 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314916 4954 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314920 4954 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314924 4954 feature_gate.go:330] unrecognized feature gate: Example Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314928 4954 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314932 4954 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314936 4954 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314940 4954 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314944 4954 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314949 4954 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314953 4954 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314957 4954 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314961 4954 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314965 4954 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314969 4954 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314973 4954 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314978 4954 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314982 4954 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314986 4954 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314992 4954 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.314996 4954 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315001 4954 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315006 4954 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.315014 4954 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315152 4954 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315163 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315168 4954 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315172 4954 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315178 4954 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315182 4954 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315186 4954 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315190 4954 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315194 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315199 4954 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315203 4954 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315206 4954 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315210 4954 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315215 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315220 4954 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315225 4954 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315229 4954 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315234 4954 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315239 4954 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315244 4954 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315248 4954 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315253 4954 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315258 4954 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315268 4954 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315276 4954 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315281 4954 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315285 4954 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315290 4954 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315295 4954 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315299 4954 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315303 4954 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315309 4954 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315314 4954 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315318 4954 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315323 4954 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315327 4954 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315332 4954 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315336 4954 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315341 4954 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315345 4954 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315350 4954 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315354 4954 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315358 4954 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315362 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315367 4954 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315371 4954 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315376 4954 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315379 4954 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315384 4954 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315389 4954 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315393 4954 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315397 4954 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315403 4954 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315408 4954 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315414 4954 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315418 4954 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315423 4954 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315427 4954 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315432 4954 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315437 4954 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315442 4954 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315447 4954 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315451 4954 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315456 4954 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315461 4954 feature_gate.go:330] unrecognized feature gate: Example Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315465 4954 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315469 4954 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315474 4954 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315478 4954 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315482 4954 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.315488 4954 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.315495 4954 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.315708 4954 server.go:940] "Client rotation is on, will bootstrap in background" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.318992 4954 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.319104 4954 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.319723 4954 server.go:997] "Starting client certificate rotation" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.319747 4954 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.319968 4954 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-16 07:32:31.474289465 +0000 UTC Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.320140 4954 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.327287 4954 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.329277 4954 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.341508 4954 log.go:25] "Validated CRI v1 runtime API" Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.342939 4954 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.114:6443: connect: connection refused" logger="UnhandledError" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.355665 4954 log.go:25] "Validated CRI v1 image API" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.358000 4954 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.360883 4954 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-06-06-47-48-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.360935 4954 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.378337 4954 manager.go:217] Machine: {Timestamp:2025-12-06 06:57:05.376920511 +0000 UTC m=+0.190279920 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:a29bc87b-f01f-4645-b6f9-deab79f2c4b3 BootID:9b7a4e02-eeee-45d6-a19e-861f2b430acc Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:7d:02:15 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:7d:02:15 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c8:cb:da Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:9a:ef:23 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:13:6d:a4 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ac:00:d3 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:f4:db:e7 Speed:-1 Mtu:1496} {Name:ens7.44 MacAddress:52:54:00:98:54:56 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:c2:c5:c6:c4:dd:41 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ea:71:f9:2e:24:9f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.378690 4954 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.378863 4954 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.379223 4954 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.379423 4954 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.379458 4954 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.379693 4954 topology_manager.go:138] "Creating topology manager with none policy" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.379704 4954 container_manager_linux.go:303] "Creating device plugin manager" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.379879 4954 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.379901 4954 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.380222 4954 state_mem.go:36] "Initialized new in-memory state store" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.380308 4954 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.380929 4954 kubelet.go:418] "Attempting to sync node with API server" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.380947 4954 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.380971 4954 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.380983 4954 kubelet.go:324] "Adding apiserver pod source" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.380997 4954 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.382827 4954 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.383046 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.114:6443: connect: connection refused Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.383156 4954 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.383155 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.114:6443: connect: connection refused" logger="UnhandledError" Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.383470 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.114:6443: connect: connection refused Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.383691 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.114:6443: connect: connection refused" logger="UnhandledError" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.384159 4954 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.384707 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.384738 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.384747 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.384756 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.384769 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.384778 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.384785 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.384798 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.384807 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.384815 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.384825 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.384831 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.385002 4954 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.385402 4954 server.go:1280] "Started kubelet" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.386834 4954 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.386632 4954 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.387491 4954 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.387932 4954 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.114:6443: connect: connection refused Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.389153 4954 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.114:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e8dfdd92895ab default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 06:57:05.385371051 +0000 UTC m=+0.198730440,LastTimestamp:2025-12-06 06:57:05.385371051 +0000 UTC m=+0.198730440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 06:57:05 crc systemd[1]: Started Kubernetes Kubelet. Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.390610 4954 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.390647 4954 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.390857 4954 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 10:16:19.950636158 +0000 UTC Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.390915 4954 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.390921 4954 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 915h19m14.559719452s for next certificate rotation Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.390998 4954 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.391007 4954 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.391175 4954 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.391354 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.114:6443: connect: connection refused" interval="200ms" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.394396 4954 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.394419 4954 factory.go:55] Registering systemd factory Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.394432 4954 factory.go:221] Registration of the systemd container factory successfully Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.394711 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.114:6443: connect: connection refused Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.394815 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.114:6443: connect: connection refused" logger="UnhandledError" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.395136 4954 factory.go:153] Registering CRI-O factory Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.395184 4954 factory.go:221] Registration of the crio container factory successfully Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.395264 4954 factory.go:103] Registering Raw factory Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.395290 4954 manager.go:1196] Started watching for new ooms in manager Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.396549 4954 manager.go:319] Starting recovery of all containers Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.396882 4954 server.go:460] "Adding debug handlers to kubelet server" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.405487 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.405627 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.405655 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.405702 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.405724 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.405744 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.405765 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.405787 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.405809 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.405831 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.405850 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.405870 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.405890 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.405911 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.405929 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.405949 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.405973 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406000 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406030 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406055 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406081 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406108 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406136 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406169 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406198 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406226 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406257 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406278 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406299 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406320 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406341 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406359 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406379 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406401 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406419 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406439 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406460 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406480 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406499 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406517 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406537 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406556 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406610 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406631 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406649 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406670 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406690 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406710 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406769 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406795 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406815 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406835 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406863 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406886 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406912 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406933 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406956 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406977 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.406999 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407019 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407039 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407060 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407081 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407100 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407121 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407140 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407159 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407178 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407200 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407225 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407252 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407280 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407308 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407337 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407365 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407392 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407414 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407435 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407453 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407472 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407493 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407515 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407534 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407553 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407602 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407619 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407639 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407658 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407676 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407695 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407713 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407732 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.407751 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.412529 4954 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.412664 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.412769 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.412819 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.412860 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.412886 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413013 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413036 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413062 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413076 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413090 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413111 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413144 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413164 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413188 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413205 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413223 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413243 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413264 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413281 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413300 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413318 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413330 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413344 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413362 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413373 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413391 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413405 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413418 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413436 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413450 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413463 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413482 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413496 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413513 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413525 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413537 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413553 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413578 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413593 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413643 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413659 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413678 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413690 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413706 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413718 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413731 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413749 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413764 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413780 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413795 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413812 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413833 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413853 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413878 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413903 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413919 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413940 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413957 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.413975 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414002 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414019 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414044 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414062 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414081 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414103 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414119 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414139 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414152 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414203 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414220 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414233 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414250 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414264 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414276 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414293 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414305 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414320 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414334 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414348 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414363 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414375 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414390 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414405 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414420 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414435 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414448 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414461 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414478 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414490 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414507 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414518 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414529 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414549 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414559 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414596 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414608 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414619 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414635 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414644 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414664 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414679 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414690 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414707 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414718 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414730 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414748 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414761 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414779 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414791 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414805 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414823 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414836 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414853 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414867 4954 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414877 4954 reconstruct.go:97] "Volume reconstruction finished" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.414885 4954 reconciler.go:26] "Reconciler: start to sync state" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.421877 4954 manager.go:324] Recovery completed Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.437145 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.438716 4954 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.439208 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.439249 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.439262 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.440073 4954 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.440098 4954 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.440119 4954 state_mem.go:36] "Initialized new in-memory state store" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.441979 4954 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.442034 4954 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 06 06:57:05 crc kubenswrapper[4954]: I1206 06:57:05.442076 4954 kubelet.go:2335] "Starting kubelet main sync loop" Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.442136 4954 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 06 06:57:05 crc kubenswrapper[4954]: W1206 06:57:05.443063 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.114:6443: connect: connection refused Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.443115 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.114:6443: connect: connection refused" logger="UnhandledError" Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.491697 4954 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.542808 4954 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.592209 4954 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.593133 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.114:6443: connect: connection refused" interval="400ms" Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.693242 4954 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.743520 4954 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.793938 4954 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.894377 4954 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.935672 4954 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.114:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e8dfdd92895ab default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 06:57:05.385371051 +0000 UTC m=+0.198730440,LastTimestamp:2025-12-06 06:57:05.385371051 +0000 UTC m=+0.198730440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.994905 4954 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 06 06:57:05 crc kubenswrapper[4954]: E1206 06:57:05.994979 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.114:6443: connect: connection refused" interval="800ms" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.079192 4954 policy_none.go:49] "None policy: Start" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.081872 4954 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.082054 4954 state_mem.go:35] "Initializing new in-memory state store" Dec 06 06:57:06 crc kubenswrapper[4954]: E1206 06:57:06.095359 4954 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 06 06:57:06 crc kubenswrapper[4954]: E1206 06:57:06.144410 4954 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 06 06:57:06 crc kubenswrapper[4954]: E1206 06:57:06.196008 4954 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 06 06:57:06 crc kubenswrapper[4954]: W1206 06:57:06.276851 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.114:6443: connect: connection refused Dec 06 06:57:06 crc kubenswrapper[4954]: E1206 06:57:06.276937 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.114:6443: connect: connection refused" logger="UnhandledError" Dec 06 06:57:06 crc kubenswrapper[4954]: E1206 06:57:06.296629 4954 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.389537 4954 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.114:6443: connect: connection refused Dec 06 06:57:06 crc kubenswrapper[4954]: E1206 06:57:06.397196 4954 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.410380 4954 manager.go:334] "Starting Device Plugin manager" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.410485 4954 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.410508 4954 server.go:79] "Starting device plugin registration server" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.411233 4954 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.411268 4954 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.411610 4954 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.411790 4954 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.411807 4954 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 06 06:57:06 crc kubenswrapper[4954]: E1206 06:57:06.419412 4954 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 06 06:57:06 crc kubenswrapper[4954]: W1206 06:57:06.457400 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.114:6443: connect: connection refused Dec 06 06:57:06 crc kubenswrapper[4954]: E1206 06:57:06.457502 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.114:6443: connect: connection refused" logger="UnhandledError" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.511758 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.513495 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.513558 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.513602 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.513642 4954 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 06:57:06 crc kubenswrapper[4954]: E1206 06:57:06.514342 4954 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.114:6443: connect: connection refused" node="crc" Dec 06 06:57:06 crc kubenswrapper[4954]: W1206 06:57:06.620236 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.114:6443: connect: connection refused Dec 06 06:57:06 crc kubenswrapper[4954]: E1206 06:57:06.620339 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.114:6443: connect: connection refused" logger="UnhandledError" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.715174 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.716914 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.716956 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.716966 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.716994 4954 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 06:57:06 crc kubenswrapper[4954]: E1206 06:57:06.717708 4954 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.114:6443: connect: connection refused" node="crc" Dec 06 06:57:06 crc kubenswrapper[4954]: W1206 06:57:06.744985 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.114:6443: connect: connection refused Dec 06 06:57:06 crc kubenswrapper[4954]: E1206 06:57:06.745147 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.114:6443: connect: connection refused" logger="UnhandledError" Dec 06 06:57:06 crc kubenswrapper[4954]: E1206 06:57:06.796474 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.114:6443: connect: connection refused" interval="1.6s" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.945314 4954 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.945420 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.946670 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.946711 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.946723 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.946851 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.947130 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.947165 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.947634 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.947665 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.947676 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.947747 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.947887 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.947932 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.948229 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.948278 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.948290 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.948659 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.948686 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.948696 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.948769 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.948792 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.948802 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.948893 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.948994 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.949023 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.949524 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.949579 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.949593 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.949706 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.949818 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.949847 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.949858 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.949881 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.949934 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.950812 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.950853 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.950865 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.951028 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.951068 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.951452 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.951520 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.951542 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.951750 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.951775 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:06 crc kubenswrapper[4954]: I1206 06:57:06.951783 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.036670 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.036754 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.036810 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.036861 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.036896 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.037009 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.037080 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.037102 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.037135 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.037172 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.037223 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.037305 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.037406 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.037463 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.037509 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.118846 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.120447 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.120488 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.120504 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.120538 4954 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 06:57:07 crc kubenswrapper[4954]: E1206 06:57:07.121107 4954 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.114:6443: connect: connection refused" node="crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139398 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139496 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139532 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139557 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139617 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139640 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139661 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139666 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139698 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139736 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139764 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139806 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139805 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139796 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139695 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139827 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139812 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139867 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139754 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139878 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139921 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139943 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139962 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139988 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139992 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.140006 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139876 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.140032 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.139882 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.140138 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.293522 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.313164 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: W1206 06:57:07.319977 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-ace1d562ea7753cf9cec2f35af52aa94d2b73c9614636b77a3805d0b71650e04 WatchSource:0}: Error finding container ace1d562ea7753cf9cec2f35af52aa94d2b73c9614636b77a3805d0b71650e04: Status 404 returned error can't find the container with id ace1d562ea7753cf9cec2f35af52aa94d2b73c9614636b77a3805d0b71650e04 Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.321328 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: W1206 06:57:07.335357 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-db60964b712552029bf5e34886c2508ea6c65c5bd59679c74444b5116ef8b223 WatchSource:0}: Error finding container db60964b712552029bf5e34886c2508ea6c65c5bd59679c74444b5116ef8b223: Status 404 returned error can't find the container with id db60964b712552029bf5e34886c2508ea6c65c5bd59679c74444b5116ef8b223 Dec 06 06:57:07 crc kubenswrapper[4954]: W1206 06:57:07.336484 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-34723d36e07da28dbb79cd41b56289e028eeac8524229eb0a083e4b6685f05ab WatchSource:0}: Error finding container 34723d36e07da28dbb79cd41b56289e028eeac8524229eb0a083e4b6685f05ab: Status 404 returned error can't find the container with id 34723d36e07da28dbb79cd41b56289e028eeac8524229eb0a083e4b6685f05ab Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.341254 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.346777 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:07 crc kubenswrapper[4954]: W1206 06:57:07.358607 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-0c6c3828707b0d720380ffaca288e6ad9188d28f1e9adebc8e16fa775cc8938c WatchSource:0}: Error finding container 0c6c3828707b0d720380ffaca288e6ad9188d28f1e9adebc8e16fa775cc8938c: Status 404 returned error can't find the container with id 0c6c3828707b0d720380ffaca288e6ad9188d28f1e9adebc8e16fa775cc8938c Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.389171 4954 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.114:6443: connect: connection refused Dec 06 06:57:07 crc kubenswrapper[4954]: W1206 06:57:07.414521 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-913f7a81a7299bf9ac57a371fbaff5068f1eaec0d262329f4baedd69a4e8859a WatchSource:0}: Error finding container 913f7a81a7299bf9ac57a371fbaff5068f1eaec0d262329f4baedd69a4e8859a: Status 404 returned error can't find the container with id 913f7a81a7299bf9ac57a371fbaff5068f1eaec0d262329f4baedd69a4e8859a Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.448807 4954 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.449068 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"34723d36e07da28dbb79cd41b56289e028eeac8524229eb0a083e4b6685f05ab"} Dec 06 06:57:07 crc kubenswrapper[4954]: E1206 06:57:07.449810 4954 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.114:6443: connect: connection refused" logger="UnhandledError" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.450457 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"db60964b712552029bf5e34886c2508ea6c65c5bd59679c74444b5116ef8b223"} Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.451707 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ace1d562ea7753cf9cec2f35af52aa94d2b73c9614636b77a3805d0b71650e04"} Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.452490 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"913f7a81a7299bf9ac57a371fbaff5068f1eaec0d262329f4baedd69a4e8859a"} Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.453367 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0c6c3828707b0d720380ffaca288e6ad9188d28f1e9adebc8e16fa775cc8938c"} Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.921288 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.924542 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.924620 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.924634 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:07 crc kubenswrapper[4954]: I1206 06:57:07.924670 4954 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 06:57:07 crc kubenswrapper[4954]: E1206 06:57:07.925208 4954 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.114:6443: connect: connection refused" node="crc" Dec 06 06:57:08 crc kubenswrapper[4954]: W1206 06:57:08.304441 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.114:6443: connect: connection refused Dec 06 06:57:08 crc kubenswrapper[4954]: E1206 06:57:08.304618 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.114:6443: connect: connection refused" logger="UnhandledError" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.389067 4954 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.114:6443: connect: connection refused Dec 06 06:57:08 crc kubenswrapper[4954]: E1206 06:57:08.397848 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.114:6443: connect: connection refused" interval="3.2s" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.457204 4954 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c" exitCode=0 Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.457282 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c"} Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.457440 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.459269 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.459321 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.459330 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.460097 4954 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="763a52c30a31474f4d6156eed9d7c9cbca1fcf0cef53e74089ff36760616b1c3" exitCode=0 Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.460153 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"763a52c30a31474f4d6156eed9d7c9cbca1fcf0cef53e74089ff36760616b1c3"} Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.460179 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.461228 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.461959 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.462006 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.462020 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.462088 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.462110 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.462122 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.462592 4954 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="8e934f5a591fbeb54465e030ae77d3fea9e7f12f6a1ea32605a27ebbbffe4540" exitCode=0 Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.462644 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"8e934f5a591fbeb54465e030ae77d3fea9e7f12f6a1ea32605a27ebbbffe4540"} Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.462716 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.463328 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.463366 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.463377 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.465011 4954 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d" exitCode=0 Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.465078 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.465086 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d"} Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.467044 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.467072 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.467082 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.476957 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d5a7fed55e7ec73f999696e646773214143d55cd4b29be33cc77433e0b2eca68"} Dec 06 06:57:08 crc kubenswrapper[4954]: I1206 06:57:08.477015 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"de5913af46a45e08c8e6774fc882ac9c66797ddc6429110d193bab506e91126e"} Dec 06 06:57:08 crc kubenswrapper[4954]: W1206 06:57:08.723503 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.114:6443: connect: connection refused Dec 06 06:57:08 crc kubenswrapper[4954]: E1206 06:57:08.723628 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.114:6443: connect: connection refused" logger="UnhandledError" Dec 06 06:57:09 crc kubenswrapper[4954]: W1206 06:57:09.034961 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.114:6443: connect: connection refused Dec 06 06:57:09 crc kubenswrapper[4954]: E1206 06:57:09.035053 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.114:6443: connect: connection refused" logger="UnhandledError" Dec 06 06:57:09 crc kubenswrapper[4954]: W1206 06:57:09.072848 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.114:6443: connect: connection refused Dec 06 06:57:09 crc kubenswrapper[4954]: E1206 06:57:09.072956 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.114:6443: connect: connection refused" logger="UnhandledError" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.483844 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739"} Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.483955 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5"} Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.483973 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354"} Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.483994 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73"} Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.498268 4954 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3bb58bc81fa75f21963abf055a7591b6a48b2d73f576cc10bbc802043719ce27" exitCode=0 Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.498360 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3bb58bc81fa75f21963abf055a7591b6a48b2d73f576cc10bbc802043719ce27"} Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.498442 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.499587 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.499626 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.499640 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.501439 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b60c69751b0a690ad14e6835009a7861b14636a1e0c533211456f376f0368bd1"} Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.501497 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.502609 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.502651 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.502661 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.504414 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1783fc9d0986247d5ada8563d17661cb609d146db208e05609175315554e54db"} Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.504466 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.504473 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b4bb75a39170b64e9b46994f3955c491ecb3f2155fc552c2738a5ece2e22de21"} Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.504487 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d2ac34c62664b0a17ec26755307ecae78d4118b236c66819ebad906c1c092cd5"} Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.506784 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.506825 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.506842 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.509082 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"862cc099ebb60b745732109a65a2cec3e28dbbf6ff9e9704fb1f48f112b18ac4"} Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.509114 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1ed952d95a1a788963e2d1e27b9c83eb3ffeae7db1666d738d914e6f374f6a3f"} Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.509166 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.509937 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.509972 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.509987 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.525544 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.527374 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.527432 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.527445 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.527473 4954 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 06:57:09 crc kubenswrapper[4954]: I1206 06:57:09.630606 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.515294 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"09c3ed8778c556151da2bb1c3876a74093d8d3286abbd138fba4a94f8eefd7eb"} Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.515364 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.516475 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.516505 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.516515 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.518070 4954 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8684a7d806f9b8fde08fa898665202f479d8cd1b7b779667516fadbd2edf6047" exitCode=0 Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.518163 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.518533 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.518720 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8684a7d806f9b8fde08fa898665202f479d8cd1b7b779667516fadbd2edf6047"} Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.518776 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.518797 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.518829 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.519054 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.519082 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.519094 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.520351 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.520373 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.520383 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.520499 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.520520 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.520532 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.520497 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.520859 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:10 crc kubenswrapper[4954]: I1206 06:57:10.520869 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:11 crc kubenswrapper[4954]: I1206 06:57:11.455329 4954 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 06 06:57:11 crc kubenswrapper[4954]: I1206 06:57:11.526092 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1f25a446fadfb363ee38298b4b5a561017cb89c2c9099555ed50a8ca7d9e1296"} Dec 06 06:57:11 crc kubenswrapper[4954]: I1206 06:57:11.526264 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:11 crc kubenswrapper[4954]: I1206 06:57:11.526289 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 06:57:11 crc kubenswrapper[4954]: I1206 06:57:11.526366 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:11 crc kubenswrapper[4954]: I1206 06:57:11.528178 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:11 crc kubenswrapper[4954]: I1206 06:57:11.528225 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:11 crc kubenswrapper[4954]: I1206 06:57:11.528239 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:11 crc kubenswrapper[4954]: I1206 06:57:11.528219 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:11 crc kubenswrapper[4954]: I1206 06:57:11.528378 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:11 crc kubenswrapper[4954]: I1206 06:57:11.528400 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:12 crc kubenswrapper[4954]: I1206 06:57:12.536215 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ddfa018798e201b7ae74d5c443999b108945c84a4a3fa803abb26256c8a4a8dc"} Dec 06 06:57:12 crc kubenswrapper[4954]: I1206 06:57:12.536292 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b4b4f2933b80af3e40fc62e3a39fc6ed040324911ff2e2ede84b7eb9d2705c31"} Dec 06 06:57:12 crc kubenswrapper[4954]: I1206 06:57:12.536316 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"86c6758d19607e84d046b2f58804bf8cac4af3bf5b40ed7d6cf1541d73006db2"} Dec 06 06:57:12 crc kubenswrapper[4954]: I1206 06:57:12.536336 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4812b6f0d6a5ec757fa7d811ea0e10399dbc2a99eef24403cabe4f78b8b9f070"} Dec 06 06:57:12 crc kubenswrapper[4954]: I1206 06:57:12.536378 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:12 crc kubenswrapper[4954]: I1206 06:57:12.537927 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:12 crc kubenswrapper[4954]: I1206 06:57:12.538008 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:12 crc kubenswrapper[4954]: I1206 06:57:12.538032 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:12 crc kubenswrapper[4954]: I1206 06:57:12.631041 4954 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 06:57:12 crc kubenswrapper[4954]: I1206 06:57:12.631167 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 06:57:12 crc kubenswrapper[4954]: I1206 06:57:12.752090 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:12 crc kubenswrapper[4954]: I1206 06:57:12.752331 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 06:57:12 crc kubenswrapper[4954]: I1206 06:57:12.752386 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:12 crc kubenswrapper[4954]: I1206 06:57:12.754172 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:12 crc kubenswrapper[4954]: I1206 06:57:12.754241 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:12 crc kubenswrapper[4954]: I1206 06:57:12.754273 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:12 crc kubenswrapper[4954]: I1206 06:57:12.998488 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:13 crc kubenswrapper[4954]: I1206 06:57:13.539661 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 06:57:13 crc kubenswrapper[4954]: I1206 06:57:13.539742 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:13 crc kubenswrapper[4954]: I1206 06:57:13.539855 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:13 crc kubenswrapper[4954]: I1206 06:57:13.541361 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:13 crc kubenswrapper[4954]: I1206 06:57:13.541415 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:13 crc kubenswrapper[4954]: I1206 06:57:13.541435 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:13 crc kubenswrapper[4954]: I1206 06:57:13.541773 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:13 crc kubenswrapper[4954]: I1206 06:57:13.541835 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:13 crc kubenswrapper[4954]: I1206 06:57:13.541855 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:14 crc kubenswrapper[4954]: I1206 06:57:14.125106 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 06:57:14 crc kubenswrapper[4954]: I1206 06:57:14.125338 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:14 crc kubenswrapper[4954]: I1206 06:57:14.126782 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:14 crc kubenswrapper[4954]: I1206 06:57:14.126822 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:14 crc kubenswrapper[4954]: I1206 06:57:14.126836 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:14 crc kubenswrapper[4954]: I1206 06:57:14.563709 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:14 crc kubenswrapper[4954]: I1206 06:57:14.563942 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:14 crc kubenswrapper[4954]: I1206 06:57:14.565366 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:14 crc kubenswrapper[4954]: I1206 06:57:14.565402 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:14 crc kubenswrapper[4954]: I1206 06:57:14.565412 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:15 crc kubenswrapper[4954]: I1206 06:57:15.111722 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 06:57:15 crc kubenswrapper[4954]: I1206 06:57:15.111992 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:15 crc kubenswrapper[4954]: I1206 06:57:15.113821 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:15 crc kubenswrapper[4954]: I1206 06:57:15.113902 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:15 crc kubenswrapper[4954]: I1206 06:57:15.113926 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:15 crc kubenswrapper[4954]: I1206 06:57:15.927500 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 06 06:57:15 crc kubenswrapper[4954]: I1206 06:57:15.927860 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:15 crc kubenswrapper[4954]: I1206 06:57:15.929600 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:15 crc kubenswrapper[4954]: I1206 06:57:15.929651 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:15 crc kubenswrapper[4954]: I1206 06:57:15.929662 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:16 crc kubenswrapper[4954]: E1206 06:57:16.419542 4954 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 06 06:57:17 crc kubenswrapper[4954]: I1206 06:57:17.518995 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 06:57:17 crc kubenswrapper[4954]: I1206 06:57:17.520815 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:17 crc kubenswrapper[4954]: I1206 06:57:17.522309 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:17 crc kubenswrapper[4954]: I1206 06:57:17.522366 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:17 crc kubenswrapper[4954]: I1206 06:57:17.522379 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:17 crc kubenswrapper[4954]: I1206 06:57:17.526090 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 06:57:17 crc kubenswrapper[4954]: I1206 06:57:17.552803 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:17 crc kubenswrapper[4954]: I1206 06:57:17.553054 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 06:57:17 crc kubenswrapper[4954]: I1206 06:57:17.553767 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:17 crc kubenswrapper[4954]: I1206 06:57:17.553799 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:17 crc kubenswrapper[4954]: I1206 06:57:17.553808 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:17 crc kubenswrapper[4954]: I1206 06:57:17.557820 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 06:57:17 crc kubenswrapper[4954]: I1206 06:57:17.876781 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 06 06:57:17 crc kubenswrapper[4954]: I1206 06:57:17.877088 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:17 crc kubenswrapper[4954]: I1206 06:57:17.878694 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:17 crc kubenswrapper[4954]: I1206 06:57:17.878738 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:17 crc kubenswrapper[4954]: I1206 06:57:17.878750 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:18 crc kubenswrapper[4954]: I1206 06:57:18.555753 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:18 crc kubenswrapper[4954]: I1206 06:57:18.557875 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:18 crc kubenswrapper[4954]: I1206 06:57:18.557962 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:18 crc kubenswrapper[4954]: I1206 06:57:18.558027 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:19 crc kubenswrapper[4954]: I1206 06:57:19.391033 4954 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 06 06:57:19 crc kubenswrapper[4954]: E1206 06:57:19.529388 4954 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 06 06:57:19 crc kubenswrapper[4954]: I1206 06:57:19.558422 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:19 crc kubenswrapper[4954]: I1206 06:57:19.560896 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:19 crc kubenswrapper[4954]: I1206 06:57:19.560953 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:19 crc kubenswrapper[4954]: I1206 06:57:19.560965 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:21 crc kubenswrapper[4954]: E1206 06:57:21.457336 4954 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 06 06:57:21 crc kubenswrapper[4954]: E1206 06:57:21.599072 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Dec 06 06:57:22 crc kubenswrapper[4954]: W1206 06:57:22.591734 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 06 06:57:22 crc kubenswrapper[4954]: I1206 06:57:22.591913 4954 trace.go:236] Trace[1308009039]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 06:57:12.589) (total time: 10002ms): Dec 06 06:57:22 crc kubenswrapper[4954]: Trace[1308009039]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (06:57:22.591) Dec 06 06:57:22 crc kubenswrapper[4954]: Trace[1308009039]: [10.00218238s] [10.00218238s] END Dec 06 06:57:22 crc kubenswrapper[4954]: E1206 06:57:22.591956 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 06 06:57:22 crc kubenswrapper[4954]: I1206 06:57:22.631654 4954 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 06:57:22 crc kubenswrapper[4954]: I1206 06:57:22.631776 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 06:57:22 crc kubenswrapper[4954]: I1206 06:57:22.730866 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:22 crc kubenswrapper[4954]: I1206 06:57:22.733088 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:22 crc kubenswrapper[4954]: I1206 06:57:22.733143 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:22 crc kubenswrapper[4954]: I1206 06:57:22.733174 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:22 crc kubenswrapper[4954]: I1206 06:57:22.733213 4954 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 06:57:22 crc kubenswrapper[4954]: I1206 06:57:22.753392 4954 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded" start-of-body= Dec 06 06:57:22 crc kubenswrapper[4954]: I1206 06:57:22.753497 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded" Dec 06 06:57:23 crc kubenswrapper[4954]: W1206 06:57:23.197935 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 06 06:57:23 crc kubenswrapper[4954]: I1206 06:57:23.198088 4954 trace.go:236] Trace[927696452]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 06:57:13.195) (total time: 10002ms): Dec 06 06:57:23 crc kubenswrapper[4954]: Trace[927696452]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (06:57:23.197) Dec 06 06:57:23 crc kubenswrapper[4954]: Trace[927696452]: [10.002431728s] [10.002431728s] END Dec 06 06:57:23 crc kubenswrapper[4954]: E1206 06:57:23.198123 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 06 06:57:23 crc kubenswrapper[4954]: W1206 06:57:23.253960 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 06 06:57:23 crc kubenswrapper[4954]: I1206 06:57:23.254096 4954 trace.go:236] Trace[1957040907]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 06:57:13.251) (total time: 10002ms): Dec 06 06:57:23 crc kubenswrapper[4954]: Trace[1957040907]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (06:57:23.253) Dec 06 06:57:23 crc kubenswrapper[4954]: Trace[1957040907]: [10.002272604s] [10.002272604s] END Dec 06 06:57:23 crc kubenswrapper[4954]: E1206 06:57:23.254122 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 06 06:57:23 crc kubenswrapper[4954]: W1206 06:57:23.327756 4954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 06 06:57:23 crc kubenswrapper[4954]: I1206 06:57:23.327904 4954 trace.go:236] Trace[647507472]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 06:57:13.326) (total time: 10001ms): Dec 06 06:57:23 crc kubenswrapper[4954]: Trace[647507472]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:57:23.327) Dec 06 06:57:23 crc kubenswrapper[4954]: Trace[647507472]: [10.001534014s] [10.001534014s] END Dec 06 06:57:23 crc kubenswrapper[4954]: E1206 06:57:23.327937 4954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 06 06:57:24 crc kubenswrapper[4954]: I1206 06:57:24.380709 4954 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 06 06:57:24 crc kubenswrapper[4954]: I1206 06:57:24.380791 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 06 06:57:26 crc kubenswrapper[4954]: E1206 06:57:26.419928 4954 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 06 06:57:27 crc kubenswrapper[4954]: I1206 06:57:27.759740 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:27 crc kubenswrapper[4954]: I1206 06:57:27.760180 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:27 crc kubenswrapper[4954]: I1206 06:57:27.761906 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:27 crc kubenswrapper[4954]: I1206 06:57:27.761965 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:27 crc kubenswrapper[4954]: I1206 06:57:27.761978 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:27 crc kubenswrapper[4954]: I1206 06:57:27.766295 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:27 crc kubenswrapper[4954]: I1206 06:57:27.909648 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 06 06:57:27 crc kubenswrapper[4954]: I1206 06:57:27.909856 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:27 crc kubenswrapper[4954]: I1206 06:57:27.911286 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:27 crc kubenswrapper[4954]: I1206 06:57:27.911356 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:27 crc kubenswrapper[4954]: I1206 06:57:27.911374 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:27 crc kubenswrapper[4954]: I1206 06:57:27.924591 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 06 06:57:28 crc kubenswrapper[4954]: I1206 06:57:28.583543 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:28 crc kubenswrapper[4954]: I1206 06:57:28.583632 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 06:57:28 crc kubenswrapper[4954]: I1206 06:57:28.583695 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:28 crc kubenswrapper[4954]: I1206 06:57:28.584658 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:28 crc kubenswrapper[4954]: I1206 06:57:28.584707 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:28 crc kubenswrapper[4954]: I1206 06:57:28.584725 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:28 crc kubenswrapper[4954]: I1206 06:57:28.584993 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:28 crc kubenswrapper[4954]: I1206 06:57:28.585029 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:28 crc kubenswrapper[4954]: I1206 06:57:28.585045 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:29 crc kubenswrapper[4954]: I1206 06:57:29.381605 4954 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 06 06:57:29 crc kubenswrapper[4954]: E1206 06:57:29.382351 4954 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 06 06:57:29 crc kubenswrapper[4954]: I1206 06:57:29.412102 4954 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Dec 06 06:57:29 crc kubenswrapper[4954]: I1206 06:57:29.412126 4954 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Dec 06 06:57:29 crc kubenswrapper[4954]: I1206 06:57:29.412215 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Dec 06 06:57:29 crc kubenswrapper[4954]: I1206 06:57:29.412291 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Dec 06 06:57:29 crc kubenswrapper[4954]: I1206 06:57:29.416922 4954 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:46662->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 06 06:57:29 crc kubenswrapper[4954]: I1206 06:57:29.417015 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:46662->192.168.126.11:17697: read: connection reset by peer" Dec 06 06:57:29 crc kubenswrapper[4954]: I1206 06:57:29.850007 4954 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 06 06:57:29 crc kubenswrapper[4954]: I1206 06:57:29.867342 4954 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.005071 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.005281 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.006698 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.006755 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.006770 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.010594 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.591143 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.593277 4954 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="09c3ed8778c556151da2bb1c3876a74093d8d3286abbd138fba4a94f8eefd7eb" exitCode=255 Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.593369 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"09c3ed8778c556151da2bb1c3876a74093d8d3286abbd138fba4a94f8eefd7eb"} Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.593433 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.593517 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.594646 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.594695 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.594708 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.594646 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.594788 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.594808 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.595858 4954 scope.go:117] "RemoveContainer" containerID="09c3ed8778c556151da2bb1c3876a74093d8d3286abbd138fba4a94f8eefd7eb" Dec 06 06:57:30 crc kubenswrapper[4954]: I1206 06:57:30.743737 4954 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 06 06:57:31 crc kubenswrapper[4954]: I1206 06:57:31.315958 4954 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 06 06:57:31 crc kubenswrapper[4954]: I1206 06:57:31.597966 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 06 06:57:31 crc kubenswrapper[4954]: I1206 06:57:31.598916 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 06 06:57:31 crc kubenswrapper[4954]: I1206 06:57:31.602086 4954 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335" exitCode=255 Dec 06 06:57:31 crc kubenswrapper[4954]: I1206 06:57:31.602144 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335"} Dec 06 06:57:31 crc kubenswrapper[4954]: I1206 06:57:31.602257 4954 scope.go:117] "RemoveContainer" containerID="09c3ed8778c556151da2bb1c3876a74093d8d3286abbd138fba4a94f8eefd7eb" Dec 06 06:57:31 crc kubenswrapper[4954]: I1206 06:57:31.602381 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:31 crc kubenswrapper[4954]: I1206 06:57:31.603854 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:31 crc kubenswrapper[4954]: I1206 06:57:31.603931 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:31 crc kubenswrapper[4954]: I1206 06:57:31.603957 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:31 crc kubenswrapper[4954]: I1206 06:57:31.605330 4954 scope.go:117] "RemoveContainer" containerID="b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335" Dec 06 06:57:31 crc kubenswrapper[4954]: E1206 06:57:31.605656 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 06 06:57:32 crc kubenswrapper[4954]: I1206 06:57:32.608684 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.060311 4954 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.212375 4954 csr.go:261] certificate signing request csr-skzcj is approved, waiting to be issued Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.221127 4954 csr.go:257] certificate signing request csr-skzcj is issued Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.398648 4954 apiserver.go:52] "Watching apiserver" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.402827 4954 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.403230 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.403666 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.403795 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:33 crc kubenswrapper[4954]: E1206 06:57:33.403920 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.403817 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.404132 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:33 crc kubenswrapper[4954]: E1206 06:57:33.404169 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:57:33 crc kubenswrapper[4954]: E1206 06:57:33.404219 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.404268 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.404285 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.406198 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.406670 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.407018 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.407183 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.408640 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.408684 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.409127 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.409193 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.409333 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.432249 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.445130 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.464958 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.474907 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.485416 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.492215 4954 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.496220 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.506452 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.506509 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.506543 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.506978 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.507046 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.507066 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.507083 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.507192 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.507358 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.507502 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.507967 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.507101 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508031 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508151 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508310 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508053 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508372 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508399 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508455 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508480 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508503 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508530 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508577 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508605 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508632 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508646 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508660 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508690 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508717 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508745 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508774 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508799 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508823 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508852 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508856 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508879 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508912 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508939 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508966 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.508991 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509016 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509048 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509071 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509096 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509122 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509147 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509173 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509198 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509221 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509231 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509246 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509272 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509299 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509323 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509350 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509377 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509402 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509425 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509448 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509473 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509498 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509519 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509540 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509588 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509794 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509832 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509864 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509887 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509915 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509948 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.509976 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510001 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510027 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510236 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510290 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510331 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510360 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510387 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510409 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510435 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510467 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510493 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510517 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510539 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510581 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510603 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510625 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510648 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510669 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510689 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510739 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510762 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510788 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510817 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510842 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510863 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510883 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510904 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510909 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510938 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510963 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.510986 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511006 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511027 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511049 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511072 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511092 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511116 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511144 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511167 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511188 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511193 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511220 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511246 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511269 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511288 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511309 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511328 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511352 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511375 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511381 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511397 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511421 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511442 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511463 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511484 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511511 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511534 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511853 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511930 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.512175 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.512196 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.512379 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.512428 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.511555 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.512668 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.512691 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.512710 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.512730 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.512749 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.512767 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.512787 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.512806 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.512822 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.512839 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.512858 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.512885 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.512910 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.513014 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.513021 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.513027 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.513613 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.513684 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.513732 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.513799 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.513890 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.514027 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.514093 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.514118 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.514124 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.514295 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.514313 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.514573 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.514761 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.514884 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.514966 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.514987 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.515121 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.515130 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.515129 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.515400 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.515415 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.515758 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.516002 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.516034 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.516342 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.516348 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.516426 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.516595 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.516705 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.516627 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.516720 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.516804 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.516882 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.516946 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.517137 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.517139 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.517296 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.517329 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.517354 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.517515 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.517617 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.517690 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.517707 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.517738 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.517837 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.517884 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.518064 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.519351 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.519804 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.520136 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.520429 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.520525 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.520882 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.518088 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.521108 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.521174 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.521358 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.513036 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.528831 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.528875 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.528906 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.528941 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.528965 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.528992 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529026 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529048 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529069 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529094 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529117 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529159 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529182 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529204 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529229 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529256 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529279 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529301 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529323 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529347 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529370 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529393 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529424 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529448 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529469 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529494 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529518 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529540 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529588 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529614 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529638 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529661 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529681 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529703 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529724 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529748 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529771 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529794 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529819 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529846 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529869 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529894 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529920 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529943 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529967 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529993 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530018 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530043 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530068 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530094 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530127 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530152 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530178 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530204 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530228 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530254 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530278 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530301 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530326 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530350 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530373 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530396 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530419 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530444 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530468 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530492 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530516 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530737 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530854 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530887 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530938 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530965 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.531024 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.531051 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.531106 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.531133 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.531185 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.531207 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.531348 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.531381 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.531447 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.531602 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.531630 4954 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.531644 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.532422 4954 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533171 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533196 4954 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533216 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533234 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533258 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533282 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533298 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533314 4954 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533327 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533343 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533357 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533371 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533715 4954 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533766 4954 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533789 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533805 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533817 4954 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533828 4954 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533838 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533850 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533864 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533881 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533895 4954 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533905 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533915 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533924 4954 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533935 4954 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533946 4954 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533957 4954 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533967 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533978 4954 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533987 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533997 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534008 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534028 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534039 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534049 4954 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534061 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534072 4954 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534082 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534092 4954 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534103 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534114 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534125 4954 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534135 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534143 4954 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534153 4954 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534162 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534173 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534185 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534203 4954 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534218 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534232 4954 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534246 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534259 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534274 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534287 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534300 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534313 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534327 4954 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534340 4954 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534352 4954 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534366 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534377 4954 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534375 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534389 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534511 4954 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534529 4954 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534547 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534584 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534599 4954 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534611 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534624 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534636 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534648 4954 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534665 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534676 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.521412 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.521937 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.522330 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.522434 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.522628 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.522672 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.529945 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.530017 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.531095 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.531418 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.531683 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.532112 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.532167 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.532979 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533298 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533729 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.535935 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534186 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: E1206 06:57:33.536021 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:57:34.035987638 +0000 UTC m=+28.849347217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534699 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.533960 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.534981 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.535046 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.535896 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.535908 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.536242 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.536274 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.536313 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.536297 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.536637 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.536644 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.537106 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.537094 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.537492 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.537556 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.537843 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.537925 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.537860 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.540027 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.540069 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.540204 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.540238 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.540280 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.541079 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.540827 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.541283 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.541552 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.541668 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.541848 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.542188 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.542297 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.542341 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.542360 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.542367 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.543078 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.543200 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.543312 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.543376 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.543536 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.543589 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.543540 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.543788 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.543825 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.544031 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.544143 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.544195 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: E1206 06:57:33.544450 4954 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.544545 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: E1206 06:57:33.544589 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:34.044546594 +0000 UTC m=+28.857906193 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.544647 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.544547 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.544679 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.544746 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: E1206 06:57:33.544826 4954 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 06:57:33 crc kubenswrapper[4954]: E1206 06:57:33.544901 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:34.044879514 +0000 UTC m=+28.858239113 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.545272 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.545357 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.545610 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.547057 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.542932 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.547781 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.548495 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.548817 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.549589 4954 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.552008 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.553775 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.554215 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.557507 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.559347 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: E1206 06:57:33.559820 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 06:57:33 crc kubenswrapper[4954]: E1206 06:57:33.559838 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 06:57:33 crc kubenswrapper[4954]: E1206 06:57:33.559850 4954 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:33 crc kubenswrapper[4954]: E1206 06:57:33.559912 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:34.059888226 +0000 UTC m=+28.873247615 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.559891 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: E1206 06:57:33.560125 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 06:57:33 crc kubenswrapper[4954]: E1206 06:57:33.560148 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 06:57:33 crc kubenswrapper[4954]: E1206 06:57:33.560162 4954 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:33 crc kubenswrapper[4954]: E1206 06:57:33.560229 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:34.060202985 +0000 UTC m=+28.873562575 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.563295 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.564978 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.564427 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.566065 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.566799 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.568310 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.569405 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.569520 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.571376 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.571928 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.572017 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.572545 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.572763 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.572908 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.572929 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.573794 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.574180 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.575705 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.575944 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.575981 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.576270 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.576698 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.576886 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.577799 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.578346 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.578477 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.581338 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.581604 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.581667 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.581801 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.581852 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.581936 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.581940 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.582099 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.583918 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.596054 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.599302 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.611943 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635451 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635492 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635550 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635579 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635589 4954 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635598 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635607 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635616 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635626 4954 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635681 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635720 4954 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635734 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635746 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635756 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635764 4954 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635773 4954 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635781 4954 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635789 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635797 4954 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635805 4954 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635814 4954 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635822 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635830 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635839 4954 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635848 4954 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635859 4954 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635868 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635877 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635887 4954 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635896 4954 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635906 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635915 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635925 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635933 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635942 4954 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635951 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635960 4954 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635968 4954 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635977 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635986 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.635995 4954 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636004 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636013 4954 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636022 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636030 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636038 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636047 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636056 4954 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636065 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636073 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636083 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636092 4954 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636107 4954 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636115 4954 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636124 4954 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636135 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636146 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636159 4954 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636178 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636205 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636220 4954 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636238 4954 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636251 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636263 4954 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636274 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636286 4954 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636295 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636317 4954 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636327 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636346 4954 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636359 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636370 4954 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636381 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636394 4954 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636403 4954 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636412 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636421 4954 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636430 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636439 4954 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636447 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636458 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636466 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636476 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636486 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636494 4954 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636502 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636509 4954 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636518 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636525 4954 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636533 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636542 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636551 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636598 4954 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636606 4954 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636614 4954 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636623 4954 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636631 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636638 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636646 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636654 4954 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636661 4954 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636669 4954 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636677 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636685 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636694 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636701 4954 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636709 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636716 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636724 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636732 4954 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636745 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636753 4954 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636761 4954 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636769 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636777 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636785 4954 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636792 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.636911 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.717556 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.724773 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 06:57:33 crc kubenswrapper[4954]: I1206 06:57:33.730861 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 06:57:33 crc kubenswrapper[4954]: W1206 06:57:33.735624 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-94a789dda653f440d4d2facc2f17affdc1da7521236b7996ad427c87f1968e19 WatchSource:0}: Error finding container 94a789dda653f440d4d2facc2f17affdc1da7521236b7996ad427c87f1968e19: Status 404 returned error can't find the container with id 94a789dda653f440d4d2facc2f17affdc1da7521236b7996ad427c87f1968e19 Dec 06 06:57:33 crc kubenswrapper[4954]: W1206 06:57:33.741456 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-d78c3a5f29ab9ff36dd71cc024c35eb4c4afc5ad59a57720f02993a16da84f58 WatchSource:0}: Error finding container d78c3a5f29ab9ff36dd71cc024c35eb4c4afc5ad59a57720f02993a16da84f58: Status 404 returned error can't find the container with id d78c3a5f29ab9ff36dd71cc024c35eb4c4afc5ad59a57720f02993a16da84f58 Dec 06 06:57:33 crc kubenswrapper[4954]: W1206 06:57:33.747549 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-84465a5bd1e47c0865031027e9bea02eb3ffcf6bc22b772babe09f4ce7cd4ae0 WatchSource:0}: Error finding container 84465a5bd1e47c0865031027e9bea02eb3ffcf6bc22b772babe09f4ce7cd4ae0: Status 404 returned error can't find the container with id 84465a5bd1e47c0865031027e9bea02eb3ffcf6bc22b772babe09f4ce7cd4ae0 Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.039802 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:57:34 crc kubenswrapper[4954]: E1206 06:57:34.040082 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:57:35.040035481 +0000 UTC m=+29.853394870 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.140736 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.140793 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.140817 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.140841 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:34 crc kubenswrapper[4954]: E1206 06:57:34.140964 4954 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 06:57:34 crc kubenswrapper[4954]: E1206 06:57:34.141029 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:35.14101134 +0000 UTC m=+29.954370729 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 06:57:34 crc kubenswrapper[4954]: E1206 06:57:34.141101 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 06:57:34 crc kubenswrapper[4954]: E1206 06:57:34.141155 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 06:57:34 crc kubenswrapper[4954]: E1206 06:57:34.141169 4954 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:34 crc kubenswrapper[4954]: E1206 06:57:34.141235 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:35.141213926 +0000 UTC m=+29.954573315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:34 crc kubenswrapper[4954]: E1206 06:57:34.141100 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 06:57:34 crc kubenswrapper[4954]: E1206 06:57:34.141299 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 06:57:34 crc kubenswrapper[4954]: E1206 06:57:34.141316 4954 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 06:57:34 crc kubenswrapper[4954]: E1206 06:57:34.141319 4954 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:34 crc kubenswrapper[4954]: E1206 06:57:34.141353 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:35.14134454 +0000 UTC m=+29.954703929 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 06:57:34 crc kubenswrapper[4954]: E1206 06:57:34.141375 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:35.14135921 +0000 UTC m=+29.954718779 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.200360 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-jstpl"] Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.200779 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jstpl" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.203010 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-rsvgk"] Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.203379 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.203403 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-n27lq"] Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.204052 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.204432 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.206209 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.206486 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-f5lgw"] Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.206529 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.206870 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.208197 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.210925 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.210936 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.212331 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.212875 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.212881 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.213367 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.213868 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.214215 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.214426 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.215860 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.216227 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.222669 4954 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-06 06:52:33 +0000 UTC, rotation deadline is 2026-10-30 12:17:35.075129198 +0000 UTC Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.222719 4954 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7877h20m0.852412593s for next certificate rotation Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.229690 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.247187 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.269013 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.285713 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.307818 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.326624 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342266 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/36967035-88f9-47a3-a15a-bce812678973-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342328 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-etc-kubernetes\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342369 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cbrq\" (UniqueName: \"kubernetes.io/projected/7e0babbe-21ce-42f4-90cf-c3eb21991413-kube-api-access-4cbrq\") pod \"machine-config-daemon-f5lgw\" (UID: \"7e0babbe-21ce-42f4-90cf-c3eb21991413\") " pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342396 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/36967035-88f9-47a3-a15a-bce812678973-os-release\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342438 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ee79602b-4e64-45c3-a608-9d312315f206-hosts-file\") pod \"node-resolver-jstpl\" (UID: \"ee79602b-4e64-45c3-a608-9d312315f206\") " pod="openshift-dns/node-resolver-jstpl" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342460 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-system-cni-dir\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342486 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-host-var-lib-cni-multus\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342511 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-host-run-multus-certs\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342604 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-hostroot\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342648 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts786\" (UniqueName: \"kubernetes.io/projected/36967035-88f9-47a3-a15a-bce812678973-kube-api-access-ts786\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342679 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-host-run-netns\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342703 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-multus-socket-dir-parent\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342757 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppbzs\" (UniqueName: \"kubernetes.io/projected/1d174f37-f89e-4daf-a663-3cad4e33dad2-kube-api-access-ppbzs\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342781 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e0babbe-21ce-42f4-90cf-c3eb21991413-proxy-tls\") pod \"machine-config-daemon-f5lgw\" (UID: \"7e0babbe-21ce-42f4-90cf-c3eb21991413\") " pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342802 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1d174f37-f89e-4daf-a663-3cad4e33dad2-multus-daemon-config\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342836 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/36967035-88f9-47a3-a15a-bce812678973-cni-binary-copy\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342864 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-host-var-lib-kubelet\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342914 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7e0babbe-21ce-42f4-90cf-c3eb21991413-rootfs\") pod \"machine-config-daemon-f5lgw\" (UID: \"7e0babbe-21ce-42f4-90cf-c3eb21991413\") " pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342960 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/36967035-88f9-47a3-a15a-bce812678973-system-cni-dir\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.342985 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs4jk\" (UniqueName: \"kubernetes.io/projected/ee79602b-4e64-45c3-a608-9d312315f206-kube-api-access-zs4jk\") pod \"node-resolver-jstpl\" (UID: \"ee79602b-4e64-45c3-a608-9d312315f206\") " pod="openshift-dns/node-resolver-jstpl" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.343008 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-host-run-k8s-cni-cncf-io\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.343033 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/36967035-88f9-47a3-a15a-bce812678973-cnibin\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.343083 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/36967035-88f9-47a3-a15a-bce812678973-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.343110 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-multus-cni-dir\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.343145 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7e0babbe-21ce-42f4-90cf-c3eb21991413-mcd-auth-proxy-config\") pod \"machine-config-daemon-f5lgw\" (UID: \"7e0babbe-21ce-42f4-90cf-c3eb21991413\") " pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.343177 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-cnibin\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.343210 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-os-release\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.343232 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d174f37-f89e-4daf-a663-3cad4e33dad2-cni-binary-copy\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.343288 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-host-var-lib-cni-bin\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.343309 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-multus-conf-dir\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.344670 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.363508 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.376775 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.377982 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.387736 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.398742 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.415422 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.426794 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.432053 4954 scope.go:117] "RemoveContainer" containerID="b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.432064 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 06:57:34 crc kubenswrapper[4954]: E1206 06:57:34.432369 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.440940 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.443152 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:34 crc kubenswrapper[4954]: E1206 06:57:34.443324 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.443772 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d174f37-f89e-4daf-a663-3cad4e33dad2-cni-binary-copy\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.443863 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-multus-conf-dir\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.443886 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-host-var-lib-cni-bin\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.443907 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/36967035-88f9-47a3-a15a-bce812678973-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.443937 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-etc-kubernetes\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.443959 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cbrq\" (UniqueName: \"kubernetes.io/projected/7e0babbe-21ce-42f4-90cf-c3eb21991413-kube-api-access-4cbrq\") pod \"machine-config-daemon-f5lgw\" (UID: \"7e0babbe-21ce-42f4-90cf-c3eb21991413\") " pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.443979 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/36967035-88f9-47a3-a15a-bce812678973-os-release\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.443996 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ee79602b-4e64-45c3-a608-9d312315f206-hosts-file\") pod \"node-resolver-jstpl\" (UID: \"ee79602b-4e64-45c3-a608-9d312315f206\") " pod="openshift-dns/node-resolver-jstpl" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444022 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-system-cni-dir\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444038 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-host-var-lib-cni-multus\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444057 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-host-run-multus-certs\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444082 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-hostroot\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444097 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-host-run-netns\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444122 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts786\" (UniqueName: \"kubernetes.io/projected/36967035-88f9-47a3-a15a-bce812678973-kube-api-access-ts786\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444141 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppbzs\" (UniqueName: \"kubernetes.io/projected/1d174f37-f89e-4daf-a663-3cad4e33dad2-kube-api-access-ppbzs\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444158 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-multus-socket-dir-parent\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444174 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e0babbe-21ce-42f4-90cf-c3eb21991413-proxy-tls\") pod \"machine-config-daemon-f5lgw\" (UID: \"7e0babbe-21ce-42f4-90cf-c3eb21991413\") " pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444190 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1d174f37-f89e-4daf-a663-3cad4e33dad2-multus-daemon-config\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444207 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/36967035-88f9-47a3-a15a-bce812678973-cni-binary-copy\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444227 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-host-var-lib-kubelet\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444243 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7e0babbe-21ce-42f4-90cf-c3eb21991413-rootfs\") pod \"machine-config-daemon-f5lgw\" (UID: \"7e0babbe-21ce-42f4-90cf-c3eb21991413\") " pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444268 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/36967035-88f9-47a3-a15a-bce812678973-system-cni-dir\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444287 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs4jk\" (UniqueName: \"kubernetes.io/projected/ee79602b-4e64-45c3-a608-9d312315f206-kube-api-access-zs4jk\") pod \"node-resolver-jstpl\" (UID: \"ee79602b-4e64-45c3-a608-9d312315f206\") " pod="openshift-dns/node-resolver-jstpl" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444303 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-host-run-k8s-cni-cncf-io\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444331 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/36967035-88f9-47a3-a15a-bce812678973-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444348 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/36967035-88f9-47a3-a15a-bce812678973-cnibin\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444366 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-multus-cni-dir\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444383 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7e0babbe-21ce-42f4-90cf-c3eb21991413-mcd-auth-proxy-config\") pod \"machine-config-daemon-f5lgw\" (UID: \"7e0babbe-21ce-42f4-90cf-c3eb21991413\") " pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444408 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-cnibin\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444422 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-os-release\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444518 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-os-release\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444680 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d174f37-f89e-4daf-a663-3cad4e33dad2-cni-binary-copy\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444769 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ee79602b-4e64-45c3-a608-9d312315f206-hosts-file\") pod \"node-resolver-jstpl\" (UID: \"ee79602b-4e64-45c3-a608-9d312315f206\") " pod="openshift-dns/node-resolver-jstpl" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444774 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-multus-socket-dir-parent\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444803 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-multus-conf-dir\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444814 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-etc-kubernetes\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444850 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-host-var-lib-cni-bin\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.444926 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-system-cni-dir\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.445017 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-host-var-lib-cni-multus\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.445057 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-host-run-multus-certs\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.445091 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/36967035-88f9-47a3-a15a-bce812678973-os-release\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.445116 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-hostroot\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.445140 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/36967035-88f9-47a3-a15a-bce812678973-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.445173 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-host-run-netns\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.445208 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-host-var-lib-kubelet\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.445262 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7e0babbe-21ce-42f4-90cf-c3eb21991413-rootfs\") pod \"machine-config-daemon-f5lgw\" (UID: \"7e0babbe-21ce-42f4-90cf-c3eb21991413\") " pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.445417 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/36967035-88f9-47a3-a15a-bce812678973-system-cni-dir\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.445462 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-host-run-k8s-cni-cncf-io\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.445590 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-cnibin\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.445594 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/36967035-88f9-47a3-a15a-bce812678973-cnibin\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.445777 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d174f37-f89e-4daf-a663-3cad4e33dad2-multus-cni-dir\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.445939 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1d174f37-f89e-4daf-a663-3cad4e33dad2-multus-daemon-config\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.446037 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/36967035-88f9-47a3-a15a-bce812678973-cni-binary-copy\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.446401 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/36967035-88f9-47a3-a15a-bce812678973-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.446597 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7e0babbe-21ce-42f4-90cf-c3eb21991413-mcd-auth-proxy-config\") pod \"machine-config-daemon-f5lgw\" (UID: \"7e0babbe-21ce-42f4-90cf-c3eb21991413\") " pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.450389 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e0babbe-21ce-42f4-90cf-c3eb21991413-proxy-tls\") pod \"machine-config-daemon-f5lgw\" (UID: \"7e0babbe-21ce-42f4-90cf-c3eb21991413\") " pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.454603 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.467361 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cbrq\" (UniqueName: \"kubernetes.io/projected/7e0babbe-21ce-42f4-90cf-c3eb21991413-kube-api-access-4cbrq\") pod \"machine-config-daemon-f5lgw\" (UID: \"7e0babbe-21ce-42f4-90cf-c3eb21991413\") " pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.471259 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppbzs\" (UniqueName: \"kubernetes.io/projected/1d174f37-f89e-4daf-a663-3cad4e33dad2-kube-api-access-ppbzs\") pod \"multus-rsvgk\" (UID: \"1d174f37-f89e-4daf-a663-3cad4e33dad2\") " pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.472135 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts786\" (UniqueName: \"kubernetes.io/projected/36967035-88f9-47a3-a15a-bce812678973-kube-api-access-ts786\") pod \"multus-additional-cni-plugins-n27lq\" (UID: \"36967035-88f9-47a3-a15a-bce812678973\") " pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.475706 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.475996 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs4jk\" (UniqueName: \"kubernetes.io/projected/ee79602b-4e64-45c3-a608-9d312315f206-kube-api-access-zs4jk\") pod \"node-resolver-jstpl\" (UID: \"ee79602b-4e64-45c3-a608-9d312315f206\") " pod="openshift-dns/node-resolver-jstpl" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.487693 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.488120 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-crz6w"] Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.489060 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.491197 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.491321 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.492889 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.493493 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.493782 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.496155 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.496538 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.505352 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.516451 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jstpl" Dec 06 06:57:34 crc kubenswrapper[4954]: W1206 06:57:34.528708 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee79602b_4e64_45c3_a608_9d312315f206.slice/crio-69487602858f26d58d25b93d46c0fc3fff1da5c39cd04e688d71a22523523b93 WatchSource:0}: Error finding container 69487602858f26d58d25b93d46c0fc3fff1da5c39cd04e688d71a22523523b93: Status 404 returned error can't find the container with id 69487602858f26d58d25b93d46c0fc3fff1da5c39cd04e688d71a22523523b93 Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.529027 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.533545 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rsvgk" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.542993 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.545288 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n27lq" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.545736 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-run-netns\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.545814 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-run-openvswitch\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.545843 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-node-log\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.545954 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-etc-openvswitch\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.545983 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.546011 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2rg6\" (UniqueName: \"kubernetes.io/projected/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-kube-api-access-f2rg6\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.546037 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-kubelet\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.546061 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-run-systemd\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.546089 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-env-overrides\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.546198 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-ovnkube-config\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.546243 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-run-ovn\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.546279 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-run-ovn-kubernetes\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.546303 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-cni-netd\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.546325 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-ovn-node-metrics-cert\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.546347 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-ovnkube-script-lib\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.546402 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-systemd-units\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.546457 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-slash\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.546483 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-cni-bin\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.546520 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-var-lib-openvswitch\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.546545 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-log-socket\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: W1206 06:57:34.550829 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d174f37_f89e_4daf_a663_3cad4e33dad2.slice/crio-8f5e7ee872b758896c3c6a2dcc5ea499a352b1252a5b0ba6583ea34b92638178 WatchSource:0}: Error finding container 8f5e7ee872b758896c3c6a2dcc5ea499a352b1252a5b0ba6583ea34b92638178: Status 404 returned error can't find the container with id 8f5e7ee872b758896c3c6a2dcc5ea499a352b1252a5b0ba6583ea34b92638178 Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.560849 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.561841 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.564170 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:34 crc kubenswrapper[4954]: W1206 06:57:34.573611 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36967035_88f9_47a3_a15a_bce812678973.slice/crio-b36071a97ba4e5c4528190f25b49ab85c255278bff28ff7dc4605010e19811e4 WatchSource:0}: Error finding container b36071a97ba4e5c4528190f25b49ab85c255278bff28ff7dc4605010e19811e4: Status 404 returned error can't find the container with id b36071a97ba4e5c4528190f25b49ab85c255278bff28ff7dc4605010e19811e4 Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.588891 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.607134 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:34Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.619622 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" event={"ID":"36967035-88f9-47a3-a15a-bce812678973","Type":"ContainerStarted","Data":"b36071a97ba4e5c4528190f25b49ab85c255278bff28ff7dc4605010e19811e4"} Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.621036 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jstpl" event={"ID":"ee79602b-4e64-45c3-a608-9d312315f206","Type":"ContainerStarted","Data":"69487602858f26d58d25b93d46c0fc3fff1da5c39cd04e688d71a22523523b93"} Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.624465 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef"} Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.624520 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0"} Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.624541 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d78c3a5f29ab9ff36dd71cc024c35eb4c4afc5ad59a57720f02993a16da84f58"} Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.625202 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:34Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.626908 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b"} Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.627298 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"94a789dda653f440d4d2facc2f17affdc1da7521236b7996ad427c87f1968e19"} Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.629852 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rsvgk" event={"ID":"1d174f37-f89e-4daf-a663-3cad4e33dad2","Type":"ContainerStarted","Data":"8f5e7ee872b758896c3c6a2dcc5ea499a352b1252a5b0ba6583ea34b92638178"} Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.631597 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"84465a5bd1e47c0865031027e9bea02eb3ffcf6bc22b772babe09f4ce7cd4ae0"} Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.632720 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"99991730c8a7c11b32b8c4e0b7dc6d717ac5fe9de98395263ddbce10305b8a26"} Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.633657 4954 scope.go:117] "RemoveContainer" containerID="b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335" Dec 06 06:57:34 crc kubenswrapper[4954]: E1206 06:57:34.633917 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.641663 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:34Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647258 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-ovnkube-config\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647341 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-run-ovn\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647374 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-ovnkube-script-lib\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647403 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-run-ovn-kubernetes\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647426 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-cni-netd\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647451 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-ovn-node-metrics-cert\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647529 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-cni-bin\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647613 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-systemd-units\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647644 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-slash\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647656 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-run-ovn\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647712 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-var-lib-openvswitch\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647743 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-log-socket\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647772 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-run-netns\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647799 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-cni-bin\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647817 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-run-openvswitch\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647846 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-run-ovn-kubernetes\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647853 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-node-log\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647876 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-cni-netd\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647882 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647907 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2rg6\" (UniqueName: \"kubernetes.io/projected/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-kube-api-access-f2rg6\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647941 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-etc-openvswitch\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647960 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-kubelet\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.647982 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-run-systemd\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.648021 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-env-overrides\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.648404 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-ovnkube-script-lib\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.648495 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-run-openvswitch\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.648534 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-systemd-units\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.648586 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-slash\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.648617 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-var-lib-openvswitch\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.648643 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-log-socket\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.648668 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-run-netns\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.648706 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-kubelet\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.648737 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-etc-openvswitch\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.648782 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-run-systemd\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.648721 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-env-overrides\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.648841 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.648813 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-node-log\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.650831 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-ovnkube-config\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.653114 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-ovn-node-metrics-cert\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.661500 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:34Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.682628 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2rg6\" (UniqueName: \"kubernetes.io/projected/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-kube-api-access-f2rg6\") pod \"ovnkube-node-crz6w\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.683104 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:34Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.697612 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:34Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.716662 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:34Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.741330 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:34Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.762879 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:34Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.787812 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:34Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.801881 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.805759 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:34Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:34 crc kubenswrapper[4954]: W1206 06:57:34.814513 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cc429b1_3932_4515_a2b8_f0dd601f3e4c.slice/crio-9cb7f752f403d46b9bebe914f4a9490233f5c9a23fa8eee1613b03c6335ce412 WatchSource:0}: Error finding container 9cb7f752f403d46b9bebe914f4a9490233f5c9a23fa8eee1613b03c6335ce412: Status 404 returned error can't find the container with id 9cb7f752f403d46b9bebe914f4a9490233f5c9a23fa8eee1613b03c6335ce412 Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.819548 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:34Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.835138 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:34Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.851296 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:34Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.919871 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:34Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.940318 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:34Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.961662 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:34Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:34 crc kubenswrapper[4954]: I1206 06:57:34.988805 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:34Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.054001 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.054237 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:57:37.054209922 +0000 UTC m=+31.867569331 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.155047 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.155122 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.155157 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.155183 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.155345 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.155340 4954 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.155402 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.155414 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.155426 4954 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.155361 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.155477 4954 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.155360 4954 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.155463 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:37.155440988 +0000 UTC m=+31.968800377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.155548 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:37.155527251 +0000 UTC m=+31.968886640 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.155559 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:37.155553741 +0000 UTC m=+31.968913130 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.155585 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:37.155580652 +0000 UTC m=+31.968940041 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.321421 4954 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 06 06:57:35 crc kubenswrapper[4954]: W1206 06:57:35.322299 4954 reflector.go:484] object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 06 06:57:35 crc kubenswrapper[4954]: W1206 06:57:35.322377 4954 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": Unexpected watch close - watch lasted less than a second and no items received Dec 06 06:57:35 crc kubenswrapper[4954]: W1206 06:57:35.322470 4954 reflector.go:484] object-"openshift-ovn-kubernetes"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Dec 06 06:57:35 crc kubenswrapper[4954]: W1206 06:57:35.322299 4954 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovnkube-config": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovnkube-config": Unexpected watch close - watch lasted less than a second and no items received Dec 06 06:57:35 crc kubenswrapper[4954]: W1206 06:57:35.322336 4954 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": watch of *v1.Secret ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": Unexpected watch close - watch lasted less than a second and no items received Dec 06 06:57:35 crc kubenswrapper[4954]: W1206 06:57:35.323077 4954 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": Unexpected watch close - watch lasted less than a second and no items received Dec 06 06:57:35 crc kubenswrapper[4954]: W1206 06:57:35.323137 4954 reflector.go:484] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.443369 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.443469 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.443597 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.444221 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.448387 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.449417 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.451524 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.452432 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.453784 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.454436 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.455759 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.457691 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.459055 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.464235 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.465169 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.467490 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.468796 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.470416 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.472552 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.473776 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.474191 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.474888 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.475316 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.475997 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.478190 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.478838 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.481770 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.484408 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.485464 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.486866 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.487641 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.489090 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.489779 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.491088 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.491879 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.492515 4954 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.494455 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.497847 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.499742 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.500409 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.501420 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.503059 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.503791 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.504827 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.505493 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.506547 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.507060 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.508181 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.508855 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.509848 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.510373 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.511318 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.511895 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.513079 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.513775 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.514756 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.515275 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.515626 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.516377 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.516998 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.517469 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.538217 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.553543 4954 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.561261 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.575244 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.596988 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.611527 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.626069 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.639218 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1"} Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.639301 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066"} Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.641033 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rsvgk" event={"ID":"1d174f37-f89e-4daf-a663-3cad4e33dad2","Type":"ContainerStarted","Data":"fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76"} Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.642656 4954 generic.go:334] "Generic (PLEG): container finished" podID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerID="51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672" exitCode=0 Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.642735 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerDied","Data":"51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672"} Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.642784 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerStarted","Data":"9cb7f752f403d46b9bebe914f4a9490233f5c9a23fa8eee1613b03c6335ce412"} Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.644777 4954 generic.go:334] "Generic (PLEG): container finished" podID="36967035-88f9-47a3-a15a-bce812678973" containerID="0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976" exitCode=0 Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.644846 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" event={"ID":"36967035-88f9-47a3-a15a-bce812678973","Type":"ContainerDied","Data":"0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976"} Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.645649 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.646542 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jstpl" event={"ID":"ee79602b-4e64-45c3-a608-9d312315f206","Type":"ContainerStarted","Data":"43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab"} Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.646905 4954 scope.go:117] "RemoveContainer" containerID="b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335" Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.647054 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.674618 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.692088 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.707063 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.724593 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.744194 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.767943 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.782454 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.782772 4954 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.785309 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.785358 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.785369 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.785523 4954 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.796229 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.797742 4954 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.798152 4954 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.799478 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.799508 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.799521 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.799540 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.799559 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:35Z","lastTransitionTime":"2025-12-06T06:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.812197 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.822131 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.829820 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.829879 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.829898 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.829920 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.829935 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:35Z","lastTransitionTime":"2025-12-06T06:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.831016 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.853112 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.853162 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.858375 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.858418 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.858428 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.858446 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.858458 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:35Z","lastTransitionTime":"2025-12-06T06:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.869670 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.871112 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.875095 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.875144 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.875159 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.875180 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.875195 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:35Z","lastTransitionTime":"2025-12-06T06:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.887458 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.892957 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.897634 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.897693 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.897705 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.897725 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.897740 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:35Z","lastTransitionTime":"2025-12-06T06:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.915825 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.918355 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:35 crc kubenswrapper[4954]: E1206 06:57:35.918490 4954 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.920630 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.920679 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.920694 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.920713 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:35 crc kubenswrapper[4954]: I1206 06:57:35.920724 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:35Z","lastTransitionTime":"2025-12-06T06:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.023708 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.023747 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.023755 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.023775 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.023784 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:36Z","lastTransitionTime":"2025-12-06T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.128172 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.128766 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.128781 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.128801 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.128815 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:36Z","lastTransitionTime":"2025-12-06T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.231846 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.231890 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.231899 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.231916 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.231926 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:36Z","lastTransitionTime":"2025-12-06T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.292596 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.297586 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.335206 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.335244 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.335254 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.335277 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.335293 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:36Z","lastTransitionTime":"2025-12-06T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.395114 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.437553 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.437641 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.437657 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.437678 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.437691 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:36Z","lastTransitionTime":"2025-12-06T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.443005 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:36 crc kubenswrapper[4954]: E1206 06:57:36.443876 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.539943 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.539981 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.539990 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.540005 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.540014 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:36Z","lastTransitionTime":"2025-12-06T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.643549 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.644071 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.644085 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.644105 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.644118 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:36Z","lastTransitionTime":"2025-12-06T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.652684 4954 generic.go:334] "Generic (PLEG): container finished" podID="36967035-88f9-47a3-a15a-bce812678973" containerID="6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926" exitCode=0 Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.652734 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" event={"ID":"36967035-88f9-47a3-a15a-bce812678973","Type":"ContainerDied","Data":"6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926"} Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.659678 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerStarted","Data":"087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e"} Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.659741 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerStarted","Data":"a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456"} Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.659772 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerStarted","Data":"8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6"} Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.672826 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.688113 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.704959 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.719705 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.730903 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.740645 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.749610 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.749664 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.749678 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.749705 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.749720 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:36Z","lastTransitionTime":"2025-12-06T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.754555 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.759470 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-7gzpg"] Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.759880 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7gzpg" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.762029 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.762181 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.762319 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.762439 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.772694 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.786947 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.801534 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.815073 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.826458 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.846496 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.852929 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.852972 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.852981 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.852998 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.853007 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:36Z","lastTransitionTime":"2025-12-06T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.857965 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.873308 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.875209 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.877145 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3d30cc7-60f8-4b93-ab83-4740d5568464-host\") pod \"node-ca-7gzpg\" (UID: \"a3d30cc7-60f8-4b93-ab83-4740d5568464\") " pod="openshift-image-registry/node-ca-7gzpg" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.877222 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a3d30cc7-60f8-4b93-ab83-4740d5568464-serviceca\") pod \"node-ca-7gzpg\" (UID: \"a3d30cc7-60f8-4b93-ab83-4740d5568464\") " pod="openshift-image-registry/node-ca-7gzpg" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.877290 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqm57\" (UniqueName: \"kubernetes.io/projected/a3d30cc7-60f8-4b93-ab83-4740d5568464-kube-api-access-kqm57\") pod \"node-ca-7gzpg\" (UID: \"a3d30cc7-60f8-4b93-ab83-4740d5568464\") " pod="openshift-image-registry/node-ca-7gzpg" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.891787 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.907394 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.909750 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.911891 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.922194 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.936258 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.950946 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.955886 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.955935 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.955951 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.955974 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.955987 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:36Z","lastTransitionTime":"2025-12-06T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.968443 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.978894 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3d30cc7-60f8-4b93-ab83-4740d5568464-host\") pod \"node-ca-7gzpg\" (UID: \"a3d30cc7-60f8-4b93-ab83-4740d5568464\") " pod="openshift-image-registry/node-ca-7gzpg" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.979010 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a3d30cc7-60f8-4b93-ab83-4740d5568464-serviceca\") pod \"node-ca-7gzpg\" (UID: \"a3d30cc7-60f8-4b93-ab83-4740d5568464\") " pod="openshift-image-registry/node-ca-7gzpg" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.979059 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqm57\" (UniqueName: \"kubernetes.io/projected/a3d30cc7-60f8-4b93-ab83-4740d5568464-kube-api-access-kqm57\") pod \"node-ca-7gzpg\" (UID: \"a3d30cc7-60f8-4b93-ab83-4740d5568464\") " pod="openshift-image-registry/node-ca-7gzpg" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.979123 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3d30cc7-60f8-4b93-ab83-4740d5568464-host\") pod \"node-ca-7gzpg\" (UID: \"a3d30cc7-60f8-4b93-ab83-4740d5568464\") " pod="openshift-image-registry/node-ca-7gzpg" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.980469 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a3d30cc7-60f8-4b93-ab83-4740d5568464-serviceca\") pod \"node-ca-7gzpg\" (UID: \"a3d30cc7-60f8-4b93-ab83-4740d5568464\") " pod="openshift-image-registry/node-ca-7gzpg" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.982839 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:36Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:36 crc kubenswrapper[4954]: I1206 06:57:36.998894 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqm57\" (UniqueName: \"kubernetes.io/projected/a3d30cc7-60f8-4b93-ab83-4740d5568464-kube-api-access-kqm57\") pod \"node-ca-7gzpg\" (UID: \"a3d30cc7-60f8-4b93-ab83-4740d5568464\") " pod="openshift-image-registry/node-ca-7gzpg" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.004754 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.017535 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.032049 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.048428 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.058658 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.058713 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.058727 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.058747 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.058759 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:37Z","lastTransitionTime":"2025-12-06T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.078981 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7gzpg" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.079915 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:57:37 crc kubenswrapper[4954]: E1206 06:57:37.080231 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:57:41.080192606 +0000 UTC m=+35.893551995 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.161499 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.161555 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.161587 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.161606 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.161619 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:37Z","lastTransitionTime":"2025-12-06T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.181386 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.181429 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.181458 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.181481 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:37 crc kubenswrapper[4954]: E1206 06:57:37.181602 4954 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 06:57:37 crc kubenswrapper[4954]: E1206 06:57:37.181668 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:41.18164964 +0000 UTC m=+35.995009029 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 06:57:37 crc kubenswrapper[4954]: E1206 06:57:37.181698 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 06:57:37 crc kubenswrapper[4954]: E1206 06:57:37.181739 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 06:57:37 crc kubenswrapper[4954]: E1206 06:57:37.181753 4954 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:37 crc kubenswrapper[4954]: E1206 06:57:37.181701 4954 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 06:57:37 crc kubenswrapper[4954]: E1206 06:57:37.181794 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:41.181784434 +0000 UTC m=+35.995143823 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 06:57:37 crc kubenswrapper[4954]: E1206 06:57:37.181809 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:41.181802034 +0000 UTC m=+35.995161423 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:37 crc kubenswrapper[4954]: E1206 06:57:37.181847 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 06:57:37 crc kubenswrapper[4954]: E1206 06:57:37.181876 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 06:57:37 crc kubenswrapper[4954]: E1206 06:57:37.181891 4954 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:37 crc kubenswrapper[4954]: E1206 06:57:37.181978 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:41.181950778 +0000 UTC m=+35.995310337 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.264076 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.264554 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.264582 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.264599 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.264610 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:37Z","lastTransitionTime":"2025-12-06T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.367400 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.367441 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.367452 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.367468 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.367487 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:37Z","lastTransitionTime":"2025-12-06T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.443296 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:37 crc kubenswrapper[4954]: E1206 06:57:37.443453 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.443633 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:37 crc kubenswrapper[4954]: E1206 06:57:37.443802 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.470808 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.470870 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.470882 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.470899 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.470908 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:37Z","lastTransitionTime":"2025-12-06T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.573830 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.573866 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.573876 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.573896 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.573909 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:37Z","lastTransitionTime":"2025-12-06T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.666535 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerStarted","Data":"771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4"} Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.666638 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerStarted","Data":"f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b"} Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.666650 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerStarted","Data":"bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c"} Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.668020 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7gzpg" event={"ID":"a3d30cc7-60f8-4b93-ab83-4740d5568464","Type":"ContainerStarted","Data":"e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8"} Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.668052 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7gzpg" event={"ID":"a3d30cc7-60f8-4b93-ab83-4740d5568464","Type":"ContainerStarted","Data":"b22f7cceb7503311de077b1668ae3b7bfb2ffe74a531b1e14c015c2dcde721c8"} Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.670591 4954 generic.go:334] "Generic (PLEG): container finished" podID="36967035-88f9-47a3-a15a-bce812678973" containerID="fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b" exitCode=0 Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.670678 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" event={"ID":"36967035-88f9-47a3-a15a-bce812678973","Type":"ContainerDied","Data":"fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b"} Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.672103 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3"} Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.675955 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.676004 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.676018 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.676037 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.676054 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:37Z","lastTransitionTime":"2025-12-06T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.685534 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.703482 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.719598 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.736687 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.749788 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.763218 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.776338 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.781958 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.782016 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.782030 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.782052 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.782071 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:37Z","lastTransitionTime":"2025-12-06T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.788975 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.803018 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.825038 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.838139 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.854278 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.871248 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.884643 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.885736 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.885802 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.885817 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.885845 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.885865 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:37Z","lastTransitionTime":"2025-12-06T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.909494 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.955587 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.982087 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.989313 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.989373 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.989389 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.989410 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:37 crc kubenswrapper[4954]: I1206 06:57:37.989426 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:37Z","lastTransitionTime":"2025-12-06T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.005474 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.024425 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.040966 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.054117 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.066289 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.084624 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.092045 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.092097 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.092105 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.092122 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.092132 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:38Z","lastTransitionTime":"2025-12-06T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.097086 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.109822 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.123754 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.195757 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.195818 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.195836 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.195857 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.195870 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:38Z","lastTransitionTime":"2025-12-06T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.298902 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.298949 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.298960 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.298978 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.298990 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:38Z","lastTransitionTime":"2025-12-06T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.402022 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.402068 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.402089 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.402112 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.402128 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:38Z","lastTransitionTime":"2025-12-06T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.443375 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:38 crc kubenswrapper[4954]: E1206 06:57:38.443520 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.504908 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.504969 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.504980 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.505002 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.505016 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:38Z","lastTransitionTime":"2025-12-06T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.607289 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.607331 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.607339 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.607357 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.607368 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:38Z","lastTransitionTime":"2025-12-06T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.678857 4954 generic.go:334] "Generic (PLEG): container finished" podID="36967035-88f9-47a3-a15a-bce812678973" containerID="66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e" exitCode=0 Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.678934 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" event={"ID":"36967035-88f9-47a3-a15a-bce812678973","Type":"ContainerDied","Data":"66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e"} Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.694633 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.710592 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.710639 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.710650 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.710671 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.710685 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:38Z","lastTransitionTime":"2025-12-06T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.713085 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.731489 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.751605 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.773031 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.784212 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.802057 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.813023 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.813061 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.813074 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.813096 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.813110 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:38Z","lastTransitionTime":"2025-12-06T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.818434 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.833593 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.847356 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.863539 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.885160 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.897256 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.916024 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.916079 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.916093 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.916116 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:38 crc kubenswrapper[4954]: I1206 06:57:38.916133 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:38Z","lastTransitionTime":"2025-12-06T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.019034 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.019085 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.019095 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.019110 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.019122 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:39Z","lastTransitionTime":"2025-12-06T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.122433 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.122513 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.122537 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.122604 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.122626 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:39Z","lastTransitionTime":"2025-12-06T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.225409 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.225460 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.225472 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.225493 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.225507 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:39Z","lastTransitionTime":"2025-12-06T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.328716 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.328764 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.328775 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.328795 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.328807 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:39Z","lastTransitionTime":"2025-12-06T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.430927 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.430977 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.430987 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.431003 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.431017 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:39Z","lastTransitionTime":"2025-12-06T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.442742 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.442804 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:39 crc kubenswrapper[4954]: E1206 06:57:39.442935 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:57:39 crc kubenswrapper[4954]: E1206 06:57:39.443109 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.534363 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.534415 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.534430 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.534451 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.534464 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:39Z","lastTransitionTime":"2025-12-06T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.637527 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.637598 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.637612 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.637632 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.637648 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:39Z","lastTransitionTime":"2025-12-06T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.686331 4954 generic.go:334] "Generic (PLEG): container finished" podID="36967035-88f9-47a3-a15a-bce812678973" containerID="f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36" exitCode=0 Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.686440 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" event={"ID":"36967035-88f9-47a3-a15a-bce812678973","Type":"ContainerDied","Data":"f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36"} Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.692436 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerStarted","Data":"07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f"} Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.705093 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:39Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.720926 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:39Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.736090 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:39Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.740542 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.740605 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.740614 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.740631 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.740641 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:39Z","lastTransitionTime":"2025-12-06T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.750634 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:39Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.774693 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:39Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.787748 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:39Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.803259 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:39Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.819554 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:39Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.836552 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:39Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.842908 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.842957 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.842969 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.842989 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.843001 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:39Z","lastTransitionTime":"2025-12-06T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.852600 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:39Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.866868 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:39Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.879414 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:39Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.896720 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:39Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.946273 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.946803 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.946813 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.946834 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:39 crc kubenswrapper[4954]: I1206 06:57:39.946845 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:39Z","lastTransitionTime":"2025-12-06T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.049704 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.049764 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.049784 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.049807 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.049820 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:40Z","lastTransitionTime":"2025-12-06T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.153614 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.153669 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.153687 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.153711 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.153723 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:40Z","lastTransitionTime":"2025-12-06T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.256594 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.256663 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.256677 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.256699 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.256710 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:40Z","lastTransitionTime":"2025-12-06T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.360320 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.360388 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.360400 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.360419 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.360428 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:40Z","lastTransitionTime":"2025-12-06T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.443041 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:40 crc kubenswrapper[4954]: E1206 06:57:40.443242 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.463319 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.463388 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.463411 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.463479 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.463522 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:40Z","lastTransitionTime":"2025-12-06T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.565750 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.565789 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.565797 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.565809 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.565820 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:40Z","lastTransitionTime":"2025-12-06T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.668757 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.668802 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.668815 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.668838 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.668850 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:40Z","lastTransitionTime":"2025-12-06T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.703322 4954 generic.go:334] "Generic (PLEG): container finished" podID="36967035-88f9-47a3-a15a-bce812678973" containerID="7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013" exitCode=0 Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.703390 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" event={"ID":"36967035-88f9-47a3-a15a-bce812678973","Type":"ContainerDied","Data":"7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013"} Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.716673 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.726912 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.739065 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.753066 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.766085 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.771393 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.771434 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.771444 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.771466 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.771480 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:40Z","lastTransitionTime":"2025-12-06T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.784759 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.798704 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.812066 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.828473 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.842996 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.858684 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.874046 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.875269 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.875300 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.875309 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.875322 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.875334 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:40Z","lastTransitionTime":"2025-12-06T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.888509 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.978023 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.978071 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.978084 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.978103 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:40 crc kubenswrapper[4954]: I1206 06:57:40.978115 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:40Z","lastTransitionTime":"2025-12-06T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.080405 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.080446 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.080456 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.080471 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.080481 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:41Z","lastTransitionTime":"2025-12-06T06:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.128832 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:57:41 crc kubenswrapper[4954]: E1206 06:57:41.129044 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:57:49.129010805 +0000 UTC m=+43.942370194 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.183651 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.183700 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.183709 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.183726 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.183738 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:41Z","lastTransitionTime":"2025-12-06T06:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.230332 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.230420 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.230452 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.230489 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:41 crc kubenswrapper[4954]: E1206 06:57:41.230530 4954 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 06:57:41 crc kubenswrapper[4954]: E1206 06:57:41.230658 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:49.230632013 +0000 UTC m=+44.043991572 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 06:57:41 crc kubenswrapper[4954]: E1206 06:57:41.230661 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 06:57:41 crc kubenswrapper[4954]: E1206 06:57:41.230553 4954 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 06:57:41 crc kubenswrapper[4954]: E1206 06:57:41.230682 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 06:57:41 crc kubenswrapper[4954]: E1206 06:57:41.230708 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 06:57:41 crc kubenswrapper[4954]: E1206 06:57:41.230713 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:49.230704375 +0000 UTC m=+44.044063984 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 06:57:41 crc kubenswrapper[4954]: E1206 06:57:41.230720 4954 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:41 crc kubenswrapper[4954]: E1206 06:57:41.230688 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 06:57:41 crc kubenswrapper[4954]: E1206 06:57:41.230739 4954 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:41 crc kubenswrapper[4954]: E1206 06:57:41.230749 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:49.230741146 +0000 UTC m=+44.044100535 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:41 crc kubenswrapper[4954]: E1206 06:57:41.230771 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:49.230763637 +0000 UTC m=+44.044123036 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.286915 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.286991 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.287004 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.287026 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.287040 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:41Z","lastTransitionTime":"2025-12-06T06:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.389061 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.389130 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.389145 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.389170 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.389182 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:41Z","lastTransitionTime":"2025-12-06T06:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.443326 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.443366 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:41 crc kubenswrapper[4954]: E1206 06:57:41.443532 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:57:41 crc kubenswrapper[4954]: E1206 06:57:41.443688 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.491513 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.491557 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.491591 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.491611 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.491623 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:41Z","lastTransitionTime":"2025-12-06T06:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.595553 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.595639 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.595651 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.595673 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.595686 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:41Z","lastTransitionTime":"2025-12-06T06:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.698343 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.698396 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.698408 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.698424 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.698435 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:41Z","lastTransitionTime":"2025-12-06T06:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.800915 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.800952 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.800961 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.800977 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.800986 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:41Z","lastTransitionTime":"2025-12-06T06:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.904037 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.904088 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.904098 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.904117 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:41 crc kubenswrapper[4954]: I1206 06:57:41.904129 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:41Z","lastTransitionTime":"2025-12-06T06:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.007219 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.007262 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.007273 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.007301 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.007313 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:42Z","lastTransitionTime":"2025-12-06T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.110894 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.110977 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.111001 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.111033 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.111056 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:42Z","lastTransitionTime":"2025-12-06T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.214073 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.214577 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.214594 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.214622 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.214640 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:42Z","lastTransitionTime":"2025-12-06T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.318157 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.318197 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.318207 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.318220 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.318232 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:42Z","lastTransitionTime":"2025-12-06T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.420383 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.420436 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.420447 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.420470 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.420485 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:42Z","lastTransitionTime":"2025-12-06T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.443377 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:42 crc kubenswrapper[4954]: E1206 06:57:42.443606 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.523712 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.523769 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.523780 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.523800 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.523824 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:42Z","lastTransitionTime":"2025-12-06T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.626097 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.626182 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.626199 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.626223 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.626241 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:42Z","lastTransitionTime":"2025-12-06T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.715933 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" event={"ID":"36967035-88f9-47a3-a15a-bce812678973","Type":"ContainerStarted","Data":"8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0"} Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.728788 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.728842 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.729198 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.729222 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.729234 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:42Z","lastTransitionTime":"2025-12-06T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.729266 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerStarted","Data":"294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa"} Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.729681 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.729697 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.729705 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.739043 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:42Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.755096 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:42Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.757163 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.758534 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.767105 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:42Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.784234 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:42Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.800467 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:42Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.817700 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:42Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.832003 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.832061 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.832075 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.832098 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.832113 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:42Z","lastTransitionTime":"2025-12-06T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.842500 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:42Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.860268 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:42Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.877236 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:42Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.894640 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:42Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.908549 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:42Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.927737 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:42Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.935640 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.935711 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.935725 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.935748 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.935762 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:42Z","lastTransitionTime":"2025-12-06T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.942909 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:42Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.960478 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:42Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:42 crc kubenswrapper[4954]: I1206 06:57:42.976652 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:42Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.002810 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:42Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.017773 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:43Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.034317 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:43Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.038605 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.038639 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.038648 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.038666 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.038677 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:43Z","lastTransitionTime":"2025-12-06T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.050281 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:43Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.065034 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:43Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.080729 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:43Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.097796 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:43Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.113547 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:43Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.127274 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:43Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.140956 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.141008 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.141019 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.141044 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.141060 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:43Z","lastTransitionTime":"2025-12-06T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.148866 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:43Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.166716 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:43Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.245317 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.245382 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.245395 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.245418 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.245436 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:43Z","lastTransitionTime":"2025-12-06T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.348761 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.348817 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.348828 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.348849 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.348896 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:43Z","lastTransitionTime":"2025-12-06T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.442879 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.442940 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:43 crc kubenswrapper[4954]: E1206 06:57:43.443083 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:57:43 crc kubenswrapper[4954]: E1206 06:57:43.443261 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.454929 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.454997 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.455011 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.455033 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.455048 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:43Z","lastTransitionTime":"2025-12-06T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.558043 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.558111 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.558121 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.558142 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.558155 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:43Z","lastTransitionTime":"2025-12-06T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.667955 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.668012 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.668023 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.668043 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.668056 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:43Z","lastTransitionTime":"2025-12-06T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.770323 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.770634 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.770715 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.770792 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.770856 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:43Z","lastTransitionTime":"2025-12-06T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.878216 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.878277 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.878289 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.878314 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.878335 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:43Z","lastTransitionTime":"2025-12-06T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.981069 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.981129 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.981142 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.981162 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:43 crc kubenswrapper[4954]: I1206 06:57:43.981173 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:43Z","lastTransitionTime":"2025-12-06T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.083972 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.084028 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.084043 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.084064 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.084082 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:44Z","lastTransitionTime":"2025-12-06T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.191413 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.191465 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.191483 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.191508 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.191525 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:44Z","lastTransitionTime":"2025-12-06T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.294064 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.294141 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.294156 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.294180 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.294193 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:44Z","lastTransitionTime":"2025-12-06T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.397181 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.397261 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.397270 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.397287 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.397297 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:44Z","lastTransitionTime":"2025-12-06T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.443142 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:44 crc kubenswrapper[4954]: E1206 06:57:44.443352 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.500031 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.500118 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.500162 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.500181 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.500191 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:44Z","lastTransitionTime":"2025-12-06T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.602582 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.602637 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.602649 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.602669 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.602681 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:44Z","lastTransitionTime":"2025-12-06T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.706472 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.706521 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.706539 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.706579 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.706593 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:44Z","lastTransitionTime":"2025-12-06T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.809033 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.809107 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.809120 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.809143 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.809177 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:44Z","lastTransitionTime":"2025-12-06T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.912156 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.912212 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.912223 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.912243 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:44 crc kubenswrapper[4954]: I1206 06:57:44.912257 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:44Z","lastTransitionTime":"2025-12-06T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.019480 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.019527 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.019539 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.019589 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.019603 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:45Z","lastTransitionTime":"2025-12-06T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.122260 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.122324 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.122334 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.122354 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.122367 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:45Z","lastTransitionTime":"2025-12-06T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.225073 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.225123 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.225132 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.225151 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.225161 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:45Z","lastTransitionTime":"2025-12-06T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.327413 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.327456 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.327468 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.327486 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.327500 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:45Z","lastTransitionTime":"2025-12-06T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.429461 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.429553 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.429594 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.429617 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.429630 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:45Z","lastTransitionTime":"2025-12-06T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.443258 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.443349 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:45 crc kubenswrapper[4954]: E1206 06:57:45.443417 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:57:45 crc kubenswrapper[4954]: E1206 06:57:45.443500 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.460109 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.474693 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.487950 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.499808 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.524470 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.532048 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.532110 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.532125 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.532149 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.532165 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:45Z","lastTransitionTime":"2025-12-06T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.539404 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.558744 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.570804 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.593223 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.610885 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.627426 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.634756 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.634810 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.634822 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.634843 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.634857 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:45Z","lastTransitionTime":"2025-12-06T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.645619 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.659653 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.732514 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6"] Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.733021 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.735949 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.736462 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.737631 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.737680 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.737693 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.737716 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.737729 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:45Z","lastTransitionTime":"2025-12-06T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.742049 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovnkube-controller/0.log" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.745978 4954 generic.go:334] "Generic (PLEG): container finished" podID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerID="294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa" exitCode=1 Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.746033 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerDied","Data":"294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa"} Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.746774 4954 scope.go:117] "RemoveContainer" containerID="294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.753990 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.770168 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.785362 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.805915 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.820186 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.839541 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.840091 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.840155 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.840170 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.840190 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.840203 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:45Z","lastTransitionTime":"2025-12-06T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.853507 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.869513 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.883512 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.894225 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8a9aa90b-56ad-4dfc-be90-8030f06801d0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-d7wx6\" (UID: \"8a9aa90b-56ad-4dfc-be90-8030f06801d0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.894399 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8a9aa90b-56ad-4dfc-be90-8030f06801d0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-d7wx6\" (UID: \"8a9aa90b-56ad-4dfc-be90-8030f06801d0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.894483 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8a9aa90b-56ad-4dfc-be90-8030f06801d0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-d7wx6\" (UID: \"8a9aa90b-56ad-4dfc-be90-8030f06801d0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.894606 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxt9n\" (UniqueName: \"kubernetes.io/projected/8a9aa90b-56ad-4dfc-be90-8030f06801d0-kube-api-access-pxt9n\") pod \"ovnkube-control-plane-749d76644c-d7wx6\" (UID: \"8a9aa90b-56ad-4dfc-be90-8030f06801d0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.896081 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.909938 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.923257 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.937098 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.942222 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.942312 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.942322 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.942340 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.942351 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:45Z","lastTransitionTime":"2025-12-06T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.946869 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.946937 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.946950 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.946971 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.947016 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:45Z","lastTransitionTime":"2025-12-06T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.956698 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: E1206 06:57:45.960533 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.967352 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.967404 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.967417 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.967451 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.967464 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:45Z","lastTransitionTime":"2025-12-06T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.975588 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: E1206 06:57:45.982467 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.987072 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.987130 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.987146 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.987169 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.987183 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:45Z","lastTransitionTime":"2025-12-06T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.991854 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:45Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.995608 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxt9n\" (UniqueName: \"kubernetes.io/projected/8a9aa90b-56ad-4dfc-be90-8030f06801d0-kube-api-access-pxt9n\") pod \"ovnkube-control-plane-749d76644c-d7wx6\" (UID: \"8a9aa90b-56ad-4dfc-be90-8030f06801d0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.995658 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8a9aa90b-56ad-4dfc-be90-8030f06801d0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-d7wx6\" (UID: \"8a9aa90b-56ad-4dfc-be90-8030f06801d0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.995696 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8a9aa90b-56ad-4dfc-be90-8030f06801d0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-d7wx6\" (UID: \"8a9aa90b-56ad-4dfc-be90-8030f06801d0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.995725 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8a9aa90b-56ad-4dfc-be90-8030f06801d0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-d7wx6\" (UID: \"8a9aa90b-56ad-4dfc-be90-8030f06801d0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.996435 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8a9aa90b-56ad-4dfc-be90-8030f06801d0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-d7wx6\" (UID: \"8a9aa90b-56ad-4dfc-be90-8030f06801d0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" Dec 06 06:57:45 crc kubenswrapper[4954]: I1206 06:57:45.997224 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8a9aa90b-56ad-4dfc-be90-8030f06801d0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-d7wx6\" (UID: \"8a9aa90b-56ad-4dfc-be90-8030f06801d0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" Dec 06 06:57:46 crc kubenswrapper[4954]: E1206 06:57:46.003072 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:46Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.003847 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8a9aa90b-56ad-4dfc-be90-8030f06801d0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-d7wx6\" (UID: \"8a9aa90b-56ad-4dfc-be90-8030f06801d0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.007903 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.008127 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.008234 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.008326 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.008411 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:46Z","lastTransitionTime":"2025-12-06T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.010093 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:46Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.014017 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxt9n\" (UniqueName: \"kubernetes.io/projected/8a9aa90b-56ad-4dfc-be90-8030f06801d0-kube-api-access-pxt9n\") pod \"ovnkube-control-plane-749d76644c-d7wx6\" (UID: \"8a9aa90b-56ad-4dfc-be90-8030f06801d0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" Dec 06 06:57:46 crc kubenswrapper[4954]: E1206 06:57:46.023994 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:46Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.025623 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:46Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.028548 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.028623 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.028639 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.028664 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.028678 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:46Z","lastTransitionTime":"2025-12-06T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:46 crc kubenswrapper[4954]: E1206 06:57:46.044630 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:46Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:46 crc kubenswrapper[4954]: E1206 06:57:46.044755 4954 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.046641 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.046615 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:46Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.046697 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.046822 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.046852 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.046869 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:46Z","lastTransitionTime":"2025-12-06T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.047788 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.061674 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:46Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.076020 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:46Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.092543 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:46Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.116776 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:46Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.136688 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:44Z\\\",\\\"message\\\":\\\"or.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.247463 6266 factory.go:656] Stopping watch factory\\\\nI1206 06:57:44.247588 6266 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247631 6266 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.247779 6266 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247941 6266 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.248220 6266 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248425 6266 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248544 6266 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:46Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.148085 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:46Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.151325 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.151389 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.151402 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.151420 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.151432 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:46Z","lastTransitionTime":"2025-12-06T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.164167 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:46Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.177915 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:46Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.192825 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:46Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.254455 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.254518 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.254535 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.254592 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.254612 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:46Z","lastTransitionTime":"2025-12-06T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.358009 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.358065 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.358081 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.358100 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.358113 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:46Z","lastTransitionTime":"2025-12-06T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.443146 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:46 crc kubenswrapper[4954]: E1206 06:57:46.443275 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.460596 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.461023 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.461128 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.461222 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.461324 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:46Z","lastTransitionTime":"2025-12-06T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.564777 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.565191 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.565202 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.565221 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.565231 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:46Z","lastTransitionTime":"2025-12-06T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.668528 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.668617 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.668636 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.668656 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.668671 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:46Z","lastTransitionTime":"2025-12-06T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.771336 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.771378 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.771390 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.771409 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.771422 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:46Z","lastTransitionTime":"2025-12-06T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.874925 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.874969 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.874980 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.874999 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.875010 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:46Z","lastTransitionTime":"2025-12-06T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.978546 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.978606 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.978615 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.978633 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:46 crc kubenswrapper[4954]: I1206 06:57:46.978645 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:46Z","lastTransitionTime":"2025-12-06T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.081937 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.081985 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.081996 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.082013 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.082024 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:47Z","lastTransitionTime":"2025-12-06T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.185860 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.185902 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.185914 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.185941 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.185951 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:47Z","lastTransitionTime":"2025-12-06T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.288145 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.288201 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.288211 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.288230 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.288240 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:47Z","lastTransitionTime":"2025-12-06T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:47 crc kubenswrapper[4954]: W1206 06:57:47.306345 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a9aa90b_56ad_4dfc_be90_8030f06801d0.slice/crio-5bd7313728e5b3ea6b98dfa9548efdce9ff12199a8da2d7f47efcafc2d05ad55 WatchSource:0}: Error finding container 5bd7313728e5b3ea6b98dfa9548efdce9ff12199a8da2d7f47efcafc2d05ad55: Status 404 returned error can't find the container with id 5bd7313728e5b3ea6b98dfa9548efdce9ff12199a8da2d7f47efcafc2d05ad55 Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.390862 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.390907 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.390920 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.390940 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.390954 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:47Z","lastTransitionTime":"2025-12-06T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.442808 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:47 crc kubenswrapper[4954]: E1206 06:57:47.443002 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.443077 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:47 crc kubenswrapper[4954]: E1206 06:57:47.443353 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.493908 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.493952 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.493961 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.493976 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.493986 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:47Z","lastTransitionTime":"2025-12-06T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.597672 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.597756 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.597784 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.597823 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.597851 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:47Z","lastTransitionTime":"2025-12-06T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.703240 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.703307 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.703324 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.703347 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.703364 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:47Z","lastTransitionTime":"2025-12-06T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.755322 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovnkube-controller/0.log" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.757644 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerStarted","Data":"c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1"} Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.758333 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" event={"ID":"8a9aa90b-56ad-4dfc-be90-8030f06801d0","Type":"ContainerStarted","Data":"5bd7313728e5b3ea6b98dfa9548efdce9ff12199a8da2d7f47efcafc2d05ad55"} Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.805752 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.805782 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.805792 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.805808 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.805818 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:47Z","lastTransitionTime":"2025-12-06T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.908945 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.909008 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.909023 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.909049 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.909066 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:47Z","lastTransitionTime":"2025-12-06T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.956495 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-vtxfz"] Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.957016 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:57:47 crc kubenswrapper[4954]: E1206 06:57:47.957080 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.974105 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:47Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.986230 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:47Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:47 crc kubenswrapper[4954]: I1206 06:57:47.997788 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:47Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.012110 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.012158 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.012172 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.012190 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.012204 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:48Z","lastTransitionTime":"2025-12-06T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.012870 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.027301 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.040772 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.061450 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:44Z\\\",\\\"message\\\":\\\"or.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.247463 6266 factory.go:656] Stopping watch factory\\\\nI1206 06:57:44.247588 6266 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247631 6266 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.247779 6266 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247941 6266 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.248220 6266 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248425 6266 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248544 6266 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.073676 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.085083 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.095958 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.110174 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.114842 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.114889 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.114903 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.114926 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.114940 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:48Z","lastTransitionTime":"2025-12-06T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.119681 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwcqf\" (UniqueName: \"kubernetes.io/projected/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-kube-api-access-xwcqf\") pod \"network-metrics-daemon-vtxfz\" (UID: \"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\") " pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.119734 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs\") pod \"network-metrics-daemon-vtxfz\" (UID: \"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\") " pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.123657 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.139460 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.157289 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.171051 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.218078 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.218134 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.218143 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.218163 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.218173 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:48Z","lastTransitionTime":"2025-12-06T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.220696 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs\") pod \"network-metrics-daemon-vtxfz\" (UID: \"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\") " pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.220786 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwcqf\" (UniqueName: \"kubernetes.io/projected/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-kube-api-access-xwcqf\") pod \"network-metrics-daemon-vtxfz\" (UID: \"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\") " pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:57:48 crc kubenswrapper[4954]: E1206 06:57:48.220907 4954 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 06:57:48 crc kubenswrapper[4954]: E1206 06:57:48.220987 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs podName:9377db43-9e5b-41e9-a9bc-f5fe3a81a457 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:48.720968414 +0000 UTC m=+43.534327803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs") pod "network-metrics-daemon-vtxfz" (UID: "9377db43-9e5b-41e9-a9bc-f5fe3a81a457") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.238265 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwcqf\" (UniqueName: \"kubernetes.io/projected/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-kube-api-access-xwcqf\") pod \"network-metrics-daemon-vtxfz\" (UID: \"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\") " pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.323269 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.323309 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.323318 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.323336 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.323346 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:48Z","lastTransitionTime":"2025-12-06T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.426380 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.426442 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.426458 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.426478 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.426491 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:48Z","lastTransitionTime":"2025-12-06T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.442399 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:48 crc kubenswrapper[4954]: E1206 06:57:48.442684 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.443742 4954 scope.go:117] "RemoveContainer" containerID="b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.530309 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.530356 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.530370 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.530387 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.530396 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:48Z","lastTransitionTime":"2025-12-06T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.632668 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.633016 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.633033 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.633050 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.633060 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:48Z","lastTransitionTime":"2025-12-06T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.726677 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs\") pod \"network-metrics-daemon-vtxfz\" (UID: \"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\") " pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:57:48 crc kubenswrapper[4954]: E1206 06:57:48.726851 4954 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 06:57:48 crc kubenswrapper[4954]: E1206 06:57:48.726932 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs podName:9377db43-9e5b-41e9-a9bc-f5fe3a81a457 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:49.726912332 +0000 UTC m=+44.540271721 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs") pod "network-metrics-daemon-vtxfz" (UID: "9377db43-9e5b-41e9-a9bc-f5fe3a81a457") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.735681 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.735730 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.735742 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.735764 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.735779 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:48Z","lastTransitionTime":"2025-12-06T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.763925 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" event={"ID":"8a9aa90b-56ad-4dfc-be90-8030f06801d0","Type":"ContainerStarted","Data":"e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68"} Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.764014 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" event={"ID":"8a9aa90b-56ad-4dfc-be90-8030f06801d0","Type":"ContainerStarted","Data":"7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df"} Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.764619 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.782223 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.798492 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.814796 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.828991 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.838978 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.839029 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.839040 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.839058 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.839392 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:48Z","lastTransitionTime":"2025-12-06T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.849958 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.862829 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.876016 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.893306 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.911730 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:44Z\\\",\\\"message\\\":\\\"or.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.247463 6266 factory.go:656] Stopping watch factory\\\\nI1206 06:57:44.247588 6266 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247631 6266 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.247779 6266 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247941 6266 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.248220 6266 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248425 6266 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248544 6266 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.923810 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.939149 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.942951 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.943014 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.943030 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.943054 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.943067 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:48Z","lastTransitionTime":"2025-12-06T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.955837 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.969800 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.984994 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:48 crc kubenswrapper[4954]: I1206 06:57:48.997943 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:48Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.012840 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:49Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.028596 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:49Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.042842 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:49Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.045877 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.045930 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.045941 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.045959 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.045972 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:49Z","lastTransitionTime":"2025-12-06T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.054989 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:49Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.069783 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:49Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.083253 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:49Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.096237 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:49Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.109152 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:49Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.125639 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:49Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.129706 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:57:49 crc kubenswrapper[4954]: E1206 06:57:49.129877 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:58:05.129852281 +0000 UTC m=+59.943211670 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.138602 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:49Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.148718 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.148763 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.148776 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.148794 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.148808 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:49Z","lastTransitionTime":"2025-12-06T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.162555 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:44Z\\\",\\\"message\\\":\\\"or.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.247463 6266 factory.go:656] Stopping watch factory\\\\nI1206 06:57:44.247588 6266 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247631 6266 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.247779 6266 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247941 6266 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.248220 6266 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248425 6266 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248544 6266 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:49Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.175051 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:49Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.188923 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:49Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.206414 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:49Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.218549 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:49Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.231051 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.231110 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.231185 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.231218 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:49 crc kubenswrapper[4954]: E1206 06:57:49.231316 4954 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 06:57:49 crc kubenswrapper[4954]: E1206 06:57:49.231354 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 06:57:49 crc kubenswrapper[4954]: E1206 06:57:49.231375 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 06:57:49 crc kubenswrapper[4954]: E1206 06:57:49.231382 4954 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 06:57:49 crc kubenswrapper[4954]: E1206 06:57:49.231333 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 06:57:49 crc kubenswrapper[4954]: E1206 06:57:49.231415 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 06:57:49 crc kubenswrapper[4954]: E1206 06:57:49.231387 4954 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:49 crc kubenswrapper[4954]: E1206 06:57:49.231425 4954 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:49 crc kubenswrapper[4954]: E1206 06:57:49.231403 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 06:58:05.231386146 +0000 UTC m=+60.044745535 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 06:57:49 crc kubenswrapper[4954]: E1206 06:57:49.231543 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 06:58:05.2315242 +0000 UTC m=+60.044883589 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 06:57:49 crc kubenswrapper[4954]: E1206 06:57:49.231555 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 06:58:05.231549011 +0000 UTC m=+60.044908400 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:49 crc kubenswrapper[4954]: E1206 06:57:49.231592 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 06:58:05.231583992 +0000 UTC m=+60.044943381 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.251475 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.252083 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.252101 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.252125 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.252136 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:49Z","lastTransitionTime":"2025-12-06T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.354606 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.354656 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.354666 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.354683 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.354693 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:49Z","lastTransitionTime":"2025-12-06T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.443210 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.443261 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.443260 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:49 crc kubenswrapper[4954]: E1206 06:57:49.443377 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:57:49 crc kubenswrapper[4954]: E1206 06:57:49.443495 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:57:49 crc kubenswrapper[4954]: E1206 06:57:49.443591 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.457140 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.457193 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.457205 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.457226 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.457238 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:49Z","lastTransitionTime":"2025-12-06T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.559851 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.559908 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.559925 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.559946 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.559965 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:49Z","lastTransitionTime":"2025-12-06T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.663647 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.663723 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.663743 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.663778 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.663811 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:49Z","lastTransitionTime":"2025-12-06T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.736070 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs\") pod \"network-metrics-daemon-vtxfz\" (UID: \"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\") " pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:57:49 crc kubenswrapper[4954]: E1206 06:57:49.736218 4954 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 06:57:49 crc kubenswrapper[4954]: E1206 06:57:49.736277 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs podName:9377db43-9e5b-41e9-a9bc-f5fe3a81a457 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:51.736263404 +0000 UTC m=+46.549622793 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs") pod "network-metrics-daemon-vtxfz" (UID: "9377db43-9e5b-41e9-a9bc-f5fe3a81a457") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.766442 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.766482 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.766491 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.766511 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.766522 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:49Z","lastTransitionTime":"2025-12-06T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.869277 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.869334 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.869348 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.869367 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.869379 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:49Z","lastTransitionTime":"2025-12-06T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.972922 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.972981 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.972990 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.973009 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:49 crc kubenswrapper[4954]: I1206 06:57:49.973024 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:49Z","lastTransitionTime":"2025-12-06T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.075904 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.075985 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.075998 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.076021 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.076037 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:50Z","lastTransitionTime":"2025-12-06T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.178871 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.178928 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.178939 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.178959 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.178974 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:50Z","lastTransitionTime":"2025-12-06T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.281711 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.281765 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.281777 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.281798 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.281807 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:50Z","lastTransitionTime":"2025-12-06T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.384547 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.384636 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.384647 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.384666 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.384677 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:50Z","lastTransitionTime":"2025-12-06T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.443209 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:50 crc kubenswrapper[4954]: E1206 06:57:50.443398 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.488504 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.488545 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.488555 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.488592 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.488605 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:50Z","lastTransitionTime":"2025-12-06T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.590641 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.590901 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.590917 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.590938 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.590956 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:50Z","lastTransitionTime":"2025-12-06T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.693527 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.693617 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.693630 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.693651 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.693666 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:50Z","lastTransitionTime":"2025-12-06T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.774605 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.776982 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96"} Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.777446 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.796358 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:50Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.796986 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.797053 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.797078 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.797104 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.797175 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:50Z","lastTransitionTime":"2025-12-06T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.812461 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:50Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.827362 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:50Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.845632 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:50Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.862889 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:50Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.876667 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:50Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.891152 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:50Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.900364 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.900427 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.900445 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.900469 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.900484 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:50Z","lastTransitionTime":"2025-12-06T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.906974 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:50Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.924799 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:50Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.941729 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:50Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.954074 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:50Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.968702 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:50Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.985132 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:50Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:50 crc kubenswrapper[4954]: I1206 06:57:50.998409 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:50Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.003151 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.003176 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.003185 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.003202 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.003211 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:51Z","lastTransitionTime":"2025-12-06T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.018209 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:44Z\\\",\\\"message\\\":\\\"or.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.247463 6266 factory.go:656] Stopping watch factory\\\\nI1206 06:57:44.247588 6266 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247631 6266 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.247779 6266 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247941 6266 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.248220 6266 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248425 6266 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248544 6266 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:51Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.106269 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.106334 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.106347 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.106367 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.106381 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:51Z","lastTransitionTime":"2025-12-06T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.209409 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.209485 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.209505 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.209535 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.209553 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:51Z","lastTransitionTime":"2025-12-06T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.312966 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.313027 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.313043 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.313065 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.313081 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:51Z","lastTransitionTime":"2025-12-06T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.415905 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.415957 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.415972 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.415993 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.416009 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:51Z","lastTransitionTime":"2025-12-06T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.443206 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.443303 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:57:51 crc kubenswrapper[4954]: E1206 06:57:51.443397 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.443304 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:51 crc kubenswrapper[4954]: E1206 06:57:51.443501 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:57:51 crc kubenswrapper[4954]: E1206 06:57:51.443680 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.518886 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.518931 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.518941 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.518958 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.518966 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:51Z","lastTransitionTime":"2025-12-06T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.621203 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.621257 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.621268 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.621288 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.621300 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:51Z","lastTransitionTime":"2025-12-06T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.724406 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.724480 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.724495 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.724523 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.724540 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:51Z","lastTransitionTime":"2025-12-06T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.759436 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs\") pod \"network-metrics-daemon-vtxfz\" (UID: \"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\") " pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:57:51 crc kubenswrapper[4954]: E1206 06:57:51.759660 4954 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 06:57:51 crc kubenswrapper[4954]: E1206 06:57:51.759787 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs podName:9377db43-9e5b-41e9-a9bc-f5fe3a81a457 nodeName:}" failed. No retries permitted until 2025-12-06 06:57:55.759762477 +0000 UTC m=+50.573121866 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs") pod "network-metrics-daemon-vtxfz" (UID: "9377db43-9e5b-41e9-a9bc-f5fe3a81a457") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.783175 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovnkube-controller/1.log" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.784150 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovnkube-controller/0.log" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.787187 4954 generic.go:334] "Generic (PLEG): container finished" podID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerID="c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1" exitCode=1 Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.787247 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerDied","Data":"c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1"} Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.787342 4954 scope.go:117] "RemoveContainer" containerID="294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.788434 4954 scope.go:117] "RemoveContainer" containerID="c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1" Dec 06 06:57:51 crc kubenswrapper[4954]: E1206 06:57:51.788630 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-crz6w_openshift-ovn-kubernetes(7cc429b1-3932-4515-a2b8-f0dd601f3e4c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.809331 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:51Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.823428 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:51Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.827251 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.827308 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.827322 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.827343 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.827355 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:51Z","lastTransitionTime":"2025-12-06T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.837900 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:51Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.853232 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:51Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.869226 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:51Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.894720 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:44Z\\\",\\\"message\\\":\\\"or.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.247463 6266 factory.go:656] Stopping watch factory\\\\nI1206 06:57:44.247588 6266 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247631 6266 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.247779 6266 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247941 6266 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.248220 6266 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248425 6266 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248544 6266 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:50Z\\\",\\\"message\\\":\\\"/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1206 06:57:50.487581 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:51Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.909204 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:51Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.928611 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:51Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.930800 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.930868 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.930885 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.930912 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.930929 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:51Z","lastTransitionTime":"2025-12-06T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.946120 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:51Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.961438 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:51Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.976345 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:51Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:51 crc kubenswrapper[4954]: I1206 06:57:51.994676 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:51Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.014402 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:52Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.032865 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:52Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.033452 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.033490 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.033503 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.033523 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.033536 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:52Z","lastTransitionTime":"2025-12-06T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.049012 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:52Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.135894 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.135949 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.135963 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.135984 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.135996 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:52Z","lastTransitionTime":"2025-12-06T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.239266 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.239358 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.239382 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.239412 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.239434 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:52Z","lastTransitionTime":"2025-12-06T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.342739 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.342830 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.342850 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.342880 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.342900 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:52Z","lastTransitionTime":"2025-12-06T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.442366 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:52 crc kubenswrapper[4954]: E1206 06:57:52.442548 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.445467 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.445519 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.445532 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.445552 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.445585 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:52Z","lastTransitionTime":"2025-12-06T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.548279 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.548351 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.548362 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.548386 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.548406 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:52Z","lastTransitionTime":"2025-12-06T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.650885 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.650935 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.650947 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.650967 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.650980 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:52Z","lastTransitionTime":"2025-12-06T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.754136 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.754202 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.754213 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.754236 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.754250 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:52Z","lastTransitionTime":"2025-12-06T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.792278 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovnkube-controller/1.log" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.856624 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.856666 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.856683 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.856701 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.856716 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:52Z","lastTransitionTime":"2025-12-06T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.960049 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.960143 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.960157 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.960175 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:52 crc kubenswrapper[4954]: I1206 06:57:52.960187 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:52Z","lastTransitionTime":"2025-12-06T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.063369 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.063403 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.063413 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.063428 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.063437 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:53Z","lastTransitionTime":"2025-12-06T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.165761 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.165799 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.165812 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.165833 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.165845 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:53Z","lastTransitionTime":"2025-12-06T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.269525 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.269605 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.269616 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.269635 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.269647 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:53Z","lastTransitionTime":"2025-12-06T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.372798 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.372877 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.372901 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.372936 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.372965 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:53Z","lastTransitionTime":"2025-12-06T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.442717 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.442718 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:53 crc kubenswrapper[4954]: E1206 06:57:53.442896 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:57:53 crc kubenswrapper[4954]: E1206 06:57:53.443063 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.443657 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:57:53 crc kubenswrapper[4954]: E1206 06:57:53.443875 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.476827 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.476891 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.476913 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.476944 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.476966 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:53Z","lastTransitionTime":"2025-12-06T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.580864 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.580944 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.580963 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.581005 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.581026 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:53Z","lastTransitionTime":"2025-12-06T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.683772 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.683865 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.683885 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.683913 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.683932 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:53Z","lastTransitionTime":"2025-12-06T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.786530 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.786890 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.787000 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.787124 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.787234 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:53Z","lastTransitionTime":"2025-12-06T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.890194 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.890252 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.890265 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.890286 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.890300 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:53Z","lastTransitionTime":"2025-12-06T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.992969 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.993032 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.993053 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.993077 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:53 crc kubenswrapper[4954]: I1206 06:57:53.993093 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:53Z","lastTransitionTime":"2025-12-06T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.095812 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.095860 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.095875 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.095895 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.095911 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:54Z","lastTransitionTime":"2025-12-06T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.129950 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.139873 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.148269 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:54Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.168423 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:54Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.185635 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:54Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.197872 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.197905 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.197915 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.197938 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.197951 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:54Z","lastTransitionTime":"2025-12-06T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.202310 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:54Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.217555 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:54Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.231356 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:54Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.251135 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:54Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.266275 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:54Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.279862 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:54Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.300774 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:54Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.301227 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.301300 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.301322 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.301351 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.301369 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:54Z","lastTransitionTime":"2025-12-06T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.320578 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:44Z\\\",\\\"message\\\":\\\"or.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.247463 6266 factory.go:656] Stopping watch factory\\\\nI1206 06:57:44.247588 6266 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247631 6266 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.247779 6266 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247941 6266 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.248220 6266 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248425 6266 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248544 6266 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:50Z\\\",\\\"message\\\":\\\"/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1206 06:57:50.487581 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:54Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.334645 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:54Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.348969 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:54Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.367074 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:54Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.383761 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:54Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.405005 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.405400 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.405550 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.405784 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.405932 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:54Z","lastTransitionTime":"2025-12-06T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.443152 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:54 crc kubenswrapper[4954]: E1206 06:57:54.443553 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.509427 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.509474 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.509483 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.509499 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.509510 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:54Z","lastTransitionTime":"2025-12-06T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.612600 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.612649 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.612659 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.612679 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.612691 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:54Z","lastTransitionTime":"2025-12-06T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.716414 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.716467 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.716480 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.716500 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.716510 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:54Z","lastTransitionTime":"2025-12-06T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.819778 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.819839 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.819859 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.819883 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.819897 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:54Z","lastTransitionTime":"2025-12-06T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.922620 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.922702 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.922716 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.922740 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:54 crc kubenswrapper[4954]: I1206 06:57:54.922754 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:54Z","lastTransitionTime":"2025-12-06T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.027199 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.027249 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.027260 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.027281 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.027293 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:55Z","lastTransitionTime":"2025-12-06T06:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.129735 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.129784 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.129795 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.129813 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.129828 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:55Z","lastTransitionTime":"2025-12-06T06:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.232524 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.232609 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.232622 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.232642 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.232664 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:55Z","lastTransitionTime":"2025-12-06T06:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.335379 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.335465 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.335485 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.335513 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.335532 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:55Z","lastTransitionTime":"2025-12-06T06:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.438216 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.438271 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.438282 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.438300 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.438312 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:55Z","lastTransitionTime":"2025-12-06T06:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.442752 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.442816 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.442748 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:55 crc kubenswrapper[4954]: E1206 06:57:55.442961 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:57:55 crc kubenswrapper[4954]: E1206 06:57:55.443164 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:57:55 crc kubenswrapper[4954]: E1206 06:57:55.443315 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.465964 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:55Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.483167 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:55Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.496413 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:55Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.518379 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:44Z\\\",\\\"message\\\":\\\"or.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.247463 6266 factory.go:656] Stopping watch factory\\\\nI1206 06:57:44.247588 6266 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247631 6266 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.247779 6266 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247941 6266 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.248220 6266 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248425 6266 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248544 6266 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:50Z\\\",\\\"message\\\":\\\"/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1206 06:57:50.487581 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:55Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.532438 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:55Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.542028 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.542080 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.542093 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.542113 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.542127 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:55Z","lastTransitionTime":"2025-12-06T06:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.552478 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:55Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.569509 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:55Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.582163 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:55Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.595366 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd399e-52cb-401b-9d4c-7c4b9dae34cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2ac34c62664b0a17ec26755307ecae78d4118b236c66819ebad906c1c092cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4bb75a39170b64e9b46994f3955c491ecb3f2155fc552c2738a5ece2e22de21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1783fc9d0986247d5ada8563d17661cb609d146db208e05609175315554e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:55Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.615217 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:55Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.630361 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:55Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.645238 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:55Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.645945 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.645992 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.646006 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.646030 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.646049 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:55Z","lastTransitionTime":"2025-12-06T06:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.656094 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:55Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.672846 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:55Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.689070 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:55Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.702631 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:55Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.749049 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.749092 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.749105 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.749125 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.749136 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:55Z","lastTransitionTime":"2025-12-06T06:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.808989 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs\") pod \"network-metrics-daemon-vtxfz\" (UID: \"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\") " pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:57:55 crc kubenswrapper[4954]: E1206 06:57:55.809225 4954 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 06:57:55 crc kubenswrapper[4954]: E1206 06:57:55.809352 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs podName:9377db43-9e5b-41e9-a9bc-f5fe3a81a457 nodeName:}" failed. No retries permitted until 2025-12-06 06:58:03.809315976 +0000 UTC m=+58.622675405 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs") pod "network-metrics-daemon-vtxfz" (UID: "9377db43-9e5b-41e9-a9bc-f5fe3a81a457") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.852960 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.853262 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.853340 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.853368 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.853391 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:55Z","lastTransitionTime":"2025-12-06T06:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.956723 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.956783 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.956793 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.956808 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:55 crc kubenswrapper[4954]: I1206 06:57:55.956823 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:55Z","lastTransitionTime":"2025-12-06T06:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.059916 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.059981 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.059992 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.060011 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.060022 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:56Z","lastTransitionTime":"2025-12-06T06:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.162896 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.162938 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.162951 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.162970 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.162984 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:56Z","lastTransitionTime":"2025-12-06T06:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.265143 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.265193 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.265206 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.265224 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.265237 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:56Z","lastTransitionTime":"2025-12-06T06:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.368417 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.368493 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.368511 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.368540 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.368594 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:56Z","lastTransitionTime":"2025-12-06T06:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.371521 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.371603 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.371617 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.371637 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.371650 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:56Z","lastTransitionTime":"2025-12-06T06:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:56 crc kubenswrapper[4954]: E1206 06:57:56.384829 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:56Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.389206 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.389245 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.389255 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.389272 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.389286 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:56Z","lastTransitionTime":"2025-12-06T06:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:56 crc kubenswrapper[4954]: E1206 06:57:56.401987 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:56Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.406431 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.406490 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.406501 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.406522 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.406537 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:56Z","lastTransitionTime":"2025-12-06T06:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:56 crc kubenswrapper[4954]: E1206 06:57:56.418652 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:56Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.424936 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.425485 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.425651 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.425760 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.425842 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:56Z","lastTransitionTime":"2025-12-06T06:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:56 crc kubenswrapper[4954]: E1206 06:57:56.439498 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:56Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.442465 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:56 crc kubenswrapper[4954]: E1206 06:57:56.442838 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.445669 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.445728 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.445744 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.445768 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.445781 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:56Z","lastTransitionTime":"2025-12-06T06:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:56 crc kubenswrapper[4954]: E1206 06:57:56.459530 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:57:56Z is after 2025-08-24T17:21:41Z" Dec 06 06:57:56 crc kubenswrapper[4954]: E1206 06:57:56.459684 4954 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.470529 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.470576 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.470588 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.470609 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.470621 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:56Z","lastTransitionTime":"2025-12-06T06:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.573281 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.573325 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.573335 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.573354 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.573365 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:56Z","lastTransitionTime":"2025-12-06T06:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.676443 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.676501 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.676515 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.676533 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.676542 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:56Z","lastTransitionTime":"2025-12-06T06:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.779130 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.779498 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.779745 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.779926 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.780126 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:56Z","lastTransitionTime":"2025-12-06T06:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.883796 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.884181 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.884366 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.884552 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.884814 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:56Z","lastTransitionTime":"2025-12-06T06:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.987625 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.987696 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.987715 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.987738 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:56 crc kubenswrapper[4954]: I1206 06:57:56.987752 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:56Z","lastTransitionTime":"2025-12-06T06:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.090466 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.090507 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.090519 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.090538 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.090550 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:57Z","lastTransitionTime":"2025-12-06T06:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.193837 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.193883 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.193893 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.193912 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.193923 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:57Z","lastTransitionTime":"2025-12-06T06:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.297228 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.297280 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.297292 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.297311 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.297323 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:57Z","lastTransitionTime":"2025-12-06T06:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.401019 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.401083 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.401099 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.401119 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.401132 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:57Z","lastTransitionTime":"2025-12-06T06:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.443385 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.443531 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:57:57 crc kubenswrapper[4954]: E1206 06:57:57.443556 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.443770 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:57 crc kubenswrapper[4954]: E1206 06:57:57.444025 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:57:57 crc kubenswrapper[4954]: E1206 06:57:57.444391 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.504236 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.504283 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.504293 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.504312 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.504326 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:57Z","lastTransitionTime":"2025-12-06T06:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.607202 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.607256 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.607266 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.607284 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.607295 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:57Z","lastTransitionTime":"2025-12-06T06:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.710927 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.711006 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.711024 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.711051 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.711065 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:57Z","lastTransitionTime":"2025-12-06T06:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.814430 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.814494 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.814503 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.814522 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.814533 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:57Z","lastTransitionTime":"2025-12-06T06:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.917419 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.917484 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.917501 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.917522 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:57 crc kubenswrapper[4954]: I1206 06:57:57.917536 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:57Z","lastTransitionTime":"2025-12-06T06:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.020256 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.020318 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.020363 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.020387 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.020401 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:58Z","lastTransitionTime":"2025-12-06T06:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.123814 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.123889 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.123904 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.123926 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.123940 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:58Z","lastTransitionTime":"2025-12-06T06:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.227066 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.227126 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.227140 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.227160 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.227173 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:58Z","lastTransitionTime":"2025-12-06T06:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.330272 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.330324 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.330332 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.330349 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.330360 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:58Z","lastTransitionTime":"2025-12-06T06:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.433641 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.433682 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.433691 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.433709 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.433718 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:58Z","lastTransitionTime":"2025-12-06T06:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.442372 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:57:58 crc kubenswrapper[4954]: E1206 06:57:58.442608 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.536518 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.536577 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.536588 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.536604 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.536614 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:58Z","lastTransitionTime":"2025-12-06T06:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.641062 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.641121 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.641136 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.641166 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.641182 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:58Z","lastTransitionTime":"2025-12-06T06:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.745661 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.746069 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.746216 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.746364 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.746492 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:58Z","lastTransitionTime":"2025-12-06T06:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.849424 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.849505 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.849515 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.849532 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.849541 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:58Z","lastTransitionTime":"2025-12-06T06:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.952332 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.952697 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.952716 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.952741 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:58 crc kubenswrapper[4954]: I1206 06:57:58.952777 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:58Z","lastTransitionTime":"2025-12-06T06:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.056056 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.056128 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.056143 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.056169 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.056183 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:59Z","lastTransitionTime":"2025-12-06T06:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.158929 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.158994 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.159007 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.159027 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.159041 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:59Z","lastTransitionTime":"2025-12-06T06:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.262303 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.262357 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.262371 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.262393 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.262407 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:59Z","lastTransitionTime":"2025-12-06T06:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.365826 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.365882 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.365894 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.365917 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.365930 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:59Z","lastTransitionTime":"2025-12-06T06:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.442699 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.442756 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.442768 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:57:59 crc kubenswrapper[4954]: E1206 06:57:59.442918 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:57:59 crc kubenswrapper[4954]: E1206 06:57:59.443019 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:57:59 crc kubenswrapper[4954]: E1206 06:57:59.443146 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.469229 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.469294 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.469312 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.469338 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.469359 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:59Z","lastTransitionTime":"2025-12-06T06:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.572769 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.572843 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.572861 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.572887 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.572901 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:59Z","lastTransitionTime":"2025-12-06T06:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.675625 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.675686 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.675705 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.675726 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.675737 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:59Z","lastTransitionTime":"2025-12-06T06:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.779061 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.779122 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.779139 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.779162 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.779174 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:59Z","lastTransitionTime":"2025-12-06T06:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.882006 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.882062 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.882075 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.882096 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.882111 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:59Z","lastTransitionTime":"2025-12-06T06:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.985397 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.985477 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.985501 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.985533 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:57:59 crc kubenswrapper[4954]: I1206 06:57:59.985596 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:57:59Z","lastTransitionTime":"2025-12-06T06:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.088975 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.089051 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.089077 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.089101 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.089116 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:00Z","lastTransitionTime":"2025-12-06T06:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.192169 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.192222 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.192236 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.192259 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.192274 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:00Z","lastTransitionTime":"2025-12-06T06:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.296237 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.296329 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.296346 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.296826 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.296883 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:00Z","lastTransitionTime":"2025-12-06T06:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.400307 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.400391 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.400403 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.400429 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.400448 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:00Z","lastTransitionTime":"2025-12-06T06:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.443313 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:00 crc kubenswrapper[4954]: E1206 06:58:00.443612 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.503924 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.503984 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.503995 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.504019 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.504040 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:00Z","lastTransitionTime":"2025-12-06T06:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.607304 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.607351 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.607364 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.607383 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.607395 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:00Z","lastTransitionTime":"2025-12-06T06:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.711115 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.711180 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.711192 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.711215 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.711234 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:00Z","lastTransitionTime":"2025-12-06T06:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.814838 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.814946 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.814979 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.815020 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.815086 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:00Z","lastTransitionTime":"2025-12-06T06:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.919597 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.919654 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.919666 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.919691 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:00 crc kubenswrapper[4954]: I1206 06:58:00.919704 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:00Z","lastTransitionTime":"2025-12-06T06:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.022893 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.022939 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.022949 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.022969 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.022979 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:01Z","lastTransitionTime":"2025-12-06T06:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.126601 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.126652 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.126666 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.126686 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.126700 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:01Z","lastTransitionTime":"2025-12-06T06:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.230966 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.231032 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.231044 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.231073 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.231091 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:01Z","lastTransitionTime":"2025-12-06T06:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.334633 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.334715 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.334740 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.334773 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.334798 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:01Z","lastTransitionTime":"2025-12-06T06:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.439368 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.439438 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.439467 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.439501 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.439526 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:01Z","lastTransitionTime":"2025-12-06T06:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.442826 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.442829 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:01 crc kubenswrapper[4954]: E1206 06:58:01.442992 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.442922 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:01 crc kubenswrapper[4954]: E1206 06:58:01.443211 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:01 crc kubenswrapper[4954]: E1206 06:58:01.443339 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.542960 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.543015 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.543027 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.543045 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.543060 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:01Z","lastTransitionTime":"2025-12-06T06:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.647057 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.647130 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.647166 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.647199 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.647223 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:01Z","lastTransitionTime":"2025-12-06T06:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.751069 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.751140 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.751156 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.751179 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.751196 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:01Z","lastTransitionTime":"2025-12-06T06:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.854906 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.855350 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.855497 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.855813 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.856105 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:01Z","lastTransitionTime":"2025-12-06T06:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.959363 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.959409 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.959418 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.959438 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:01 crc kubenswrapper[4954]: I1206 06:58:01.959450 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:01Z","lastTransitionTime":"2025-12-06T06:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.062434 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.062835 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.063193 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.063284 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.063390 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:02Z","lastTransitionTime":"2025-12-06T06:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.167425 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.167977 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.168147 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.168355 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.168518 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:02Z","lastTransitionTime":"2025-12-06T06:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.271870 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.271937 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.271959 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.271986 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.272004 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:02Z","lastTransitionTime":"2025-12-06T06:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.375959 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.376040 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.376061 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.376094 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.376117 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:02Z","lastTransitionTime":"2025-12-06T06:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.443140 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:02 crc kubenswrapper[4954]: E1206 06:58:02.443329 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.479257 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.479325 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.479343 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.479377 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.479397 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:02Z","lastTransitionTime":"2025-12-06T06:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.582691 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.582750 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.582767 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.582787 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.582797 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:02Z","lastTransitionTime":"2025-12-06T06:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.686472 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.686525 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.686540 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.686557 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.686601 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:02Z","lastTransitionTime":"2025-12-06T06:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.789158 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.789228 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.789242 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.789271 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.789286 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:02Z","lastTransitionTime":"2025-12-06T06:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.894017 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.894534 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.894569 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.894605 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.894621 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:02Z","lastTransitionTime":"2025-12-06T06:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.997541 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.997625 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.997641 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.997666 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:02 crc kubenswrapper[4954]: I1206 06:58:02.997681 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:02Z","lastTransitionTime":"2025-12-06T06:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.101761 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.101835 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.101854 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.101885 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.101906 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:03Z","lastTransitionTime":"2025-12-06T06:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.205085 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.205158 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.205169 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.205217 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.205231 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:03Z","lastTransitionTime":"2025-12-06T06:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.308317 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.308383 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.308396 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.308416 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.308429 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:03Z","lastTransitionTime":"2025-12-06T06:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.410845 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.410884 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.410892 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.410909 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.410918 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:03Z","lastTransitionTime":"2025-12-06T06:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.442808 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.442808 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.442964 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:03 crc kubenswrapper[4954]: E1206 06:58:03.443133 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:03 crc kubenswrapper[4954]: E1206 06:58:03.443229 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:03 crc kubenswrapper[4954]: E1206 06:58:03.443314 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.513290 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.513364 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.513378 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.513397 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.513408 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:03Z","lastTransitionTime":"2025-12-06T06:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.617041 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.617094 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.617106 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.617126 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.617138 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:03Z","lastTransitionTime":"2025-12-06T06:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.720486 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.720527 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.720538 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.720578 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.720590 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:03Z","lastTransitionTime":"2025-12-06T06:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.824000 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.824065 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.824084 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.824110 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.824127 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:03Z","lastTransitionTime":"2025-12-06T06:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.907720 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs\") pod \"network-metrics-daemon-vtxfz\" (UID: \"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\") " pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:03 crc kubenswrapper[4954]: E1206 06:58:03.907847 4954 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 06:58:03 crc kubenswrapper[4954]: E1206 06:58:03.907916 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs podName:9377db43-9e5b-41e9-a9bc-f5fe3a81a457 nodeName:}" failed. No retries permitted until 2025-12-06 06:58:19.907899108 +0000 UTC m=+74.721258497 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs") pod "network-metrics-daemon-vtxfz" (UID: "9377db43-9e5b-41e9-a9bc-f5fe3a81a457") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.925872 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.925936 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.925950 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.925967 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:03 crc kubenswrapper[4954]: I1206 06:58:03.926001 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:03Z","lastTransitionTime":"2025-12-06T06:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.028703 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.028757 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.028768 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.028787 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.028799 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:04Z","lastTransitionTime":"2025-12-06T06:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.131841 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.131917 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.131942 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.131972 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.131990 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:04Z","lastTransitionTime":"2025-12-06T06:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.235609 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.235692 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.235718 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.235756 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.235784 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:04Z","lastTransitionTime":"2025-12-06T06:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.339590 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.339628 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.339639 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.339662 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.339677 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:04Z","lastTransitionTime":"2025-12-06T06:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.442728 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:04 crc kubenswrapper[4954]: E1206 06:58:04.442906 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.442987 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.443022 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.443039 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.443057 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.443074 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:04Z","lastTransitionTime":"2025-12-06T06:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.546272 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.546328 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.546363 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.546388 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.546402 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:04Z","lastTransitionTime":"2025-12-06T06:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.569117 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.588825 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:04Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.603678 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:04Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.618748 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:04Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.635248 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd399e-52cb-401b-9d4c-7c4b9dae34cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2ac34c62664b0a17ec26755307ecae78d4118b236c66819ebad906c1c092cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4bb75a39170b64e9b46994f3955c491ecb3f2155fc552c2738a5ece2e22de21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1783fc9d0986247d5ada8563d17661cb609d146db208e05609175315554e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:04Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.651081 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.651197 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.651267 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.651309 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.651345 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:04Z","lastTransitionTime":"2025-12-06T06:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.652463 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:04Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.668343 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:04Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.683836 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:04Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.696348 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:04Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.714980 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:04Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.731059 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:04Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.743513 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:04Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.756361 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.756475 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.757152 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.757199 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.757220 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:04Z","lastTransitionTime":"2025-12-06T06:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.757586 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:04Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.770824 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:04Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.785781 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:04Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.807161 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:44Z\\\",\\\"message\\\":\\\"or.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.247463 6266 factory.go:656] Stopping watch factory\\\\nI1206 06:57:44.247588 6266 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247631 6266 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.247779 6266 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247941 6266 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.248220 6266 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248425 6266 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248544 6266 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:50Z\\\",\\\"message\\\":\\\"/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1206 06:57:50.487581 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:04Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.821287 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:04Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.860596 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.860658 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.860671 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.860691 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.860703 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:04Z","lastTransitionTime":"2025-12-06T06:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.963920 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.964305 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.964380 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.964455 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:04 crc kubenswrapper[4954]: I1206 06:58:04.964517 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:04Z","lastTransitionTime":"2025-12-06T06:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.067633 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.067682 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.067694 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.067711 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.067722 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:05Z","lastTransitionTime":"2025-12-06T06:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.170404 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.170444 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.170453 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.170469 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.170479 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:05Z","lastTransitionTime":"2025-12-06T06:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.223505 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:58:05 crc kubenswrapper[4954]: E1206 06:58:05.223755 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:58:37.223719511 +0000 UTC m=+92.037078920 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.275125 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.275920 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.275937 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.275964 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.275978 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:05Z","lastTransitionTime":"2025-12-06T06:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.325425 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.325518 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.325631 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.325695 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:05 crc kubenswrapper[4954]: E1206 06:58:05.325824 4954 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 06:58:05 crc kubenswrapper[4954]: E1206 06:58:05.325854 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 06:58:05 crc kubenswrapper[4954]: E1206 06:58:05.325890 4954 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 06:58:05 crc kubenswrapper[4954]: E1206 06:58:05.325952 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 06:58:05 crc kubenswrapper[4954]: E1206 06:58:05.325910 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 06:58:05 crc kubenswrapper[4954]: E1206 06:58:05.325989 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 06:58:05 crc kubenswrapper[4954]: E1206 06:58:05.326021 4954 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:58:05 crc kubenswrapper[4954]: E1206 06:58:05.326031 4954 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:58:05 crc kubenswrapper[4954]: E1206 06:58:05.325929 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 06:58:37.325904625 +0000 UTC m=+92.139264024 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 06:58:05 crc kubenswrapper[4954]: E1206 06:58:05.326120 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 06:58:37.32610011 +0000 UTC m=+92.139459539 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 06:58:05 crc kubenswrapper[4954]: E1206 06:58:05.326158 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 06:58:37.326143062 +0000 UTC m=+92.139502501 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:58:05 crc kubenswrapper[4954]: E1206 06:58:05.326202 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 06:58:37.326188123 +0000 UTC m=+92.139547552 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.378952 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.379074 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.379100 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.379133 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.379159 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:05Z","lastTransitionTime":"2025-12-06T06:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.442813 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.442927 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.442883 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:05 crc kubenswrapper[4954]: E1206 06:58:05.443192 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:05 crc kubenswrapper[4954]: E1206 06:58:05.443741 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:05 crc kubenswrapper[4954]: E1206 06:58:05.444542 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.446344 4954 scope.go:117] "RemoveContainer" containerID="c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.461715 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.479652 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.482337 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.482376 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.482390 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.482413 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.482428 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:05Z","lastTransitionTime":"2025-12-06T06:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.496044 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.521049 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294cd80dd19fefe1aabbc6827a3fb371ac8b524ed54e84b66342dec526c19efa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:44Z\\\",\\\"message\\\":\\\"or.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.247463 6266 factory.go:656] Stopping watch factory\\\\nI1206 06:57:44.247588 6266 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247631 6266 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.247779 6266 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1206 06:57:44.247941 6266 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 06:57:44.248220 6266 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248425 6266 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:57:44.248544 6266 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:50Z\\\",\\\"message\\\":\\\"/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1206 06:57:50.487581 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.534899 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.551692 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.567993 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.578840 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.587372 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.587405 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.587415 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.587430 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.587441 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:05Z","lastTransitionTime":"2025-12-06T06:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.592783 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd399e-52cb-401b-9d4c-7c4b9dae34cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2ac34c62664b0a17ec26755307ecae78d4118b236c66819ebad906c1c092cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4bb75a39170b64e9b46994f3955c491ecb3f2155fc552c2738a5ece2e22de21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1783fc9d0986247d5ada8563d17661cb609d146db208e05609175315554e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.608536 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.625345 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.638247 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.652639 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.672995 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.691387 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.691423 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.691434 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.691452 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.691466 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:05Z","lastTransitionTime":"2025-12-06T06:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.694421 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.708844 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.723190 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.740094 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.770459 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:50Z\\\",\\\"message\\\":\\\"/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1206 06:57:50.487581 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-crz6w_openshift-ovn-kubernetes(7cc429b1-3932-4515-a2b8-f0dd601f3e4c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.784152 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.794308 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.794357 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.794371 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.794390 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.794402 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:05Z","lastTransitionTime":"2025-12-06T06:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.798537 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.819655 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.834957 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.852793 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.868945 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.881077 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.901300 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.902154 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.902184 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.902211 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.902225 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:05Z","lastTransitionTime":"2025-12-06T06:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.902754 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.923304 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.940461 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.955587 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.967115 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd399e-52cb-401b-9d4c-7c4b9dae34cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2ac34c62664b0a17ec26755307ecae78d4118b236c66819ebad906c1c092cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4bb75a39170b64e9b46994f3955c491ecb3f2155fc552c2738a5ece2e22de21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1783fc9d0986247d5ada8563d17661cb609d146db208e05609175315554e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:05 crc kubenswrapper[4954]: I1206 06:58:05.981256 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:05Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.004702 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.004754 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.004767 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.004788 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.004799 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:06Z","lastTransitionTime":"2025-12-06T06:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.109148 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.109216 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.109226 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.109250 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.109265 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:06Z","lastTransitionTime":"2025-12-06T06:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.212476 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.212534 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.212547 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.212585 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.212604 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:06Z","lastTransitionTime":"2025-12-06T06:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.315469 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.315519 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.315531 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.315550 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.315586 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:06Z","lastTransitionTime":"2025-12-06T06:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.418432 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.418922 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.418949 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.418983 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.419011 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:06Z","lastTransitionTime":"2025-12-06T06:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.442334 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:06 crc kubenswrapper[4954]: E1206 06:58:06.442597 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.523231 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.523867 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.523907 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.523937 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.523952 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:06Z","lastTransitionTime":"2025-12-06T06:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.598044 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.598109 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.598123 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.598146 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.598160 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:06Z","lastTransitionTime":"2025-12-06T06:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:06 crc kubenswrapper[4954]: E1206 06:58:06.613718 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:06Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.620178 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.620248 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.620260 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.620279 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.620295 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:06Z","lastTransitionTime":"2025-12-06T06:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:06 crc kubenswrapper[4954]: E1206 06:58:06.636587 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:06Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.642207 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.642267 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.642279 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.642301 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.642313 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:06Z","lastTransitionTime":"2025-12-06T06:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:06 crc kubenswrapper[4954]: E1206 06:58:06.662993 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:06Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.673696 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.673753 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.673768 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.673791 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.673806 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:06Z","lastTransitionTime":"2025-12-06T06:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:06 crc kubenswrapper[4954]: E1206 06:58:06.689217 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:06Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.706212 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.706263 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.706275 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.706293 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.706305 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:06Z","lastTransitionTime":"2025-12-06T06:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:06 crc kubenswrapper[4954]: E1206 06:58:06.731964 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:06Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:06 crc kubenswrapper[4954]: E1206 06:58:06.732084 4954 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.734052 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.734092 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.734103 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.734121 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.734131 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:06Z","lastTransitionTime":"2025-12-06T06:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.837518 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.837602 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.837618 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.837642 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.837661 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:06Z","lastTransitionTime":"2025-12-06T06:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.850288 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovnkube-controller/1.log" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.853835 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerStarted","Data":"e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c"} Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.854301 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.870337 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:06Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.884991 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:06Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.896849 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:06Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.917021 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:06Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.934650 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd399e-52cb-401b-9d4c-7c4b9dae34cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2ac34c62664b0a17ec26755307ecae78d4118b236c66819ebad906c1c092cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4bb75a39170b64e9b46994f3955c491ecb3f2155fc552c2738a5ece2e22de21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1783fc9d0986247d5ada8563d17661cb609d146db208e05609175315554e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:06Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.940049 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.940095 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.940108 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.940129 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.940143 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:06Z","lastTransitionTime":"2025-12-06T06:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.955807 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:06Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.971208 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:06Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:06 crc kubenswrapper[4954]: I1206 06:58:06.987077 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:06Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.012754 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:50Z\\\",\\\"message\\\":\\\"/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1206 06:57:50.487581 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:07Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.026653 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:07Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.042812 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:07Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.043449 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.043498 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.043511 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.043533 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.043543 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:07Z","lastTransitionTime":"2025-12-06T06:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.058036 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:07Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.072715 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:07Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.087706 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:07Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.106073 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:07Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.120081 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:07Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.146093 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.146155 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.146166 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.146185 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.146198 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:07Z","lastTransitionTime":"2025-12-06T06:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.248345 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.248391 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.248401 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.248417 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.248431 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:07Z","lastTransitionTime":"2025-12-06T06:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.351705 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.351750 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.351764 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.351784 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.351796 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:07Z","lastTransitionTime":"2025-12-06T06:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.443223 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.443361 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:07 crc kubenswrapper[4954]: E1206 06:58:07.443468 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:07 crc kubenswrapper[4954]: E1206 06:58:07.443656 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.443894 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:07 crc kubenswrapper[4954]: E1206 06:58:07.444128 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.454223 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.454280 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.454292 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.454313 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.454329 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:07Z","lastTransitionTime":"2025-12-06T06:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.557529 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.557644 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.557665 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.557698 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.557724 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:07Z","lastTransitionTime":"2025-12-06T06:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.666113 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.666214 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.666243 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.666274 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.666316 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:07Z","lastTransitionTime":"2025-12-06T06:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.770021 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.770066 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.770080 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.770099 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.770114 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:07Z","lastTransitionTime":"2025-12-06T06:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.873082 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.873138 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.873150 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.873172 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.873191 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:07Z","lastTransitionTime":"2025-12-06T06:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.976008 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.976044 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.976053 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.976069 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:07 crc kubenswrapper[4954]: I1206 06:58:07.976079 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:07Z","lastTransitionTime":"2025-12-06T06:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.080298 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.080373 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.080395 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.080429 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.080453 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:08Z","lastTransitionTime":"2025-12-06T06:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.184146 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.184193 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.184208 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.184231 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.184246 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:08Z","lastTransitionTime":"2025-12-06T06:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.287017 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.287061 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.287073 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.287092 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.287106 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:08Z","lastTransitionTime":"2025-12-06T06:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.389836 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.389884 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.389896 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.389914 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.389928 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:08Z","lastTransitionTime":"2025-12-06T06:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.442896 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:08 crc kubenswrapper[4954]: E1206 06:58:08.443067 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.493002 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.493040 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.493048 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.493063 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.493074 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:08Z","lastTransitionTime":"2025-12-06T06:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.596201 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.596254 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.596269 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.596295 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.596313 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:08Z","lastTransitionTime":"2025-12-06T06:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.700514 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.700627 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.700646 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.700677 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.700699 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:08Z","lastTransitionTime":"2025-12-06T06:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.804612 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.804667 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.804687 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.804719 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.804748 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:08Z","lastTransitionTime":"2025-12-06T06:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.864525 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovnkube-controller/2.log" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.865706 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovnkube-controller/1.log" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.869777 4954 generic.go:334] "Generic (PLEG): container finished" podID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerID="e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c" exitCode=1 Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.869864 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerDied","Data":"e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c"} Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.869925 4954 scope.go:117] "RemoveContainer" containerID="c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.870876 4954 scope.go:117] "RemoveContainer" containerID="e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c" Dec 06 06:58:08 crc kubenswrapper[4954]: E1206 06:58:08.871114 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-crz6w_openshift-ovn-kubernetes(7cc429b1-3932-4515-a2b8-f0dd601f3e4c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.891537 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:08Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.907709 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.907762 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.907776 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.907799 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.907816 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:08Z","lastTransitionTime":"2025-12-06T06:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.911991 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:08Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.929318 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:08Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.951517 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:50Z\\\",\\\"message\\\":\\\"/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1206 06:57:50.487581 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:58:08Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 06:58:07.200471 6642 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 06:58:07.200498 6642 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 06:58:07.200501 6642 handler.go:208] Removed *v1.Node event handler 7\\\\nI1206 06:58:07.200506 6642 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 06:58:07.200546 6642 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 06:58:07.200571 6642 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 06:58:07.200596 6642 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1206 06:58:07.200649 6642 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 06:58:07.200554 6642 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 06:58:07.200740 6642 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 06:58:07.200779 6642 factory.go:656] Stopping watch factory\\\\nI1206 06:58:07.200986 6642 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1206 06:58:07.201073 6642 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1206 06:58:07.201107 6642 ovnkube.go:599] Stopped ovnkube\\\\nI1206 06:58:07.201131 6642 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1206 06:58:07.201195 6642 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:08Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.970236 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:08Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:08 crc kubenswrapper[4954]: I1206 06:58:08.986122 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:08Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.001302 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:08Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.010320 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.010358 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.010374 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.010395 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.010408 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:09Z","lastTransitionTime":"2025-12-06T06:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.019351 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:09Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.033704 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd399e-52cb-401b-9d4c-7c4b9dae34cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2ac34c62664b0a17ec26755307ecae78d4118b236c66819ebad906c1c092cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4bb75a39170b64e9b46994f3955c491ecb3f2155fc552c2738a5ece2e22de21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1783fc9d0986247d5ada8563d17661cb609d146db208e05609175315554e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:09Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.049696 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:09Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.063303 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:09Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.075715 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:09Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.085261 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:09Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.101929 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:09Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.112841 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.112887 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.112901 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.112924 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.112939 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:09Z","lastTransitionTime":"2025-12-06T06:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.114080 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:09Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.135770 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:09Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.215950 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.216043 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.216057 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.216080 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.216093 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:09Z","lastTransitionTime":"2025-12-06T06:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.320339 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.320706 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.321246 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.321344 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.321481 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:09Z","lastTransitionTime":"2025-12-06T06:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.424088 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.424137 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.424151 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.424172 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.424185 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:09Z","lastTransitionTime":"2025-12-06T06:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.443413 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.443426 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.443454 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:09 crc kubenswrapper[4954]: E1206 06:58:09.443529 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:09 crc kubenswrapper[4954]: E1206 06:58:09.443699 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:09 crc kubenswrapper[4954]: E1206 06:58:09.443812 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.527324 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.527419 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.527446 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.527476 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.527496 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:09Z","lastTransitionTime":"2025-12-06T06:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.630349 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.630958 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.631075 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.631224 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.631287 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:09Z","lastTransitionTime":"2025-12-06T06:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.734441 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.734513 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.734527 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.734549 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.734578 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:09Z","lastTransitionTime":"2025-12-06T06:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.837859 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.838257 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.838375 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.838487 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.838624 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:09Z","lastTransitionTime":"2025-12-06T06:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.874387 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovnkube-controller/2.log" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.941327 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.941376 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.941390 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.941413 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:09 crc kubenswrapper[4954]: I1206 06:58:09.941424 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:09Z","lastTransitionTime":"2025-12-06T06:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.044749 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.044903 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.044916 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.044938 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.044950 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:10Z","lastTransitionTime":"2025-12-06T06:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.148008 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.148054 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.148063 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.148080 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.148090 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:10Z","lastTransitionTime":"2025-12-06T06:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.251158 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.251240 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.251258 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.251284 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.251303 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:10Z","lastTransitionTime":"2025-12-06T06:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.354711 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.354766 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.354793 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.354814 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.354827 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:10Z","lastTransitionTime":"2025-12-06T06:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.442795 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:10 crc kubenswrapper[4954]: E1206 06:58:10.442975 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.457864 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.457918 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.457935 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.457961 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.457974 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:10Z","lastTransitionTime":"2025-12-06T06:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.560432 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.560482 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.560494 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.560516 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.560530 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:10Z","lastTransitionTime":"2025-12-06T06:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.663089 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.663125 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.663133 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.663148 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.663158 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:10Z","lastTransitionTime":"2025-12-06T06:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.765444 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.765508 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.765520 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.765540 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.765552 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:10Z","lastTransitionTime":"2025-12-06T06:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.868796 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.868841 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.868854 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.868874 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.868887 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:10Z","lastTransitionTime":"2025-12-06T06:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.971863 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.971901 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.971911 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.971929 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:10 crc kubenswrapper[4954]: I1206 06:58:10.971940 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:10Z","lastTransitionTime":"2025-12-06T06:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.074733 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.075221 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.075347 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.075526 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.075668 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:11Z","lastTransitionTime":"2025-12-06T06:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.178546 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.178962 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.179060 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.179147 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.179235 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:11Z","lastTransitionTime":"2025-12-06T06:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.283502 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.283585 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.283606 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.283657 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.283673 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:11Z","lastTransitionTime":"2025-12-06T06:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.387343 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.387404 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.387424 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.387449 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.387468 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:11Z","lastTransitionTime":"2025-12-06T06:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.443011 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.443124 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.443016 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:11 crc kubenswrapper[4954]: E1206 06:58:11.443161 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:11 crc kubenswrapper[4954]: E1206 06:58:11.443463 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:11 crc kubenswrapper[4954]: E1206 06:58:11.443724 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.490712 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.490766 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.490777 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.490797 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.490807 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:11Z","lastTransitionTime":"2025-12-06T06:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.593742 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.593846 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.593863 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.593888 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.593900 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:11Z","lastTransitionTime":"2025-12-06T06:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.696726 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.696781 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.696795 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.696825 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.696840 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:11Z","lastTransitionTime":"2025-12-06T06:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.799318 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.799380 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.799394 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.799413 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.799427 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:11Z","lastTransitionTime":"2025-12-06T06:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.902504 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.902632 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.902649 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.902676 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:11 crc kubenswrapper[4954]: I1206 06:58:11.902699 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:11Z","lastTransitionTime":"2025-12-06T06:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.005321 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.005369 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.005379 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.005396 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.005404 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:12Z","lastTransitionTime":"2025-12-06T06:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.108089 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.108163 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.108177 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.108204 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.108242 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:12Z","lastTransitionTime":"2025-12-06T06:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.211777 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.211831 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.211844 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.211865 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.211878 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:12Z","lastTransitionTime":"2025-12-06T06:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.314432 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.314527 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.314545 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.314594 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.314609 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:12Z","lastTransitionTime":"2025-12-06T06:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.418198 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.418267 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.418280 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.418311 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.418329 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:12Z","lastTransitionTime":"2025-12-06T06:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.442767 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:12 crc kubenswrapper[4954]: E1206 06:58:12.442945 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.520782 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.520860 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.520879 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.520910 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.520930 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:12Z","lastTransitionTime":"2025-12-06T06:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.623478 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.623541 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.623557 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.623608 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.623627 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:12Z","lastTransitionTime":"2025-12-06T06:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.726474 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.726528 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.726543 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.726580 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.726600 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:12Z","lastTransitionTime":"2025-12-06T06:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.829019 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.829109 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.829139 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.829175 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.829198 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:12Z","lastTransitionTime":"2025-12-06T06:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.932181 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.932232 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.932242 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.932261 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:12 crc kubenswrapper[4954]: I1206 06:58:12.932271 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:12Z","lastTransitionTime":"2025-12-06T06:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.035534 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.035612 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.035624 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.035642 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.035654 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:13Z","lastTransitionTime":"2025-12-06T06:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.139086 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.139138 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.139148 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.139168 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.139180 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:13Z","lastTransitionTime":"2025-12-06T06:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.242099 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.242163 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.242186 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.242211 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.242224 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:13Z","lastTransitionTime":"2025-12-06T06:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.345847 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.345915 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.345928 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.345950 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.345964 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:13Z","lastTransitionTime":"2025-12-06T06:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.443425 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.443489 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.443577 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:13 crc kubenswrapper[4954]: E1206 06:58:13.443661 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:13 crc kubenswrapper[4954]: E1206 06:58:13.443756 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:13 crc kubenswrapper[4954]: E1206 06:58:13.443960 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.448207 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.448240 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.448259 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.448280 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.448294 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:13Z","lastTransitionTime":"2025-12-06T06:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.551426 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.551473 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.551489 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.551508 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.551521 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:13Z","lastTransitionTime":"2025-12-06T06:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.654520 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.654580 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.654592 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.654608 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.654620 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:13Z","lastTransitionTime":"2025-12-06T06:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.757466 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.757509 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.757517 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.757532 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.757547 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:13Z","lastTransitionTime":"2025-12-06T06:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.860844 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.860921 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.860945 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.860980 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.861002 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:13Z","lastTransitionTime":"2025-12-06T06:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.964395 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.964453 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.964464 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.964485 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:13 crc kubenswrapper[4954]: I1206 06:58:13.964498 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:13Z","lastTransitionTime":"2025-12-06T06:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.066947 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.066984 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.066996 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.067016 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.067030 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:14Z","lastTransitionTime":"2025-12-06T06:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.169506 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.169538 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.169546 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.169577 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.169587 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:14Z","lastTransitionTime":"2025-12-06T06:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.272642 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.273216 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.273437 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.273631 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.273863 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:14Z","lastTransitionTime":"2025-12-06T06:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.376869 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.376959 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.376979 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.377001 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.377014 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:14Z","lastTransitionTime":"2025-12-06T06:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.443245 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:14 crc kubenswrapper[4954]: E1206 06:58:14.443416 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.479963 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.480311 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.480421 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.480520 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.480627 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:14Z","lastTransitionTime":"2025-12-06T06:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.583578 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.583616 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.583630 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.583650 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.583660 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:14Z","lastTransitionTime":"2025-12-06T06:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.686803 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.686884 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.686899 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.686926 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.686943 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:14Z","lastTransitionTime":"2025-12-06T06:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.789403 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.789457 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.789477 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.789504 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.789518 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:14Z","lastTransitionTime":"2025-12-06T06:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.893120 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.893192 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.893208 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.893232 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.893247 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:14Z","lastTransitionTime":"2025-12-06T06:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.996619 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.996668 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.996680 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.996698 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:14 crc kubenswrapper[4954]: I1206 06:58:14.996710 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:14Z","lastTransitionTime":"2025-12-06T06:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.099507 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.099614 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.099629 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.099651 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.099665 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:15Z","lastTransitionTime":"2025-12-06T06:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.202613 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.202663 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.202677 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.202698 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.202711 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:15Z","lastTransitionTime":"2025-12-06T06:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.305211 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.305302 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.305324 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.305354 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.305370 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:15Z","lastTransitionTime":"2025-12-06T06:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.408554 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.408619 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.408630 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.408649 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.408666 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:15Z","lastTransitionTime":"2025-12-06T06:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.442699 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.443137 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:15 crc kubenswrapper[4954]: E1206 06:58:15.443183 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.443247 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:15 crc kubenswrapper[4954]: E1206 06:58:15.443430 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:15 crc kubenswrapper[4954]: E1206 06:58:15.443640 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.459089 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:15Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.475275 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:15Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.489199 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:15Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.510487 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:50Z\\\",\\\"message\\\":\\\"/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1206 06:57:50.487581 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:58:08Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 06:58:07.200471 6642 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 06:58:07.200498 6642 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 06:58:07.200501 6642 handler.go:208] Removed *v1.Node event handler 7\\\\nI1206 06:58:07.200506 6642 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 06:58:07.200546 6642 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 06:58:07.200571 6642 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 06:58:07.200596 6642 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1206 06:58:07.200649 6642 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 06:58:07.200554 6642 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 06:58:07.200740 6642 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 06:58:07.200779 6642 factory.go:656] Stopping watch factory\\\\nI1206 06:58:07.200986 6642 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1206 06:58:07.201073 6642 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1206 06:58:07.201107 6642 ovnkube.go:599] Stopped ovnkube\\\\nI1206 06:58:07.201131 6642 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1206 06:58:07.201195 6642 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:15Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.512555 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.512611 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.512621 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.512638 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.512665 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:15Z","lastTransitionTime":"2025-12-06T06:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.521908 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:15Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.536666 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:15Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.551441 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:15Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.564306 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:15Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.582535 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:15Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.596855 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd399e-52cb-401b-9d4c-7c4b9dae34cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2ac34c62664b0a17ec26755307ecae78d4118b236c66819ebad906c1c092cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4bb75a39170b64e9b46994f3955c491ecb3f2155fc552c2738a5ece2e22de21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1783fc9d0986247d5ada8563d17661cb609d146db208e05609175315554e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:15Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.611225 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:15Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.616739 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.616790 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.616800 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.616822 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.616833 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:15Z","lastTransitionTime":"2025-12-06T06:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.629082 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:15Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.649429 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:15Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.663306 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:15Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.679375 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:15Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.693649 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:15Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.720075 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.720129 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.720138 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.720157 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.720169 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:15Z","lastTransitionTime":"2025-12-06T06:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.823614 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.823670 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.823680 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.823700 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.823714 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:15Z","lastTransitionTime":"2025-12-06T06:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.926832 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.926886 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.926936 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.926958 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:15 crc kubenswrapper[4954]: I1206 06:58:15.926972 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:15Z","lastTransitionTime":"2025-12-06T06:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.029630 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.029670 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.029681 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.029699 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.029712 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:16Z","lastTransitionTime":"2025-12-06T06:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.133301 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.133366 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.133382 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.133404 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.133420 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:16Z","lastTransitionTime":"2025-12-06T06:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.236184 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.236237 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.236248 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.236266 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.236278 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:16Z","lastTransitionTime":"2025-12-06T06:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.339819 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.339887 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.339898 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.339914 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.339926 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:16Z","lastTransitionTime":"2025-12-06T06:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.442037 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.442090 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.442100 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.442116 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.442128 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:16Z","lastTransitionTime":"2025-12-06T06:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.442288 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:16 crc kubenswrapper[4954]: E1206 06:58:16.442485 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.545514 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.545595 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.545611 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.545629 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.545641 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:16Z","lastTransitionTime":"2025-12-06T06:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.648666 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.648711 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.648721 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.648740 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.648751 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:16Z","lastTransitionTime":"2025-12-06T06:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.751364 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.751410 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.751421 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.751442 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.751477 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:16Z","lastTransitionTime":"2025-12-06T06:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.824734 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.824837 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.824861 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.824893 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.824918 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:16Z","lastTransitionTime":"2025-12-06T06:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:16 crc kubenswrapper[4954]: E1206 06:58:16.840553 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:16Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.846816 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.846872 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.846883 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.846905 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.846917 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:16Z","lastTransitionTime":"2025-12-06T06:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:16 crc kubenswrapper[4954]: E1206 06:58:16.864308 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:16Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.871650 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.872068 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.872248 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.872429 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.872652 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:16Z","lastTransitionTime":"2025-12-06T06:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:16 crc kubenswrapper[4954]: E1206 06:58:16.889669 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:16Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.895287 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.895333 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.895348 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.895372 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.895401 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:16Z","lastTransitionTime":"2025-12-06T06:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:16 crc kubenswrapper[4954]: E1206 06:58:16.908307 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:16Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.912711 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.912763 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.912774 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.912795 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.912809 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:16Z","lastTransitionTime":"2025-12-06T06:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:16 crc kubenswrapper[4954]: E1206 06:58:16.926865 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:16Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:16 crc kubenswrapper[4954]: E1206 06:58:16.927008 4954 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.929130 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.929158 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.929172 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.929191 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:16 crc kubenswrapper[4954]: I1206 06:58:16.929202 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:16Z","lastTransitionTime":"2025-12-06T06:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.032107 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.032145 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.032154 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.032170 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.032183 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:17Z","lastTransitionTime":"2025-12-06T06:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.135931 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.135966 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.135977 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.136010 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.136021 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:17Z","lastTransitionTime":"2025-12-06T06:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.238850 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.238906 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.238920 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.238943 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.238955 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:17Z","lastTransitionTime":"2025-12-06T06:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.342078 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.342124 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.342135 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.342153 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.342163 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:17Z","lastTransitionTime":"2025-12-06T06:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.443221 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.443276 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.443302 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:17 crc kubenswrapper[4954]: E1206 06:58:17.443425 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:17 crc kubenswrapper[4954]: E1206 06:58:17.443512 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:17 crc kubenswrapper[4954]: E1206 06:58:17.443613 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.444998 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.445027 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.445037 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.445054 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.445065 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:17Z","lastTransitionTime":"2025-12-06T06:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.547729 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.547781 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.547791 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.547808 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.547826 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:17Z","lastTransitionTime":"2025-12-06T06:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.651485 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.651537 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.651546 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.651589 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.651602 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:17Z","lastTransitionTime":"2025-12-06T06:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.754744 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.754799 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.754812 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.754830 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.754843 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:17Z","lastTransitionTime":"2025-12-06T06:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.856967 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.857030 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.857046 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.857066 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.857082 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:17Z","lastTransitionTime":"2025-12-06T06:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.959812 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.959877 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.959891 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.959910 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:17 crc kubenswrapper[4954]: I1206 06:58:17.959919 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:17Z","lastTransitionTime":"2025-12-06T06:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.062377 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.062419 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.062431 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.062449 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.062460 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:18Z","lastTransitionTime":"2025-12-06T06:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.165156 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.165204 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.165215 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.165240 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.165253 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:18Z","lastTransitionTime":"2025-12-06T06:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.269074 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.269130 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.269143 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.269167 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.269187 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:18Z","lastTransitionTime":"2025-12-06T06:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.371728 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.371777 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.371791 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.371808 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.371818 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:18Z","lastTransitionTime":"2025-12-06T06:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.443049 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:18 crc kubenswrapper[4954]: E1206 06:58:18.443240 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.474664 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.474722 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.474733 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.474750 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.474761 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:18Z","lastTransitionTime":"2025-12-06T06:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.577911 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.577971 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.577994 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.578021 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.578039 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:18Z","lastTransitionTime":"2025-12-06T06:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.680134 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.680188 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.680200 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.680223 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.680236 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:18Z","lastTransitionTime":"2025-12-06T06:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.782616 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.782655 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.782672 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.782696 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.782712 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:18Z","lastTransitionTime":"2025-12-06T06:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.885767 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.885813 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.885823 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.885843 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.885855 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:18Z","lastTransitionTime":"2025-12-06T06:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.989671 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.990056 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.990298 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.990428 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:18 crc kubenswrapper[4954]: I1206 06:58:18.990539 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:18Z","lastTransitionTime":"2025-12-06T06:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.094041 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.094086 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.094128 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.094147 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.094171 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:19Z","lastTransitionTime":"2025-12-06T06:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.196388 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.196432 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.196444 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.196465 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.196479 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:19Z","lastTransitionTime":"2025-12-06T06:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.299340 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.299406 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.299419 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.299436 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.299447 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:19Z","lastTransitionTime":"2025-12-06T06:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.402010 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.402363 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.402456 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.402544 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.402651 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:19Z","lastTransitionTime":"2025-12-06T06:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.442847 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.443306 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:19 crc kubenswrapper[4954]: E1206 06:58:19.443678 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.443694 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:19 crc kubenswrapper[4954]: E1206 06:58:19.443962 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:19 crc kubenswrapper[4954]: E1206 06:58:19.443747 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.505402 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.505751 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.505969 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.506076 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.506159 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:19Z","lastTransitionTime":"2025-12-06T06:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.609265 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.609367 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.609394 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.609426 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.609447 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:19Z","lastTransitionTime":"2025-12-06T06:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.713789 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.714113 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.714188 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.714254 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.714320 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:19Z","lastTransitionTime":"2025-12-06T06:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.817184 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.817457 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.817642 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.817737 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.817810 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:19Z","lastTransitionTime":"2025-12-06T06:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.919840 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.919890 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.919902 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.919919 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.919930 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:19Z","lastTransitionTime":"2025-12-06T06:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:19 crc kubenswrapper[4954]: I1206 06:58:19.988381 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs\") pod \"network-metrics-daemon-vtxfz\" (UID: \"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\") " pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:19 crc kubenswrapper[4954]: E1206 06:58:19.988587 4954 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 06:58:19 crc kubenswrapper[4954]: E1206 06:58:19.988664 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs podName:9377db43-9e5b-41e9-a9bc-f5fe3a81a457 nodeName:}" failed. No retries permitted until 2025-12-06 06:58:51.988643317 +0000 UTC m=+106.802002706 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs") pod "network-metrics-daemon-vtxfz" (UID: "9377db43-9e5b-41e9-a9bc-f5fe3a81a457") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.023293 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.023358 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.023370 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.023390 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.023403 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:20Z","lastTransitionTime":"2025-12-06T06:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.127614 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.127662 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.127678 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.127698 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.127711 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:20Z","lastTransitionTime":"2025-12-06T06:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.231760 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.231845 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.231869 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.231895 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.231914 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:20Z","lastTransitionTime":"2025-12-06T06:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.335257 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.335324 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.335334 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.335354 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.335371 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:20Z","lastTransitionTime":"2025-12-06T06:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.438845 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.438925 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.438949 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.438984 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.439010 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:20Z","lastTransitionTime":"2025-12-06T06:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.443259 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:20 crc kubenswrapper[4954]: E1206 06:58:20.443474 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.542640 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.542724 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.542743 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.542774 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.542796 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:20Z","lastTransitionTime":"2025-12-06T06:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.645223 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.645272 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.645281 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.645298 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.645313 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:20Z","lastTransitionTime":"2025-12-06T06:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.748750 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.748808 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.748823 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.748842 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.748855 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:20Z","lastTransitionTime":"2025-12-06T06:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.851487 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.851543 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.851553 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.851591 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.851609 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:20Z","lastTransitionTime":"2025-12-06T06:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.915363 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rsvgk_1d174f37-f89e-4daf-a663-3cad4e33dad2/kube-multus/0.log" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.915444 4954 generic.go:334] "Generic (PLEG): container finished" podID="1d174f37-f89e-4daf-a663-3cad4e33dad2" containerID="fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76" exitCode=1 Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.915496 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rsvgk" event={"ID":"1d174f37-f89e-4daf-a663-3cad4e33dad2","Type":"ContainerDied","Data":"fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76"} Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.916116 4954 scope.go:117] "RemoveContainer" containerID="fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.931033 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:20Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.953421 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:20Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.957461 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.957551 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.957605 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.957638 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.957670 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:20Z","lastTransitionTime":"2025-12-06T06:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.974802 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:20Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:20 crc kubenswrapper[4954]: I1206 06:58:20.990855 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:20Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.005320 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:21Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.025217 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:50Z\\\",\\\"message\\\":\\\"/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1206 06:57:50.487581 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:58:08Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 06:58:07.200471 6642 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 06:58:07.200498 6642 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 06:58:07.200501 6642 handler.go:208] Removed *v1.Node event handler 7\\\\nI1206 06:58:07.200506 6642 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 06:58:07.200546 6642 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 06:58:07.200571 6642 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 06:58:07.200596 6642 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1206 06:58:07.200649 6642 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 06:58:07.200554 6642 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 06:58:07.200740 6642 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 06:58:07.200779 6642 factory.go:656] Stopping watch factory\\\\nI1206 06:58:07.200986 6642 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1206 06:58:07.201073 6642 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1206 06:58:07.201107 6642 ovnkube.go:599] Stopped ovnkube\\\\nI1206 06:58:07.201131 6642 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1206 06:58:07.201195 6642 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:21Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.037935 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:21Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.051493 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:21Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.061521 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.061632 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.061647 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.061668 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.061680 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:21Z","lastTransitionTime":"2025-12-06T06:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.065469 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:58:20Z\\\",\\\"message\\\":\\\"2025-12-06T06:57:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_097e22bc-457d-46ce-accd-cb98cea0c49e\\\\n2025-12-06T06:57:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_097e22bc-457d-46ce-accd-cb98cea0c49e to /host/opt/cni/bin/\\\\n2025-12-06T06:57:35Z [verbose] multus-daemon started\\\\n2025-12-06T06:57:35Z [verbose] Readiness Indicator file check\\\\n2025-12-06T06:58:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:21Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.077353 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:21Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.091063 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:21Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.108657 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:21Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.121600 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd399e-52cb-401b-9d4c-7c4b9dae34cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2ac34c62664b0a17ec26755307ecae78d4118b236c66819ebad906c1c092cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4bb75a39170b64e9b46994f3955c491ecb3f2155fc552c2738a5ece2e22de21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1783fc9d0986247d5ada8563d17661cb609d146db208e05609175315554e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:21Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.138469 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:21Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.154142 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:21Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.164614 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.164937 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.165081 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.165198 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.165298 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:21Z","lastTransitionTime":"2025-12-06T06:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.168506 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:21Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.269155 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.269533 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.269623 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.269691 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.269749 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:21Z","lastTransitionTime":"2025-12-06T06:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.373110 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.373152 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.373161 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.373179 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.373190 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:21Z","lastTransitionTime":"2025-12-06T06:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.442720 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.442752 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:21 crc kubenswrapper[4954]: E1206 06:58:21.442890 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.442911 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:21 crc kubenswrapper[4954]: E1206 06:58:21.443071 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:21 crc kubenswrapper[4954]: E1206 06:58:21.443168 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.475905 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.475948 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.475957 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.475975 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.475986 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:21Z","lastTransitionTime":"2025-12-06T06:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.578491 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.578553 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.578618 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.578645 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.578660 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:21Z","lastTransitionTime":"2025-12-06T06:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.681374 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.681422 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.681432 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.681449 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.681462 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:21Z","lastTransitionTime":"2025-12-06T06:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.785677 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.785718 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.785728 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.785748 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.785761 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:21Z","lastTransitionTime":"2025-12-06T06:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.889128 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.889173 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.889183 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.889199 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.889209 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:21Z","lastTransitionTime":"2025-12-06T06:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.922286 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rsvgk_1d174f37-f89e-4daf-a663-3cad4e33dad2/kube-multus/0.log" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.922358 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rsvgk" event={"ID":"1d174f37-f89e-4daf-a663-3cad4e33dad2","Type":"ContainerStarted","Data":"2fc97358db525a0093a6b1350179318aec4b2da84f919bbdf3f2e5a56205a363"} Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.936639 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:21Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.948426 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:21Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.959378 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:21Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.978814 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c697dfcc1c3a912d53652e426355886489fb917864a5927509bc87e7bc6aa3e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:57:50Z\\\",\\\"message\\\":\\\"/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1206 06:57:50.487581 6408 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:58:08Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 06:58:07.200471 6642 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 06:58:07.200498 6642 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 06:58:07.200501 6642 handler.go:208] Removed *v1.Node event handler 7\\\\nI1206 06:58:07.200506 6642 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 06:58:07.200546 6642 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 06:58:07.200571 6642 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 06:58:07.200596 6642 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1206 06:58:07.200649 6642 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 06:58:07.200554 6642 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 06:58:07.200740 6642 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 06:58:07.200779 6642 factory.go:656] Stopping watch factory\\\\nI1206 06:58:07.200986 6642 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1206 06:58:07.201073 6642 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1206 06:58:07.201107 6642 ovnkube.go:599] Stopped ovnkube\\\\nI1206 06:58:07.201131 6642 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1206 06:58:07.201195 6642 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:21Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.989405 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:21Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.991769 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.991827 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.991843 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.991864 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:21 crc kubenswrapper[4954]: I1206 06:58:21.991879 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:21Z","lastTransitionTime":"2025-12-06T06:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.002807 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:22Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.014741 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc97358db525a0093a6b1350179318aec4b2da84f919bbdf3f2e5a56205a363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:58:20Z\\\",\\\"message\\\":\\\"2025-12-06T06:57:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_097e22bc-457d-46ce-accd-cb98cea0c49e\\\\n2025-12-06T06:57:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_097e22bc-457d-46ce-accd-cb98cea0c49e to /host/opt/cni/bin/\\\\n2025-12-06T06:57:35Z [verbose] multus-daemon started\\\\n2025-12-06T06:57:35Z [verbose] Readiness Indicator file check\\\\n2025-12-06T06:58:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:22Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.027426 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:22Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.038961 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:22Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.057878 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:22Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.076799 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd399e-52cb-401b-9d4c-7c4b9dae34cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2ac34c62664b0a17ec26755307ecae78d4118b236c66819ebad906c1c092cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4bb75a39170b64e9b46994f3955c491ecb3f2155fc552c2738a5ece2e22de21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1783fc9d0986247d5ada8563d17661cb609d146db208e05609175315554e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:22Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.090336 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:22Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.094148 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.094223 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.094234 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.094257 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.094270 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:22Z","lastTransitionTime":"2025-12-06T06:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.105103 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:22Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.124611 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:22Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.142045 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:22Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.159315 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:22Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.196346 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.196403 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.196419 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.196444 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.196461 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:22Z","lastTransitionTime":"2025-12-06T06:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.302174 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.302220 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.302234 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.302279 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.302299 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:22Z","lastTransitionTime":"2025-12-06T06:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.404414 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.404450 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.404467 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.404489 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.404501 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:22Z","lastTransitionTime":"2025-12-06T06:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.442826 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:22 crc kubenswrapper[4954]: E1206 06:58:22.442992 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.507866 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.507916 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.507930 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.507952 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.507965 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:22Z","lastTransitionTime":"2025-12-06T06:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.610096 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.610148 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.610157 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.610175 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.610186 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:22Z","lastTransitionTime":"2025-12-06T06:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.712936 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.713017 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.713034 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.713056 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.713069 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:22Z","lastTransitionTime":"2025-12-06T06:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.816223 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.816277 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.816290 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.816311 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.816327 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:22Z","lastTransitionTime":"2025-12-06T06:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.919790 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.919865 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.919881 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.919910 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:22 crc kubenswrapper[4954]: I1206 06:58:22.919927 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:22Z","lastTransitionTime":"2025-12-06T06:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.023827 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.023901 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.023914 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.023936 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.023953 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:23Z","lastTransitionTime":"2025-12-06T06:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.127181 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.127217 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.127236 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.127256 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.127267 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:23Z","lastTransitionTime":"2025-12-06T06:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.229951 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.230011 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.230023 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.230042 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.230056 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:23Z","lastTransitionTime":"2025-12-06T06:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.331897 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.331936 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.331947 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.331965 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.331976 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:23Z","lastTransitionTime":"2025-12-06T06:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.435280 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.435349 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.435365 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.435386 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.435402 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:23Z","lastTransitionTime":"2025-12-06T06:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.443232 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.443318 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:23 crc kubenswrapper[4954]: E1206 06:58:23.443394 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.443454 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:23 crc kubenswrapper[4954]: E1206 06:58:23.443625 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:23 crc kubenswrapper[4954]: E1206 06:58:23.443729 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.444486 4954 scope.go:117] "RemoveContainer" containerID="e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c" Dec 06 06:58:23 crc kubenswrapper[4954]: E1206 06:58:23.444745 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-crz6w_openshift-ovn-kubernetes(7cc429b1-3932-4515-a2b8-f0dd601f3e4c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.460963 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:23Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.473722 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:23Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.490537 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:23Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.509375 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:23Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.522814 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:23Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.538395 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.538714 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.538747 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.538771 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.538788 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:23Z","lastTransitionTime":"2025-12-06T06:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.545022 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:58:08Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 06:58:07.200471 6642 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 06:58:07.200498 6642 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 06:58:07.200501 6642 handler.go:208] Removed *v1.Node event handler 7\\\\nI1206 06:58:07.200506 6642 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 06:58:07.200546 6642 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 06:58:07.200571 6642 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 06:58:07.200596 6642 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1206 06:58:07.200649 6642 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 06:58:07.200554 6642 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 06:58:07.200740 6642 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 06:58:07.200779 6642 factory.go:656] Stopping watch factory\\\\nI1206 06:58:07.200986 6642 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1206 06:58:07.201073 6642 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1206 06:58:07.201107 6642 ovnkube.go:599] Stopped ovnkube\\\\nI1206 06:58:07.201131 6642 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1206 06:58:07.201195 6642 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:58:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-crz6w_openshift-ovn-kubernetes(7cc429b1-3932-4515-a2b8-f0dd601f3e4c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:23Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.565359 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:23Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.594629 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:23Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.617987 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc97358db525a0093a6b1350179318aec4b2da84f919bbdf3f2e5a56205a363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:58:20Z\\\",\\\"message\\\":\\\"2025-12-06T06:57:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_097e22bc-457d-46ce-accd-cb98cea0c49e\\\\n2025-12-06T06:57:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_097e22bc-457d-46ce-accd-cb98cea0c49e to /host/opt/cni/bin/\\\\n2025-12-06T06:57:35Z [verbose] multus-daemon started\\\\n2025-12-06T06:57:35Z [verbose] Readiness Indicator file check\\\\n2025-12-06T06:58:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:23Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.636774 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:23Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.641687 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.641728 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.641740 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.641758 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.641770 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:23Z","lastTransitionTime":"2025-12-06T06:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.652343 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd399e-52cb-401b-9d4c-7c4b9dae34cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2ac34c62664b0a17ec26755307ecae78d4118b236c66819ebad906c1c092cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4bb75a39170b64e9b46994f3955c491ecb3f2155fc552c2738a5ece2e22de21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1783fc9d0986247d5ada8563d17661cb609d146db208e05609175315554e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:23Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.668505 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:23Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.682510 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:23Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.697747 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:23Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.709345 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:23Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.726996 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:23Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.744377 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.744419 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.744430 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.744448 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.744461 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:23Z","lastTransitionTime":"2025-12-06T06:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.847965 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.848036 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.848047 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.848065 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.848078 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:23Z","lastTransitionTime":"2025-12-06T06:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.950912 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.950999 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.951015 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.951039 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:23 crc kubenswrapper[4954]: I1206 06:58:23.951055 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:23Z","lastTransitionTime":"2025-12-06T06:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.053993 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.054050 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.054063 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.054084 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.054100 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:24Z","lastTransitionTime":"2025-12-06T06:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.156583 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.156646 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.156697 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.156720 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.156735 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:24Z","lastTransitionTime":"2025-12-06T06:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.260248 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.260342 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.260362 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.260390 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.260410 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:24Z","lastTransitionTime":"2025-12-06T06:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.363248 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.363292 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.363303 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.363323 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.363337 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:24Z","lastTransitionTime":"2025-12-06T06:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.442918 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:24 crc kubenswrapper[4954]: E1206 06:58:24.443089 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.466406 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.466493 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.466517 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.466552 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.466613 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:24Z","lastTransitionTime":"2025-12-06T06:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.569833 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.569891 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.569903 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.569924 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.569937 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:24Z","lastTransitionTime":"2025-12-06T06:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.673416 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.673480 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.673493 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.673514 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.673525 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:24Z","lastTransitionTime":"2025-12-06T06:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.776529 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.776622 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.776637 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.776655 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.776672 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:24Z","lastTransitionTime":"2025-12-06T06:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.879404 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.879466 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.879476 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.879493 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.879507 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:24Z","lastTransitionTime":"2025-12-06T06:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.982537 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.982630 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.982645 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.982669 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:24 crc kubenswrapper[4954]: I1206 06:58:24.982689 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:24Z","lastTransitionTime":"2025-12-06T06:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.086066 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.086121 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.086131 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.086147 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.086158 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:25Z","lastTransitionTime":"2025-12-06T06:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.189614 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.189675 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.189695 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.189718 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.189732 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:25Z","lastTransitionTime":"2025-12-06T06:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.292461 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.292523 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.292532 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.292549 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.292579 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:25Z","lastTransitionTime":"2025-12-06T06:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.395478 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.395545 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.395580 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.395605 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.395619 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:25Z","lastTransitionTime":"2025-12-06T06:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.443301 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.443383 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.443454 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:25 crc kubenswrapper[4954]: E1206 06:58:25.443610 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:25 crc kubenswrapper[4954]: E1206 06:58:25.443666 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:25 crc kubenswrapper[4954]: E1206 06:58:25.443782 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.462310 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:25Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.475844 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:25Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.493326 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:25Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.498832 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.498899 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.498913 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.498936 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.498952 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:25Z","lastTransitionTime":"2025-12-06T06:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.509120 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:25Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.524383 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:25Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.548583 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:58:08Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 06:58:07.200471 6642 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 06:58:07.200498 6642 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 06:58:07.200501 6642 handler.go:208] Removed *v1.Node event handler 7\\\\nI1206 06:58:07.200506 6642 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 06:58:07.200546 6642 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 06:58:07.200571 6642 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 06:58:07.200596 6642 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1206 06:58:07.200649 6642 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 06:58:07.200554 6642 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 06:58:07.200740 6642 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 06:58:07.200779 6642 factory.go:656] Stopping watch factory\\\\nI1206 06:58:07.200986 6642 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1206 06:58:07.201073 6642 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1206 06:58:07.201107 6642 ovnkube.go:599] Stopped ovnkube\\\\nI1206 06:58:07.201131 6642 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1206 06:58:07.201195 6642 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:58:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-crz6w_openshift-ovn-kubernetes(7cc429b1-3932-4515-a2b8-f0dd601f3e4c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:25Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.562018 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:25Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.577868 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:25Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.596406 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc97358db525a0093a6b1350179318aec4b2da84f919bbdf3f2e5a56205a363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:58:20Z\\\",\\\"message\\\":\\\"2025-12-06T06:57:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_097e22bc-457d-46ce-accd-cb98cea0c49e\\\\n2025-12-06T06:57:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_097e22bc-457d-46ce-accd-cb98cea0c49e to /host/opt/cni/bin/\\\\n2025-12-06T06:57:35Z [verbose] multus-daemon started\\\\n2025-12-06T06:57:35Z [verbose] Readiness Indicator file check\\\\n2025-12-06T06:58:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:25Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.602596 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.602641 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.602657 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.602677 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.602691 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:25Z","lastTransitionTime":"2025-12-06T06:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.611277 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:25Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.627407 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd399e-52cb-401b-9d4c-7c4b9dae34cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2ac34c62664b0a17ec26755307ecae78d4118b236c66819ebad906c1c092cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4bb75a39170b64e9b46994f3955c491ecb3f2155fc552c2738a5ece2e22de21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1783fc9d0986247d5ada8563d17661cb609d146db208e05609175315554e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:25Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.644307 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:25Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.661726 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:25Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.676196 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:25Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.688762 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:25Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.704262 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:25Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.704867 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.704908 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.704919 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.704937 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.704946 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:25Z","lastTransitionTime":"2025-12-06T06:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.807529 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.807624 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.807637 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.807656 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.807671 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:25Z","lastTransitionTime":"2025-12-06T06:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.911268 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.911324 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.911339 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.911359 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:25 crc kubenswrapper[4954]: I1206 06:58:25.911368 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:25Z","lastTransitionTime":"2025-12-06T06:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.015096 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.015144 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.015157 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.015178 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.015194 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:26Z","lastTransitionTime":"2025-12-06T06:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.117583 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.117627 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.117636 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.117652 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.117662 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:26Z","lastTransitionTime":"2025-12-06T06:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.221050 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.221110 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.221125 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.221149 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.221163 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:26Z","lastTransitionTime":"2025-12-06T06:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.323986 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.324072 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.324082 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.324103 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.324121 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:26Z","lastTransitionTime":"2025-12-06T06:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.427136 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.427180 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.427191 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.427211 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.427221 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:26Z","lastTransitionTime":"2025-12-06T06:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.443124 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:26 crc kubenswrapper[4954]: E1206 06:58:26.443266 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.453644 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.530515 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.530603 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.530621 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.530648 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.530669 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:26Z","lastTransitionTime":"2025-12-06T06:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.634403 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.634453 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.634467 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.634497 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.634509 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:26Z","lastTransitionTime":"2025-12-06T06:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.737005 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.737052 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.737064 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.737082 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.737097 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:26Z","lastTransitionTime":"2025-12-06T06:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.839241 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.839290 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.839303 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.839324 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.839336 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:26Z","lastTransitionTime":"2025-12-06T06:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.941729 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.941779 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.941793 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.941843 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:26 crc kubenswrapper[4954]: I1206 06:58:26.941859 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:26Z","lastTransitionTime":"2025-12-06T06:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.043978 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.044030 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.044047 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.044070 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.044085 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:27Z","lastTransitionTime":"2025-12-06T06:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.146863 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.146922 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.146934 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.146954 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.146967 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:27Z","lastTransitionTime":"2025-12-06T06:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.247657 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.247709 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.247725 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.247747 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.247760 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:27Z","lastTransitionTime":"2025-12-06T06:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:27 crc kubenswrapper[4954]: E1206 06:58:27.263435 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:27Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.267599 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.267635 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.267647 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.267667 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.267679 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:27Z","lastTransitionTime":"2025-12-06T06:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:27 crc kubenswrapper[4954]: E1206 06:58:27.283083 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:27Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.286771 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.286800 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.286811 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.286829 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.286841 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:27Z","lastTransitionTime":"2025-12-06T06:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:27 crc kubenswrapper[4954]: E1206 06:58:27.301394 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:27Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.305424 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.305465 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.305478 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.305498 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.305511 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:27Z","lastTransitionTime":"2025-12-06T06:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:27 crc kubenswrapper[4954]: E1206 06:58:27.327158 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:27Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.331884 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.331945 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.331962 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.331988 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.332002 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:27Z","lastTransitionTime":"2025-12-06T06:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:27 crc kubenswrapper[4954]: E1206 06:58:27.345852 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:27Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:27 crc kubenswrapper[4954]: E1206 06:58:27.345969 4954 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.348312 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.348358 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.348375 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.348399 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.348416 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:27Z","lastTransitionTime":"2025-12-06T06:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.443362 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.443403 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:27 crc kubenswrapper[4954]: E1206 06:58:27.443630 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:27 crc kubenswrapper[4954]: E1206 06:58:27.443924 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.443950 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:27 crc kubenswrapper[4954]: E1206 06:58:27.444336 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.451064 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.451120 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.451132 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.451153 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.451173 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:27Z","lastTransitionTime":"2025-12-06T06:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.554032 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.554077 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.554089 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.554107 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.554121 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:27Z","lastTransitionTime":"2025-12-06T06:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.657340 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.657399 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.657410 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.657430 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.657444 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:27Z","lastTransitionTime":"2025-12-06T06:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.760346 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.760387 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.760405 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.760432 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.760443 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:27Z","lastTransitionTime":"2025-12-06T06:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.863376 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.863419 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.863436 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.863453 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.863465 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:27Z","lastTransitionTime":"2025-12-06T06:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.969457 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.969503 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.969513 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.969532 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:27 crc kubenswrapper[4954]: I1206 06:58:27.969543 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:27Z","lastTransitionTime":"2025-12-06T06:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.072389 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.072449 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.072459 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.072475 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.072485 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:28Z","lastTransitionTime":"2025-12-06T06:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.174785 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.174846 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.174861 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.174882 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.174897 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:28Z","lastTransitionTime":"2025-12-06T06:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.278195 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.278252 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.278265 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.278285 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.278297 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:28Z","lastTransitionTime":"2025-12-06T06:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.381685 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.381730 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.381739 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.381759 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.381771 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:28Z","lastTransitionTime":"2025-12-06T06:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.443108 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:28 crc kubenswrapper[4954]: E1206 06:58:28.443287 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.485027 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.485111 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.485130 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.485162 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.485189 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:28Z","lastTransitionTime":"2025-12-06T06:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.587660 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.587702 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.587714 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.587731 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.587740 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:28Z","lastTransitionTime":"2025-12-06T06:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.690582 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.690666 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.690704 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.690729 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.690742 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:28Z","lastTransitionTime":"2025-12-06T06:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.794445 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.794492 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.794503 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.794527 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.794538 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:28Z","lastTransitionTime":"2025-12-06T06:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.897745 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.897788 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.897797 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.897813 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:28 crc kubenswrapper[4954]: I1206 06:58:28.897824 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:28Z","lastTransitionTime":"2025-12-06T06:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.001357 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.001420 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.001439 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.001465 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.001486 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:29Z","lastTransitionTime":"2025-12-06T06:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.103889 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.104079 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.104097 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.104123 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.104148 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:29Z","lastTransitionTime":"2025-12-06T06:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.207506 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.207555 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.207582 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.207601 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.207611 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:29Z","lastTransitionTime":"2025-12-06T06:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.310856 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.310971 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.310986 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.311015 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.311038 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:29Z","lastTransitionTime":"2025-12-06T06:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.413285 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.413337 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.413347 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.413366 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.413379 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:29Z","lastTransitionTime":"2025-12-06T06:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.443265 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.443294 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.443444 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:29 crc kubenswrapper[4954]: E1206 06:58:29.443631 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:29 crc kubenswrapper[4954]: E1206 06:58:29.443757 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:29 crc kubenswrapper[4954]: E1206 06:58:29.443953 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.516647 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.516687 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.516698 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.516716 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.516726 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:29Z","lastTransitionTime":"2025-12-06T06:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.619941 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.620003 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.620018 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.620038 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.620051 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:29Z","lastTransitionTime":"2025-12-06T06:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.723670 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.723728 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.723741 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.723763 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.723777 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:29Z","lastTransitionTime":"2025-12-06T06:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.826622 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.826677 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.826689 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.826707 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.826718 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:29Z","lastTransitionTime":"2025-12-06T06:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.929239 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.929310 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.929323 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.929347 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:29 crc kubenswrapper[4954]: I1206 06:58:29.929364 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:29Z","lastTransitionTime":"2025-12-06T06:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.032526 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.032622 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.032643 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.032665 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.032680 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:30Z","lastTransitionTime":"2025-12-06T06:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.136296 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.136352 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.136363 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.136383 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.136397 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:30Z","lastTransitionTime":"2025-12-06T06:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.239718 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.239784 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.239793 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.239819 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.239834 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:30Z","lastTransitionTime":"2025-12-06T06:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.348546 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.348643 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.348661 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.348688 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.348703 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:30Z","lastTransitionTime":"2025-12-06T06:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.443113 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:30 crc kubenswrapper[4954]: E1206 06:58:30.443410 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.451202 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.451311 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.451339 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.451365 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.451382 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:30Z","lastTransitionTime":"2025-12-06T06:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.554492 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.554554 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.554582 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.554605 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.554619 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:30Z","lastTransitionTime":"2025-12-06T06:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.657029 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.657080 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.657090 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.657110 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.657122 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:30Z","lastTransitionTime":"2025-12-06T06:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.759987 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.760594 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.760638 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.760672 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.760686 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:30Z","lastTransitionTime":"2025-12-06T06:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.863744 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.863829 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.863843 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.863863 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.863878 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:30Z","lastTransitionTime":"2025-12-06T06:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.966849 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.966900 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.966913 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.966931 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:30 crc kubenswrapper[4954]: I1206 06:58:30.966943 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:30Z","lastTransitionTime":"2025-12-06T06:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.070446 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.070489 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.070498 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.070515 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.070528 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:31Z","lastTransitionTime":"2025-12-06T06:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.173636 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.173711 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.173724 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.173747 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.173759 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:31Z","lastTransitionTime":"2025-12-06T06:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.277114 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.277174 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.277188 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.277213 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.277228 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:31Z","lastTransitionTime":"2025-12-06T06:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.379970 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.380026 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.380040 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.380059 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.380072 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:31Z","lastTransitionTime":"2025-12-06T06:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.443379 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.443492 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.443399 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:31 crc kubenswrapper[4954]: E1206 06:58:31.443640 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:31 crc kubenswrapper[4954]: E1206 06:58:31.443763 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:31 crc kubenswrapper[4954]: E1206 06:58:31.443982 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.482688 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.482790 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.482803 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.482826 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.482843 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:31Z","lastTransitionTime":"2025-12-06T06:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.585415 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.585463 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.585474 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.585493 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.585504 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:31Z","lastTransitionTime":"2025-12-06T06:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.689549 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.689627 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.689639 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.689660 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.689676 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:31Z","lastTransitionTime":"2025-12-06T06:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.792870 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.792923 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.792933 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.792954 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.792963 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:31Z","lastTransitionTime":"2025-12-06T06:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.896552 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.896631 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.896646 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.896684 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.896698 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:31Z","lastTransitionTime":"2025-12-06T06:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.999742 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.999800 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:31 crc kubenswrapper[4954]: I1206 06:58:31.999818 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:31.999838 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:31.999853 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:31Z","lastTransitionTime":"2025-12-06T06:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.102495 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.102579 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.102598 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.102622 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.102638 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:32Z","lastTransitionTime":"2025-12-06T06:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.205994 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.206064 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.206087 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.206119 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.206140 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:32Z","lastTransitionTime":"2025-12-06T06:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.309853 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.309916 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.309929 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.309952 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.309967 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:32Z","lastTransitionTime":"2025-12-06T06:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.413395 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.413473 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.413492 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.413522 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.413543 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:32Z","lastTransitionTime":"2025-12-06T06:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.443260 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:32 crc kubenswrapper[4954]: E1206 06:58:32.443433 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.517038 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.517082 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.517136 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.517155 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.517202 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:32Z","lastTransitionTime":"2025-12-06T06:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.620166 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.620235 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.620257 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.620292 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.620310 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:32Z","lastTransitionTime":"2025-12-06T06:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.723249 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.723338 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.723361 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.723392 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.723416 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:32Z","lastTransitionTime":"2025-12-06T06:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.826692 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.826744 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.826755 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.826776 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.826795 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:32Z","lastTransitionTime":"2025-12-06T06:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.931445 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.931500 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.931514 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.931545 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:32 crc kubenswrapper[4954]: I1206 06:58:32.931586 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:32Z","lastTransitionTime":"2025-12-06T06:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.034504 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.034580 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.034591 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.034610 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.034620 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:33Z","lastTransitionTime":"2025-12-06T06:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.137041 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.137090 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.137104 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.137123 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.137134 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:33Z","lastTransitionTime":"2025-12-06T06:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.239977 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.240017 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.240028 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.240043 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.240051 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:33Z","lastTransitionTime":"2025-12-06T06:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.342133 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.342182 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.342192 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.342213 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.342222 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:33Z","lastTransitionTime":"2025-12-06T06:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.443272 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.443356 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.443286 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:33 crc kubenswrapper[4954]: E1206 06:58:33.443468 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:33 crc kubenswrapper[4954]: E1206 06:58:33.443623 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:33 crc kubenswrapper[4954]: E1206 06:58:33.443718 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.444792 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.444826 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.444839 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.444853 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.444864 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:33Z","lastTransitionTime":"2025-12-06T06:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.547278 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.547322 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.547333 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.547353 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.547363 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:33Z","lastTransitionTime":"2025-12-06T06:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.650127 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.650186 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.650196 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.650217 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.650235 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:33Z","lastTransitionTime":"2025-12-06T06:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.752924 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.752987 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.752999 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.753022 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.753035 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:33Z","lastTransitionTime":"2025-12-06T06:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.856113 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.856204 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.856218 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.856240 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.856254 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:33Z","lastTransitionTime":"2025-12-06T06:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.959148 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.959196 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.959239 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.959257 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:33 crc kubenswrapper[4954]: I1206 06:58:33.959269 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:33Z","lastTransitionTime":"2025-12-06T06:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.062236 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.062705 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.062740 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.062763 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.062775 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:34Z","lastTransitionTime":"2025-12-06T06:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.164807 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.164857 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.164868 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.164885 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.164897 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:34Z","lastTransitionTime":"2025-12-06T06:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.267236 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.267274 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.267282 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.267301 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.267310 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:34Z","lastTransitionTime":"2025-12-06T06:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.370596 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.370643 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.370654 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.370674 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.370688 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:34Z","lastTransitionTime":"2025-12-06T06:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.442642 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:34 crc kubenswrapper[4954]: E1206 06:58:34.442797 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.473515 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.473612 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.473626 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.473654 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.473665 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:34Z","lastTransitionTime":"2025-12-06T06:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.576231 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.576291 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.576301 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.576321 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.576336 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:34Z","lastTransitionTime":"2025-12-06T06:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.679409 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.679460 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.679471 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.679487 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.679498 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:34Z","lastTransitionTime":"2025-12-06T06:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.782799 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.782898 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.782914 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.782957 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.782974 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:34Z","lastTransitionTime":"2025-12-06T06:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.885991 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.886079 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.886093 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.886117 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.886131 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:34Z","lastTransitionTime":"2025-12-06T06:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.988966 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.989020 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.989033 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.989055 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:34 crc kubenswrapper[4954]: I1206 06:58:34.989068 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:34Z","lastTransitionTime":"2025-12-06T06:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.096220 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.096290 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.096303 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.096331 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.096345 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:35Z","lastTransitionTime":"2025-12-06T06:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.199830 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.199885 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.199895 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.199913 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.199926 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:35Z","lastTransitionTime":"2025-12-06T06:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.302901 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.302966 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.302977 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.302998 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.303012 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:35Z","lastTransitionTime":"2025-12-06T06:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.407228 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.407305 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.407320 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.407348 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.407365 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:35Z","lastTransitionTime":"2025-12-06T06:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.442824 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.442891 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:35 crc kubenswrapper[4954]: E1206 06:58:35.442997 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.443035 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:35 crc kubenswrapper[4954]: E1206 06:58:35.443180 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:35 crc kubenswrapper[4954]: E1206 06:58:35.443288 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.460469 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.475725 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd399e-52cb-401b-9d4c-7c4b9dae34cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2ac34c62664b0a17ec26755307ecae78d4118b236c66819ebad906c1c092cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4bb75a39170b64e9b46994f3955c491ecb3f2155fc552c2738a5ece2e22de21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1783fc9d0986247d5ada8563d17661cb609d146db208e05609175315554e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.494054 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.510750 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.510810 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.510826 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.510850 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.510867 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:35Z","lastTransitionTime":"2025-12-06T06:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.512172 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.529539 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.541512 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.554004 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82dc64fa-c1cb-48ce-9c45-07e6f082bf06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c69751b0a690ad14e6835009a7861b14636a1e0c533211456f376f0368bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e934f5a591fbeb54465e030ae77d3fea9e7f12f6a1ea32605a27ebbbffe4540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e934f5a591fbeb54465e030ae77d3fea9e7f12f6a1ea32605a27ebbbffe4540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.567681 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.583341 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.599422 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.613975 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.614030 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.614042 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.614061 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.614075 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:35Z","lastTransitionTime":"2025-12-06T06:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.616142 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.631375 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.656830 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:58:08Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 06:58:07.200471 6642 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 06:58:07.200498 6642 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 06:58:07.200501 6642 handler.go:208] Removed *v1.Node event handler 7\\\\nI1206 06:58:07.200506 6642 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 06:58:07.200546 6642 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 06:58:07.200571 6642 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 06:58:07.200596 6642 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1206 06:58:07.200649 6642 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 06:58:07.200554 6642 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 06:58:07.200740 6642 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 06:58:07.200779 6642 factory.go:656] Stopping watch factory\\\\nI1206 06:58:07.200986 6642 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1206 06:58:07.201073 6642 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1206 06:58:07.201107 6642 ovnkube.go:599] Stopped ovnkube\\\\nI1206 06:58:07.201131 6642 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1206 06:58:07.201195 6642 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:58:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-crz6w_openshift-ovn-kubernetes(7cc429b1-3932-4515-a2b8-f0dd601f3e4c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.670651 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.683711 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.698721 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc97358db525a0093a6b1350179318aec4b2da84f919bbdf3f2e5a56205a363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:58:20Z\\\",\\\"message\\\":\\\"2025-12-06T06:57:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_097e22bc-457d-46ce-accd-cb98cea0c49e\\\\n2025-12-06T06:57:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_097e22bc-457d-46ce-accd-cb98cea0c49e to /host/opt/cni/bin/\\\\n2025-12-06T06:57:35Z [verbose] multus-daemon started\\\\n2025-12-06T06:57:35Z [verbose] Readiness Indicator file check\\\\n2025-12-06T06:58:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.713223 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:35Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.717185 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.717264 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.717279 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.717302 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.717318 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:35Z","lastTransitionTime":"2025-12-06T06:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.820787 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.820841 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.820855 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.820876 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.820890 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:35Z","lastTransitionTime":"2025-12-06T06:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.923788 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.923842 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.923856 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.923879 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:35 crc kubenswrapper[4954]: I1206 06:58:35.923893 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:35Z","lastTransitionTime":"2025-12-06T06:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.026724 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.026769 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.026781 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.026799 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.026810 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:36Z","lastTransitionTime":"2025-12-06T06:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.130021 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.130094 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.130113 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.130139 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.130159 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:36Z","lastTransitionTime":"2025-12-06T06:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.233106 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.233172 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.233187 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.233211 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.233227 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:36Z","lastTransitionTime":"2025-12-06T06:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.335795 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.335839 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.335850 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.335868 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.335881 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:36Z","lastTransitionTime":"2025-12-06T06:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.437688 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.437739 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.437751 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.437770 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.437784 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:36Z","lastTransitionTime":"2025-12-06T06:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.442302 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:36 crc kubenswrapper[4954]: E1206 06:58:36.442489 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.540615 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.540685 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.540698 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.540723 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.540738 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:36Z","lastTransitionTime":"2025-12-06T06:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.646125 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.646869 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.646962 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.647059 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.647149 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:36Z","lastTransitionTime":"2025-12-06T06:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.749585 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.749637 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.749651 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.749672 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.749687 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:36Z","lastTransitionTime":"2025-12-06T06:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.852838 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.853197 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.853277 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.853646 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.853737 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:36Z","lastTransitionTime":"2025-12-06T06:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.956639 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.956688 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.956698 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.956715 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:36 crc kubenswrapper[4954]: I1206 06:58:36.956726 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:36Z","lastTransitionTime":"2025-12-06T06:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.059587 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.059627 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.059638 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.059655 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.059668 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:37Z","lastTransitionTime":"2025-12-06T06:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.162720 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.162794 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.162809 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.162836 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.162853 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:37Z","lastTransitionTime":"2025-12-06T06:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.265398 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.265444 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.265456 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.265473 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.265485 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:37Z","lastTransitionTime":"2025-12-06T06:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.275074 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.275424 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:41.275371174 +0000 UTC m=+156.088730563 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.368280 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.368350 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.368366 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.368404 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.368422 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:37Z","lastTransitionTime":"2025-12-06T06:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.376050 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.376106 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.376133 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.376163 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.376242 4954 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.376318 4954 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.376349 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 06:59:41.37632366 +0000 UTC m=+156.189683049 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.376384 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 06:59:41.376363551 +0000 UTC m=+156.189722940 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.376394 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.376436 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.376431 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.376504 4954 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.376522 4954 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.376453 4954 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.376619 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 06:59:41.376589788 +0000 UTC m=+156.189949177 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.376643 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 06:59:41.376636309 +0000 UTC m=+156.189995698 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.414751 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.414820 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.414836 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.414857 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.414871 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:37Z","lastTransitionTime":"2025-12-06T06:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.430862 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.436091 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.436140 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.436153 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.436173 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.436194 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:37Z","lastTransitionTime":"2025-12-06T06:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.442598 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.442622 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.442907 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.442981 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.443470 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.443605 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.444354 4954 scope.go:117] "RemoveContainer" containerID="e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c" Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.450946 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.458109 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.458169 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.458180 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.458200 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.458210 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:37Z","lastTransitionTime":"2025-12-06T06:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.461482 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.471523 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.477168 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.477709 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.477815 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.477912 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.477976 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:37Z","lastTransitionTime":"2025-12-06T06:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.492005 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.502551 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.502611 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.502626 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.502649 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.502664 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:37Z","lastTransitionTime":"2025-12-06T06:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.518621 4954 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b7a4e02-eeee-45d6-a19e-861f2b430acc\\\",\\\"systemUUID\\\":\\\"a29bc87b-f01f-4645-b6f9-deab79f2c4b3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:37 crc kubenswrapper[4954]: E1206 06:58:37.519036 4954 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.521017 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.521056 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.521070 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.521092 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.521105 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:37Z","lastTransitionTime":"2025-12-06T06:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.623480 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.623889 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.624002 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.624096 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.624168 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:37Z","lastTransitionTime":"2025-12-06T06:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.727516 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.727594 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.727617 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.727641 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.727656 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:37Z","lastTransitionTime":"2025-12-06T06:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.830725 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.830779 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.830789 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.830808 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.830818 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:37Z","lastTransitionTime":"2025-12-06T06:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.933269 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.933302 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.933312 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.933369 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.933387 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:37Z","lastTransitionTime":"2025-12-06T06:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.981123 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovnkube-controller/2.log" Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.987487 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerStarted","Data":"eb1edbcb12fbf2d933be983b2d4ed0eef36eb346d6d792ed44e67c0f3377b7f1"} Dec 06 06:58:37 crc kubenswrapper[4954]: I1206 06:58:37.987929 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.002935 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:37Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.016804 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.028713 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.035114 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.035152 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.035165 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.035183 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.035194 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:38Z","lastTransitionTime":"2025-12-06T06:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.047992 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1edbcb12fbf2d933be983b2d4ed0eef36eb346d6d792ed44e67c0f3377b7f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:58:08Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 06:58:07.200471 6642 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 06:58:07.200498 6642 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 06:58:07.200501 6642 handler.go:208] Removed *v1.Node event handler 7\\\\nI1206 06:58:07.200506 6642 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 06:58:07.200546 6642 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 06:58:07.200571 6642 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 06:58:07.200596 6642 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1206 06:58:07.200649 6642 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 06:58:07.200554 6642 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 06:58:07.200740 6642 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 06:58:07.200779 6642 factory.go:656] Stopping watch factory\\\\nI1206 06:58:07.200986 6642 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1206 06:58:07.201073 6642 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1206 06:58:07.201107 6642 ovnkube.go:599] Stopped ovnkube\\\\nI1206 06:58:07.201131 6642 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1206 06:58:07.201195 6642 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:58:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.060778 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.075885 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.091994 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc97358db525a0093a6b1350179318aec4b2da84f919bbdf3f2e5a56205a363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:58:20Z\\\",\\\"message\\\":\\\"2025-12-06T06:57:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_097e22bc-457d-46ce-accd-cb98cea0c49e\\\\n2025-12-06T06:57:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_097e22bc-457d-46ce-accd-cb98cea0c49e to /host/opt/cni/bin/\\\\n2025-12-06T06:57:35Z [verbose] multus-daemon started\\\\n2025-12-06T06:57:35Z [verbose] Readiness Indicator file check\\\\n2025-12-06T06:58:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.103055 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.116863 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd399e-52cb-401b-9d4c-7c4b9dae34cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2ac34c62664b0a17ec26755307ecae78d4118b236c66819ebad906c1c092cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4bb75a39170b64e9b46994f3955c491ecb3f2155fc552c2738a5ece2e22de21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1783fc9d0986247d5ada8563d17661cb609d146db208e05609175315554e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.135383 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.137360 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.137403 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.137415 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.137432 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.137446 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:38Z","lastTransitionTime":"2025-12-06T06:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.150220 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.163641 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.175086 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.190582 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.202858 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82dc64fa-c1cb-48ce-9c45-07e6f082bf06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c69751b0a690ad14e6835009a7861b14636a1e0c533211456f376f0368bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e934f5a591fbeb54465e030ae77d3fea9e7f12f6a1ea32605a27ebbbffe4540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e934f5a591fbeb54465e030ae77d3fea9e7f12f6a1ea32605a27ebbbffe4540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.226755 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb4b4cd8-7492-47e3-95e4-794bcdcee20c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4812b6f0d6a5ec757fa7d811ea0e10399dbc2a99eef24403cabe4f78b8b9f070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c6758d19607e84d046b2f58804bf8cac4af3bf5b40ed7d6cf1541d73006db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b4f2933b80af3e40fc62e3a39fc6ed040324911ff2e2ede84b7eb9d2705c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddfa018798e201b7ae74d5c443999b108945c84a4a3fa803abb26256c8a4a8dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f25a446fadfb363ee38298b4b5a561017cb89c2c9099555ed50a8ca7d9e1296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://763a52c30a31474f4d6156eed9d7c9cbca1fcf0cef53e74089ff36760616b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763a52c30a31474f4d6156eed9d7c9cbca1fcf0cef53e74089ff36760616b1c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bb58bc81fa75f21963abf055a7591b6a48b2d73f576cc10bbc802043719ce27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bb58bc81fa75f21963abf055a7591b6a48b2d73f576cc10bbc802043719ce27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8684a7d806f9b8fde08fa898665202f479d8cd1b7b779667516fadbd2edf6047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8684a7d806f9b8fde08fa898665202f479d8cd1b7b779667516fadbd2edf6047\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.239801 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.239849 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.239860 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.239905 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.239916 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:38Z","lastTransitionTime":"2025-12-06T06:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.252727 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.265220 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:38Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.342517 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.342551 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.342573 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.342591 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.342599 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:38Z","lastTransitionTime":"2025-12-06T06:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.442753 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:38 crc kubenswrapper[4954]: E1206 06:58:38.442928 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.444521 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.444578 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.444588 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.444604 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.444616 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:38Z","lastTransitionTime":"2025-12-06T06:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.547752 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.547801 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.547812 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.547829 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.547844 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:38Z","lastTransitionTime":"2025-12-06T06:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.650249 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.650291 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.650300 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.650316 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.650326 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:38Z","lastTransitionTime":"2025-12-06T06:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.753491 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.753585 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.753623 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.753653 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.753669 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:38Z","lastTransitionTime":"2025-12-06T06:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.856678 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.856735 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.856746 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.856765 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.856777 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:38Z","lastTransitionTime":"2025-12-06T06:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.959650 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.959711 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.959726 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.959746 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:38 crc kubenswrapper[4954]: I1206 06:58:38.959762 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:38Z","lastTransitionTime":"2025-12-06T06:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.062245 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.062300 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.062312 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.062332 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.062346 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:39Z","lastTransitionTime":"2025-12-06T06:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.164885 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.164941 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.164958 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.164980 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.164995 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:39Z","lastTransitionTime":"2025-12-06T06:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.268116 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.268174 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.268184 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.268207 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.268220 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:39Z","lastTransitionTime":"2025-12-06T06:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.371535 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.371624 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.371639 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.371663 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.371676 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:39Z","lastTransitionTime":"2025-12-06T06:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.442539 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.442945 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:39 crc kubenswrapper[4954]: E1206 06:58:39.443023 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:39 crc kubenswrapper[4954]: E1206 06:58:39.443099 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.442628 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:39 crc kubenswrapper[4954]: E1206 06:58:39.443218 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.474320 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.474372 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.474385 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.474409 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.474424 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:39Z","lastTransitionTime":"2025-12-06T06:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.576921 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.576964 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.576974 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.577025 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.577040 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:39Z","lastTransitionTime":"2025-12-06T06:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.679541 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.679615 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.679630 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.679650 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.679662 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:39Z","lastTransitionTime":"2025-12-06T06:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.783240 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.783327 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.783341 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.783369 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.783387 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:39Z","lastTransitionTime":"2025-12-06T06:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.885502 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.885578 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.885592 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.885614 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.885625 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:39Z","lastTransitionTime":"2025-12-06T06:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.988641 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.988687 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.988699 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.988716 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.988727 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:39Z","lastTransitionTime":"2025-12-06T06:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.993279 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovnkube-controller/3.log" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.993862 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovnkube-controller/2.log" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.996149 4954 generic.go:334] "Generic (PLEG): container finished" podID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerID="eb1edbcb12fbf2d933be983b2d4ed0eef36eb346d6d792ed44e67c0f3377b7f1" exitCode=1 Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.996188 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerDied","Data":"eb1edbcb12fbf2d933be983b2d4ed0eef36eb346d6d792ed44e67c0f3377b7f1"} Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.996254 4954 scope.go:117] "RemoveContainer" containerID="e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c" Dec 06 06:58:39 crc kubenswrapper[4954]: I1206 06:58:39.997526 4954 scope.go:117] "RemoveContainer" containerID="eb1edbcb12fbf2d933be983b2d4ed0eef36eb346d6d792ed44e67c0f3377b7f1" Dec 06 06:58:39 crc kubenswrapper[4954]: E1206 06:58:39.997787 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-crz6w_openshift-ovn-kubernetes(7cc429b1-3932-4515-a2b8-f0dd601f3e4c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.011960 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62012aee2e023a49c335d66decf7c8b3122db4978719085f42847a6bd20d926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.024793 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4a91a8e33c51bb671e7c9976b4da41667c102923bc5ca7d4444f04f1c07b8ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafc05773e960792209b97d17b4e81e5dd341a18a9b91cd53f4c22e9bdb644b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.041775 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jstpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee79602b-4e64-45c3-a608-9d312315f206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c5b2d85235e3bdf5ebc541c44d290f88b9b6e6b843394d4cf1c5173fafccab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs4jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jstpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.057705 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n27lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36967035-88f9-47a3-a15a-bce812678973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a9ed9476e869d3abc4d6d85c290de6638e3e573732dc51b8d8a0a2178b588a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f6c52b2f527d9e88d4356aeef76ac3dc3c989f8e8d75d9a8b0ff80b048b9976\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ea121efd6529106f3f44e1fed8343d0f4033ca1d39e5e85f849d96ef09f1926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc78157177b9fcbcf8ca30edac95a4289f125322743de0b1d8ed1b124893003b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f02ec6a620b47c26158d1e95b24ccd6220d1c801e679c1212e67a922b78d9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4abffbacb436dc197115cf38a6a436422741fe0879b22d5461c05a0e3641a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bcce6f5214f4b948f8df1512f90aa3df27c21f2f17b38b7943a3735ec97b013\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts786\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n27lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.073454 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd399e-52cb-401b-9d4c-7c4b9dae34cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2ac34c62664b0a17ec26755307ecae78d4118b236c66819ebad906c1c092cd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4bb75a39170b64e9b46994f3955c491ecb3f2155fc552c2738a5ece2e22de21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1783fc9d0986247d5ada8563d17661cb609d146db208e05609175315554e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2080ed442ba6d89087f3d682adff0163fb421ab52c9a4f3c65d477c9c9cae21d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.091938 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.091967 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.091976 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.091993 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.092006 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:40Z","lastTransitionTime":"2025-12-06T06:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.092259 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"598b09d0-83c1-4b56-b622-5d01f4792661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T06:57:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW1206 06:57:31.137522 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1206 06:57:31.137741 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 06:57:31.138536 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2441179559/tls.crt::/tmp/serving-cert-2441179559/tls.key\\\\\\\"\\\\nI1206 06:57:31.455928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1206 06:57:31.458748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1206 06:57:31.458791 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1206 06:57:31.458828 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1206 06:57:31.458836 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1206 06:57:31.463439 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1206 06:57:31.463456 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1206 06:57:31.463481 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463490 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1206 06:57:31.463497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1206 06:57:31.463503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1206 06:57:31.463507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1206 06:57:31.463511 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1206 06:57:31.466738 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.106146 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a9aa90b-56ad-4dfc-be90-8030f06801d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf091df11e79d054cc80b0c6c9a87e0b992534785eec51ac1c3f4fcb83c37df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72fc3985dbc67a8351d7c7d677947fa3aef6464c304111e339b6a51cbafcc68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxt9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d7wx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.115813 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82dc64fa-c1cb-48ce-9c45-07e6f082bf06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b60c69751b0a690ad14e6835009a7861b14636a1e0c533211456f376f0368bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e934f5a591fbeb54465e030ae77d3fea9e7f12f6a1ea32605a27ebbbffe4540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e934f5a591fbeb54465e030ae77d3fea9e7f12f6a1ea32605a27ebbbffe4540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.132861 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb4b4cd8-7492-47e3-95e4-794bcdcee20c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4812b6f0d6a5ec757fa7d811ea0e10399dbc2a99eef24403cabe4f78b8b9f070\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c6758d19607e84d046b2f58804bf8cac4af3bf5b40ed7d6cf1541d73006db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b4f2933b80af3e40fc62e3a39fc6ed040324911ff2e2ede84b7eb9d2705c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddfa018798e201b7ae74d5c443999b108945c84a4a3fa803abb26256c8a4a8dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f25a446fadfb363ee38298b4b5a561017cb89c2c9099555ed50a8ca7d9e1296\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://763a52c30a31474f4d6156eed9d7c9cbca1fcf0cef53e74089ff36760616b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763a52c30a31474f4d6156eed9d7c9cbca1fcf0cef53e74089ff36760616b1c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bb58bc81fa75f21963abf055a7591b6a48b2d73f576cc10bbc802043719ce27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bb58bc81fa75f21963abf055a7591b6a48b2d73f576cc10bbc802043719ce27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8684a7d806f9b8fde08fa898665202f479d8cd1b7b779667516fadbd2edf6047\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8684a7d806f9b8fde08fa898665202f479d8cd1b7b779667516fadbd2edf6047\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.143546 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b5eb734d5f64022fab0063ef60a663765d5232004573f43a2105602f1ab7aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.159429 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1edbcb12fbf2d933be983b2d4ed0eef36eb346d6d792ed44e67c0f3377b7f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3c28aa133d267730380afda180ca2bced01de27f310fe2cc0697e03f4d1646c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:58:08Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 06:58:07.200471 6642 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 06:58:07.200498 6642 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 06:58:07.200501 6642 handler.go:208] Removed *v1.Node event handler 7\\\\nI1206 06:58:07.200506 6642 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 06:58:07.200546 6642 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 06:58:07.200571 6642 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 06:58:07.200596 6642 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1206 06:58:07.200649 6642 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 06:58:07.200554 6642 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 06:58:07.200740 6642 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 06:58:07.200779 6642 factory.go:656] Stopping watch factory\\\\nI1206 06:58:07.200986 6642 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1206 06:58:07.201073 6642 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1206 06:58:07.201107 6642 ovnkube.go:599] Stopped ovnkube\\\\nI1206 06:58:07.201131 6642 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1206 06:58:07.201195 6642 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:58:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb1edbcb12fbf2d933be983b2d4ed0eef36eb346d6d792ed44e67c0f3377b7f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:58:38Z\\\",\\\"message\\\":\\\"5801 6995 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:58:38.696050 6995 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:58:38.696502 6995 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:58:38.696801 6995 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 06:58:38.701085 6995 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 06:58:38.701136 6995 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 06:58:38.701209 6995 factory.go:656] Stopping watch factory\\\\nI1206 06:58:38.701222 6995 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 06:58:38.701268 6995 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 06:58:38.724910 6995 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1206 06:58:38.724927 6995 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1206 06:58:38.724997 6995 ovnkube.go:599] Stopped ovnkube\\\\nI1206 06:58:38.725030 6995 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1206 06:58:38.725097 6995 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f2rg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-crz6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.167971 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7gzpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d30cc7-60f8-4b93-ab83-4740d5568464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93627dac952ab363f0b9f524e4fdbb5cf800f5cd601e24ea1448e706775caf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqm57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7gzpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.186872 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.194121 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.194166 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.194177 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.194196 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.194208 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:40Z","lastTransitionTime":"2025-12-06T06:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.201485 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.212287 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0babbe-21ce-42f4-90cf-c3eb21991413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94a890f812f87b331a7c46a3abdbcd316609139f7303cd5230432fec1d8fb0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cbrq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5lgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.223778 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.236835 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rsvgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d174f37-f89e-4daf-a663-3cad4e33dad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fc97358db525a0093a6b1350179318aec4b2da84f919bbdf3f2e5a56205a363\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T06:58:20Z\\\",\\\"message\\\":\\\"2025-12-06T06:57:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_097e22bc-457d-46ce-accd-cb98cea0c49e\\\\n2025-12-06T06:57:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_097e22bc-457d-46ce-accd-cb98cea0c49e to /host/opt/cni/bin/\\\\n2025-12-06T06:57:35Z [verbose] multus-daemon started\\\\n2025-12-06T06:57:35Z [verbose] Readiness Indicator file check\\\\n2025-12-06T06:58:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T06:57:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T06:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppbzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rsvgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.249320 4954 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T06:57:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwcqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T06:57:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vtxfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T06:58:40Z is after 2025-08-24T17:21:41Z" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.297446 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.297499 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.297512 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.297539 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.297552 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:40Z","lastTransitionTime":"2025-12-06T06:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.401306 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.401363 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.401374 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.401393 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.401405 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:40Z","lastTransitionTime":"2025-12-06T06:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.443317 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:40 crc kubenswrapper[4954]: E1206 06:58:40.443512 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.504049 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.504105 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.504118 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.504138 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.504150 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:40Z","lastTransitionTime":"2025-12-06T06:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.606322 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.606371 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.606382 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.606403 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.606417 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:40Z","lastTransitionTime":"2025-12-06T06:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.709521 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.709583 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.709593 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.709612 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.709622 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:40Z","lastTransitionTime":"2025-12-06T06:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.812351 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.812402 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.812414 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.812432 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.812445 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:40Z","lastTransitionTime":"2025-12-06T06:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.914866 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.914909 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.914921 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.914940 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:40 crc kubenswrapper[4954]: I1206 06:58:40.914961 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:40Z","lastTransitionTime":"2025-12-06T06:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.001552 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovnkube-controller/3.log" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.017848 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.017891 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.017903 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.017923 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.017934 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:41Z","lastTransitionTime":"2025-12-06T06:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.120205 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.120271 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.120285 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.120308 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.120322 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:41Z","lastTransitionTime":"2025-12-06T06:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.223721 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.223786 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.223805 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.223834 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.223854 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:41Z","lastTransitionTime":"2025-12-06T06:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.326401 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.326432 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.326440 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.326657 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.326680 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:41Z","lastTransitionTime":"2025-12-06T06:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.429355 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.429401 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.429414 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.429434 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.429446 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:41Z","lastTransitionTime":"2025-12-06T06:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.442858 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.442905 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.442952 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:41 crc kubenswrapper[4954]: E1206 06:58:41.443130 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:41 crc kubenswrapper[4954]: E1206 06:58:41.443191 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:41 crc kubenswrapper[4954]: E1206 06:58:41.443289 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.532226 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.532424 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.532439 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.532459 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.532470 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:41Z","lastTransitionTime":"2025-12-06T06:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.635117 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.635153 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.635162 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.635178 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.635188 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:41Z","lastTransitionTime":"2025-12-06T06:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.737555 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.737876 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.737890 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.737908 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.737920 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:41Z","lastTransitionTime":"2025-12-06T06:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.840365 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.840405 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.840416 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.840433 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.840443 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:41Z","lastTransitionTime":"2025-12-06T06:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.943085 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.943126 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.943137 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.943156 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:41 crc kubenswrapper[4954]: I1206 06:58:41.943170 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:41Z","lastTransitionTime":"2025-12-06T06:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.046649 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.046713 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.046725 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.046743 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.046754 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:42Z","lastTransitionTime":"2025-12-06T06:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.149082 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.149119 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.149131 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.149149 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.149159 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:42Z","lastTransitionTime":"2025-12-06T06:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.252010 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.252053 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.252064 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.252083 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.252098 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:42Z","lastTransitionTime":"2025-12-06T06:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.354951 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.355016 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.355033 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.355055 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.355071 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:42Z","lastTransitionTime":"2025-12-06T06:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.443121 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:42 crc kubenswrapper[4954]: E1206 06:58:42.443315 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.457127 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.457181 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.457193 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.457212 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.457223 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:42Z","lastTransitionTime":"2025-12-06T06:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.560932 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.561002 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.561017 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.561039 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.561052 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:42Z","lastTransitionTime":"2025-12-06T06:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.663608 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.663663 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.663678 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.663703 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.663717 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:42Z","lastTransitionTime":"2025-12-06T06:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.766322 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.766382 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.766396 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.766421 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.766437 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:42Z","lastTransitionTime":"2025-12-06T06:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.869914 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.869961 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.869972 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.869993 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.870005 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:42Z","lastTransitionTime":"2025-12-06T06:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.972554 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.972647 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.972661 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.972686 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:42 crc kubenswrapper[4954]: I1206 06:58:42.972699 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:42Z","lastTransitionTime":"2025-12-06T06:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.075553 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.075648 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.075662 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.075686 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.075699 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:43Z","lastTransitionTime":"2025-12-06T06:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.178541 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.178624 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.178642 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.178669 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.178686 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:43Z","lastTransitionTime":"2025-12-06T06:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.281576 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.281627 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.281644 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.281663 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.281675 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:43Z","lastTransitionTime":"2025-12-06T06:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.384654 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.384694 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.384704 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.384722 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.384734 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:43Z","lastTransitionTime":"2025-12-06T06:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.442695 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.442762 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.442867 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:43 crc kubenswrapper[4954]: E1206 06:58:43.442866 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:43 crc kubenswrapper[4954]: E1206 06:58:43.443056 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:43 crc kubenswrapper[4954]: E1206 06:58:43.443215 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.487709 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.487760 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.487769 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.487786 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.487797 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:43Z","lastTransitionTime":"2025-12-06T06:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.590349 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.590420 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.590430 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.590447 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.590457 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:43Z","lastTransitionTime":"2025-12-06T06:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.693499 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.693541 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.693554 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.693585 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.693596 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:43Z","lastTransitionTime":"2025-12-06T06:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.795641 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.795699 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.795709 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.795725 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.795735 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:43Z","lastTransitionTime":"2025-12-06T06:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.897864 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.897909 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.897920 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.897941 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:43 crc kubenswrapper[4954]: I1206 06:58:43.897952 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:43Z","lastTransitionTime":"2025-12-06T06:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.000632 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.000672 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.000684 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.000704 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.000717 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:44Z","lastTransitionTime":"2025-12-06T06:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.103605 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.103666 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.103679 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.103699 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.103711 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:44Z","lastTransitionTime":"2025-12-06T06:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.207176 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.207259 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.207270 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.207287 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.207297 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:44Z","lastTransitionTime":"2025-12-06T06:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.310472 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.310533 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.310543 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.310579 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.310590 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:44Z","lastTransitionTime":"2025-12-06T06:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.412935 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.412977 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.412987 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.413004 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.413015 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:44Z","lastTransitionTime":"2025-12-06T06:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.442850 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:44 crc kubenswrapper[4954]: E1206 06:58:44.443042 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.515757 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.515809 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.515824 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.515844 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.515858 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:44Z","lastTransitionTime":"2025-12-06T06:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.618702 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.618763 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.618778 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.618802 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.618818 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:44Z","lastTransitionTime":"2025-12-06T06:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.721893 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.721947 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.721956 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.721976 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.721996 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:44Z","lastTransitionTime":"2025-12-06T06:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.824623 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.824667 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.824676 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.824694 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.824706 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:44Z","lastTransitionTime":"2025-12-06T06:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.934909 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.934976 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.934990 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.935012 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:44 crc kubenswrapper[4954]: I1206 06:58:44.935028 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:44Z","lastTransitionTime":"2025-12-06T06:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.038229 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.038292 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.038305 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.038326 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.038339 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:45Z","lastTransitionTime":"2025-12-06T06:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.140952 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.140998 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.141009 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.141028 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.141040 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:45Z","lastTransitionTime":"2025-12-06T06:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.243466 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.243515 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.243525 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.243546 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.243584 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:45Z","lastTransitionTime":"2025-12-06T06:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.346352 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.346403 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.346415 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.346433 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.346446 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:45Z","lastTransitionTime":"2025-12-06T06:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.442473 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.442489 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:45 crc kubenswrapper[4954]: E1206 06:58:45.442719 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:45 crc kubenswrapper[4954]: E1206 06:58:45.442766 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.442920 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:45 crc kubenswrapper[4954]: E1206 06:58:45.443461 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.448776 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.448808 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.448818 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.448832 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.448844 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:45Z","lastTransitionTime":"2025-12-06T06:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.485163 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rsvgk" podStartSLOduration=72.485133528 podStartE2EDuration="1m12.485133528s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:58:45.485055476 +0000 UTC m=+100.298414865" watchObservedRunningTime="2025-12-06 06:58:45.485133528 +0000 UTC m=+100.298492937" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.542121 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-n27lq" podStartSLOduration=72.54209594 podStartE2EDuration="1m12.54209594s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:58:45.521849857 +0000 UTC m=+100.335209296" watchObservedRunningTime="2025-12-06 06:58:45.54209594 +0000 UTC m=+100.355455329" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.542399 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=51.542380658 podStartE2EDuration="51.542380658s" podCreationTimestamp="2025-12-06 06:57:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:58:45.542214543 +0000 UTC m=+100.355573952" watchObservedRunningTime="2025-12-06 06:58:45.542380658 +0000 UTC m=+100.355740047" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.551074 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.551140 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.551158 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.551184 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.551201 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:45Z","lastTransitionTime":"2025-12-06T06:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.581750 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=71.58171488 podStartE2EDuration="1m11.58171488s" podCreationTimestamp="2025-12-06 06:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:58:45.566649974 +0000 UTC m=+100.380009383" watchObservedRunningTime="2025-12-06 06:58:45.58171488 +0000 UTC m=+100.395074309" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.611900 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jstpl" podStartSLOduration=72.611866973 podStartE2EDuration="1m12.611866973s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:58:45.61103214 +0000 UTC m=+100.424391549" watchObservedRunningTime="2025-12-06 06:58:45.611866973 +0000 UTC m=+100.425226382" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.627366 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=19.627329271 podStartE2EDuration="19.627329271s" podCreationTimestamp="2025-12-06 06:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:58:45.626450056 +0000 UTC m=+100.439809445" watchObservedRunningTime="2025-12-06 06:58:45.627329271 +0000 UTC m=+100.440688680" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.654011 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.654058 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.654072 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.654092 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.654106 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:45Z","lastTransitionTime":"2025-12-06T06:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.667674 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=8.667649561 podStartE2EDuration="8.667649561s" podCreationTimestamp="2025-12-06 06:58:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:58:45.666772897 +0000 UTC m=+100.480132306" watchObservedRunningTime="2025-12-06 06:58:45.667649561 +0000 UTC m=+100.481008950" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.732063 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d7wx6" podStartSLOduration=72.732038463 podStartE2EDuration="1m12.732038463s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:58:45.731181659 +0000 UTC m=+100.544541098" watchObservedRunningTime="2025-12-06 06:58:45.732038463 +0000 UTC m=+100.545397852" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.756768 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.756815 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.756827 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.756847 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.756861 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:45Z","lastTransitionTime":"2025-12-06T06:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.778423 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podStartSLOduration=72.778399014 podStartE2EDuration="1m12.778399014s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:58:45.778324512 +0000 UTC m=+100.591683901" watchObservedRunningTime="2025-12-06 06:58:45.778399014 +0000 UTC m=+100.591758403" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.812478 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7gzpg" podStartSLOduration=72.812449018 podStartE2EDuration="1m12.812449018s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:58:45.81217561 +0000 UTC m=+100.625534989" watchObservedRunningTime="2025-12-06 06:58:45.812449018 +0000 UTC m=+100.625808407" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.860157 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.860215 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.860229 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.860248 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.860259 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:45Z","lastTransitionTime":"2025-12-06T06:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.962646 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.962702 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.962715 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.962736 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:45 crc kubenswrapper[4954]: I1206 06:58:45.962751 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:45Z","lastTransitionTime":"2025-12-06T06:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.065193 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.065248 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.065260 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.065282 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.065294 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:46Z","lastTransitionTime":"2025-12-06T06:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.168159 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.168206 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.168218 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.168239 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.168254 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:46Z","lastTransitionTime":"2025-12-06T06:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.270463 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.270528 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.270544 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.270584 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.270602 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:46Z","lastTransitionTime":"2025-12-06T06:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.374419 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.374495 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.374520 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.374553 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.374621 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:46Z","lastTransitionTime":"2025-12-06T06:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.442310 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:46 crc kubenswrapper[4954]: E1206 06:58:46.442429 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.477076 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.477137 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.477152 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.477177 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.477196 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:46Z","lastTransitionTime":"2025-12-06T06:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.580208 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.580284 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.580294 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.580312 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.580330 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:46Z","lastTransitionTime":"2025-12-06T06:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.683330 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.683394 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.683407 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.683430 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.683446 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:46Z","lastTransitionTime":"2025-12-06T06:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.785871 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.785930 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.785944 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.785965 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.785979 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:46Z","lastTransitionTime":"2025-12-06T06:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.889979 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.890043 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.890061 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.890114 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.890133 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:46Z","lastTransitionTime":"2025-12-06T06:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.992811 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.993157 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.993247 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.993362 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:46 crc kubenswrapper[4954]: I1206 06:58:46.993474 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:46Z","lastTransitionTime":"2025-12-06T06:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.096349 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.096393 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.096402 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.096422 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.096436 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:47Z","lastTransitionTime":"2025-12-06T06:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.205593 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.205651 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.205666 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.205689 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.205704 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:47Z","lastTransitionTime":"2025-12-06T06:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.308801 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.308849 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.308859 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.308878 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.308888 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:47Z","lastTransitionTime":"2025-12-06T06:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.411602 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.411884 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.411962 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.412032 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.412096 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:47Z","lastTransitionTime":"2025-12-06T06:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.442917 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.442991 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.443086 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:47 crc kubenswrapper[4954]: E1206 06:58:47.443081 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:47 crc kubenswrapper[4954]: E1206 06:58:47.443195 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:47 crc kubenswrapper[4954]: E1206 06:58:47.443270 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.515088 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.515134 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.515146 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.515165 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.515177 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:47Z","lastTransitionTime":"2025-12-06T06:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.618140 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.618186 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.618197 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.618217 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.618228 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:47Z","lastTransitionTime":"2025-12-06T06:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.722043 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.722109 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.722121 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.722146 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.722159 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:47Z","lastTransitionTime":"2025-12-06T06:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.809969 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.810025 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.810039 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.810061 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.810075 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:47Z","lastTransitionTime":"2025-12-06T06:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.834510 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.834549 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.834579 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.834599 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.834610 4954 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T06:58:47Z","lastTransitionTime":"2025-12-06T06:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.870472 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml"] Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.870987 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.872944 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.873127 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.874596 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.877701 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.987740 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9c6ml\" (UID: \"ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.987831 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9c6ml\" (UID: \"ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.987889 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9c6ml\" (UID: \"ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.987922 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9c6ml\" (UID: \"ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" Dec 06 06:58:47 crc kubenswrapper[4954]: I1206 06:58:47.988096 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9c6ml\" (UID: \"ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" Dec 06 06:58:48 crc kubenswrapper[4954]: I1206 06:58:48.089490 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9c6ml\" (UID: \"ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" Dec 06 06:58:48 crc kubenswrapper[4954]: I1206 06:58:48.089557 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9c6ml\" (UID: \"ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" Dec 06 06:58:48 crc kubenswrapper[4954]: I1206 06:58:48.089651 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9c6ml\" (UID: \"ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" Dec 06 06:58:48 crc kubenswrapper[4954]: I1206 06:58:48.089708 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9c6ml\" (UID: \"ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" Dec 06 06:58:48 crc kubenswrapper[4954]: I1206 06:58:48.089730 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9c6ml\" (UID: \"ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" Dec 06 06:58:48 crc kubenswrapper[4954]: I1206 06:58:48.090003 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9c6ml\" (UID: \"ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" Dec 06 06:58:48 crc kubenswrapper[4954]: I1206 06:58:48.090062 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9c6ml\" (UID: \"ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" Dec 06 06:58:48 crc kubenswrapper[4954]: I1206 06:58:48.090694 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9c6ml\" (UID: \"ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" Dec 06 06:58:48 crc kubenswrapper[4954]: I1206 06:58:48.096883 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9c6ml\" (UID: \"ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" Dec 06 06:58:48 crc kubenswrapper[4954]: I1206 06:58:48.112253 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9c6ml\" (UID: \"ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" Dec 06 06:58:48 crc kubenswrapper[4954]: I1206 06:58:48.188114 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" Dec 06 06:58:48 crc kubenswrapper[4954]: I1206 06:58:48.443485 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:48 crc kubenswrapper[4954]: E1206 06:58:48.444327 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:49 crc kubenswrapper[4954]: I1206 06:58:49.036744 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" event={"ID":"ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb","Type":"ContainerStarted","Data":"b69a3a23ea50256eb6fe1331f43fff1c717959bb5bed8840a9e311264427211d"} Dec 06 06:58:49 crc kubenswrapper[4954]: I1206 06:58:49.036813 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" event={"ID":"ef3f3e81-fa79-41dd-b673-59d6bb7fe3bb","Type":"ContainerStarted","Data":"1fd93785025a361ae80f899343a8e0753d743b983922e8e95fdb5dd575db26fb"} Dec 06 06:58:49 crc kubenswrapper[4954]: I1206 06:58:49.057027 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9c6ml" podStartSLOduration=76.056986415 podStartE2EDuration="1m16.056986415s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:58:49.055104702 +0000 UTC m=+103.868464111" watchObservedRunningTime="2025-12-06 06:58:49.056986415 +0000 UTC m=+103.870345834" Dec 06 06:58:49 crc kubenswrapper[4954]: I1206 06:58:49.442926 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:49 crc kubenswrapper[4954]: I1206 06:58:49.443030 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:49 crc kubenswrapper[4954]: I1206 06:58:49.443097 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:49 crc kubenswrapper[4954]: E1206 06:58:49.443107 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:49 crc kubenswrapper[4954]: E1206 06:58:49.443266 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:49 crc kubenswrapper[4954]: E1206 06:58:49.443372 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:49 crc kubenswrapper[4954]: I1206 06:58:49.458824 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 06 06:58:50 crc kubenswrapper[4954]: I1206 06:58:50.443309 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:50 crc kubenswrapper[4954]: E1206 06:58:50.443629 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:51 crc kubenswrapper[4954]: I1206 06:58:51.443362 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:51 crc kubenswrapper[4954]: I1206 06:58:51.443515 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:51 crc kubenswrapper[4954]: E1206 06:58:51.443737 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:51 crc kubenswrapper[4954]: E1206 06:58:51.443934 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:51 crc kubenswrapper[4954]: I1206 06:58:51.443978 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:51 crc kubenswrapper[4954]: E1206 06:58:51.444162 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:52 crc kubenswrapper[4954]: I1206 06:58:52.037810 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs\") pod \"network-metrics-daemon-vtxfz\" (UID: \"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\") " pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:52 crc kubenswrapper[4954]: E1206 06:58:52.038016 4954 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 06:58:52 crc kubenswrapper[4954]: E1206 06:58:52.038130 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs podName:9377db43-9e5b-41e9-a9bc-f5fe3a81a457 nodeName:}" failed. No retries permitted until 2025-12-06 06:59:56.038097529 +0000 UTC m=+170.851456928 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs") pod "network-metrics-daemon-vtxfz" (UID: "9377db43-9e5b-41e9-a9bc-f5fe3a81a457") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 06:58:52 crc kubenswrapper[4954]: I1206 06:58:52.442885 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:52 crc kubenswrapper[4954]: E1206 06:58:52.443107 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:53 crc kubenswrapper[4954]: I1206 06:58:53.442716 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:53 crc kubenswrapper[4954]: I1206 06:58:53.442936 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:53 crc kubenswrapper[4954]: E1206 06:58:53.443121 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:53 crc kubenswrapper[4954]: I1206 06:58:53.443142 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:53 crc kubenswrapper[4954]: E1206 06:58:53.443672 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:53 crc kubenswrapper[4954]: E1206 06:58:53.443540 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:54 crc kubenswrapper[4954]: I1206 06:58:54.443057 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:54 crc kubenswrapper[4954]: E1206 06:58:54.443274 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:55 crc kubenswrapper[4954]: I1206 06:58:55.442502 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:55 crc kubenswrapper[4954]: I1206 06:58:55.442538 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:55 crc kubenswrapper[4954]: I1206 06:58:55.442822 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:55 crc kubenswrapper[4954]: E1206 06:58:55.444376 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:55 crc kubenswrapper[4954]: E1206 06:58:55.444551 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:55 crc kubenswrapper[4954]: E1206 06:58:55.445414 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:55 crc kubenswrapper[4954]: I1206 06:58:55.446525 4954 scope.go:117] "RemoveContainer" containerID="eb1edbcb12fbf2d933be983b2d4ed0eef36eb346d6d792ed44e67c0f3377b7f1" Dec 06 06:58:55 crc kubenswrapper[4954]: E1206 06:58:55.446768 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-crz6w_openshift-ovn-kubernetes(7cc429b1-3932-4515-a2b8-f0dd601f3e4c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" Dec 06 06:58:55 crc kubenswrapper[4954]: I1206 06:58:55.467419 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=6.467347781 podStartE2EDuration="6.467347781s" podCreationTimestamp="2025-12-06 06:58:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:58:55.463700877 +0000 UTC m=+110.277060276" watchObservedRunningTime="2025-12-06 06:58:55.467347781 +0000 UTC m=+110.280707230" Dec 06 06:58:56 crc kubenswrapper[4954]: I1206 06:58:56.442370 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:56 crc kubenswrapper[4954]: E1206 06:58:56.442509 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:57 crc kubenswrapper[4954]: I1206 06:58:57.443252 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:57 crc kubenswrapper[4954]: I1206 06:58:57.443276 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:57 crc kubenswrapper[4954]: E1206 06:58:57.443530 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:58:57 crc kubenswrapper[4954]: E1206 06:58:57.443702 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:57 crc kubenswrapper[4954]: I1206 06:58:57.443934 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:57 crc kubenswrapper[4954]: E1206 06:58:57.444024 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:58 crc kubenswrapper[4954]: I1206 06:58:58.443296 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:58:58 crc kubenswrapper[4954]: E1206 06:58:58.443681 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:58:59 crc kubenswrapper[4954]: I1206 06:58:59.442822 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:58:59 crc kubenswrapper[4954]: I1206 06:58:59.442888 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:58:59 crc kubenswrapper[4954]: I1206 06:58:59.443005 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:58:59 crc kubenswrapper[4954]: E1206 06:58:59.443131 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:58:59 crc kubenswrapper[4954]: E1206 06:58:59.443255 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:58:59 crc kubenswrapper[4954]: E1206 06:58:59.443439 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:59:00 crc kubenswrapper[4954]: I1206 06:59:00.443263 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:59:00 crc kubenswrapper[4954]: E1206 06:59:00.443471 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:59:01 crc kubenswrapper[4954]: I1206 06:59:01.443494 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:01 crc kubenswrapper[4954]: E1206 06:59:01.443797 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:59:01 crc kubenswrapper[4954]: I1206 06:59:01.443319 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:59:01 crc kubenswrapper[4954]: I1206 06:59:01.444187 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:59:01 crc kubenswrapper[4954]: E1206 06:59:01.444462 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:59:01 crc kubenswrapper[4954]: E1206 06:59:01.444811 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:59:02 crc kubenswrapper[4954]: I1206 06:59:02.442790 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:59:02 crc kubenswrapper[4954]: E1206 06:59:02.442960 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:59:03 crc kubenswrapper[4954]: I1206 06:59:03.442803 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:59:03 crc kubenswrapper[4954]: I1206 06:59:03.442920 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:03 crc kubenswrapper[4954]: E1206 06:59:03.443152 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:59:03 crc kubenswrapper[4954]: I1206 06:59:03.443193 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:59:03 crc kubenswrapper[4954]: E1206 06:59:03.443357 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:59:03 crc kubenswrapper[4954]: E1206 06:59:03.443532 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:59:04 crc kubenswrapper[4954]: I1206 06:59:04.442708 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:59:04 crc kubenswrapper[4954]: E1206 06:59:04.443081 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:59:05 crc kubenswrapper[4954]: E1206 06:59:05.403725 4954 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 06 06:59:05 crc kubenswrapper[4954]: I1206 06:59:05.442697 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:59:05 crc kubenswrapper[4954]: I1206 06:59:05.442771 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:05 crc kubenswrapper[4954]: I1206 06:59:05.442814 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:59:05 crc kubenswrapper[4954]: E1206 06:59:05.443801 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:59:05 crc kubenswrapper[4954]: E1206 06:59:05.443903 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:59:05 crc kubenswrapper[4954]: E1206 06:59:05.443998 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:59:06 crc kubenswrapper[4954]: E1206 06:59:06.441545 4954 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 06 06:59:06 crc kubenswrapper[4954]: I1206 06:59:06.442328 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:59:06 crc kubenswrapper[4954]: E1206 06:59:06.442556 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:59:07 crc kubenswrapper[4954]: I1206 06:59:07.105593 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rsvgk_1d174f37-f89e-4daf-a663-3cad4e33dad2/kube-multus/1.log" Dec 06 06:59:07 crc kubenswrapper[4954]: I1206 06:59:07.106032 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rsvgk_1d174f37-f89e-4daf-a663-3cad4e33dad2/kube-multus/0.log" Dec 06 06:59:07 crc kubenswrapper[4954]: I1206 06:59:07.106072 4954 generic.go:334] "Generic (PLEG): container finished" podID="1d174f37-f89e-4daf-a663-3cad4e33dad2" containerID="2fc97358db525a0093a6b1350179318aec4b2da84f919bbdf3f2e5a56205a363" exitCode=1 Dec 06 06:59:07 crc kubenswrapper[4954]: I1206 06:59:07.106108 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rsvgk" event={"ID":"1d174f37-f89e-4daf-a663-3cad4e33dad2","Type":"ContainerDied","Data":"2fc97358db525a0093a6b1350179318aec4b2da84f919bbdf3f2e5a56205a363"} Dec 06 06:59:07 crc kubenswrapper[4954]: I1206 06:59:07.106150 4954 scope.go:117] "RemoveContainer" containerID="fd11bc1ea8c25f9275ae80a34b631d3638d25babbc9b3b7db9aa378e5812ae76" Dec 06 06:59:07 crc kubenswrapper[4954]: I1206 06:59:07.106544 4954 scope.go:117] "RemoveContainer" containerID="2fc97358db525a0093a6b1350179318aec4b2da84f919bbdf3f2e5a56205a363" Dec 06 06:59:07 crc kubenswrapper[4954]: E1206 06:59:07.106732 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-rsvgk_openshift-multus(1d174f37-f89e-4daf-a663-3cad4e33dad2)\"" pod="openshift-multus/multus-rsvgk" podUID="1d174f37-f89e-4daf-a663-3cad4e33dad2" Dec 06 06:59:07 crc kubenswrapper[4954]: I1206 06:59:07.442740 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:59:07 crc kubenswrapper[4954]: E1206 06:59:07.443870 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:59:07 crc kubenswrapper[4954]: I1206 06:59:07.443125 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:07 crc kubenswrapper[4954]: E1206 06:59:07.444169 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:59:07 crc kubenswrapper[4954]: I1206 06:59:07.442740 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:59:07 crc kubenswrapper[4954]: E1206 06:59:07.444454 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:59:08 crc kubenswrapper[4954]: I1206 06:59:08.111855 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rsvgk_1d174f37-f89e-4daf-a663-3cad4e33dad2/kube-multus/1.log" Dec 06 06:59:08 crc kubenswrapper[4954]: I1206 06:59:08.442553 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:59:08 crc kubenswrapper[4954]: E1206 06:59:08.442714 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:59:08 crc kubenswrapper[4954]: I1206 06:59:08.443389 4954 scope.go:117] "RemoveContainer" containerID="eb1edbcb12fbf2d933be983b2d4ed0eef36eb346d6d792ed44e67c0f3377b7f1" Dec 06 06:59:08 crc kubenswrapper[4954]: E1206 06:59:08.443607 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-crz6w_openshift-ovn-kubernetes(7cc429b1-3932-4515-a2b8-f0dd601f3e4c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" Dec 06 06:59:09 crc kubenswrapper[4954]: I1206 06:59:09.442785 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:59:09 crc kubenswrapper[4954]: I1206 06:59:09.442856 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:09 crc kubenswrapper[4954]: I1206 06:59:09.442863 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:59:09 crc kubenswrapper[4954]: E1206 06:59:09.442944 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:59:09 crc kubenswrapper[4954]: E1206 06:59:09.443039 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:59:09 crc kubenswrapper[4954]: E1206 06:59:09.443157 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:59:10 crc kubenswrapper[4954]: I1206 06:59:10.443009 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:59:10 crc kubenswrapper[4954]: E1206 06:59:10.443200 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:59:11 crc kubenswrapper[4954]: I1206 06:59:11.442475 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:59:11 crc kubenswrapper[4954]: E1206 06:59:11.442482 4954 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 06 06:59:11 crc kubenswrapper[4954]: E1206 06:59:11.442661 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:59:11 crc kubenswrapper[4954]: I1206 06:59:11.442774 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:11 crc kubenswrapper[4954]: I1206 06:59:11.442941 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:59:11 crc kubenswrapper[4954]: E1206 06:59:11.442952 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:59:11 crc kubenswrapper[4954]: E1206 06:59:11.443009 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:59:12 crc kubenswrapper[4954]: I1206 06:59:12.443199 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:59:12 crc kubenswrapper[4954]: E1206 06:59:12.443401 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:59:13 crc kubenswrapper[4954]: I1206 06:59:13.443107 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:59:13 crc kubenswrapper[4954]: E1206 06:59:13.443288 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:59:13 crc kubenswrapper[4954]: I1206 06:59:13.443139 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:59:13 crc kubenswrapper[4954]: I1206 06:59:13.443730 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:13 crc kubenswrapper[4954]: E1206 06:59:13.443815 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:59:13 crc kubenswrapper[4954]: E1206 06:59:13.443796 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:59:14 crc kubenswrapper[4954]: I1206 06:59:14.443042 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:59:14 crc kubenswrapper[4954]: E1206 06:59:14.443226 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:59:15 crc kubenswrapper[4954]: I1206 06:59:15.442623 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:15 crc kubenswrapper[4954]: I1206 06:59:15.442623 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:59:15 crc kubenswrapper[4954]: I1206 06:59:15.442641 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:59:15 crc kubenswrapper[4954]: E1206 06:59:15.444614 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:59:15 crc kubenswrapper[4954]: E1206 06:59:15.444833 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:59:15 crc kubenswrapper[4954]: E1206 06:59:15.444932 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:59:16 crc kubenswrapper[4954]: I1206 06:59:16.442282 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:59:16 crc kubenswrapper[4954]: E1206 06:59:16.442474 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:59:16 crc kubenswrapper[4954]: E1206 06:59:16.443308 4954 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 06 06:59:17 crc kubenswrapper[4954]: I1206 06:59:17.442436 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:59:17 crc kubenswrapper[4954]: I1206 06:59:17.442489 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:17 crc kubenswrapper[4954]: I1206 06:59:17.442436 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:59:17 crc kubenswrapper[4954]: E1206 06:59:17.442649 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:59:17 crc kubenswrapper[4954]: E1206 06:59:17.442897 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:59:17 crc kubenswrapper[4954]: E1206 06:59:17.442939 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:59:18 crc kubenswrapper[4954]: I1206 06:59:18.442740 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:59:18 crc kubenswrapper[4954]: E1206 06:59:18.443035 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:59:18 crc kubenswrapper[4954]: I1206 06:59:18.443187 4954 scope.go:117] "RemoveContainer" containerID="2fc97358db525a0093a6b1350179318aec4b2da84f919bbdf3f2e5a56205a363" Dec 06 06:59:19 crc kubenswrapper[4954]: I1206 06:59:19.443007 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:19 crc kubenswrapper[4954]: E1206 06:59:19.443178 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:59:19 crc kubenswrapper[4954]: I1206 06:59:19.443230 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:59:19 crc kubenswrapper[4954]: I1206 06:59:19.443285 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:59:19 crc kubenswrapper[4954]: E1206 06:59:19.443653 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:59:19 crc kubenswrapper[4954]: E1206 06:59:19.443750 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:59:19 crc kubenswrapper[4954]: I1206 06:59:19.444058 4954 scope.go:117] "RemoveContainer" containerID="eb1edbcb12fbf2d933be983b2d4ed0eef36eb346d6d792ed44e67c0f3377b7f1" Dec 06 06:59:20 crc kubenswrapper[4954]: I1206 06:59:20.155352 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rsvgk_1d174f37-f89e-4daf-a663-3cad4e33dad2/kube-multus/1.log" Dec 06 06:59:20 crc kubenswrapper[4954]: I1206 06:59:20.155788 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rsvgk" event={"ID":"1d174f37-f89e-4daf-a663-3cad4e33dad2","Type":"ContainerStarted","Data":"927b002a44fb819e93b6e6a1f9d1406a9e6322216198277c321d2d21450973f5"} Dec 06 06:59:20 crc kubenswrapper[4954]: I1206 06:59:20.442985 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:59:20 crc kubenswrapper[4954]: E1206 06:59:20.443138 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:59:21 crc kubenswrapper[4954]: I1206 06:59:21.162612 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovnkube-controller/3.log" Dec 06 06:59:21 crc kubenswrapper[4954]: I1206 06:59:21.166042 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerStarted","Data":"0c9ec58f7b3b34bd066406e31183b463197f7ff4457112cb1e17a7f40e20eadb"} Dec 06 06:59:21 crc kubenswrapper[4954]: I1206 06:59:21.166735 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:59:21 crc kubenswrapper[4954]: I1206 06:59:21.218097 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" podStartSLOduration=108.218078351 podStartE2EDuration="1m48.218078351s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:21.217145006 +0000 UTC m=+136.030504395" watchObservedRunningTime="2025-12-06 06:59:21.218078351 +0000 UTC m=+136.031437740" Dec 06 06:59:21 crc kubenswrapper[4954]: I1206 06:59:21.247123 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vtxfz"] Dec 06 06:59:21 crc kubenswrapper[4954]: I1206 06:59:21.247306 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:59:21 crc kubenswrapper[4954]: E1206 06:59:21.247433 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:59:21 crc kubenswrapper[4954]: I1206 06:59:21.442977 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:21 crc kubenswrapper[4954]: E1206 06:59:21.443184 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:59:21 crc kubenswrapper[4954]: I1206 06:59:21.443763 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:59:21 crc kubenswrapper[4954]: E1206 06:59:21.443854 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:59:21 crc kubenswrapper[4954]: E1206 06:59:21.444238 4954 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 06 06:59:22 crc kubenswrapper[4954]: I1206 06:59:22.442462 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:59:22 crc kubenswrapper[4954]: I1206 06:59:22.442523 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:59:22 crc kubenswrapper[4954]: E1206 06:59:22.442637 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:59:22 crc kubenswrapper[4954]: E1206 06:59:22.442758 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:59:23 crc kubenswrapper[4954]: I1206 06:59:23.442814 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:59:23 crc kubenswrapper[4954]: I1206 06:59:23.442865 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:23 crc kubenswrapper[4954]: E1206 06:59:23.443012 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:59:23 crc kubenswrapper[4954]: E1206 06:59:23.443172 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:59:24 crc kubenswrapper[4954]: I1206 06:59:24.443345 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:59:24 crc kubenswrapper[4954]: I1206 06:59:24.443345 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:59:24 crc kubenswrapper[4954]: E1206 06:59:24.443608 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:59:24 crc kubenswrapper[4954]: E1206 06:59:24.443780 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:59:25 crc kubenswrapper[4954]: I1206 06:59:25.442678 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:59:25 crc kubenswrapper[4954]: E1206 06:59:25.444088 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 06:59:25 crc kubenswrapper[4954]: I1206 06:59:25.444196 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:25 crc kubenswrapper[4954]: E1206 06:59:25.444463 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 06:59:26 crc kubenswrapper[4954]: I1206 06:59:26.442773 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:59:26 crc kubenswrapper[4954]: I1206 06:59:26.442808 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:59:26 crc kubenswrapper[4954]: E1206 06:59:26.443087 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 06:59:26 crc kubenswrapper[4954]: E1206 06:59:26.443165 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtxfz" podUID="9377db43-9e5b-41e9-a9bc-f5fe3a81a457" Dec 06 06:59:27 crc kubenswrapper[4954]: I1206 06:59:27.442933 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:59:27 crc kubenswrapper[4954]: I1206 06:59:27.443007 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:27 crc kubenswrapper[4954]: I1206 06:59:27.445723 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 06 06:59:27 crc kubenswrapper[4954]: I1206 06:59:27.445949 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 06 06:59:27 crc kubenswrapper[4954]: I1206 06:59:27.446271 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 06 06:59:27 crc kubenswrapper[4954]: I1206 06:59:27.448213 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.442510 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.442510 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.444863 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.444892 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.597247 4954 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.638756 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6bfbl"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.639446 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.639884 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.640266 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.658417 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.672513 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.673289 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.673581 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.673724 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.673884 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.674038 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.674298 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.674485 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-z6mbx"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.674891 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.674981 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5g5d4"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.675025 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.675118 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.675172 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-428jw"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.675224 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.675323 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.675427 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.675455 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-428jw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.675916 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jzdxj"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.676127 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-z6mbx" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.676370 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.676414 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.677054 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.673877 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.678092 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.680882 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.681817 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.682711 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.683282 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qlrf9"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.683996 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.684349 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.685125 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.685709 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.686883 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-56lst"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.687702 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-56lst" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.688260 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.688410 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7p9tp"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.688911 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.688956 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-7p9tp" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.689654 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.690955 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-z4t2h"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.691541 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.692183 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.692843 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pm69p"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.693306 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pm69p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.694187 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.694629 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.695325 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ljfkb"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.696828 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccjjn"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.697068 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ljfkb" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.697537 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccjjn" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.700660 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g89rd"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.701460 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g89rd" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.702434 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.703640 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-gwzmq"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.704251 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cnfn5"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.704612 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cnfn5" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.704647 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.704965 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.705306 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.724366 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.724797 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.725280 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.725392 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.726811 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.729149 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-27pnw"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.732088 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.734128 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-27pnw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.758964 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c560e73e-7742-4248-8b43-c398e456d967-machine-approver-tls\") pod \"machine-approver-56656f9798-7dd7m\" (UID: \"c560e73e-7742-4248-8b43-c398e456d967\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.760476 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-k5tts"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.760511 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c560e73e-7742-4248-8b43-c398e456d967-config\") pod \"machine-approver-56656f9798-7dd7m\" (UID: \"c560e73e-7742-4248-8b43-c398e456d967\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.760581 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/392a92d5-3d71-4d5e-9bf3-d1231e3b76da-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-27pnw\" (UID: \"392a92d5-3d71-4d5e-9bf3-d1231e3b76da\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-27pnw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.760618 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/392a92d5-3d71-4d5e-9bf3-d1231e3b76da-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-27pnw\" (UID: \"392a92d5-3d71-4d5e-9bf3-d1231e3b76da\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-27pnw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.760654 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/07ff0de8-076e-4536-a897-5bec6fcc7592-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ljfkb\" (UID: \"07ff0de8-076e-4536-a897-5bec6fcc7592\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ljfkb" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.760671 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a835af21-2fbc-46ef-b961-9d111dc803b1-profile-collector-cert\") pod \"catalog-operator-68c6474976-4t69k\" (UID: \"a835af21-2fbc-46ef-b961-9d111dc803b1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.760691 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6aa6d1ce-12b4-4f90-8c8a-403035535e60-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-428jw\" (UID: \"6aa6d1ce-12b4-4f90-8c8a-403035535e60\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-428jw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.760726 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa6d1ce-12b4-4f90-8c8a-403035535e60-config\") pod \"openshift-apiserver-operator-796bbdcf4f-428jw\" (UID: \"6aa6d1ce-12b4-4f90-8c8a-403035535e60\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-428jw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.760754 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95bkb\" (UniqueName: \"kubernetes.io/projected/a835af21-2fbc-46ef-b961-9d111dc803b1-kube-api-access-95bkb\") pod \"catalog-operator-68c6474976-4t69k\" (UID: \"a835af21-2fbc-46ef-b961-9d111dc803b1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.760773 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a835af21-2fbc-46ef-b961-9d111dc803b1-srv-cert\") pod \"catalog-operator-68c6474976-4t69k\" (UID: \"a835af21-2fbc-46ef-b961-9d111dc803b1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.760806 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6brdg\" (UniqueName: \"kubernetes.io/projected/c560e73e-7742-4248-8b43-c398e456d967-kube-api-access-6brdg\") pod \"machine-approver-56656f9798-7dd7m\" (UID: \"c560e73e-7742-4248-8b43-c398e456d967\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.760831 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c560e73e-7742-4248-8b43-c398e456d967-auth-proxy-config\") pod \"machine-approver-56656f9798-7dd7m\" (UID: \"c560e73e-7742-4248-8b43-c398e456d967\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.760860 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z72c6\" (UniqueName: \"kubernetes.io/projected/6aa6d1ce-12b4-4f90-8c8a-403035535e60-kube-api-access-z72c6\") pod \"openshift-apiserver-operator-796bbdcf4f-428jw\" (UID: \"6aa6d1ce-12b4-4f90-8c8a-403035535e60\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-428jw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.760892 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/392a92d5-3d71-4d5e-9bf3-d1231e3b76da-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-27pnw\" (UID: \"392a92d5-3d71-4d5e-9bf3-d1231e3b76da\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-27pnw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.760917 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.761174 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.761360 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.760917 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hsfz\" (UniqueName: \"kubernetes.io/projected/07ff0de8-076e-4536-a897-5bec6fcc7592-kube-api-access-2hsfz\") pod \"cluster-samples-operator-665b6dd947-ljfkb\" (UID: \"07ff0de8-076e-4536-a897-5bec6fcc7592\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ljfkb" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.761669 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.761369 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.761869 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.761764 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.761820 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.769037 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5sc28"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.769183 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.769557 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.770180 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5sc28" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.770213 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.770418 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.770649 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.770909 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.771116 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.771276 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.771356 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.771436 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.771662 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.771772 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.771822 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.771849 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.771960 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.772167 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.772298 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.772411 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.772513 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.772636 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.772749 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.772834 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.772911 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.773010 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.773654 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.773827 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.777556 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.778036 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.778356 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.778453 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.778343 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.778773 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.779043 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.779192 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.779242 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.779341 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.779261 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.779486 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.779520 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.779520 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.779790 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.779816 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.779852 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.779881 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.779906 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.779831 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.780062 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.780103 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.780144 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.780181 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.779837 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.780259 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.780294 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.780074 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.780411 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.780514 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.780702 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.782486 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.783888 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.787298 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.788138 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bb"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.793834 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.794012 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.794339 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.795268 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zpgsz"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.795373 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.795778 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.796351 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p752"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.796928 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.797161 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-29x4r"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.797345 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bb" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.797799 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-zpgsz" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.797839 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.798014 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.798126 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.798263 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.798304 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq6gx"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.798311 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p752" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.812046 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.812472 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq6gx" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.817457 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.825250 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6gq5q"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.828643 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.833737 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6gq5q" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.834363 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.834605 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.837616 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bbfcd"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.839537 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bbfcd" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.840508 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.840940 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.841839 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.843741 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-j2cx5"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.845027 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.845349 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.845693 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.845707 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sc4fg"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.846783 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-j2cx5" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.845722 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.847333 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.847656 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.847744 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.849043 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qlrf9"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.850027 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-428jw"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.851144 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5g5d4"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.852152 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.852484 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-z6mbx"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.853712 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6bfbl"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.854934 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-d7n2g"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.856477 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.856717 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.857348 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ljfkb"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.858529 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jzdxj"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.862478 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5sc28"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.862833 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.864301 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21436169-5d86-482f-8277-dd88780f2b68-serving-cert\") pod \"authentication-operator-69f744f599-qlrf9\" (UID: \"21436169-5d86-482f-8277-dd88780f2b68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.864357 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11bae354-1c41-442d-820c-d3cd3fa537d8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jzdxj\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.864398 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67-tmpfs\") pod \"packageserver-d55dfcdfc-4x2sh\" (UID: \"e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.864870 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dcsv\" (UniqueName: \"kubernetes.io/projected/dcb356df-9cbe-46e4-a7c1-584a55224bdf-kube-api-access-4dcsv\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.864964 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.865045 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f4fb523f-9022-4352-993a-14844194617e-signing-key\") pod \"service-ca-9c57cc56f-j2cx5\" (UID: \"f4fb523f-9022-4352-993a-14844194617e\") " pod="openshift-service-ca/service-ca-9c57cc56f-j2cx5" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.865076 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c560e73e-7742-4248-8b43-c398e456d967-machine-approver-tls\") pod \"machine-approver-56656f9798-7dd7m\" (UID: \"c560e73e-7742-4248-8b43-c398e456d967\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.865184 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs7r6\" (UniqueName: \"kubernetes.io/projected/891188e6-3c26-44de-84b2-6585f0d5e7dd-kube-api-access-zs7r6\") pod \"downloads-7954f5f757-56lst\" (UID: \"891188e6-3c26-44de-84b2-6585f0d5e7dd\") " pod="openshift-console/downloads-7954f5f757-56lst" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.865207 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e54ca3bc-3cac-4f64-a27b-6e0f899f16b7-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bt8q4\" (UID: \"e54ca3bc-3cac-4f64-a27b-6e0f899f16b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.865226 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5678dfe-78f2-40b2-8673-8ae5c96bc282-audit-dir\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.865245 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d4e3a12-5fd4-4796-8eab-70392f7cf809-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cnfn5\" (UID: \"7d4e3a12-5fd4-4796-8eab-70392f7cf809\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cnfn5" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.865345 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.865368 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7-srv-cert\") pod \"olm-operator-6b444d44fb-w26lv\" (UID: \"aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.866656 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.866715 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21436169-5d86-482f-8277-dd88780f2b68-service-ca-bundle\") pod \"authentication-operator-69f744f599-qlrf9\" (UID: \"21436169-5d86-482f-8277-dd88780f2b68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.866794 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkjnt\" (UniqueName: \"kubernetes.io/projected/5e9d36ab-4a09-4275-9732-cd9bd681a917-kube-api-access-wkjnt\") pod \"control-plane-machine-set-operator-78cbb6b69f-wq6gx\" (UID: \"5e9d36ab-4a09-4275-9732-cd9bd681a917\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq6gx" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.866864 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kjkm\" (UniqueName: \"kubernetes.io/projected/f9ff9691-e91d-48ba-9ee9-de7807534c6e-kube-api-access-8kjkm\") pod \"csi-hostpathplugin-d7n2g\" (UID: \"f9ff9691-e91d-48ba-9ee9-de7807534c6e\") " pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.866915 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkr5c\" (UniqueName: \"kubernetes.io/projected/2b823812-b773-4c33-9e75-55395275621d-kube-api-access-mkr5c\") pod \"package-server-manager-789f6589d5-7p752\" (UID: \"2b823812-b773-4c33-9e75-55395275621d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p752" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.866987 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d4e3a12-5fd4-4796-8eab-70392f7cf809-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cnfn5\" (UID: \"7d4e3a12-5fd4-4796-8eab-70392f7cf809\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cnfn5" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.867035 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dc8c8cf2-372a-4606-b6ab-0781d4326602-etcd-client\") pod \"etcd-operator-b45778765-gwzmq\" (UID: \"dc8c8cf2-372a-4606-b6ab-0781d4326602\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.867091 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkxkt\" (UniqueName: \"kubernetes.io/projected/b5678dfe-78f2-40b2-8673-8ae5c96bc282-kube-api-access-dkxkt\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.867119 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dcb356df-9cbe-46e4-a7c1-584a55224bdf-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.867148 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9078eab8-cd16-404e-a8e6-e02c60ddfe16-trusted-ca\") pod \"console-operator-58897d9998-7p9tp\" (UID: \"9078eab8-cd16-404e-a8e6-e02c60ddfe16\") " pod="openshift-console-operator/console-operator-58897d9998-7p9tp" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.867176 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcb356df-9cbe-46e4-a7c1-584a55224bdf-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.867225 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.867310 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-service-ca\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.867339 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-trusted-ca-bundle\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.867364 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e54ca3bc-3cac-4f64-a27b-6e0f899f16b7-trusted-ca\") pod \"ingress-operator-5b745b69d9-bt8q4\" (UID: \"e54ca3bc-3cac-4f64-a27b-6e0f899f16b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.867444 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/392a92d5-3d71-4d5e-9bf3-d1231e3b76da-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-27pnw\" (UID: \"392a92d5-3d71-4d5e-9bf3-d1231e3b76da\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-27pnw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.867577 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jctq\" (UniqueName: \"kubernetes.io/projected/e54ca3bc-3cac-4f64-a27b-6e0f899f16b7-kube-api-access-6jctq\") pod \"ingress-operator-5b745b69d9-bt8q4\" (UID: \"e54ca3bc-3cac-4f64-a27b-6e0f899f16b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.867794 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/392a92d5-3d71-4d5e-9bf3-d1231e3b76da-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-27pnw\" (UID: \"392a92d5-3d71-4d5e-9bf3-d1231e3b76da\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-27pnw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.867849 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21436169-5d86-482f-8277-dd88780f2b68-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qlrf9\" (UID: \"21436169-5d86-482f-8277-dd88780f2b68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.867932 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv85r\" (UniqueName: \"kubernetes.io/projected/aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7-kube-api-access-hv85r\") pod \"olm-operator-6b444d44fb-w26lv\" (UID: \"aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.867987 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.867996 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11bae354-1c41-442d-820c-d3cd3fa537d8-client-ca\") pod \"controller-manager-879f6c89f-jzdxj\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868071 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f2c25db-28aa-4b19-8f9a-21b78e02089f-config\") pod \"machine-api-operator-5694c8668f-z6mbx\" (UID: \"2f2c25db-28aa-4b19-8f9a-21b78e02089f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z6mbx" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868126 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcb356df-9cbe-46e4-a7c1-584a55224bdf-serving-cert\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868151 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzlrd\" (UniqueName: \"kubernetes.io/projected/e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67-kube-api-access-wzlrd\") pod \"packageserver-d55dfcdfc-4x2sh\" (UID: \"e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868181 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a835af21-2fbc-46ef-b961-9d111dc803b1-profile-collector-cert\") pod \"catalog-operator-68c6474976-4t69k\" (UID: \"a835af21-2fbc-46ef-b961-9d111dc803b1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868209 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b5678dfe-78f2-40b2-8673-8ae5c96bc282-node-pullsecrets\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868234 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dc8c8cf2-372a-4606-b6ab-0781d4326602-etcd-service-ca\") pod \"etcd-operator-b45778765-gwzmq\" (UID: \"dc8c8cf2-372a-4606-b6ab-0781d4326602\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868258 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-console-config\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868308 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42266b35-bab7-4577-8054-b55fa5c2dcc0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-g89rd\" (UID: \"42266b35-bab7-4577-8054-b55fa5c2dcc0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g89rd" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868334 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72tsc\" (UniqueName: \"kubernetes.io/projected/7d4e3a12-5fd4-4796-8eab-70392f7cf809-kube-api-access-72tsc\") pod \"kube-storage-version-migrator-operator-b67b599dd-cnfn5\" (UID: \"7d4e3a12-5fd4-4796-8eab-70392f7cf809\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cnfn5" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868366 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f9ff9691-e91d-48ba-9ee9-de7807534c6e-csi-data-dir\") pod \"csi-hostpathplugin-d7n2g\" (UID: \"f9ff9691-e91d-48ba-9ee9-de7807534c6e\") " pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868396 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11bae354-1c41-442d-820c-d3cd3fa537d8-config\") pod \"controller-manager-879f6c89f-jzdxj\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868427 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b823812-b773-4c33-9e75-55395275621d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7p752\" (UID: \"2b823812-b773-4c33-9e75-55395275621d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p752" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868468 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abc225d9-d805-4962-b7aa-71632d5cba7b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pm69p\" (UID: \"abc225d9-d805-4962-b7aa-71632d5cba7b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pm69p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868494 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dcb356df-9cbe-46e4-a7c1-584a55224bdf-audit-dir\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868526 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95bkb\" (UniqueName: \"kubernetes.io/projected/a835af21-2fbc-46ef-b961-9d111dc803b1-kube-api-access-95bkb\") pod \"catalog-operator-68c6474976-4t69k\" (UID: \"a835af21-2fbc-46ef-b961-9d111dc803b1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868553 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dcb356df-9cbe-46e4-a7c1-584a55224bdf-encryption-config\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868619 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f9ff9691-e91d-48ba-9ee9-de7807534c6e-plugins-dir\") pod \"csi-hostpathplugin-d7n2g\" (UID: \"f9ff9691-e91d-48ba-9ee9-de7807534c6e\") " pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868650 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0ba7b20e-fa15-4fcb-8755-9594b2084aa0-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8hvbg\" (UID: \"0ba7b20e-fa15-4fcb-8755-9594b2084aa0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.868678 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f9ff9691-e91d-48ba-9ee9-de7807534c6e-registration-dir\") pod \"csi-hostpathplugin-d7n2g\" (UID: \"f9ff9691-e91d-48ba-9ee9-de7807534c6e\") " pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.869150 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z72c6\" (UniqueName: \"kubernetes.io/projected/6aa6d1ce-12b4-4f90-8c8a-403035535e60-kube-api-access-z72c6\") pod \"openshift-apiserver-operator-796bbdcf4f-428jw\" (UID: \"6aa6d1ce-12b4-4f90-8c8a-403035535e60\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-428jw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.877239 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pm69p"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.877309 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zpgsz"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.877337 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-z4t2h"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.880066 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c560e73e-7742-4248-8b43-c398e456d967-auth-proxy-config\") pod \"machine-approver-56656f9798-7dd7m\" (UID: \"c560e73e-7742-4248-8b43-c398e456d967\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.880213 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7733adb-d709-4fee-bfc6-728721369b82-console-oauth-config\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.880293 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21436169-5d86-482f-8277-dd88780f2b68-config\") pod \"authentication-operator-69f744f599-qlrf9\" (UID: \"21436169-5d86-482f-8277-dd88780f2b68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.880428 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e54ca3bc-3cac-4f64-a27b-6e0f899f16b7-metrics-tls\") pod \"ingress-operator-5b745b69d9-bt8q4\" (UID: \"e54ca3bc-3cac-4f64-a27b-6e0f899f16b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.880538 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e9d36ab-4a09-4275-9732-cd9bd681a917-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wq6gx\" (UID: \"5e9d36ab-4a09-4275-9732-cd9bd681a917\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq6gx" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.880645 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b5678dfe-78f2-40b2-8673-8ae5c96bc282-etcd-client\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.881504 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c560e73e-7742-4248-8b43-c398e456d967-auth-proxy-config\") pod \"machine-approver-56656f9798-7dd7m\" (UID: \"c560e73e-7742-4248-8b43-c398e456d967\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.881653 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9078eab8-cd16-404e-a8e6-e02c60ddfe16-serving-cert\") pod \"console-operator-58897d9998-7p9tp\" (UID: \"9078eab8-cd16-404e-a8e6-e02c60ddfe16\") " pod="openshift-console-operator/console-operator-58897d9998-7p9tp" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.881742 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abc225d9-d805-4962-b7aa-71632d5cba7b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pm69p\" (UID: \"abc225d9-d805-4962-b7aa-71632d5cba7b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pm69p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.881778 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f2c25db-28aa-4b19-8f9a-21b78e02089f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-z6mbx\" (UID: \"2f2c25db-28aa-4b19-8f9a-21b78e02089f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z6mbx" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.881814 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0ba7b20e-fa15-4fcb-8755-9594b2084aa0-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8hvbg\" (UID: \"0ba7b20e-fa15-4fcb-8755-9594b2084aa0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.881918 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlctx\" (UniqueName: \"kubernetes.io/projected/76a9970f-0017-4289-b4d5-804bbf7b0e9d-kube-api-access-qlctx\") pod \"openshift-config-operator-7777fb866f-6wbv2\" (UID: \"76a9970f-0017-4289-b4d5-804bbf7b0e9d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.881967 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882001 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882035 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42266b35-bab7-4577-8054-b55fa5c2dcc0-config\") pod \"kube-controller-manager-operator-78b949d7b-g89rd\" (UID: \"42266b35-bab7-4577-8054-b55fa5c2dcc0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g89rd" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882094 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc225d9-d805-4962-b7aa-71632d5cba7b-config\") pod \"kube-apiserver-operator-766d6c64bb-pm69p\" (UID: \"abc225d9-d805-4962-b7aa-71632d5cba7b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pm69p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882134 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b5678dfe-78f2-40b2-8673-8ae5c96bc282-image-import-ca\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882128 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882188 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjb52\" (UniqueName: \"kubernetes.io/projected/21436169-5d86-482f-8277-dd88780f2b68-kube-api-access-tjb52\") pod \"authentication-operator-69f744f599-qlrf9\" (UID: \"21436169-5d86-482f-8277-dd88780f2b68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882295 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wpnb\" (UniqueName: \"kubernetes.io/projected/4c71517f-f5be-4508-8ce4-df43bd8700b7-kube-api-access-4wpnb\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882334 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56l7z\" (UniqueName: \"kubernetes.io/projected/2f2c25db-28aa-4b19-8f9a-21b78e02089f-kube-api-access-56l7z\") pod \"machine-api-operator-5694c8668f-z6mbx\" (UID: \"2f2c25db-28aa-4b19-8f9a-21b78e02089f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z6mbx" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882366 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-w26lv\" (UID: \"aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882388 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vvk2\" (UniqueName: \"kubernetes.io/projected/b7733adb-d709-4fee-bfc6-728721369b82-kube-api-access-6vvk2\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882439 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11bae354-1c41-442d-820c-d3cd3fa537d8-serving-cert\") pod \"controller-manager-879f6c89f-jzdxj\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882485 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5678dfe-78f2-40b2-8673-8ae5c96bc282-serving-cert\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882529 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f9ff9691-e91d-48ba-9ee9-de7807534c6e-mountpoint-dir\") pod \"csi-hostpathplugin-d7n2g\" (UID: \"f9ff9691-e91d-48ba-9ee9-de7807534c6e\") " pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882652 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ffb6c45-5d12-405d-8b73-9a54b4d0922f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ccjjn\" (UID: \"4ffb6c45-5d12-405d-8b73-9a54b4d0922f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccjjn" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882695 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c71517f-f5be-4508-8ce4-df43bd8700b7-audit-dir\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882776 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dcb356df-9cbe-46e4-a7c1-584a55224bdf-etcd-client\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882838 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c560e73e-7742-4248-8b43-c398e456d967-config\") pod \"machine-approver-56656f9798-7dd7m\" (UID: \"c560e73e-7742-4248-8b43-c398e456d967\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882872 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9x5w\" (UniqueName: \"kubernetes.io/projected/0ba7b20e-fa15-4fcb-8755-9594b2084aa0-kube-api-access-r9x5w\") pod \"cluster-image-registry-operator-dc59b4c8b-8hvbg\" (UID: \"0ba7b20e-fa15-4fcb-8755-9594b2084aa0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882897 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/76a9970f-0017-4289-b4d5-804bbf7b0e9d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6wbv2\" (UID: \"76a9970f-0017-4289-b4d5-804bbf7b0e9d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882921 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdpnr\" (UniqueName: \"kubernetes.io/projected/dc8c8cf2-372a-4606-b6ab-0781d4326602-kube-api-access-qdpnr\") pod \"etcd-operator-b45778765-gwzmq\" (UID: \"dc8c8cf2-372a-4606-b6ab-0781d4326602\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882941 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdsm4\" (UniqueName: \"kubernetes.io/projected/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-kube-api-access-xdsm4\") pod \"route-controller-manager-6576b87f9c-pvtl6\" (UID: \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882967 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b5678dfe-78f2-40b2-8673-8ae5c96bc282-encryption-config\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.882991 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dc8c8cf2-372a-4606-b6ab-0781d4326602-etcd-ca\") pod \"etcd-operator-b45778765-gwzmq\" (UID: \"dc8c8cf2-372a-4606-b6ab-0781d4326602\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.883012 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-client-ca\") pod \"route-controller-manager-6576b87f9c-pvtl6\" (UID: \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.883060 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-audit-policies\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.883085 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.883228 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/07ff0de8-076e-4536-a897-5bec6fcc7592-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ljfkb\" (UID: \"07ff0de8-076e-4536-a897-5bec6fcc7592\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ljfkb" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.883252 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-config\") pod \"route-controller-manager-6576b87f9c-pvtl6\" (UID: \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.883275 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67-webhook-cert\") pod \"packageserver-d55dfcdfc-4x2sh\" (UID: \"e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.883300 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6aa6d1ce-12b4-4f90-8c8a-403035535e60-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-428jw\" (UID: \"6aa6d1ce-12b4-4f90-8c8a-403035535e60\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-428jw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.883334 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c6hk\" (UniqueName: \"kubernetes.io/projected/11bae354-1c41-442d-820c-d3cd3fa537d8-kube-api-access-5c6hk\") pod \"controller-manager-879f6c89f-jzdxj\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.883358 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b5678dfe-78f2-40b2-8673-8ae5c96bc282-etcd-serving-ca\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.883378 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67-apiservice-cert\") pod \"packageserver-d55dfcdfc-4x2sh\" (UID: \"e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.883399 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhg8m\" (UniqueName: \"kubernetes.io/projected/f4fb523f-9022-4352-993a-14844194617e-kube-api-access-jhg8m\") pod \"service-ca-9c57cc56f-j2cx5\" (UID: \"f4fb523f-9022-4352-993a-14844194617e\") " pod="openshift-service-ca/service-ca-9c57cc56f-j2cx5" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.883864 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa6d1ce-12b4-4f90-8c8a-403035535e60-config\") pod \"openshift-apiserver-operator-796bbdcf4f-428jw\" (UID: \"6aa6d1ce-12b4-4f90-8c8a-403035535e60\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-428jw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.884209 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b5678dfe-78f2-40b2-8673-8ae5c96bc282-audit\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.884242 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7733adb-d709-4fee-bfc6-728721369b82-console-serving-cert\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.884271 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ffb6c45-5d12-405d-8b73-9a54b4d0922f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ccjjn\" (UID: \"4ffb6c45-5d12-405d-8b73-9a54b4d0922f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccjjn" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.884295 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dcb356df-9cbe-46e4-a7c1-584a55224bdf-audit-policies\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.884314 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc8c8cf2-372a-4606-b6ab-0781d4326602-config\") pod \"etcd-operator-b45778765-gwzmq\" (UID: \"dc8c8cf2-372a-4606-b6ab-0781d4326602\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.884332 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0ba7b20e-fa15-4fcb-8755-9594b2084aa0-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8hvbg\" (UID: \"0ba7b20e-fa15-4fcb-8755-9594b2084aa0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.884351 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5678dfe-78f2-40b2-8673-8ae5c96bc282-config\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.884377 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.884411 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.884414 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c560e73e-7742-4248-8b43-c398e456d967-config\") pod \"machine-approver-56656f9798-7dd7m\" (UID: \"c560e73e-7742-4248-8b43-c398e456d967\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.884459 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a835af21-2fbc-46ef-b961-9d111dc803b1-srv-cert\") pod \"catalog-operator-68c6474976-4t69k\" (UID: \"a835af21-2fbc-46ef-b961-9d111dc803b1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.884492 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42266b35-bab7-4577-8054-b55fa5c2dcc0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-g89rd\" (UID: \"42266b35-bab7-4577-8054-b55fa5c2dcc0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g89rd" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.884684 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5678dfe-78f2-40b2-8673-8ae5c96bc282-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.884739 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.884790 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gszb\" (UniqueName: \"kubernetes.io/projected/9078eab8-cd16-404e-a8e6-e02c60ddfe16-kube-api-access-2gszb\") pod \"console-operator-58897d9998-7p9tp\" (UID: \"9078eab8-cd16-404e-a8e6-e02c60ddfe16\") " pod="openshift-console-operator/console-operator-58897d9998-7p9tp" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.884843 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6brdg\" (UniqueName: \"kubernetes.io/projected/c560e73e-7742-4248-8b43-c398e456d967-kube-api-access-6brdg\") pod \"machine-approver-56656f9798-7dd7m\" (UID: \"c560e73e-7742-4248-8b43-c398e456d967\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.884997 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-oauth-serving-cert\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.885128 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9078eab8-cd16-404e-a8e6-e02c60ddfe16-config\") pod \"console-operator-58897d9998-7p9tp\" (UID: \"9078eab8-cd16-404e-a8e6-e02c60ddfe16\") " pod="openshift-console-operator/console-operator-58897d9998-7p9tp" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.885177 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f9ff9691-e91d-48ba-9ee9-de7807534c6e-socket-dir\") pod \"csi-hostpathplugin-d7n2g\" (UID: \"f9ff9691-e91d-48ba-9ee9-de7807534c6e\") " pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.885208 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f4fb523f-9022-4352-993a-14844194617e-signing-cabundle\") pod \"service-ca-9c57cc56f-j2cx5\" (UID: \"f4fb523f-9022-4352-993a-14844194617e\") " pod="openshift-service-ca/service-ca-9c57cc56f-j2cx5" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.885273 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/392a92d5-3d71-4d5e-9bf3-d1231e3b76da-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-27pnw\" (UID: \"392a92d5-3d71-4d5e-9bf3-d1231e3b76da\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-27pnw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.885314 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2f2c25db-28aa-4b19-8f9a-21b78e02089f-images\") pod \"machine-api-operator-5694c8668f-z6mbx\" (UID: \"2f2c25db-28aa-4b19-8f9a-21b78e02089f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z6mbx" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.885343 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc8c8cf2-372a-4606-b6ab-0781d4326602-serving-cert\") pod \"etcd-operator-b45778765-gwzmq\" (UID: \"dc8c8cf2-372a-4606-b6ab-0781d4326602\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.885366 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bb"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.885412 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.885476 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vgrr\" (UniqueName: \"kubernetes.io/projected/4ffb6c45-5d12-405d-8b73-9a54b4d0922f-kube-api-access-4vgrr\") pod \"openshift-controller-manager-operator-756b6f6bc6-ccjjn\" (UID: \"4ffb6c45-5d12-405d-8b73-9a54b4d0922f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccjjn" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.885543 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hsfz\" (UniqueName: \"kubernetes.io/projected/07ff0de8-076e-4536-a897-5bec6fcc7592-kube-api-access-2hsfz\") pod \"cluster-samples-operator-665b6dd947-ljfkb\" (UID: \"07ff0de8-076e-4536-a897-5bec6fcc7592\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ljfkb" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.885628 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76a9970f-0017-4289-b4d5-804bbf7b0e9d-serving-cert\") pod \"openshift-config-operator-7777fb866f-6wbv2\" (UID: \"76a9970f-0017-4289-b4d5-804bbf7b0e9d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.885681 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa6d1ce-12b4-4f90-8c8a-403035535e60-config\") pod \"openshift-apiserver-operator-796bbdcf4f-428jw\" (UID: \"6aa6d1ce-12b4-4f90-8c8a-403035535e60\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-428jw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.885709 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-serving-cert\") pod \"route-controller-manager-6576b87f9c-pvtl6\" (UID: \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.886584 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c560e73e-7742-4248-8b43-c398e456d967-machine-approver-tls\") pod \"machine-approver-56656f9798-7dd7m\" (UID: \"c560e73e-7742-4248-8b43-c398e456d967\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.888614 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6aa6d1ce-12b4-4f90-8c8a-403035535e60-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-428jw\" (UID: \"6aa6d1ce-12b4-4f90-8c8a-403035535e60\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-428jw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.889242 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccjjn"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.889550 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/07ff0de8-076e-4536-a897-5bec6fcc7592-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ljfkb\" (UID: \"07ff0de8-076e-4536-a897-5bec6fcc7592\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ljfkb" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.890747 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cnfn5"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.892148 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7p9tp"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.893725 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.895196 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.896678 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-56lst"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.898164 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g89rd"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.900158 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-27pnw"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.902161 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.902586 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.903446 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-29x4r"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.904807 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq6gx"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.906644 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.907769 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.909176 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4w27r"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.910378 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4w27r" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.910608 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2h9sc"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.911459 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2h9sc" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.912030 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.913447 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6gq5q"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.914813 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.916332 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-gwzmq"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.917768 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p752"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.919838 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-d7n2g"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.922051 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-j2cx5"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.922435 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.923874 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4w27r"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.925514 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bbfcd"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.927056 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sc4fg"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.928635 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2h9sc"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.930050 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-pbwcv"] Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.931106 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pbwcv" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.941754 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.962300 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.982285 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.986917 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-trusted-ca-bundle\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987458 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e54ca3bc-3cac-4f64-a27b-6e0f899f16b7-trusted-ca\") pod \"ingress-operator-5b745b69d9-bt8q4\" (UID: \"e54ca3bc-3cac-4f64-a27b-6e0f899f16b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987504 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c76139a-69bb-4812-bd9a-481f6702904b-config\") pod \"service-ca-operator-777779d784-bbfcd\" (UID: \"5c76139a-69bb-4812-bd9a-481f6702904b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bbfcd" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987541 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-service-ca\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987589 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21436169-5d86-482f-8277-dd88780f2b68-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qlrf9\" (UID: \"21436169-5d86-482f-8277-dd88780f2b68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987615 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jctq\" (UniqueName: \"kubernetes.io/projected/e54ca3bc-3cac-4f64-a27b-6e0f899f16b7-kube-api-access-6jctq\") pod \"ingress-operator-5b745b69d9-bt8q4\" (UID: \"e54ca3bc-3cac-4f64-a27b-6e0f899f16b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987643 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv85r\" (UniqueName: \"kubernetes.io/projected/aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7-kube-api-access-hv85r\") pod \"olm-operator-6b444d44fb-w26lv\" (UID: \"aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987674 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11bae354-1c41-442d-820c-d3cd3fa537d8-client-ca\") pod \"controller-manager-879f6c89f-jzdxj\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987762 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f2c25db-28aa-4b19-8f9a-21b78e02089f-config\") pod \"machine-api-operator-5694c8668f-z6mbx\" (UID: \"2f2c25db-28aa-4b19-8f9a-21b78e02089f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z6mbx" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987785 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcb356df-9cbe-46e4-a7c1-584a55224bdf-serving-cert\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987809 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smzft\" (UniqueName: \"kubernetes.io/projected/6003dd30-05c4-4565-82c7-d082a2d36d93-kube-api-access-smzft\") pod \"machine-config-controller-84d6567774-gk4bb\" (UID: \"6003dd30-05c4-4565-82c7-d082a2d36d93\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bb" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987837 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b5678dfe-78f2-40b2-8673-8ae5c96bc282-node-pullsecrets\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987856 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzlrd\" (UniqueName: \"kubernetes.io/projected/e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67-kube-api-access-wzlrd\") pod \"packageserver-d55dfcdfc-4x2sh\" (UID: \"e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987878 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42266b35-bab7-4577-8054-b55fa5c2dcc0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-g89rd\" (UID: \"42266b35-bab7-4577-8054-b55fa5c2dcc0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g89rd" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987898 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dc8c8cf2-372a-4606-b6ab-0781d4326602-etcd-service-ca\") pod \"etcd-operator-b45778765-gwzmq\" (UID: \"dc8c8cf2-372a-4606-b6ab-0781d4326602\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987914 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-console-config\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987939 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72tsc\" (UniqueName: \"kubernetes.io/projected/7d4e3a12-5fd4-4796-8eab-70392f7cf809-kube-api-access-72tsc\") pod \"kube-storage-version-migrator-operator-b67b599dd-cnfn5\" (UID: \"7d4e3a12-5fd4-4796-8eab-70392f7cf809\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cnfn5" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987958 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f9ff9691-e91d-48ba-9ee9-de7807534c6e-csi-data-dir\") pod \"csi-hostpathplugin-d7n2g\" (UID: \"f9ff9691-e91d-48ba-9ee9-de7807534c6e\") " pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987978 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abc225d9-d805-4962-b7aa-71632d5cba7b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pm69p\" (UID: \"abc225d9-d805-4962-b7aa-71632d5cba7b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pm69p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.987992 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dcb356df-9cbe-46e4-a7c1-584a55224bdf-audit-dir\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988007 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b5678dfe-78f2-40b2-8673-8ae5c96bc282-node-pullsecrets\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988045 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11bae354-1c41-442d-820c-d3cd3fa537d8-config\") pod \"controller-manager-879f6c89f-jzdxj\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988128 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b823812-b773-4c33-9e75-55395275621d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7p752\" (UID: \"2b823812-b773-4c33-9e75-55395275621d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p752" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988161 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dcb356df-9cbe-46e4-a7c1-584a55224bdf-encryption-config\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988183 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0ba7b20e-fa15-4fcb-8755-9594b2084aa0-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8hvbg\" (UID: \"0ba7b20e-fa15-4fcb-8755-9594b2084aa0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988245 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f9ff9691-e91d-48ba-9ee9-de7807534c6e-registration-dir\") pod \"csi-hostpathplugin-d7n2g\" (UID: \"f9ff9691-e91d-48ba-9ee9-de7807534c6e\") " pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988401 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f9ff9691-e91d-48ba-9ee9-de7807534c6e-plugins-dir\") pod \"csi-hostpathplugin-d7n2g\" (UID: \"f9ff9691-e91d-48ba-9ee9-de7807534c6e\") " pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988435 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6003dd30-05c4-4565-82c7-d082a2d36d93-proxy-tls\") pod \"machine-config-controller-84d6567774-gk4bb\" (UID: \"6003dd30-05c4-4565-82c7-d082a2d36d93\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bb" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988501 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7733adb-d709-4fee-bfc6-728721369b82-console-oauth-config\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988528 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e54ca3bc-3cac-4f64-a27b-6e0f899f16b7-metrics-tls\") pod \"ingress-operator-5b745b69d9-bt8q4\" (UID: \"e54ca3bc-3cac-4f64-a27b-6e0f899f16b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988606 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e9d36ab-4a09-4275-9732-cd9bd681a917-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wq6gx\" (UID: \"5e9d36ab-4a09-4275-9732-cd9bd681a917\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq6gx" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988630 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21436169-5d86-482f-8277-dd88780f2b68-config\") pod \"authentication-operator-69f744f599-qlrf9\" (UID: \"21436169-5d86-482f-8277-dd88780f2b68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988650 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b5678dfe-78f2-40b2-8673-8ae5c96bc282-etcd-client\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988690 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6003dd30-05c4-4565-82c7-d082a2d36d93-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gk4bb\" (UID: \"6003dd30-05c4-4565-82c7-d082a2d36d93\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bb" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988711 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9078eab8-cd16-404e-a8e6-e02c60ddfe16-serving-cert\") pod \"console-operator-58897d9998-7p9tp\" (UID: \"9078eab8-cd16-404e-a8e6-e02c60ddfe16\") " pod="openshift-console-operator/console-operator-58897d9998-7p9tp" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988728 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l226\" (UniqueName: \"kubernetes.io/projected/97da928c-a8a1-48ef-90f4-68a650becdf6-kube-api-access-2l226\") pod \"marketplace-operator-79b997595-29x4r\" (UID: \"97da928c-a8a1-48ef-90f4-68a650becdf6\") " pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988771 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0ba7b20e-fa15-4fcb-8755-9594b2084aa0-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8hvbg\" (UID: \"0ba7b20e-fa15-4fcb-8755-9594b2084aa0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988792 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abc225d9-d805-4962-b7aa-71632d5cba7b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pm69p\" (UID: \"abc225d9-d805-4962-b7aa-71632d5cba7b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pm69p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988811 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f2c25db-28aa-4b19-8f9a-21b78e02089f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-z6mbx\" (UID: \"2f2c25db-28aa-4b19-8f9a-21b78e02089f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z6mbx" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988856 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42266b35-bab7-4577-8054-b55fa5c2dcc0-config\") pod \"kube-controller-manager-operator-78b949d7b-g89rd\" (UID: \"42266b35-bab7-4577-8054-b55fa5c2dcc0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g89rd" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988891 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlctx\" (UniqueName: \"kubernetes.io/projected/76a9970f-0017-4289-b4d5-804bbf7b0e9d-kube-api-access-qlctx\") pod \"openshift-config-operator-7777fb866f-6wbv2\" (UID: \"76a9970f-0017-4289-b4d5-804bbf7b0e9d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988932 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.988963 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-trusted-ca-bundle\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.989031 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f9ff9691-e91d-48ba-9ee9-de7807534c6e-registration-dir\") pod \"csi-hostpathplugin-d7n2g\" (UID: \"f9ff9691-e91d-48ba-9ee9-de7807534c6e\") " pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.989055 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-service-ca\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.989104 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.989128 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11bae354-1c41-442d-820c-d3cd3fa537d8-client-ca\") pod \"controller-manager-879f6c89f-jzdxj\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.989141 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dcb356df-9cbe-46e4-a7c1-584a55224bdf-audit-dir\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.989194 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f9ff9691-e91d-48ba-9ee9-de7807534c6e-plugins-dir\") pod \"csi-hostpathplugin-d7n2g\" (UID: \"f9ff9691-e91d-48ba-9ee9-de7807534c6e\") " pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.989252 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc225d9-d805-4962-b7aa-71632d5cba7b-config\") pod \"kube-apiserver-operator-766d6c64bb-pm69p\" (UID: \"abc225d9-d805-4962-b7aa-71632d5cba7b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pm69p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.989264 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e54ca3bc-3cac-4f64-a27b-6e0f899f16b7-trusted-ca\") pod \"ingress-operator-5b745b69d9-bt8q4\" (UID: \"e54ca3bc-3cac-4f64-a27b-6e0f899f16b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.989306 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjb52\" (UniqueName: \"kubernetes.io/projected/21436169-5d86-482f-8277-dd88780f2b68-kube-api-access-tjb52\") pod \"authentication-operator-69f744f599-qlrf9\" (UID: \"21436169-5d86-482f-8277-dd88780f2b68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.989367 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b5678dfe-78f2-40b2-8673-8ae5c96bc282-image-import-ca\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.989431 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56l7z\" (UniqueName: \"kubernetes.io/projected/2f2c25db-28aa-4b19-8f9a-21b78e02089f-kube-api-access-56l7z\") pod \"machine-api-operator-5694c8668f-z6mbx\" (UID: \"2f2c25db-28aa-4b19-8f9a-21b78e02089f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z6mbx" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.990796 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21436169-5d86-482f-8277-dd88780f2b68-config\") pod \"authentication-operator-69f744f599-qlrf9\" (UID: \"21436169-5d86-482f-8277-dd88780f2b68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.990160 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-console-config\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.990605 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f2c25db-28aa-4b19-8f9a-21b78e02089f-config\") pod \"machine-api-operator-5694c8668f-z6mbx\" (UID: \"2f2c25db-28aa-4b19-8f9a-21b78e02089f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z6mbx" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.991036 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11bae354-1c41-442d-820c-d3cd3fa537d8-config\") pod \"controller-manager-879f6c89f-jzdxj\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.991106 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wpnb\" (UniqueName: \"kubernetes.io/projected/4c71517f-f5be-4508-8ce4-df43bd8700b7-kube-api-access-4wpnb\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.989333 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f9ff9691-e91d-48ba-9ee9-de7807534c6e-csi-data-dir\") pod \"csi-hostpathplugin-d7n2g\" (UID: \"f9ff9691-e91d-48ba-9ee9-de7807534c6e\") " pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.989813 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21436169-5d86-482f-8277-dd88780f2b68-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qlrf9\" (UID: \"21436169-5d86-482f-8277-dd88780f2b68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.991181 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97da928c-a8a1-48ef-90f4-68a650becdf6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-29x4r\" (UID: \"97da928c-a8a1-48ef-90f4-68a650becdf6\") " pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.992178 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-w26lv\" (UID: \"aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.992245 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vvk2\" (UniqueName: \"kubernetes.io/projected/b7733adb-d709-4fee-bfc6-728721369b82-kube-api-access-6vvk2\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.992267 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc225d9-d805-4962-b7aa-71632d5cba7b-config\") pod \"kube-apiserver-operator-766d6c64bb-pm69p\" (UID: \"abc225d9-d805-4962-b7aa-71632d5cba7b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pm69p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.992299 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5678dfe-78f2-40b2-8673-8ae5c96bc282-serving-cert\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.992456 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11bae354-1c41-442d-820c-d3cd3fa537d8-serving-cert\") pod \"controller-manager-879f6c89f-jzdxj\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.992493 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ffb6c45-5d12-405d-8b73-9a54b4d0922f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ccjjn\" (UID: \"4ffb6c45-5d12-405d-8b73-9a54b4d0922f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccjjn" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.992609 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f9ff9691-e91d-48ba-9ee9-de7807534c6e-mountpoint-dir\") pod \"csi-hostpathplugin-d7n2g\" (UID: \"f9ff9691-e91d-48ba-9ee9-de7807534c6e\") " pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.992812 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w2p5\" (UniqueName: \"kubernetes.io/projected/5a1c6a2c-6c84-4a85-9fd2-3c589165934a-kube-api-access-6w2p5\") pod \"migrator-59844c95c7-6gq5q\" (UID: \"5a1c6a2c-6c84-4a85-9fd2-3c589165934a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6gq5q" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.992862 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c71517f-f5be-4508-8ce4-df43bd8700b7-audit-dir\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.992891 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dcb356df-9cbe-46e4-a7c1-584a55224bdf-etcd-client\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.992960 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9x5w\" (UniqueName: \"kubernetes.io/projected/0ba7b20e-fa15-4fcb-8755-9594b2084aa0-kube-api-access-r9x5w\") pod \"cluster-image-registry-operator-dc59b4c8b-8hvbg\" (UID: \"0ba7b20e-fa15-4fcb-8755-9594b2084aa0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.992999 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/76a9970f-0017-4289-b4d5-804bbf7b0e9d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6wbv2\" (UID: \"76a9970f-0017-4289-b4d5-804bbf7b0e9d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.993026 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdpnr\" (UniqueName: \"kubernetes.io/projected/dc8c8cf2-372a-4606-b6ab-0781d4326602-kube-api-access-qdpnr\") pod \"etcd-operator-b45778765-gwzmq\" (UID: \"dc8c8cf2-372a-4606-b6ab-0781d4326602\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.993053 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdsm4\" (UniqueName: \"kubernetes.io/projected/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-kube-api-access-xdsm4\") pod \"route-controller-manager-6576b87f9c-pvtl6\" (UID: \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.993082 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b5678dfe-78f2-40b2-8673-8ae5c96bc282-encryption-config\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.993270 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dcb356df-9cbe-46e4-a7c1-584a55224bdf-encryption-config\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.993550 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b5678dfe-78f2-40b2-8673-8ae5c96bc282-image-import-ca\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.993910 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abc225d9-d805-4962-b7aa-71632d5cba7b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pm69p\" (UID: \"abc225d9-d805-4962-b7aa-71632d5cba7b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pm69p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.994067 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dc8c8cf2-372a-4606-b6ab-0781d4326602-etcd-ca\") pod \"etcd-operator-b45778765-gwzmq\" (UID: \"dc8c8cf2-372a-4606-b6ab-0781d4326602\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.994096 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-client-ca\") pod \"route-controller-manager-6576b87f9c-pvtl6\" (UID: \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.994121 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-audit-policies\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.994147 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.995602 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f9ff9691-e91d-48ba-9ee9-de7807534c6e-mountpoint-dir\") pod \"csi-hostpathplugin-d7n2g\" (UID: \"f9ff9691-e91d-48ba-9ee9-de7807534c6e\") " pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.995631 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0ba7b20e-fa15-4fcb-8755-9594b2084aa0-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8hvbg\" (UID: \"0ba7b20e-fa15-4fcb-8755-9594b2084aa0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.995725 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67-webhook-cert\") pod \"packageserver-d55dfcdfc-4x2sh\" (UID: \"e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.995757 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-config\") pod \"route-controller-manager-6576b87f9c-pvtl6\" (UID: \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.995785 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b5678dfe-78f2-40b2-8673-8ae5c96bc282-etcd-serving-ca\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.995877 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67-apiservice-cert\") pod \"packageserver-d55dfcdfc-4x2sh\" (UID: \"e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.995913 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c6hk\" (UniqueName: \"kubernetes.io/projected/11bae354-1c41-442d-820c-d3cd3fa537d8-kube-api-access-5c6hk\") pod \"controller-manager-879f6c89f-jzdxj\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.995942 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b5678dfe-78f2-40b2-8673-8ae5c96bc282-audit\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.995946 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.995969 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhg8m\" (UniqueName: \"kubernetes.io/projected/f4fb523f-9022-4352-993a-14844194617e-kube-api-access-jhg8m\") pod \"service-ca-9c57cc56f-j2cx5\" (UID: \"f4fb523f-9022-4352-993a-14844194617e\") " pod="openshift-service-ca/service-ca-9c57cc56f-j2cx5" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.996157 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.996160 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9078eab8-cd16-404e-a8e6-e02c60ddfe16-serving-cert\") pod \"console-operator-58897d9998-7p9tp\" (UID: \"9078eab8-cd16-404e-a8e6-e02c60ddfe16\") " pod="openshift-console-operator/console-operator-58897d9998-7p9tp" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.996398 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-client-ca\") pod \"route-controller-manager-6576b87f9c-pvtl6\" (UID: \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.996441 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c71517f-f5be-4508-8ce4-df43bd8700b7-audit-dir\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.996458 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcb356df-9cbe-46e4-a7c1-584a55224bdf-serving-cert\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.997674 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-audit-policies\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.997832 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b5678dfe-78f2-40b2-8673-8ae5c96bc282-audit\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.997828 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b5678dfe-78f2-40b2-8673-8ae5c96bc282-etcd-serving-ca\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.997947 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dcb356df-9cbe-46e4-a7c1-584a55224bdf-audit-policies\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.997998 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc8c8cf2-372a-4606-b6ab-0781d4326602-config\") pod \"etcd-operator-b45778765-gwzmq\" (UID: \"dc8c8cf2-372a-4606-b6ab-0781d4326602\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.998035 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b5678dfe-78f2-40b2-8673-8ae5c96bc282-etcd-client\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.998082 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7733adb-d709-4fee-bfc6-728721369b82-console-serving-cert\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.998129 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ffb6c45-5d12-405d-8b73-9a54b4d0922f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ccjjn\" (UID: \"4ffb6c45-5d12-405d-8b73-9a54b4d0922f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccjjn" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.998331 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0ba7b20e-fa15-4fcb-8755-9594b2084aa0-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8hvbg\" (UID: \"0ba7b20e-fa15-4fcb-8755-9594b2084aa0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.998595 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/76a9970f-0017-4289-b4d5-804bbf7b0e9d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6wbv2\" (UID: \"76a9970f-0017-4289-b4d5-804bbf7b0e9d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.998791 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0ba7b20e-fa15-4fcb-8755-9594b2084aa0-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8hvbg\" (UID: \"0ba7b20e-fa15-4fcb-8755-9594b2084aa0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.998797 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dcb356df-9cbe-46e4-a7c1-584a55224bdf-audit-policies\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.998852 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5678dfe-78f2-40b2-8673-8ae5c96bc282-config\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.998964 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.999027 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ffb6c45-5d12-405d-8b73-9a54b4d0922f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ccjjn\" (UID: \"4ffb6c45-5d12-405d-8b73-9a54b4d0922f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccjjn" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.999079 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.999347 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42266b35-bab7-4577-8054-b55fa5c2dcc0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-g89rd\" (UID: \"42266b35-bab7-4577-8054-b55fa5c2dcc0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g89rd" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.999624 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5678dfe-78f2-40b2-8673-8ae5c96bc282-config\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:28 crc kubenswrapper[4954]: I1206 06:59:28.999874 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gszb\" (UniqueName: \"kubernetes.io/projected/9078eab8-cd16-404e-a8e6-e02c60ddfe16-kube-api-access-2gszb\") pod \"console-operator-58897d9998-7p9tp\" (UID: \"9078eab8-cd16-404e-a8e6-e02c60ddfe16\") " pod="openshift-console-operator/console-operator-58897d9998-7p9tp" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:28.999988 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5678dfe-78f2-40b2-8673-8ae5c96bc282-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.001213 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.001273 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-config\") pod \"route-controller-manager-6576b87f9c-pvtl6\" (UID: \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.001389 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-oauth-serving-cert\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.001818 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ffb6c45-5d12-405d-8b73-9a54b4d0922f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ccjjn\" (UID: \"4ffb6c45-5d12-405d-8b73-9a54b4d0922f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccjjn" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.002037 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f2c25db-28aa-4b19-8f9a-21b78e02089f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-z6mbx\" (UID: \"2f2c25db-28aa-4b19-8f9a-21b78e02089f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z6mbx" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.002145 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5678dfe-78f2-40b2-8673-8ae5c96bc282-serving-cert\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.002388 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.002790 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9078eab8-cd16-404e-a8e6-e02c60ddfe16-config\") pod \"console-operator-58897d9998-7p9tp\" (UID: \"9078eab8-cd16-404e-a8e6-e02c60ddfe16\") " pod="openshift-console-operator/console-operator-58897d9998-7p9tp" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.002866 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc8c8cf2-372a-4606-b6ab-0781d4326602-serving-cert\") pod \"etcd-operator-b45778765-gwzmq\" (UID: \"dc8c8cf2-372a-4606-b6ab-0781d4326602\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.002916 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.002955 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f9ff9691-e91d-48ba-9ee9-de7807534c6e-socket-dir\") pod \"csi-hostpathplugin-d7n2g\" (UID: \"f9ff9691-e91d-48ba-9ee9-de7807534c6e\") " pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.002989 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f4fb523f-9022-4352-993a-14844194617e-signing-cabundle\") pod \"service-ca-9c57cc56f-j2cx5\" (UID: \"f4fb523f-9022-4352-993a-14844194617e\") " pod="openshift-service-ca/service-ca-9c57cc56f-j2cx5" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003029 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2f2c25db-28aa-4b19-8f9a-21b78e02089f-images\") pod \"machine-api-operator-5694c8668f-z6mbx\" (UID: \"2f2c25db-28aa-4b19-8f9a-21b78e02089f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z6mbx" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003051 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7733adb-d709-4fee-bfc6-728721369b82-console-serving-cert\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003071 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76a9970f-0017-4289-b4d5-804bbf7b0e9d-serving-cert\") pod \"openshift-config-operator-7777fb866f-6wbv2\" (UID: \"76a9970f-0017-4289-b4d5-804bbf7b0e9d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003098 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vgrr\" (UniqueName: \"kubernetes.io/projected/4ffb6c45-5d12-405d-8b73-9a54b4d0922f-kube-api-access-4vgrr\") pod \"openshift-controller-manager-operator-756b6f6bc6-ccjjn\" (UID: \"4ffb6c45-5d12-405d-8b73-9a54b4d0922f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccjjn" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003127 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-serving-cert\") pod \"route-controller-manager-6576b87f9c-pvtl6\" (UID: \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003153 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21436169-5d86-482f-8277-dd88780f2b68-serving-cert\") pod \"authentication-operator-69f744f599-qlrf9\" (UID: \"21436169-5d86-482f-8277-dd88780f2b68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003157 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f9ff9691-e91d-48ba-9ee9-de7807534c6e-socket-dir\") pod \"csi-hostpathplugin-d7n2g\" (UID: \"f9ff9691-e91d-48ba-9ee9-de7807534c6e\") " pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003180 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dcsv\" (UniqueName: \"kubernetes.io/projected/dcb356df-9cbe-46e4-a7c1-584a55224bdf-kube-api-access-4dcsv\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003202 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11bae354-1c41-442d-820c-d3cd3fa537d8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jzdxj\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003230 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67-tmpfs\") pod \"packageserver-d55dfcdfc-4x2sh\" (UID: \"e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003271 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97da928c-a8a1-48ef-90f4-68a650becdf6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-29x4r\" (UID: \"97da928c-a8a1-48ef-90f4-68a650becdf6\") " pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003295 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e54ca3bc-3cac-4f64-a27b-6e0f899f16b7-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bt8q4\" (UID: \"e54ca3bc-3cac-4f64-a27b-6e0f899f16b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003339 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-oauth-serving-cert\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003539 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5678dfe-78f2-40b2-8673-8ae5c96bc282-audit-dir\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003605 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003640 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f4fb523f-9022-4352-993a-14844194617e-signing-key\") pod \"service-ca-9c57cc56f-j2cx5\" (UID: \"f4fb523f-9022-4352-993a-14844194617e\") " pod="openshift-service-ca/service-ca-9c57cc56f-j2cx5" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003672 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq5qw\" (UniqueName: \"kubernetes.io/projected/5c76139a-69bb-4812-bd9a-481f6702904b-kube-api-access-qq5qw\") pod \"service-ca-operator-777779d784-bbfcd\" (UID: \"5c76139a-69bb-4812-bd9a-481f6702904b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bbfcd" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003668 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7733adb-d709-4fee-bfc6-728721369b82-console-oauth-config\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003708 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs7r6\" (UniqueName: \"kubernetes.io/projected/891188e6-3c26-44de-84b2-6585f0d5e7dd-kube-api-access-zs7r6\") pod \"downloads-7954f5f757-56lst\" (UID: \"891188e6-3c26-44de-84b2-6585f0d5e7dd\") " pod="openshift-console/downloads-7954f5f757-56lst" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003742 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d4e3a12-5fd4-4796-8eab-70392f7cf809-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cnfn5\" (UID: \"7d4e3a12-5fd4-4796-8eab-70392f7cf809\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cnfn5" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003779 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003821 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5678dfe-78f2-40b2-8673-8ae5c96bc282-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003849 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7-srv-cert\") pod \"olm-operator-6b444d44fb-w26lv\" (UID: \"aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003884 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.003948 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d4e3a12-5fd4-4796-8eab-70392f7cf809-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cnfn5\" (UID: \"7d4e3a12-5fd4-4796-8eab-70392f7cf809\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cnfn5" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.004003 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.004008 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21436169-5d86-482f-8277-dd88780f2b68-service-ca-bundle\") pod \"authentication-operator-69f744f599-qlrf9\" (UID: \"21436169-5d86-482f-8277-dd88780f2b68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.004072 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkjnt\" (UniqueName: \"kubernetes.io/projected/5e9d36ab-4a09-4275-9732-cd9bd681a917-kube-api-access-wkjnt\") pod \"control-plane-machine-set-operator-78cbb6b69f-wq6gx\" (UID: \"5e9d36ab-4a09-4275-9732-cd9bd681a917\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq6gx" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.004107 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kjkm\" (UniqueName: \"kubernetes.io/projected/f9ff9691-e91d-48ba-9ee9-de7807534c6e-kube-api-access-8kjkm\") pod \"csi-hostpathplugin-d7n2g\" (UID: \"f9ff9691-e91d-48ba-9ee9-de7807534c6e\") " pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.004139 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkr5c\" (UniqueName: \"kubernetes.io/projected/2b823812-b773-4c33-9e75-55395275621d-kube-api-access-mkr5c\") pod \"package-server-manager-789f6589d5-7p752\" (UID: \"2b823812-b773-4c33-9e75-55395275621d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p752" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.004172 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dc8c8cf2-372a-4606-b6ab-0781d4326602-etcd-client\") pod \"etcd-operator-b45778765-gwzmq\" (UID: \"dc8c8cf2-372a-4606-b6ab-0781d4326602\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.004196 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9078eab8-cd16-404e-a8e6-e02c60ddfe16-config\") pod \"console-operator-58897d9998-7p9tp\" (UID: \"9078eab8-cd16-404e-a8e6-e02c60ddfe16\") " pod="openshift-console-operator/console-operator-58897d9998-7p9tp" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.004216 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkxkt\" (UniqueName: \"kubernetes.io/projected/b5678dfe-78f2-40b2-8673-8ae5c96bc282-kube-api-access-dkxkt\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.004239 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.004253 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9078eab8-cd16-404e-a8e6-e02c60ddfe16-trusted-ca\") pod \"console-operator-58897d9998-7p9tp\" (UID: \"9078eab8-cd16-404e-a8e6-e02c60ddfe16\") " pod="openshift-console-operator/console-operator-58897d9998-7p9tp" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.004288 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dcb356df-9cbe-46e4-a7c1-584a55224bdf-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.004315 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.004350 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c76139a-69bb-4812-bd9a-481f6702904b-serving-cert\") pod \"service-ca-operator-777779d784-bbfcd\" (UID: \"5c76139a-69bb-4812-bd9a-481f6702904b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bbfcd" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.004450 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcb356df-9cbe-46e4-a7c1-584a55224bdf-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.004922 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21436169-5d86-482f-8277-dd88780f2b68-service-ca-bundle\") pod \"authentication-operator-69f744f599-qlrf9\" (UID: \"21436169-5d86-482f-8277-dd88780f2b68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.005239 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcb356df-9cbe-46e4-a7c1-584a55224bdf-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.005374 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11bae354-1c41-442d-820c-d3cd3fa537d8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jzdxj\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.005842 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.005941 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5678dfe-78f2-40b2-8673-8ae5c96bc282-audit-dir\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.006323 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67-tmpfs\") pod \"packageserver-d55dfcdfc-4x2sh\" (UID: \"e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.006423 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2f2c25db-28aa-4b19-8f9a-21b78e02089f-images\") pod \"machine-api-operator-5694c8668f-z6mbx\" (UID: \"2f2c25db-28aa-4b19-8f9a-21b78e02089f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z6mbx" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.006653 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b5678dfe-78f2-40b2-8673-8ae5c96bc282-encryption-config\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.006899 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dcb356df-9cbe-46e4-a7c1-584a55224bdf-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.007089 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.007628 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e54ca3bc-3cac-4f64-a27b-6e0f899f16b7-metrics-tls\") pod \"ingress-operator-5b745b69d9-bt8q4\" (UID: \"e54ca3bc-3cac-4f64-a27b-6e0f899f16b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.008198 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9078eab8-cd16-404e-a8e6-e02c60ddfe16-trusted-ca\") pod \"console-operator-58897d9998-7p9tp\" (UID: \"9078eab8-cd16-404e-a8e6-e02c60ddfe16\") " pod="openshift-console-operator/console-operator-58897d9998-7p9tp" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.008859 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.008907 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21436169-5d86-482f-8277-dd88780f2b68-serving-cert\") pod \"authentication-operator-69f744f599-qlrf9\" (UID: \"21436169-5d86-482f-8277-dd88780f2b68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.009058 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.012234 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11bae354-1c41-442d-820c-d3cd3fa537d8-serving-cert\") pod \"controller-manager-879f6c89f-jzdxj\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.012255 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42266b35-bab7-4577-8054-b55fa5c2dcc0-config\") pod \"kube-controller-manager-operator-78b949d7b-g89rd\" (UID: \"42266b35-bab7-4577-8054-b55fa5c2dcc0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g89rd" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.012463 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76a9970f-0017-4289-b4d5-804bbf7b0e9d-serving-cert\") pod \"openshift-config-operator-7777fb866f-6wbv2\" (UID: \"76a9970f-0017-4289-b4d5-804bbf7b0e9d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.013923 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dcb356df-9cbe-46e4-a7c1-584a55224bdf-etcd-client\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.014141 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-serving-cert\") pod \"route-controller-manager-6576b87f9c-pvtl6\" (UID: \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.014510 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.014525 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.014987 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.022802 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.033414 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42266b35-bab7-4577-8054-b55fa5c2dcc0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-g89rd\" (UID: \"42266b35-bab7-4577-8054-b55fa5c2dcc0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g89rd" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.043074 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.063028 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.081991 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.090509 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d4e3a12-5fd4-4796-8eab-70392f7cf809-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cnfn5\" (UID: \"7d4e3a12-5fd4-4796-8eab-70392f7cf809\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cnfn5" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.101997 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.105489 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97da928c-a8a1-48ef-90f4-68a650becdf6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-29x4r\" (UID: \"97da928c-a8a1-48ef-90f4-68a650becdf6\") " pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.105539 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq5qw\" (UniqueName: \"kubernetes.io/projected/5c76139a-69bb-4812-bd9a-481f6702904b-kube-api-access-qq5qw\") pod \"service-ca-operator-777779d784-bbfcd\" (UID: \"5c76139a-69bb-4812-bd9a-481f6702904b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bbfcd" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.105663 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c76139a-69bb-4812-bd9a-481f6702904b-serving-cert\") pod \"service-ca-operator-777779d784-bbfcd\" (UID: \"5c76139a-69bb-4812-bd9a-481f6702904b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bbfcd" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.105724 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c76139a-69bb-4812-bd9a-481f6702904b-config\") pod \"service-ca-operator-777779d784-bbfcd\" (UID: \"5c76139a-69bb-4812-bd9a-481f6702904b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bbfcd" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.105774 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smzft\" (UniqueName: \"kubernetes.io/projected/6003dd30-05c4-4565-82c7-d082a2d36d93-kube-api-access-smzft\") pod \"machine-config-controller-84d6567774-gk4bb\" (UID: \"6003dd30-05c4-4565-82c7-d082a2d36d93\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bb" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.105858 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6003dd30-05c4-4565-82c7-d082a2d36d93-proxy-tls\") pod \"machine-config-controller-84d6567774-gk4bb\" (UID: \"6003dd30-05c4-4565-82c7-d082a2d36d93\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bb" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.105889 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6003dd30-05c4-4565-82c7-d082a2d36d93-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gk4bb\" (UID: \"6003dd30-05c4-4565-82c7-d082a2d36d93\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bb" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.105912 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l226\" (UniqueName: \"kubernetes.io/projected/97da928c-a8a1-48ef-90f4-68a650becdf6-kube-api-access-2l226\") pod \"marketplace-operator-79b997595-29x4r\" (UID: \"97da928c-a8a1-48ef-90f4-68a650becdf6\") " pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.105991 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97da928c-a8a1-48ef-90f4-68a650becdf6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-29x4r\" (UID: \"97da928c-a8a1-48ef-90f4-68a650becdf6\") " pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.106036 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w2p5\" (UniqueName: \"kubernetes.io/projected/5a1c6a2c-6c84-4a85-9fd2-3c589165934a-kube-api-access-6w2p5\") pod \"migrator-59844c95c7-6gq5q\" (UID: \"5a1c6a2c-6c84-4a85-9fd2-3c589165934a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6gq5q" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.107069 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6003dd30-05c4-4565-82c7-d082a2d36d93-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gk4bb\" (UID: \"6003dd30-05c4-4565-82c7-d082a2d36d93\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bb" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.122988 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.126775 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d4e3a12-5fd4-4796-8eab-70392f7cf809-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cnfn5\" (UID: \"7d4e3a12-5fd4-4796-8eab-70392f7cf809\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cnfn5" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.141633 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.149389 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dc8c8cf2-372a-4606-b6ab-0781d4326602-etcd-client\") pod \"etcd-operator-b45778765-gwzmq\" (UID: \"dc8c8cf2-372a-4606-b6ab-0781d4326602\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.162075 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.182235 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.201864 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.206250 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc8c8cf2-372a-4606-b6ab-0781d4326602-serving-cert\") pod \"etcd-operator-b45778765-gwzmq\" (UID: \"dc8c8cf2-372a-4606-b6ab-0781d4326602\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.221143 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.229355 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc8c8cf2-372a-4606-b6ab-0781d4326602-config\") pod \"etcd-operator-b45778765-gwzmq\" (UID: \"dc8c8cf2-372a-4606-b6ab-0781d4326602\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.241963 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.246413 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dc8c8cf2-372a-4606-b6ab-0781d4326602-etcd-ca\") pod \"etcd-operator-b45778765-gwzmq\" (UID: \"dc8c8cf2-372a-4606-b6ab-0781d4326602\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.261777 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.269476 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dc8c8cf2-372a-4606-b6ab-0781d4326602-etcd-service-ca\") pod \"etcd-operator-b45778765-gwzmq\" (UID: \"dc8c8cf2-372a-4606-b6ab-0781d4326602\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.282617 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.301972 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.322626 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.328948 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a835af21-2fbc-46ef-b961-9d111dc803b1-srv-cert\") pod \"catalog-operator-68c6474976-4t69k\" (UID: \"a835af21-2fbc-46ef-b961-9d111dc803b1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.341933 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.346153 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-w26lv\" (UID: \"aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.351472 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a835af21-2fbc-46ef-b961-9d111dc803b1-profile-collector-cert\") pod \"catalog-operator-68c6474976-4t69k\" (UID: \"a835af21-2fbc-46ef-b961-9d111dc803b1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.362153 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.381892 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.401981 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.409462 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7-srv-cert\") pod \"olm-operator-6b444d44fb-w26lv\" (UID: \"aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.421908 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.441211 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.462727 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.468171 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/392a92d5-3d71-4d5e-9bf3-d1231e3b76da-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-27pnw\" (UID: \"392a92d5-3d71-4d5e-9bf3-d1231e3b76da\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-27pnw" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.481759 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.489600 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/392a92d5-3d71-4d5e-9bf3-d1231e3b76da-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-27pnw\" (UID: \"392a92d5-3d71-4d5e-9bf3-d1231e3b76da\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-27pnw" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.501656 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.521373 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.542407 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.562510 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.581269 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.601967 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.621043 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.641190 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.661673 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.682031 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.702221 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.722524 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.741235 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.761793 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.782661 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.789209 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6003dd30-05c4-4565-82c7-d082a2d36d93-proxy-tls\") pod \"machine-config-controller-84d6567774-gk4bb\" (UID: \"6003dd30-05c4-4565-82c7-d082a2d36d93\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bb" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.800024 4954 request.go:700] Waited for 1.002199359s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-controller-dockercfg-c2lfx&limit=500&resourceVersion=0 Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.802119 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.822046 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.841894 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.861855 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.870399 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97da928c-a8a1-48ef-90f4-68a650becdf6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-29x4r\" (UID: \"97da928c-a8a1-48ef-90f4-68a650becdf6\") " pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.889064 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.897137 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97da928c-a8a1-48ef-90f4-68a650becdf6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-29x4r\" (UID: \"97da928c-a8a1-48ef-90f4-68a650becdf6\") " pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.903944 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.921933 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.961692 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.973091 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b823812-b773-4c33-9e75-55395275621d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7p752\" (UID: \"2b823812-b773-4c33-9e75-55395275621d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p752" Dec 06 06:59:29 crc kubenswrapper[4954]: I1206 06:59:29.982771 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 06 06:59:29 crc kubenswrapper[4954]: E1206 06:59:29.990423 4954 secret.go:188] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 06 06:59:29 crc kubenswrapper[4954]: E1206 06:59:29.990528 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e9d36ab-4a09-4275-9732-cd9bd681a917-control-plane-machine-set-operator-tls podName:5e9d36ab-4a09-4275-9732-cd9bd681a917 nodeName:}" failed. No retries permitted until 2025-12-06 06:59:30.490501399 +0000 UTC m=+145.303860788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/5e9d36ab-4a09-4275-9732-cd9bd681a917-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-78cbb6b69f-wq6gx" (UID: "5e9d36ab-4a09-4275-9732-cd9bd681a917") : failed to sync secret cache: timed out waiting for the condition Dec 06 06:59:29 crc kubenswrapper[4954]: E1206 06:59:29.997519 4954 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 06 06:59:29 crc kubenswrapper[4954]: E1206 06:59:29.997591 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67-webhook-cert podName:e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67 nodeName:}" failed. No retries permitted until 2025-12-06 06:59:30.497576123 +0000 UTC m=+145.310935512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67-webhook-cert") pod "packageserver-d55dfcdfc-4x2sh" (UID: "e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67") : failed to sync secret cache: timed out waiting for the condition Dec 06 06:59:29 crc kubenswrapper[4954]: E1206 06:59:29.998615 4954 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 06 06:59:29 crc kubenswrapper[4954]: E1206 06:59:29.998661 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67-apiservice-cert podName:e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67 nodeName:}" failed. No retries permitted until 2025-12-06 06:59:30.498649401 +0000 UTC m=+145.312008790 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67-apiservice-cert") pod "packageserver-d55dfcdfc-4x2sh" (UID: "e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67") : failed to sync secret cache: timed out waiting for the condition Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.001274 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 06 06:59:30 crc kubenswrapper[4954]: E1206 06:59:30.004831 4954 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Dec 06 06:59:30 crc kubenswrapper[4954]: E1206 06:59:30.004926 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4fb523f-9022-4352-993a-14844194617e-signing-cabundle podName:f4fb523f-9022-4352-993a-14844194617e nodeName:}" failed. No retries permitted until 2025-12-06 06:59:30.504898634 +0000 UTC m=+145.318258033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/f4fb523f-9022-4352-993a-14844194617e-signing-cabundle") pod "service-ca-9c57cc56f-j2cx5" (UID: "f4fb523f-9022-4352-993a-14844194617e") : failed to sync configmap cache: timed out waiting for the condition Dec 06 06:59:30 crc kubenswrapper[4954]: E1206 06:59:30.005955 4954 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Dec 06 06:59:30 crc kubenswrapper[4954]: E1206 06:59:30.006037 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4fb523f-9022-4352-993a-14844194617e-signing-key podName:f4fb523f-9022-4352-993a-14844194617e nodeName:}" failed. No retries permitted until 2025-12-06 06:59:30.506026924 +0000 UTC m=+145.319386313 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/f4fb523f-9022-4352-993a-14844194617e-signing-key") pod "service-ca-9c57cc56f-j2cx5" (UID: "f4fb523f-9022-4352-993a-14844194617e") : failed to sync secret cache: timed out waiting for the condition Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.022647 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.042274 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.063126 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.083052 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.102419 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 06 06:59:30 crc kubenswrapper[4954]: E1206 06:59:30.106656 4954 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 06 06:59:30 crc kubenswrapper[4954]: E1206 06:59:30.106759 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c76139a-69bb-4812-bd9a-481f6702904b-serving-cert podName:5c76139a-69bb-4812-bd9a-481f6702904b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:30.606736151 +0000 UTC m=+145.420095540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5c76139a-69bb-4812-bd9a-481f6702904b-serving-cert") pod "service-ca-operator-777779d784-bbfcd" (UID: "5c76139a-69bb-4812-bd9a-481f6702904b") : failed to sync secret cache: timed out waiting for the condition Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.107721 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c76139a-69bb-4812-bd9a-481f6702904b-config\") pod \"service-ca-operator-777779d784-bbfcd\" (UID: \"5c76139a-69bb-4812-bd9a-481f6702904b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bbfcd" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.122309 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.141616 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.163110 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.182516 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.202910 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.222223 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.242239 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.262053 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.281820 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.302613 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.323031 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.342953 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.362624 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.383130 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.402360 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.422458 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.441778 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.462844 4954 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.520108 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/392a92d5-3d71-4d5e-9bf3-d1231e3b76da-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-27pnw\" (UID: \"392a92d5-3d71-4d5e-9bf3-d1231e3b76da\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-27pnw" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.535910 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67-webhook-cert\") pod \"packageserver-d55dfcdfc-4x2sh\" (UID: \"e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.535970 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67-apiservice-cert\") pod \"packageserver-d55dfcdfc-4x2sh\" (UID: \"e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.536037 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f4fb523f-9022-4352-993a-14844194617e-signing-cabundle\") pod \"service-ca-9c57cc56f-j2cx5\" (UID: \"f4fb523f-9022-4352-993a-14844194617e\") " pod="openshift-service-ca/service-ca-9c57cc56f-j2cx5" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.536084 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f4fb523f-9022-4352-993a-14844194617e-signing-key\") pod \"service-ca-9c57cc56f-j2cx5\" (UID: \"f4fb523f-9022-4352-993a-14844194617e\") " pod="openshift-service-ca/service-ca-9c57cc56f-j2cx5" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.536232 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e9d36ab-4a09-4275-9732-cd9bd681a917-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wq6gx\" (UID: \"5e9d36ab-4a09-4275-9732-cd9bd681a917\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq6gx" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.537859 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f4fb523f-9022-4352-993a-14844194617e-signing-cabundle\") pod \"service-ca-9c57cc56f-j2cx5\" (UID: \"f4fb523f-9022-4352-993a-14844194617e\") " pod="openshift-service-ca/service-ca-9c57cc56f-j2cx5" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.538177 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95bkb\" (UniqueName: \"kubernetes.io/projected/a835af21-2fbc-46ef-b961-9d111dc803b1-kube-api-access-95bkb\") pod \"catalog-operator-68c6474976-4t69k\" (UID: \"a835af21-2fbc-46ef-b961-9d111dc803b1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.539551 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e9d36ab-4a09-4275-9732-cd9bd681a917-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wq6gx\" (UID: \"5e9d36ab-4a09-4275-9732-cd9bd681a917\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq6gx" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.539968 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67-webhook-cert\") pod \"packageserver-d55dfcdfc-4x2sh\" (UID: \"e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.540471 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67-apiservice-cert\") pod \"packageserver-d55dfcdfc-4x2sh\" (UID: \"e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.542145 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f4fb523f-9022-4352-993a-14844194617e-signing-key\") pod \"service-ca-9c57cc56f-j2cx5\" (UID: \"f4fb523f-9022-4352-993a-14844194617e\") " pod="openshift-service-ca/service-ca-9c57cc56f-j2cx5" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.558491 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z72c6\" (UniqueName: \"kubernetes.io/projected/6aa6d1ce-12b4-4f90-8c8a-403035535e60-kube-api-access-z72c6\") pod \"openshift-apiserver-operator-796bbdcf4f-428jw\" (UID: \"6aa6d1ce-12b4-4f90-8c8a-403035535e60\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-428jw" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.576623 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6brdg\" (UniqueName: \"kubernetes.io/projected/c560e73e-7742-4248-8b43-c398e456d967-kube-api-access-6brdg\") pod \"machine-approver-56656f9798-7dd7m\" (UID: \"c560e73e-7742-4248-8b43-c398e456d967\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.598893 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hsfz\" (UniqueName: \"kubernetes.io/projected/07ff0de8-076e-4536-a897-5bec6fcc7592-kube-api-access-2hsfz\") pod \"cluster-samples-operator-665b6dd947-ljfkb\" (UID: \"07ff0de8-076e-4536-a897-5bec6fcc7592\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ljfkb" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.601840 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.611722 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.621933 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 06 06:59:30 crc kubenswrapper[4954]: W1206 06:59:30.629411 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc560e73e_7742_4248_8b43_c398e456d967.slice/crio-d4b136ac185265981d940ed37b771d5497457a83bab9e8badad64dae9b81daa0 WatchSource:0}: Error finding container d4b136ac185265981d940ed37b771d5497457a83bab9e8badad64dae9b81daa0: Status 404 returned error can't find the container with id d4b136ac185265981d940ed37b771d5497457a83bab9e8badad64dae9b81daa0 Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.637956 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c76139a-69bb-4812-bd9a-481f6702904b-serving-cert\") pod \"service-ca-operator-777779d784-bbfcd\" (UID: \"5c76139a-69bb-4812-bd9a-481f6702904b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bbfcd" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.642612 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.644536 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c76139a-69bb-4812-bd9a-481f6702904b-serving-cert\") pod \"service-ca-operator-777779d784-bbfcd\" (UID: \"5c76139a-69bb-4812-bd9a-481f6702904b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bbfcd" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.663023 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.682683 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.701969 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.722432 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.727682 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ljfkb" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.742360 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.762764 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.771364 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.782738 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.791738 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-27pnw" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.795928 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-428jw" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.819842 4954 request.go:700] Waited for 1.831846039s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/serviceaccounts/olm-operator-serviceaccount/token Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.825123 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jctq\" (UniqueName: \"kubernetes.io/projected/e54ca3bc-3cac-4f64-a27b-6e0f899f16b7-kube-api-access-6jctq\") pod \"ingress-operator-5b745b69d9-bt8q4\" (UID: \"e54ca3bc-3cac-4f64-a27b-6e0f899f16b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.840178 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv85r\" (UniqueName: \"kubernetes.io/projected/aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7-kube-api-access-hv85r\") pod \"olm-operator-6b444d44fb-w26lv\" (UID: \"aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.859345 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzlrd\" (UniqueName: \"kubernetes.io/projected/e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67-kube-api-access-wzlrd\") pod \"packageserver-d55dfcdfc-4x2sh\" (UID: \"e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.881520 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42266b35-bab7-4577-8054-b55fa5c2dcc0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-g89rd\" (UID: \"42266b35-bab7-4577-8054-b55fa5c2dcc0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g89rd" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.904178 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72tsc\" (UniqueName: \"kubernetes.io/projected/7d4e3a12-5fd4-4796-8eab-70392f7cf809-kube-api-access-72tsc\") pod \"kube-storage-version-migrator-operator-b67b599dd-cnfn5\" (UID: \"7d4e3a12-5fd4-4796-8eab-70392f7cf809\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cnfn5" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.920058 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abc225d9-d805-4962-b7aa-71632d5cba7b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pm69p\" (UID: \"abc225d9-d805-4962-b7aa-71632d5cba7b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pm69p" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.951035 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlctx\" (UniqueName: \"kubernetes.io/projected/76a9970f-0017-4289-b4d5-804bbf7b0e9d-kube-api-access-qlctx\") pod \"openshift-config-operator-7777fb866f-6wbv2\" (UID: \"76a9970f-0017-4289-b4d5-804bbf7b0e9d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.960521 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.963678 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56l7z\" (UniqueName: \"kubernetes.io/projected/2f2c25db-28aa-4b19-8f9a-21b78e02089f-kube-api-access-56l7z\") pod \"machine-api-operator-5694c8668f-z6mbx\" (UID: \"2f2c25db-28aa-4b19-8f9a-21b78e02089f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z6mbx" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.970159 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" Dec 06 06:59:30 crc kubenswrapper[4954]: I1206 06:59:30.978281 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjb52\" (UniqueName: \"kubernetes.io/projected/21436169-5d86-482f-8277-dd88780f2b68-kube-api-access-tjb52\") pod \"authentication-operator-69f744f599-qlrf9\" (UID: \"21436169-5d86-482f-8277-dd88780f2b68\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.006082 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wpnb\" (UniqueName: \"kubernetes.io/projected/4c71517f-f5be-4508-8ce4-df43bd8700b7-kube-api-access-4wpnb\") pod \"oauth-openshift-558db77b4-5g5d4\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.010491 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pm69p" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.017076 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vvk2\" (UniqueName: \"kubernetes.io/projected/b7733adb-d709-4fee-bfc6-728721369b82-kube-api-access-6vvk2\") pod \"console-f9d7485db-z4t2h\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.041413 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdsm4\" (UniqueName: \"kubernetes.io/projected/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-kube-api-access-xdsm4\") pod \"route-controller-manager-6576b87f9c-pvtl6\" (UID: \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.046207 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g89rd" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.057169 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cnfn5" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.087008 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.088261 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.113213 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhg8m\" (UniqueName: \"kubernetes.io/projected/f4fb523f-9022-4352-993a-14844194617e-kube-api-access-jhg8m\") pod \"service-ca-9c57cc56f-j2cx5\" (UID: \"f4fb523f-9022-4352-993a-14844194617e\") " pod="openshift-service-ca/service-ca-9c57cc56f-j2cx5" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.114760 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9x5w\" (UniqueName: \"kubernetes.io/projected/0ba7b20e-fa15-4fcb-8755-9594b2084aa0-kube-api-access-r9x5w\") pod \"cluster-image-registry-operator-dc59b4c8b-8hvbg\" (UID: \"0ba7b20e-fa15-4fcb-8755-9594b2084aa0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.121540 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-z6mbx" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.141628 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0ba7b20e-fa15-4fcb-8755-9594b2084aa0-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8hvbg\" (UID: \"0ba7b20e-fa15-4fcb-8755-9594b2084aa0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.161403 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gszb\" (UniqueName: \"kubernetes.io/projected/9078eab8-cd16-404e-a8e6-e02c60ddfe16-kube-api-access-2gszb\") pod \"console-operator-58897d9998-7p9tp\" (UID: \"9078eab8-cd16-404e-a8e6-e02c60ddfe16\") " pod="openshift-console-operator/console-operator-58897d9998-7p9tp" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.161732 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.185073 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e54ca3bc-3cac-4f64-a27b-6e0f899f16b7-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bt8q4\" (UID: \"e54ca3bc-3cac-4f64-a27b-6e0f899f16b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.188756 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.197508 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m" event={"ID":"c560e73e-7742-4248-8b43-c398e456d967","Type":"ContainerStarted","Data":"d4b136ac185265981d940ed37b771d5497457a83bab9e8badad64dae9b81daa0"} Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.200075 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dcsv\" (UniqueName: \"kubernetes.io/projected/dcb356df-9cbe-46e4-a7c1-584a55224bdf-kube-api-access-4dcsv\") pod \"apiserver-7bbb656c7d-c665p\" (UID: \"dcb356df-9cbe-46e4-a7c1-584a55224bdf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.232253 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.253912 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-j2cx5" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.256303 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kjkm\" (UniqueName: \"kubernetes.io/projected/f9ff9691-e91d-48ba-9ee9-de7807534c6e-kube-api-access-8kjkm\") pod \"csi-hostpathplugin-d7n2g\" (UID: \"f9ff9691-e91d-48ba-9ee9-de7807534c6e\") " pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.258300 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-7p9tp" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.278245 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkr5c\" (UniqueName: \"kubernetes.io/projected/2b823812-b773-4c33-9e75-55395275621d-kube-api-access-mkr5c\") pod \"package-server-manager-789f6589d5-7p752\" (UID: \"2b823812-b773-4c33-9e75-55395275621d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p752" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.286203 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.292019 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.298221 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkxkt\" (UniqueName: \"kubernetes.io/projected/b5678dfe-78f2-40b2-8673-8ae5c96bc282-kube-api-access-dkxkt\") pod \"apiserver-76f77b778f-6bfbl\" (UID: \"b5678dfe-78f2-40b2-8673-8ae5c96bc282\") " pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.319048 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs7r6\" (UniqueName: \"kubernetes.io/projected/891188e6-3c26-44de-84b2-6585f0d5e7dd-kube-api-access-zs7r6\") pod \"downloads-7954f5f757-56lst\" (UID: \"891188e6-3c26-44de-84b2-6585f0d5e7dd\") " pod="openshift-console/downloads-7954f5f757-56lst" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.319151 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.337888 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq5qw\" (UniqueName: \"kubernetes.io/projected/5c76139a-69bb-4812-bd9a-481f6702904b-kube-api-access-qq5qw\") pod \"service-ca-operator-777779d784-bbfcd\" (UID: \"5c76139a-69bb-4812-bd9a-481f6702904b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bbfcd" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.358386 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smzft\" (UniqueName: \"kubernetes.io/projected/6003dd30-05c4-4565-82c7-d082a2d36d93-kube-api-access-smzft\") pod \"machine-config-controller-84d6567774-gk4bb\" (UID: \"6003dd30-05c4-4565-82c7-d082a2d36d93\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bb" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.377474 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l226\" (UniqueName: \"kubernetes.io/projected/97da928c-a8a1-48ef-90f4-68a650becdf6-kube-api-access-2l226\") pod \"marketplace-operator-79b997595-29x4r\" (UID: \"97da928c-a8a1-48ef-90f4-68a650becdf6\") " pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.380056 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:31 crc kubenswrapper[4954]: I1206 06:59:31.396955 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w2p5\" (UniqueName: \"kubernetes.io/projected/5a1c6a2c-6c84-4a85-9fd2-3c589165934a-kube-api-access-6w2p5\") pod \"migrator-59844c95c7-6gq5q\" (UID: \"5a1c6a2c-6c84-4a85-9fd2-3c589165934a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6gq5q" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.105722 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdpnr\" (UniqueName: \"kubernetes.io/projected/dc8c8cf2-372a-4606-b6ab-0781d4326602-kube-api-access-qdpnr\") pod \"etcd-operator-b45778765-gwzmq\" (UID: \"dc8c8cf2-372a-4606-b6ab-0781d4326602\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.105725 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c6hk\" (UniqueName: \"kubernetes.io/projected/11bae354-1c41-442d-820c-d3cd3fa537d8-kube-api-access-5c6hk\") pod \"controller-manager-879f6c89f-jzdxj\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.109916 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-56lst" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.110786 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p752" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.110877 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.112462 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vgrr\" (UniqueName: \"kubernetes.io/projected/4ffb6c45-5d12-405d-8b73-9a54b4d0922f-kube-api-access-4vgrr\") pod \"openshift-controller-manager-operator-756b6f6bc6-ccjjn\" (UID: \"4ffb6c45-5d12-405d-8b73-9a54b4d0922f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccjjn" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.112084 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bbfcd" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.113141 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.112941 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.112812 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.113219 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.113353 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-bound-sa-token\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.113413 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.113796 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bb" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.113803 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6gq5q" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.114965 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-trusted-ca\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.115431 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-registry-certificates\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.115443 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkjnt\" (UniqueName: \"kubernetes.io/projected/5e9d36ab-4a09-4275-9732-cd9bd681a917-kube-api-access-wkjnt\") pod \"control-plane-machine-set-operator-78cbb6b69f-wq6gx\" (UID: \"5e9d36ab-4a09-4275-9732-cd9bd681a917\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq6gx" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.115628 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-registry-tls\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.116536 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.116723 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7plrp\" (UniqueName: \"kubernetes.io/projected/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-kube-api-access-7plrp\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: E1206 06:59:32.119153 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:32.61911286 +0000 UTC m=+147.432472289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.122592 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq6gx" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.191742 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-428jw"] Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.197881 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ljfkb"] Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.202028 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2"] Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.204205 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pm69p"] Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.207499 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k"] Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.212423 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh"] Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.214399 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g89rd"] Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.216930 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-27pnw"] Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.217720 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.217971 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9dbb2866-4465-4a76-b007-f381ba65f51e-proxy-tls\") pod \"machine-config-operator-74547568cd-fwst7\" (UID: \"9dbb2866-4465-4a76-b007-f381ba65f51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7" Dec 06 06:59:32 crc kubenswrapper[4954]: E1206 06:59:32.218082 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:32.7180447 +0000 UTC m=+147.531404149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218222 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218266 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7plrp\" (UniqueName: \"kubernetes.io/projected/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-kube-api-access-7plrp\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218291 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ed8e0159-4aa0-4e24-982d-b8e43b561192-default-certificate\") pod \"router-default-5444994796-k5tts\" (UID: \"ed8e0159-4aa0-4e24-982d-b8e43b561192\") " pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218327 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2c7e376-ab13-4dfa-900f-8de633105709-config-volume\") pod \"collect-profiles-29416725-9mhn4\" (UID: \"c2c7e376-ab13-4dfa-900f-8de633105709\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218383 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218399 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed8e0159-4aa0-4e24-982d-b8e43b561192-metrics-certs\") pod \"router-default-5444994796-k5tts\" (UID: \"ed8e0159-4aa0-4e24-982d-b8e43b561192\") " pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218446 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-bound-sa-token\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218465 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218494 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed8e0159-4aa0-4e24-982d-b8e43b561192-service-ca-bundle\") pod \"router-default-5444994796-k5tts\" (UID: \"ed8e0159-4aa0-4e24-982d-b8e43b561192\") " pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218522 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-trusted-ca\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218586 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ed8e0159-4aa0-4e24-982d-b8e43b561192-stats-auth\") pod \"router-default-5444994796-k5tts\" (UID: \"ed8e0159-4aa0-4e24-982d-b8e43b561192\") " pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218612 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-registry-certificates\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218677 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4603aefa-389d-4f23-b247-1c7e98f9dc94-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zpgsz\" (UID: \"4603aefa-389d-4f23-b247-1c7e98f9dc94\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zpgsz" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218708 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9dbb2866-4465-4a76-b007-f381ba65f51e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fwst7\" (UID: \"9dbb2866-4465-4a76-b007-f381ba65f51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218749 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-registry-tls\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218775 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2c7e376-ab13-4dfa-900f-8de633105709-secret-volume\") pod \"collect-profiles-29416725-9mhn4\" (UID: \"c2c7e376-ab13-4dfa-900f-8de633105709\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218815 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gfx9\" (UniqueName: \"kubernetes.io/projected/2c9c8d86-e93d-431d-8734-331a2f3997bf-kube-api-access-7gfx9\") pod \"dns-operator-744455d44c-5sc28\" (UID: \"2c9c8d86-e93d-431d-8734-331a2f3997bf\") " pod="openshift-dns-operator/dns-operator-744455d44c-5sc28" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218840 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-697rr\" (UniqueName: \"kubernetes.io/projected/4603aefa-389d-4f23-b247-1c7e98f9dc94-kube-api-access-697rr\") pod \"multus-admission-controller-857f4d67dd-zpgsz\" (UID: \"4603aefa-389d-4f23-b247-1c7e98f9dc94\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zpgsz" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218868 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsxxc\" (UniqueName: \"kubernetes.io/projected/ed8e0159-4aa0-4e24-982d-b8e43b561192-kube-api-access-xsxxc\") pod \"router-default-5444994796-k5tts\" (UID: \"ed8e0159-4aa0-4e24-982d-b8e43b561192\") " pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:32 crc kubenswrapper[4954]: E1206 06:59:32.218883 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:32.718866282 +0000 UTC m=+147.532225771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218917 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2rcp\" (UniqueName: \"kubernetes.io/projected/c2c7e376-ab13-4dfa-900f-8de633105709-kube-api-access-k2rcp\") pod \"collect-profiles-29416725-9mhn4\" (UID: \"c2c7e376-ab13-4dfa-900f-8de633105709\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218951 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9dbb2866-4465-4a76-b007-f381ba65f51e-images\") pod \"machine-config-operator-74547568cd-fwst7\" (UID: \"9dbb2866-4465-4a76-b007-f381ba65f51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.218993 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlxbq\" (UniqueName: \"kubernetes.io/projected/9dbb2866-4465-4a76-b007-f381ba65f51e-kube-api-access-jlxbq\") pod \"machine-config-operator-74547568cd-fwst7\" (UID: \"9dbb2866-4465-4a76-b007-f381ba65f51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.219471 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c9c8d86-e93d-431d-8734-331a2f3997bf-metrics-tls\") pod \"dns-operator-744455d44c-5sc28\" (UID: \"2c9c8d86-e93d-431d-8734-331a2f3997bf\") " pod="openshift-dns-operator/dns-operator-744455d44c-5sc28" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.219279 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.221536 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-trusted-ca\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.222194 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-registry-certificates\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.223377 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-registry-tls\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.224345 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.235927 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccjjn" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.258878 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7plrp\" (UniqueName: \"kubernetes.io/projected/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-kube-api-access-7plrp\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.281710 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-bound-sa-token\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.321254 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.321435 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nktk6\" (UniqueName: \"kubernetes.io/projected/8be1c797-26e0-4752-ab88-e593ce382532-kube-api-access-nktk6\") pod \"dns-default-4w27r\" (UID: \"8be1c797-26e0-4752-ab88-e593ce382532\") " pod="openshift-dns/dns-default-4w27r" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.321492 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlxbq\" (UniqueName: \"kubernetes.io/projected/9dbb2866-4465-4a76-b007-f381ba65f51e-kube-api-access-jlxbq\") pod \"machine-config-operator-74547568cd-fwst7\" (UID: \"9dbb2866-4465-4a76-b007-f381ba65f51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.321510 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68zqf\" (UniqueName: \"kubernetes.io/projected/f04de935-d344-44d6-9cc5-0abd0ec89d50-kube-api-access-68zqf\") pod \"machine-config-server-pbwcv\" (UID: \"f04de935-d344-44d6-9cc5-0abd0ec89d50\") " pod="openshift-machine-config-operator/machine-config-server-pbwcv" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.321547 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c9c8d86-e93d-431d-8734-331a2f3997bf-metrics-tls\") pod \"dns-operator-744455d44c-5sc28\" (UID: \"2c9c8d86-e93d-431d-8734-331a2f3997bf\") " pod="openshift-dns-operator/dns-operator-744455d44c-5sc28" Dec 06 06:59:32 crc kubenswrapper[4954]: E1206 06:59:32.321681 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:32.821640872 +0000 UTC m=+147.635000261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.321738 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9dbb2866-4465-4a76-b007-f381ba65f51e-proxy-tls\") pod \"machine-config-operator-74547568cd-fwst7\" (UID: \"9dbb2866-4465-4a76-b007-f381ba65f51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.321858 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.321902 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f04de935-d344-44d6-9cc5-0abd0ec89d50-node-bootstrap-token\") pod \"machine-config-server-pbwcv\" (UID: \"f04de935-d344-44d6-9cc5-0abd0ec89d50\") " pod="openshift-machine-config-operator/machine-config-server-pbwcv" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.321967 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ed8e0159-4aa0-4e24-982d-b8e43b561192-default-certificate\") pod \"router-default-5444994796-k5tts\" (UID: \"ed8e0159-4aa0-4e24-982d-b8e43b561192\") " pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.322102 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2c7e376-ab13-4dfa-900f-8de633105709-config-volume\") pod \"collect-profiles-29416725-9mhn4\" (UID: \"c2c7e376-ab13-4dfa-900f-8de633105709\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.322294 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8be1c797-26e0-4752-ab88-e593ce382532-metrics-tls\") pod \"dns-default-4w27r\" (UID: \"8be1c797-26e0-4752-ab88-e593ce382532\") " pod="openshift-dns/dns-default-4w27r" Dec 06 06:59:32 crc kubenswrapper[4954]: E1206 06:59:32.322352 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:32.82232563 +0000 UTC m=+147.635685219 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.322469 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed8e0159-4aa0-4e24-982d-b8e43b561192-metrics-certs\") pod \"router-default-5444994796-k5tts\" (UID: \"ed8e0159-4aa0-4e24-982d-b8e43b561192\") " pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.322579 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f04de935-d344-44d6-9cc5-0abd0ec89d50-certs\") pod \"machine-config-server-pbwcv\" (UID: \"f04de935-d344-44d6-9cc5-0abd0ec89d50\") " pod="openshift-machine-config-operator/machine-config-server-pbwcv" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.322651 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed8e0159-4aa0-4e24-982d-b8e43b561192-service-ca-bundle\") pod \"router-default-5444994796-k5tts\" (UID: \"ed8e0159-4aa0-4e24-982d-b8e43b561192\") " pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.322686 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f2c00fbc-6a46-4e89-af8e-202e23796264-cert\") pod \"ingress-canary-2h9sc\" (UID: \"f2c00fbc-6a46-4e89-af8e-202e23796264\") " pod="openshift-ingress-canary/ingress-canary-2h9sc" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.322740 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfcb5\" (UniqueName: \"kubernetes.io/projected/f2c00fbc-6a46-4e89-af8e-202e23796264-kube-api-access-jfcb5\") pod \"ingress-canary-2h9sc\" (UID: \"f2c00fbc-6a46-4e89-af8e-202e23796264\") " pod="openshift-ingress-canary/ingress-canary-2h9sc" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.322930 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ed8e0159-4aa0-4e24-982d-b8e43b561192-stats-auth\") pod \"router-default-5444994796-k5tts\" (UID: \"ed8e0159-4aa0-4e24-982d-b8e43b561192\") " pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.323144 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4603aefa-389d-4f23-b247-1c7e98f9dc94-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zpgsz\" (UID: \"4603aefa-389d-4f23-b247-1c7e98f9dc94\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zpgsz" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.323742 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed8e0159-4aa0-4e24-982d-b8e43b561192-service-ca-bundle\") pod \"router-default-5444994796-k5tts\" (UID: \"ed8e0159-4aa0-4e24-982d-b8e43b561192\") " pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.324494 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2c7e376-ab13-4dfa-900f-8de633105709-config-volume\") pod \"collect-profiles-29416725-9mhn4\" (UID: \"c2c7e376-ab13-4dfa-900f-8de633105709\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.327383 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9dbb2866-4465-4a76-b007-f381ba65f51e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fwst7\" (UID: \"9dbb2866-4465-4a76-b007-f381ba65f51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.328039 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9dbb2866-4465-4a76-b007-f381ba65f51e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fwst7\" (UID: \"9dbb2866-4465-4a76-b007-f381ba65f51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.328955 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2c7e376-ab13-4dfa-900f-8de633105709-secret-volume\") pod \"collect-profiles-29416725-9mhn4\" (UID: \"c2c7e376-ab13-4dfa-900f-8de633105709\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.329065 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8be1c797-26e0-4752-ab88-e593ce382532-config-volume\") pod \"dns-default-4w27r\" (UID: \"8be1c797-26e0-4752-ab88-e593ce382532\") " pod="openshift-dns/dns-default-4w27r" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.329093 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-697rr\" (UniqueName: \"kubernetes.io/projected/4603aefa-389d-4f23-b247-1c7e98f9dc94-kube-api-access-697rr\") pod \"multus-admission-controller-857f4d67dd-zpgsz\" (UID: \"4603aefa-389d-4f23-b247-1c7e98f9dc94\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zpgsz" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.329246 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ed8e0159-4aa0-4e24-982d-b8e43b561192-default-certificate\") pod \"router-default-5444994796-k5tts\" (UID: \"ed8e0159-4aa0-4e24-982d-b8e43b561192\") " pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.329696 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gfx9\" (UniqueName: \"kubernetes.io/projected/2c9c8d86-e93d-431d-8734-331a2f3997bf-kube-api-access-7gfx9\") pod \"dns-operator-744455d44c-5sc28\" (UID: \"2c9c8d86-e93d-431d-8734-331a2f3997bf\") " pod="openshift-dns-operator/dns-operator-744455d44c-5sc28" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.329758 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsxxc\" (UniqueName: \"kubernetes.io/projected/ed8e0159-4aa0-4e24-982d-b8e43b561192-kube-api-access-xsxxc\") pod \"router-default-5444994796-k5tts\" (UID: \"ed8e0159-4aa0-4e24-982d-b8e43b561192\") " pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.329804 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2rcp\" (UniqueName: \"kubernetes.io/projected/c2c7e376-ab13-4dfa-900f-8de633105709-kube-api-access-k2rcp\") pod \"collect-profiles-29416725-9mhn4\" (UID: \"c2c7e376-ab13-4dfa-900f-8de633105709\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.329890 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9dbb2866-4465-4a76-b007-f381ba65f51e-images\") pod \"machine-config-operator-74547568cd-fwst7\" (UID: \"9dbb2866-4465-4a76-b007-f381ba65f51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.332364 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9dbb2866-4465-4a76-b007-f381ba65f51e-images\") pod \"machine-config-operator-74547568cd-fwst7\" (UID: \"9dbb2866-4465-4a76-b007-f381ba65f51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.333165 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9dbb2866-4465-4a76-b007-f381ba65f51e-proxy-tls\") pod \"machine-config-operator-74547568cd-fwst7\" (UID: \"9dbb2866-4465-4a76-b007-f381ba65f51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.335524 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c9c8d86-e93d-431d-8734-331a2f3997bf-metrics-tls\") pod \"dns-operator-744455d44c-5sc28\" (UID: \"2c9c8d86-e93d-431d-8734-331a2f3997bf\") " pod="openshift-dns-operator/dns-operator-744455d44c-5sc28" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.336409 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed8e0159-4aa0-4e24-982d-b8e43b561192-metrics-certs\") pod \"router-default-5444994796-k5tts\" (UID: \"ed8e0159-4aa0-4e24-982d-b8e43b561192\") " pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.337085 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2c7e376-ab13-4dfa-900f-8de633105709-secret-volume\") pod \"collect-profiles-29416725-9mhn4\" (UID: \"c2c7e376-ab13-4dfa-900f-8de633105709\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.339493 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ed8e0159-4aa0-4e24-982d-b8e43b561192-stats-auth\") pod \"router-default-5444994796-k5tts\" (UID: \"ed8e0159-4aa0-4e24-982d-b8e43b561192\") " pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.346919 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4603aefa-389d-4f23-b247-1c7e98f9dc94-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zpgsz\" (UID: \"4603aefa-389d-4f23-b247-1c7e98f9dc94\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zpgsz" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.364352 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlxbq\" (UniqueName: \"kubernetes.io/projected/9dbb2866-4465-4a76-b007-f381ba65f51e-kube-api-access-jlxbq\") pod \"machine-config-operator-74547568cd-fwst7\" (UID: \"9dbb2866-4465-4a76-b007-f381ba65f51e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.377103 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.402142 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cnfn5"] Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.407815 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-697rr\" (UniqueName: \"kubernetes.io/projected/4603aefa-389d-4f23-b247-1c7e98f9dc94-kube-api-access-697rr\") pod \"multus-admission-controller-857f4d67dd-zpgsz\" (UID: \"4603aefa-389d-4f23-b247-1c7e98f9dc94\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zpgsz" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.430831 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.431008 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8be1c797-26e0-4752-ab88-e593ce382532-config-volume\") pod \"dns-default-4w27r\" (UID: \"8be1c797-26e0-4752-ab88-e593ce382532\") " pod="openshift-dns/dns-default-4w27r" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.431064 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nktk6\" (UniqueName: \"kubernetes.io/projected/8be1c797-26e0-4752-ab88-e593ce382532-kube-api-access-nktk6\") pod \"dns-default-4w27r\" (UID: \"8be1c797-26e0-4752-ab88-e593ce382532\") " pod="openshift-dns/dns-default-4w27r" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.431093 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68zqf\" (UniqueName: \"kubernetes.io/projected/f04de935-d344-44d6-9cc5-0abd0ec89d50-kube-api-access-68zqf\") pod \"machine-config-server-pbwcv\" (UID: \"f04de935-d344-44d6-9cc5-0abd0ec89d50\") " pod="openshift-machine-config-operator/machine-config-server-pbwcv" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.431141 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f04de935-d344-44d6-9cc5-0abd0ec89d50-node-bootstrap-token\") pod \"machine-config-server-pbwcv\" (UID: \"f04de935-d344-44d6-9cc5-0abd0ec89d50\") " pod="openshift-machine-config-operator/machine-config-server-pbwcv" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.431191 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8be1c797-26e0-4752-ab88-e593ce382532-metrics-tls\") pod \"dns-default-4w27r\" (UID: \"8be1c797-26e0-4752-ab88-e593ce382532\") " pod="openshift-dns/dns-default-4w27r" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.431222 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f04de935-d344-44d6-9cc5-0abd0ec89d50-certs\") pod \"machine-config-server-pbwcv\" (UID: \"f04de935-d344-44d6-9cc5-0abd0ec89d50\") " pod="openshift-machine-config-operator/machine-config-server-pbwcv" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.431249 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f2c00fbc-6a46-4e89-af8e-202e23796264-cert\") pod \"ingress-canary-2h9sc\" (UID: \"f2c00fbc-6a46-4e89-af8e-202e23796264\") " pod="openshift-ingress-canary/ingress-canary-2h9sc" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.431272 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfcb5\" (UniqueName: \"kubernetes.io/projected/f2c00fbc-6a46-4e89-af8e-202e23796264-kube-api-access-jfcb5\") pod \"ingress-canary-2h9sc\" (UID: \"f2c00fbc-6a46-4e89-af8e-202e23796264\") " pod="openshift-ingress-canary/ingress-canary-2h9sc" Dec 06 06:59:32 crc kubenswrapper[4954]: E1206 06:59:32.431752 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:32.931705513 +0000 UTC m=+147.745065082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.432335 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2rcp\" (UniqueName: \"kubernetes.io/projected/c2c7e376-ab13-4dfa-900f-8de633105709-kube-api-access-k2rcp\") pod \"collect-profiles-29416725-9mhn4\" (UID: \"c2c7e376-ab13-4dfa-900f-8de633105709\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.435902 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8be1c797-26e0-4752-ab88-e593ce382532-config-volume\") pod \"dns-default-4w27r\" (UID: \"8be1c797-26e0-4752-ab88-e593ce382532\") " pod="openshift-dns/dns-default-4w27r" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.436355 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f04de935-d344-44d6-9cc5-0abd0ec89d50-node-bootstrap-token\") pod \"machine-config-server-pbwcv\" (UID: \"f04de935-d344-44d6-9cc5-0abd0ec89d50\") " pod="openshift-machine-config-operator/machine-config-server-pbwcv" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.440685 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f04de935-d344-44d6-9cc5-0abd0ec89d50-certs\") pod \"machine-config-server-pbwcv\" (UID: \"f04de935-d344-44d6-9cc5-0abd0ec89d50\") " pod="openshift-machine-config-operator/machine-config-server-pbwcv" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.445138 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8be1c797-26e0-4752-ab88-e593ce382532-metrics-tls\") pod \"dns-default-4w27r\" (UID: \"8be1c797-26e0-4752-ab88-e593ce382532\") " pod="openshift-dns/dns-default-4w27r" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.445524 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.446600 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f2c00fbc-6a46-4e89-af8e-202e23796264-cert\") pod \"ingress-canary-2h9sc\" (UID: \"f2c00fbc-6a46-4e89-af8e-202e23796264\") " pod="openshift-ingress-canary/ingress-canary-2h9sc" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.449036 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsxxc\" (UniqueName: \"kubernetes.io/projected/ed8e0159-4aa0-4e24-982d-b8e43b561192-kube-api-access-xsxxc\") pod \"router-default-5444994796-k5tts\" (UID: \"ed8e0159-4aa0-4e24-982d-b8e43b561192\") " pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.463304 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gfx9\" (UniqueName: \"kubernetes.io/projected/2c9c8d86-e93d-431d-8734-331a2f3997bf-kube-api-access-7gfx9\") pod \"dns-operator-744455d44c-5sc28\" (UID: \"2c9c8d86-e93d-431d-8734-331a2f3997bf\") " pod="openshift-dns-operator/dns-operator-744455d44c-5sc28" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.488432 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6"] Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.505208 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfcb5\" (UniqueName: \"kubernetes.io/projected/f2c00fbc-6a46-4e89-af8e-202e23796264-kube-api-access-jfcb5\") pod \"ingress-canary-2h9sc\" (UID: \"f2c00fbc-6a46-4e89-af8e-202e23796264\") " pod="openshift-ingress-canary/ingress-canary-2h9sc" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.508188 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2h9sc" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.527937 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68zqf\" (UniqueName: \"kubernetes.io/projected/f04de935-d344-44d6-9cc5-0abd0ec89d50-kube-api-access-68zqf\") pod \"machine-config-server-pbwcv\" (UID: \"f04de935-d344-44d6-9cc5-0abd0ec89d50\") " pod="openshift-machine-config-operator/machine-config-server-pbwcv" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.535103 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: E1206 06:59:32.535511 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:33.03549448 +0000 UTC m=+147.848853869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.545097 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nktk6\" (UniqueName: \"kubernetes.io/projected/8be1c797-26e0-4752-ab88-e593ce382532-kube-api-access-nktk6\") pod \"dns-default-4w27r\" (UID: \"8be1c797-26e0-4752-ab88-e593ce382532\") " pod="openshift-dns/dns-default-4w27r" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.657962 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:32 crc kubenswrapper[4954]: E1206 06:59:32.658216 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:33.15817342 +0000 UTC m=+147.971532809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.662458 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.663216 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:32 crc kubenswrapper[4954]: E1206 06:59:32.663611 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:33.16352624 +0000 UTC m=+147.976885619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.666956 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5sc28" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.700063 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-zpgsz" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.762003 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p"] Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.764044 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:32 crc kubenswrapper[4954]: E1206 06:59:32.765281 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:33.265251953 +0000 UTC m=+148.078611342 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.765802 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: E1206 06:59:32.766214 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:33.266142227 +0000 UTC m=+148.079501616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.768370 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv"] Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.802500 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4w27r" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.814195 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pbwcv" Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.866684 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:32 crc kubenswrapper[4954]: E1206 06:59:32.867299 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:33.367275613 +0000 UTC m=+148.180635002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:32 crc kubenswrapper[4954]: W1206 06:59:32.963179 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa19d8cf_e25d_4ff3_9a82_4921d5a1e9c7.slice/crio-4f96ee56a5682d326e53956c412f8c2d700e99f8d29fa54d42f6386db9c660f1 WatchSource:0}: Error finding container 4f96ee56a5682d326e53956c412f8c2d700e99f8d29fa54d42f6386db9c660f1: Status 404 returned error can't find the container with id 4f96ee56a5682d326e53956c412f8c2d700e99f8d29fa54d42f6386db9c660f1 Dec 06 06:59:32 crc kubenswrapper[4954]: I1206 06:59:32.968270 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:32 crc kubenswrapper[4954]: E1206 06:59:32.968876 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:33.468857893 +0000 UTC m=+148.282217282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.069298 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:33 crc kubenswrapper[4954]: E1206 06:59:33.069510 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:33.569470697 +0000 UTC m=+148.382830086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.070323 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:33 crc kubenswrapper[4954]: E1206 06:59:33.070788 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:33.570767701 +0000 UTC m=+148.384127090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.171423 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:33 crc kubenswrapper[4954]: E1206 06:59:33.171892 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:33.671869018 +0000 UTC m=+148.485228397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.273190 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:33 crc kubenswrapper[4954]: E1206 06:59:33.273638 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:33.773623182 +0000 UTC m=+148.586982571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.274704 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-k5tts" event={"ID":"ed8e0159-4aa0-4e24-982d-b8e43b561192","Type":"ContainerStarted","Data":"b133f23ea4da28cdca2f9aa5214b2e1cbb1bff86c9a004ab6635d29b83ff0d14"} Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.277127 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" event={"ID":"dcb356df-9cbe-46e4-a7c1-584a55224bdf","Type":"ContainerStarted","Data":"e635fad779892f17934208ae1417e47a8cf26527d07681260401db859572a44a"} Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.278825 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k" event={"ID":"a835af21-2fbc-46ef-b961-9d111dc803b1","Type":"ContainerStarted","Data":"bfbad45275da4c2ebdc495f5eb709fd34202f82db016f6e8da4aec30eec5a2b8"} Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.279908 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv" event={"ID":"aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7","Type":"ContainerStarted","Data":"4f96ee56a5682d326e53956c412f8c2d700e99f8d29fa54d42f6386db9c660f1"} Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.283031 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cnfn5" event={"ID":"7d4e3a12-5fd4-4796-8eab-70392f7cf809","Type":"ContainerStarted","Data":"38fdc937b70d9b1f1834c6080a85fa5f525745cec28386502bfc97922beeea1b"} Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.285117 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pm69p" event={"ID":"abc225d9-d805-4962-b7aa-71632d5cba7b","Type":"ContainerStarted","Data":"f6c5dc6f386999abab0f39beffed2741caad4fe4ec3bc310a060b0caaa8dcf2c"} Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.289528 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" event={"ID":"e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67","Type":"ContainerStarted","Data":"9e47b8af5a6416bead2c89140a398f2572802d2c4340afbbad333e016a6b2b7b"} Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.289601 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" event={"ID":"e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67","Type":"ContainerStarted","Data":"e2fe14f01d957463e91dd92b2aed9bae7eb0b565c6fd117ddf9cbedc4670f054"} Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.290140 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.297124 4954 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4x2sh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.297295 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" podUID="e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.298814 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-27pnw" event={"ID":"392a92d5-3d71-4d5e-9bf3-d1231e3b76da","Type":"ContainerStarted","Data":"a284bca3dbcc698a4e8cb24054c0e6ee843b77239316369b0d674c7d4c69d1a7"} Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.302100 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pbwcv" event={"ID":"f04de935-d344-44d6-9cc5-0abd0ec89d50","Type":"ContainerStarted","Data":"7c42157ffe1c3e64278f200e7e2fc6cf6fd5321f5fcb0d699da67274ae6de20e"} Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.304258 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" event={"ID":"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919","Type":"ContainerStarted","Data":"7ce19d9d3f44a097f43a70bb9a890a106167cbb5f8cc0d5c173e7521b2ff9e50"} Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.316344 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ljfkb" event={"ID":"07ff0de8-076e-4536-a897-5bec6fcc7592","Type":"ContainerStarted","Data":"696355fa62cac35412b5e792d8d92b7e90709e6cbe2bc55f18c061f8fba2c3a5"} Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.322122 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" podStartSLOduration=120.322100157 podStartE2EDuration="2m0.322100157s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:33.3161009 +0000 UTC m=+148.129460299" watchObservedRunningTime="2025-12-06 06:59:33.322100157 +0000 UTC m=+148.135459546" Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.323315 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m" event={"ID":"c560e73e-7742-4248-8b43-c398e456d967","Type":"ContainerStarted","Data":"e2e31e2a0610ff488a178e17c8c1867dae40cd9db95932114e535867f2e9c692"} Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.329968 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g89rd" event={"ID":"42266b35-bab7-4577-8054-b55fa5c2dcc0","Type":"ContainerStarted","Data":"57b256bada686c98e6db6934e6c520192485442c2565fb2f8f271f3994ae66cc"} Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.331353 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" event={"ID":"76a9970f-0017-4289-b4d5-804bbf7b0e9d","Type":"ContainerStarted","Data":"d0927cbbcffb8fec31b827dbf961bf5d0bc6cf6a0d7b63bc858088e467f35b88"} Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.333615 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-428jw" event={"ID":"6aa6d1ce-12b4-4f90-8c8a-403035535e60","Type":"ContainerStarted","Data":"b2b777eba4829367ada0662807c2fdc3bbb9b2b146eebd178f0a2fbe7e8511d0"} Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.354524 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-428jw" podStartSLOduration=120.354505062 podStartE2EDuration="2m0.354505062s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:33.352176661 +0000 UTC m=+148.165536060" watchObservedRunningTime="2025-12-06 06:59:33.354505062 +0000 UTC m=+148.167864451" Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.375602 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:33 crc kubenswrapper[4954]: E1206 06:59:33.375892 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:33.875857739 +0000 UTC m=+148.689217128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.376333 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:33 crc kubenswrapper[4954]: E1206 06:59:33.376682 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:33.87666881 +0000 UTC m=+148.690028199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.477128 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:33 crc kubenswrapper[4954]: E1206 06:59:33.477666 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:33.977648764 +0000 UTC m=+148.791008143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.601175 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:33 crc kubenswrapper[4954]: E1206 06:59:33.601722 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:34.101697269 +0000 UTC m=+148.915056658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.702901 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:33 crc kubenswrapper[4954]: E1206 06:59:33.703168 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:34.203127315 +0000 UTC m=+149.016486704 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.703496 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:33 crc kubenswrapper[4954]: E1206 06:59:33.703923 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:34.203907065 +0000 UTC m=+149.017266454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.806397 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:33 crc kubenswrapper[4954]: E1206 06:59:33.806842 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:34.306805639 +0000 UTC m=+149.120165028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.807188 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:33 crc kubenswrapper[4954]: E1206 06:59:33.807661 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:34.307650011 +0000 UTC m=+149.121009480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.881836 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6bfbl"] Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.891710 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-d7n2g"] Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.901086 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq6gx"] Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.908773 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:33 crc kubenswrapper[4954]: E1206 06:59:33.908966 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:34.408947513 +0000 UTC m=+149.222306902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.909041 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:33 crc kubenswrapper[4954]: E1206 06:59:33.909396 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:34.409389545 +0000 UTC m=+149.222748934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:33 crc kubenswrapper[4954]: W1206 06:59:33.923801 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5678dfe_78f2_40b2_8673_8ae5c96bc282.slice/crio-0d9a1b6c0114e729b09dad0fc6bdba54884cb283b4c06d76400ebaabbc7db75e WatchSource:0}: Error finding container 0d9a1b6c0114e729b09dad0fc6bdba54884cb283b4c06d76400ebaabbc7db75e: Status 404 returned error can't find the container with id 0d9a1b6c0114e729b09dad0fc6bdba54884cb283b4c06d76400ebaabbc7db75e Dec 06 06:59:33 crc kubenswrapper[4954]: I1206 06:59:33.928761 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5g5d4"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.012417 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:34 crc kubenswrapper[4954]: E1206 06:59:34.012914 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:34.512871444 +0000 UTC m=+149.326230823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.013054 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:34 crc kubenswrapper[4954]: E1206 06:59:34.013547 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:34.513538592 +0000 UTC m=+149.326897971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.071451 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-z4t2h"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.091350 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-j2cx5"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.115112 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:34 crc kubenswrapper[4954]: E1206 06:59:34.116005 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:34.615987954 +0000 UTC m=+149.429347343 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.121654 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p752"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.130244 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.135851 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-gwzmq"] Dec 06 06:59:34 crc kubenswrapper[4954]: W1206 06:59:34.162070 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7733adb_d709_4fee_bfc6_728721369b82.slice/crio-02b8a177e8cbe0122f0c1c162d47c1e78341dd182ce0d247fb5e5acb3b2e9f7e WatchSource:0}: Error finding container 02b8a177e8cbe0122f0c1c162d47c1e78341dd182ce0d247fb5e5acb3b2e9f7e: Status 404 returned error can't find the container with id 02b8a177e8cbe0122f0c1c162d47c1e78341dd182ce0d247fb5e5acb3b2e9f7e Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.200181 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-56lst"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.200245 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-z6mbx"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.206257 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qlrf9"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.217628 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:34 crc kubenswrapper[4954]: E1206 06:59:34.218051 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:34.718035905 +0000 UTC m=+149.531395294 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:34 crc kubenswrapper[4954]: W1206 06:59:34.306145 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod891188e6_3c26_44de_84b2_6585f0d5e7dd.slice/crio-0ba499d3b0809733d81b0d72ae91c7374ffc9f65a3f2a74752be2630e9d7d1b1 WatchSource:0}: Error finding container 0ba499d3b0809733d81b0d72ae91c7374ffc9f65a3f2a74752be2630e9d7d1b1: Status 404 returned error can't find the container with id 0ba499d3b0809733d81b0d72ae91c7374ffc9f65a3f2a74752be2630e9d7d1b1 Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.320260 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:34 crc kubenswrapper[4954]: E1206 06:59:34.320981 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:34.82094941 +0000 UTC m=+149.634308819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.342136 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bbfcd"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.350795 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4w27r"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.356771 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.364280 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-29x4r"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.365956 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jzdxj"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.374062 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2h9sc"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.377931 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pm69p" event={"ID":"abc225d9-d805-4962-b7aa-71632d5cba7b","Type":"ContainerStarted","Data":"a5298443d177b767ac792243a562828a944a05092c01a53d45316fc28fcb6ce1"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.403764 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bb"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.406989 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" event={"ID":"dc8c8cf2-372a-4606-b6ab-0781d4326602","Type":"ContainerStarted","Data":"cffc3b6ffa7b94b7cca847c21ded80a7aa07032b486b82e08c02ec2aadb31f51"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.413512 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-z4t2h" event={"ID":"b7733adb-d709-4fee-bfc6-728721369b82","Type":"ContainerStarted","Data":"02b8a177e8cbe0122f0c1c162d47c1e78341dd182ce0d247fb5e5acb3b2e9f7e"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.416786 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.424979 4954 generic.go:334] "Generic (PLEG): container finished" podID="dcb356df-9cbe-46e4-a7c1-584a55224bdf" containerID="d0940efe790209a8cc8168ca52915a0b98aa5758154cce9b2e3e81f73062bddc" exitCode=0 Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.426157 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5sc28"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.426188 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" event={"ID":"dcb356df-9cbe-46e4-a7c1-584a55224bdf","Type":"ContainerDied","Data":"d0940efe790209a8cc8168ca52915a0b98aa5758154cce9b2e3e81f73062bddc"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.426786 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:34 crc kubenswrapper[4954]: E1206 06:59:34.427109 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:34.927098599 +0000 UTC m=+149.740457988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.432799 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pm69p" podStartSLOduration=121.432775097 podStartE2EDuration="2m1.432775097s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:34.409451928 +0000 UTC m=+149.222811317" watchObservedRunningTime="2025-12-06 06:59:34.432775097 +0000 UTC m=+149.246134486" Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.438343 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" event={"ID":"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919","Type":"ContainerStarted","Data":"ed274cbd2937dd9c60f84900f8173ec9ee4d8ec6945bafd8e9343ee9c7c31c2a"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.439643 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.446110 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ljfkb" event={"ID":"07ff0de8-076e-4536-a897-5bec6fcc7592","Type":"ContainerStarted","Data":"14b68f0ff9a411d704e7bec41d52fe76813bed79344f2a31cc70c7992643d772"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.446158 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ljfkb" event={"ID":"07ff0de8-076e-4536-a897-5bec6fcc7592","Type":"ContainerStarted","Data":"dc40e8dfdb04cec8a361ad480293b3cd5496d9376d3e31e2788ede6ad849f2ec"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.484177 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7p9tp"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.499084 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zpgsz"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.499211 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-z6mbx" event={"ID":"2f2c25db-28aa-4b19-8f9a-21b78e02089f","Type":"ContainerStarted","Data":"9fbec2c99b04705c62fb7942b0a53ad71f3d04338af092075807fec369231995"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.507105 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pbwcv" event={"ID":"f04de935-d344-44d6-9cc5-0abd0ec89d50","Type":"ContainerStarted","Data":"616af0481c00164d0e9e15327b20edcb4aca79972b70634521502478588c6d31"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.510113 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6gq5q"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.529352 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:34 crc kubenswrapper[4954]: E1206 06:59:34.530708 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:35.03067931 +0000 UTC m=+149.844038699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.541750 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccjjn"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.545735 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4"] Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.548096 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-428jw" event={"ID":"6aa6d1ce-12b4-4f90-8c8a-403035535e60","Type":"ContainerStarted","Data":"24a7104a924c200c834afaf1c811ed8684d1ec82cecf9bf5945d02d3854b09d6"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.546598 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" podStartSLOduration=120.546578155 podStartE2EDuration="2m0.546578155s" podCreationTimestamp="2025-12-06 06:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:34.517413204 +0000 UTC m=+149.330772593" watchObservedRunningTime="2025-12-06 06:59:34.546578155 +0000 UTC m=+149.359937564" Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.556809 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq6gx" event={"ID":"5e9d36ab-4a09-4275-9732-cd9bd681a917","Type":"ContainerStarted","Data":"25b7ce5a81f4e5811d475e56e2c54b27afdb5ff463e420d9f0bb62ccd03f3ef6"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.570058 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ljfkb" podStartSLOduration=121.570034017 podStartE2EDuration="2m1.570034017s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:34.563041694 +0000 UTC m=+149.376401083" watchObservedRunningTime="2025-12-06 06:59:34.570034017 +0000 UTC m=+149.383393406" Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.591913 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-pbwcv" podStartSLOduration=6.591894527 podStartE2EDuration="6.591894527s" podCreationTimestamp="2025-12-06 06:59:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:34.590530271 +0000 UTC m=+149.403889660" watchObservedRunningTime="2025-12-06 06:59:34.591894527 +0000 UTC m=+149.405253916" Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.618173 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k" event={"ID":"a835af21-2fbc-46ef-b961-9d111dc803b1","Type":"ContainerStarted","Data":"9358d202ff85bd8ee95222f1960070249f889b96e8a80ea7c6d5688af31da816"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.619287 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k" Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.631674 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.631780 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k" Dec 06 06:59:34 crc kubenswrapper[4954]: E1206 06:59:34.632750 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:35.132714962 +0000 UTC m=+149.946074551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.649508 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4t69k" podStartSLOduration=121.649482539 podStartE2EDuration="2m1.649482539s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:34.649318315 +0000 UTC m=+149.462677714" watchObservedRunningTime="2025-12-06 06:59:34.649482539 +0000 UTC m=+149.462841958" Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.652616 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m" event={"ID":"c560e73e-7742-4248-8b43-c398e456d967","Type":"ContainerStarted","Data":"6d02a8dd07d4720d7a1b80812a1b5c992f2d59fcec9f775b2bd9c5cd87e5ed37"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.671903 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p752" event={"ID":"2b823812-b773-4c33-9e75-55395275621d","Type":"ContainerStarted","Data":"28679d2b11abcea0541ddcc28f5a19972b30d1a5ce7620dc1f82c49e33bee346"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.681836 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" event={"ID":"f9ff9691-e91d-48ba-9ee9-de7807534c6e","Type":"ContainerStarted","Data":"4cb9f74321f44588ec1a4983ecf3abe4692ba28b5d1331c6c7daa77c839ef479"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.691420 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" event={"ID":"b5678dfe-78f2-40b2-8673-8ae5c96bc282","Type":"ContainerStarted","Data":"0d9a1b6c0114e729b09dad0fc6bdba54884cb283b4c06d76400ebaabbc7db75e"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.703363 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-k5tts" event={"ID":"ed8e0159-4aa0-4e24-982d-b8e43b561192","Type":"ContainerStarted","Data":"53915cdc3d8c7f75ae7ad52910bdfdce7c0164980a19e1f98ea46ffa2d65d1e3"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.713282 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7dd7m" podStartSLOduration=121.713267153 podStartE2EDuration="2m1.713267153s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:34.712156054 +0000 UTC m=+149.525515463" watchObservedRunningTime="2025-12-06 06:59:34.713267153 +0000 UTC m=+149.526626542" Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.717907 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" event={"ID":"4c71517f-f5be-4508-8ce4-df43bd8700b7","Type":"ContainerStarted","Data":"2378b48570e9865f3430dbf17c91ffe52a56edb05cd9c7f5db33323a009ced8d"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.719144 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.738336 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:34 crc kubenswrapper[4954]: E1206 06:59:34.739882 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:35.239844386 +0000 UTC m=+150.053203995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.740504 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g89rd" event={"ID":"42266b35-bab7-4577-8054-b55fa5c2dcc0","Type":"ContainerStarted","Data":"f667d75c4563fdd1b3d46d310c83ae676e7130e6ac2ef8b018f72c9c2632b96f"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.757580 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv" event={"ID":"aa19d8cf-e25d-4ff3-9a82-4921d5a1e9c7","Type":"ContainerStarted","Data":"97f81041d5362701e7ae7fcc8f26021252219d27e2df651385382724ed536e63"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.758002 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv" Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.762822 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-56lst" event={"ID":"891188e6-3c26-44de-84b2-6585f0d5e7dd","Type":"ContainerStarted","Data":"0ba499d3b0809733d81b0d72ae91c7374ffc9f65a3f2a74752be2630e9d7d1b1"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.768093 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" event={"ID":"21436169-5d86-482f-8277-dd88780f2b68","Type":"ContainerStarted","Data":"940859519dc44c8fbe7b77a2ff2572ffa3a743c5bc2171a3efca716f9530ccb2"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.775283 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-27pnw" event={"ID":"392a92d5-3d71-4d5e-9bf3-d1231e3b76da","Type":"ContainerStarted","Data":"13e571c2f3a901c597b52b5820d2b488502daf577f41738aa3fe6f4c5a9a6066"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.780527 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-k5tts" podStartSLOduration=121.780494496 podStartE2EDuration="2m1.780494496s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:34.746411587 +0000 UTC m=+149.559770986" watchObservedRunningTime="2025-12-06 06:59:34.780494496 +0000 UTC m=+149.593853885" Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.782886 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-j2cx5" event={"ID":"f4fb523f-9022-4352-993a-14844194617e","Type":"ContainerStarted","Data":"27479ba4d197fb7568dca3e3287e521e6abcba377e62d966be7c6ce288544840"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.803508 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv" Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.804442 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g89rd" podStartSLOduration=121.804432021 podStartE2EDuration="2m1.804432021s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:34.803941068 +0000 UTC m=+149.617300457" watchObservedRunningTime="2025-12-06 06:59:34.804432021 +0000 UTC m=+149.617791410" Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.838635 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w26lv" podStartSLOduration=121.838612602 podStartE2EDuration="2m1.838612602s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:34.833073538 +0000 UTC m=+149.646432927" watchObservedRunningTime="2025-12-06 06:59:34.838612602 +0000 UTC m=+149.651971991" Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.840388 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:34 crc kubenswrapper[4954]: E1206 06:59:34.841938 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:35.341924228 +0000 UTC m=+150.155283617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.844747 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg" event={"ID":"0ba7b20e-fa15-4fcb-8755-9594b2084aa0","Type":"ContainerStarted","Data":"123a2efa007335360d18ee664fe9a8851d1e9f572177948aced6d178e75d5d98"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.876557 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-27pnw" podStartSLOduration=121.876532341 podStartE2EDuration="2m1.876532341s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:34.855532723 +0000 UTC m=+149.668892122" watchObservedRunningTime="2025-12-06 06:59:34.876532341 +0000 UTC m=+149.689891730" Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.882966 4954 generic.go:334] "Generic (PLEG): container finished" podID="76a9970f-0017-4289-b4d5-804bbf7b0e9d" containerID="c45db04b5ff3db41ac0c75bb5f1b994f901bbfa4b10b37e6cb13926ef30182e4" exitCode=0 Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.883060 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" event={"ID":"76a9970f-0017-4289-b4d5-804bbf7b0e9d","Type":"ContainerDied","Data":"c45db04b5ff3db41ac0c75bb5f1b994f901bbfa4b10b37e6cb13926ef30182e4"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.915606 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cnfn5" event={"ID":"7d4e3a12-5fd4-4796-8eab-70392f7cf809","Type":"ContainerStarted","Data":"f8d59bed5e09fbd72d35fd7cffc2ea53a3a9bd5665789b63bb465f42340e14f2"} Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.951944 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:34 crc kubenswrapper[4954]: E1206 06:59:34.953856 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:35.453834307 +0000 UTC m=+150.267193696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.973436 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" Dec 06 06:59:34 crc kubenswrapper[4954]: I1206 06:59:34.981512 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cnfn5" podStartSLOduration=121.981493939 podStartE2EDuration="2m1.981493939s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:34.975488322 +0000 UTC m=+149.788847751" watchObservedRunningTime="2025-12-06 06:59:34.981493939 +0000 UTC m=+149.794853328" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.056302 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:35 crc kubenswrapper[4954]: E1206 06:59:35.073445 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:35.573412066 +0000 UTC m=+150.386771455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.157904 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:35 crc kubenswrapper[4954]: E1206 06:59:35.158408 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:35.658385453 +0000 UTC m=+150.471744842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.183825 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r7qkf"] Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.184923 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7qkf" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.195283 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r7qkf"] Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.207003 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.262082 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4fd0cda-b6fe-411e-a815-cd883c0ed24f-utilities\") pod \"community-operators-r7qkf\" (UID: \"e4fd0cda-b6fe-411e-a815-cd883c0ed24f\") " pod="openshift-marketplace/community-operators-r7qkf" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.262576 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4fd0cda-b6fe-411e-a815-cd883c0ed24f-catalog-content\") pod \"community-operators-r7qkf\" (UID: \"e4fd0cda-b6fe-411e-a815-cd883c0ed24f\") " pod="openshift-marketplace/community-operators-r7qkf" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.262599 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sld9h\" (UniqueName: \"kubernetes.io/projected/e4fd0cda-b6fe-411e-a815-cd883c0ed24f-kube-api-access-sld9h\") pod \"community-operators-r7qkf\" (UID: \"e4fd0cda-b6fe-411e-a815-cd883c0ed24f\") " pod="openshift-marketplace/community-operators-r7qkf" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.262650 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:35 crc kubenswrapper[4954]: E1206 06:59:35.263003 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:35.762988721 +0000 UTC m=+150.576348110 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.382396 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5wq5n"] Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.390960 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wq5n" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.391039 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.391887 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4fd0cda-b6fe-411e-a815-cd883c0ed24f-utilities\") pod \"community-operators-r7qkf\" (UID: \"e4fd0cda-b6fe-411e-a815-cd883c0ed24f\") " pod="openshift-marketplace/community-operators-r7qkf" Dec 06 06:59:35 crc kubenswrapper[4954]: E1206 06:59:35.392093 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:35.892064458 +0000 UTC m=+150.705423847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.392165 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4fd0cda-b6fe-411e-a815-cd883c0ed24f-catalog-content\") pod \"community-operators-r7qkf\" (UID: \"e4fd0cda-b6fe-411e-a815-cd883c0ed24f\") " pod="openshift-marketplace/community-operators-r7qkf" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.392186 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sld9h\" (UniqueName: \"kubernetes.io/projected/e4fd0cda-b6fe-411e-a815-cd883c0ed24f-kube-api-access-sld9h\") pod \"community-operators-r7qkf\" (UID: \"e4fd0cda-b6fe-411e-a815-cd883c0ed24f\") " pod="openshift-marketplace/community-operators-r7qkf" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.392251 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.392621 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4fd0cda-b6fe-411e-a815-cd883c0ed24f-utilities\") pod \"community-operators-r7qkf\" (UID: \"e4fd0cda-b6fe-411e-a815-cd883c0ed24f\") " pod="openshift-marketplace/community-operators-r7qkf" Dec 06 06:59:35 crc kubenswrapper[4954]: E1206 06:59:35.392946 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:35.89291235 +0000 UTC m=+150.706271739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.392993 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4fd0cda-b6fe-411e-a815-cd883c0ed24f-catalog-content\") pod \"community-operators-r7qkf\" (UID: \"e4fd0cda-b6fe-411e-a815-cd883c0ed24f\") " pod="openshift-marketplace/community-operators-r7qkf" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.403011 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.406118 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5wq5n"] Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.443901 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sld9h\" (UniqueName: \"kubernetes.io/projected/e4fd0cda-b6fe-411e-a815-cd883c0ed24f-kube-api-access-sld9h\") pod \"community-operators-r7qkf\" (UID: \"e4fd0cda-b6fe-411e-a815-cd883c0ed24f\") " pod="openshift-marketplace/community-operators-r7qkf" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.494742 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.495061 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9-utilities\") pod \"certified-operators-5wq5n\" (UID: \"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9\") " pod="openshift-marketplace/certified-operators-5wq5n" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.495145 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9-catalog-content\") pod \"certified-operators-5wq5n\" (UID: \"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9\") " pod="openshift-marketplace/certified-operators-5wq5n" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.495175 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f58mz\" (UniqueName: \"kubernetes.io/projected/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9-kube-api-access-f58mz\") pod \"certified-operators-5wq5n\" (UID: \"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9\") " pod="openshift-marketplace/certified-operators-5wq5n" Dec 06 06:59:35 crc kubenswrapper[4954]: E1206 06:59:35.495282 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:35.995264339 +0000 UTC m=+150.808623728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.570960 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zbdtt"] Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.572708 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zbdtt" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.574747 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zbdtt"] Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.600785 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9-utilities\") pod \"certified-operators-5wq5n\" (UID: \"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9\") " pod="openshift-marketplace/certified-operators-5wq5n" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.600917 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.600980 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9-catalog-content\") pod \"certified-operators-5wq5n\" (UID: \"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9\") " pod="openshift-marketplace/certified-operators-5wq5n" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.601013 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f58mz\" (UniqueName: \"kubernetes.io/projected/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9-kube-api-access-f58mz\") pod \"certified-operators-5wq5n\" (UID: \"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9\") " pod="openshift-marketplace/certified-operators-5wq5n" Dec 06 06:59:35 crc kubenswrapper[4954]: E1206 06:59:35.601436 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:36.101413338 +0000 UTC m=+150.914772787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.602058 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9-utilities\") pod \"certified-operators-5wq5n\" (UID: \"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9\") " pod="openshift-marketplace/certified-operators-5wq5n" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.602155 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7qkf" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.603481 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9-catalog-content\") pod \"certified-operators-5wq5n\" (UID: \"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9\") " pod="openshift-marketplace/certified-operators-5wq5n" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.636497 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f58mz\" (UniqueName: \"kubernetes.io/projected/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9-kube-api-access-f58mz\") pod \"certified-operators-5wq5n\" (UID: \"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9\") " pod="openshift-marketplace/certified-operators-5wq5n" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.668320 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.675037 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 06:59:35 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Dec 06 06:59:35 crc kubenswrapper[4954]: [+]process-running ok Dec 06 06:59:35 crc kubenswrapper[4954]: healthz check failed Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.675136 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.705032 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:35 crc kubenswrapper[4954]: E1206 06:59:35.705306 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:36.205291618 +0000 UTC m=+151.018651007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.705354 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8571f5ab-84f5-4516-835c-01f5361d0ca4-utilities\") pod \"community-operators-zbdtt\" (UID: \"8571f5ab-84f5-4516-835c-01f5361d0ca4\") " pod="openshift-marketplace/community-operators-zbdtt" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.705419 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8571f5ab-84f5-4516-835c-01f5361d0ca4-catalog-content\") pod \"community-operators-zbdtt\" (UID: \"8571f5ab-84f5-4516-835c-01f5361d0ca4\") " pod="openshift-marketplace/community-operators-zbdtt" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.705529 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rb69\" (UniqueName: \"kubernetes.io/projected/8571f5ab-84f5-4516-835c-01f5361d0ca4-kube-api-access-9rb69\") pod \"community-operators-zbdtt\" (UID: \"8571f5ab-84f5-4516-835c-01f5361d0ca4\") " pod="openshift-marketplace/community-operators-zbdtt" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.705879 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:35 crc kubenswrapper[4954]: E1206 06:59:35.706213 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:36.206200651 +0000 UTC m=+151.019560040 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.760722 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wq5n" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.761884 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vfhrk"] Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.763366 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vfhrk" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.778485 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vfhrk"] Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.810298 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.810689 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8571f5ab-84f5-4516-835c-01f5361d0ca4-utilities\") pod \"community-operators-zbdtt\" (UID: \"8571f5ab-84f5-4516-835c-01f5361d0ca4\") " pod="openshift-marketplace/community-operators-zbdtt" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.810728 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8571f5ab-84f5-4516-835c-01f5361d0ca4-catalog-content\") pod \"community-operators-zbdtt\" (UID: \"8571f5ab-84f5-4516-835c-01f5361d0ca4\") " pod="openshift-marketplace/community-operators-zbdtt" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.810814 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rb69\" (UniqueName: \"kubernetes.io/projected/8571f5ab-84f5-4516-835c-01f5361d0ca4-kube-api-access-9rb69\") pod \"community-operators-zbdtt\" (UID: \"8571f5ab-84f5-4516-835c-01f5361d0ca4\") " pod="openshift-marketplace/community-operators-zbdtt" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.813278 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8571f5ab-84f5-4516-835c-01f5361d0ca4-catalog-content\") pod \"community-operators-zbdtt\" (UID: \"8571f5ab-84f5-4516-835c-01f5361d0ca4\") " pod="openshift-marketplace/community-operators-zbdtt" Dec 06 06:59:35 crc kubenswrapper[4954]: E1206 06:59:35.813589 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:36.313535401 +0000 UTC m=+151.126894790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.815109 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8571f5ab-84f5-4516-835c-01f5361d0ca4-utilities\") pod \"community-operators-zbdtt\" (UID: \"8571f5ab-84f5-4516-835c-01f5361d0ca4\") " pod="openshift-marketplace/community-operators-zbdtt" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.868544 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rb69\" (UniqueName: \"kubernetes.io/projected/8571f5ab-84f5-4516-835c-01f5361d0ca4-kube-api-access-9rb69\") pod \"community-operators-zbdtt\" (UID: \"8571f5ab-84f5-4516-835c-01f5361d0ca4\") " pod="openshift-marketplace/community-operators-zbdtt" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.912011 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.912074 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f-utilities\") pod \"certified-operators-vfhrk\" (UID: \"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f\") " pod="openshift-marketplace/certified-operators-vfhrk" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.912124 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nrpm\" (UniqueName: \"kubernetes.io/projected/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f-kube-api-access-2nrpm\") pod \"certified-operators-vfhrk\" (UID: \"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f\") " pod="openshift-marketplace/certified-operators-vfhrk" Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.912166 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f-catalog-content\") pod \"certified-operators-vfhrk\" (UID: \"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f\") " pod="openshift-marketplace/certified-operators-vfhrk" Dec 06 06:59:35 crc kubenswrapper[4954]: E1206 06:59:35.912387 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:36.412369649 +0000 UTC m=+151.225729038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.953495 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bb" event={"ID":"6003dd30-05c4-4565-82c7-d082a2d36d93","Type":"ContainerStarted","Data":"5cf69ae1a9df01c04ad78f40c105dc9619eb47e1c06262315f2a9fc434bf8d7d"} Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.956738 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-z4t2h" event={"ID":"b7733adb-d709-4fee-bfc6-728721369b82","Type":"ContainerStarted","Data":"dad48dc047c52cd05870b82198941900349d8c4ed02bda7e065e4961c5eaabb3"} Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.981451 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zpgsz" event={"ID":"4603aefa-389d-4f23-b247-1c7e98f9dc94","Type":"ContainerStarted","Data":"b1eb297a3da624c4536bf48c82cecc228442085a73359cdaf82874225df47b03"} Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.988202 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bbfcd" event={"ID":"5c76139a-69bb-4812-bd9a-481f6702904b","Type":"ContainerStarted","Data":"6ed7317ab8b4fd91e0303b7c12f3905138bb4a0297da11608309594d188bfccd"} Dec 06 06:59:35 crc kubenswrapper[4954]: I1206 06:59:35.991592 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-7p9tp" event={"ID":"9078eab8-cd16-404e-a8e6-e02c60ddfe16","Type":"ContainerStarted","Data":"a44aa75d3e0fbd0e84cff441c879b4b22fd86543d7cf635190cd10cf5f9d1751"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.014231 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.014500 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f-catalog-content\") pod \"certified-operators-vfhrk\" (UID: \"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f\") " pod="openshift-marketplace/certified-operators-vfhrk" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.014611 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f-utilities\") pod \"certified-operators-vfhrk\" (UID: \"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f\") " pod="openshift-marketplace/certified-operators-vfhrk" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.014650 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nrpm\" (UniqueName: \"kubernetes.io/projected/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f-kube-api-access-2nrpm\") pod \"certified-operators-vfhrk\" (UID: \"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f\") " pod="openshift-marketplace/certified-operators-vfhrk" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.015523 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f-catalog-content\") pod \"certified-operators-vfhrk\" (UID: \"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f\") " pod="openshift-marketplace/certified-operators-vfhrk" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.015575 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f-utilities\") pod \"certified-operators-vfhrk\" (UID: \"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f\") " pod="openshift-marketplace/certified-operators-vfhrk" Dec 06 06:59:36 crc kubenswrapper[4954]: E1206 06:59:36.015676 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:36.515652123 +0000 UTC m=+151.329011592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.053116 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p752" event={"ID":"2b823812-b773-4c33-9e75-55395275621d","Type":"ContainerStarted","Data":"4b59943f099640eea7e77acc843075c5cdc31b720438d0f3c4922ee421011b0f"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.058142 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4w27r" event={"ID":"8be1c797-26e0-4752-ab88-e593ce382532","Type":"ContainerStarted","Data":"39f4ee178326dc4ac52734adc4155123d25962dd7c744c02659ad171dc4b4243"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.060585 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" event={"ID":"dc8c8cf2-372a-4606-b6ab-0781d4326602","Type":"ContainerStarted","Data":"a3a00415dda7d8cb1ca787f06141149f93b2f595eca7fef44ea6ad779639c63a"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.065814 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6gq5q" event={"ID":"5a1c6a2c-6c84-4a85-9fd2-3c589165934a","Type":"ContainerStarted","Data":"f734444e1f326e47091441dc24263661ec8e923e6b016d6da81e7203826638be"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.070384 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" event={"ID":"4c71517f-f5be-4508-8ce4-df43bd8700b7","Type":"ContainerStarted","Data":"5af4757e9ea26ef3a1ad1c9124b5cea1df905378f885c22950890d2303be7c41"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.071425 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.078009 4954 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-5g5d4 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" start-of-body= Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.079207 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nrpm\" (UniqueName: \"kubernetes.io/projected/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f-kube-api-access-2nrpm\") pod \"certified-operators-vfhrk\" (UID: \"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f\") " pod="openshift-marketplace/certified-operators-vfhrk" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.079292 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq6gx" event={"ID":"5e9d36ab-4a09-4275-9732-cd9bd681a917","Type":"ContainerStarted","Data":"79df69268d4dfc7535e71806d1ea1465ac509784a261a4e306916cd6d8d28f3c"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.078193 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" podUID="4c71517f-f5be-4508-8ce4-df43bd8700b7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.092464 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-j2cx5" event={"ID":"f4fb523f-9022-4352-993a-14844194617e","Type":"ContainerStarted","Data":"314ad7bb3c190a647dcac33210b47504a2d968290c502c40faa14df4f68d8ca6"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.102421 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" event={"ID":"f9ff9691-e91d-48ba-9ee9-de7807534c6e","Type":"ContainerStarted","Data":"9f9bb7b2d80260b467b56e9ef3e08a2ddd19e7a566c1510f845cb66dff568b90"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.105780 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zbdtt" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.116043 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:36 crc kubenswrapper[4954]: E1206 06:59:36.117347 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:36.617329045 +0000 UTC m=+151.430688534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.142957 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" event={"ID":"21436169-5d86-482f-8277-dd88780f2b68","Type":"ContainerStarted","Data":"04617940fae8137531fbd817e6f6d4b969c8a521c82e4808fe980bdccd764389"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.144819 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" event={"ID":"97da928c-a8a1-48ef-90f4-68a650becdf6","Type":"ContainerStarted","Data":"ae991b17adef97791a3877e3ebc1100ca2515671894485f83f54d8d73d0fd2d0"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.145989 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-z6mbx" event={"ID":"2f2c25db-28aa-4b19-8f9a-21b78e02089f","Type":"ContainerStarted","Data":"99d8f66bf01df5bada1a8bee668aaba33b19443e897fcbb5f399a4261c465951"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.146809 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vfhrk" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.149653 4954 generic.go:334] "Generic (PLEG): container finished" podID="b5678dfe-78f2-40b2-8673-8ae5c96bc282" containerID="3a7e8c3c8173c0723f461f49f6ed84ee71379290127f1c9a184c8ba2bdf6e9f1" exitCode=0 Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.149716 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" event={"ID":"b5678dfe-78f2-40b2-8673-8ae5c96bc282","Type":"ContainerDied","Data":"3a7e8c3c8173c0723f461f49f6ed84ee71379290127f1c9a184c8ba2bdf6e9f1"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.161084 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7" event={"ID":"9dbb2866-4465-4a76-b007-f381ba65f51e","Type":"ContainerStarted","Data":"d13211d929a88c9a5ed5122a316d7338fbfe28fdd1b1599fa290949a2f177807"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.192227 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg" event={"ID":"0ba7b20e-fa15-4fcb-8755-9594b2084aa0","Type":"ContainerStarted","Data":"2fd5c466ba216c69bdb78429943e21c712c295efb3e7a8a424e6d9bfe1ea011d"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.202421 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4" event={"ID":"e54ca3bc-3cac-4f64-a27b-6e0f899f16b7","Type":"ContainerStarted","Data":"c8e68bc4b5fcf463fb68d740495404dff268bb90ccaaa8399d25a621565e0a77"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.202469 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4" event={"ID":"e54ca3bc-3cac-4f64-a27b-6e0f899f16b7","Type":"ContainerStarted","Data":"c3a5b7e311c5e3f17546c5eddfe9a120a804810ff33177119859205455b64315"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.218002 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.222914 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4" event={"ID":"c2c7e376-ab13-4dfa-900f-8de633105709","Type":"ContainerStarted","Data":"e766a5d3086914a55fa15a45967f17f99cc7294f3dd71d0a82585ff82461e71e"} Dec 06 06:59:36 crc kubenswrapper[4954]: E1206 06:59:36.229197 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:36.729173592 +0000 UTC m=+151.542532981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.251918 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2h9sc" event={"ID":"f2c00fbc-6a46-4e89-af8e-202e23796264","Type":"ContainerStarted","Data":"16ad9ddeacd6178ed8f47eb7d8b4109d88b87cc8dea463397d6b7f7d795ddc58"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.356874 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.360282 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5sc28" event={"ID":"2c9c8d86-e93d-431d-8734-331a2f3997bf","Type":"ContainerStarted","Data":"cc9056315b865ee24ca9e912e5b6153413b46dd4e85c0721766492991fe04e92"} Dec 06 06:59:36 crc kubenswrapper[4954]: E1206 06:59:36.372279 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:36.872253954 +0000 UTC m=+151.685613343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.390090 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccjjn" event={"ID":"4ffb6c45-5d12-405d-8b73-9a54b4d0922f","Type":"ContainerStarted","Data":"d0d0fc96aaa6ded257e42ad954e08c10f7d3ab6e48d09ae465411f9a8f842589"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.417750 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" podStartSLOduration=123.41772616 podStartE2EDuration="2m3.41772616s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:36.393472548 +0000 UTC m=+151.206831947" watchObservedRunningTime="2025-12-06 06:59:36.41772616 +0000 UTC m=+151.231085559" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.418912 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" event={"ID":"11bae354-1c41-442d-820c-d3cd3fa537d8","Type":"ContainerStarted","Data":"b1d40dfa8ee62fdc64b51200d0003552e46f1079f39cfa20dae90dee48f4aa6c"} Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.419933 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.439512 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq6gx" podStartSLOduration=123.439478967 podStartE2EDuration="2m3.439478967s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:36.425799071 +0000 UTC m=+151.239158460" watchObservedRunningTime="2025-12-06 06:59:36.439478967 +0000 UTC m=+151.252838356" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.447830 4954 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-jzdxj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.447911 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" podUID="11bae354-1c41-442d-820c-d3cd3fa537d8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.458548 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:36 crc kubenswrapper[4954]: E1206 06:59:36.459023 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:36.959006136 +0000 UTC m=+151.772365525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.505408 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r7qkf"] Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.506715 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-gwzmq" podStartSLOduration=123.50669209 podStartE2EDuration="2m3.50669209s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:36.49290023 +0000 UTC m=+151.306259649" watchObservedRunningTime="2025-12-06 06:59:36.50669209 +0000 UTC m=+151.320051479" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.537910 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" podStartSLOduration=123.537891724 podStartE2EDuration="2m3.537891724s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:36.534545976 +0000 UTC m=+151.347905375" watchObservedRunningTime="2025-12-06 06:59:36.537891724 +0000 UTC m=+151.351251113" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.565905 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:36 crc kubenswrapper[4954]: E1206 06:59:36.566446 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:37.066429048 +0000 UTC m=+151.879788437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.633815 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-j2cx5" podStartSLOduration=122.633792065 podStartE2EDuration="2m2.633792065s" podCreationTimestamp="2025-12-06 06:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:36.590517446 +0000 UTC m=+151.403876835" watchObservedRunningTime="2025-12-06 06:59:36.633792065 +0000 UTC m=+151.447151454" Dec 06 06:59:36 crc kubenswrapper[4954]: W1206 06:59:36.651074 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4fd0cda_b6fe_411e_a815_cd883c0ed24f.slice/crio-f5b33fc35689e7ff1b4c6ba0e64d33bec60ab404f520afcad6919b2b2f041fce WatchSource:0}: Error finding container f5b33fc35689e7ff1b4c6ba0e64d33bec60ab404f520afcad6919b2b2f041fce: Status 404 returned error can't find the container with id f5b33fc35689e7ff1b4c6ba0e64d33bec60ab404f520afcad6919b2b2f041fce Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.692822 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 06:59:36 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Dec 06 06:59:36 crc kubenswrapper[4954]: [+]process-running ok Dec 06 06:59:36 crc kubenswrapper[4954]: healthz check failed Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.693424 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.696198 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:36 crc kubenswrapper[4954]: E1206 06:59:36.696605 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:37.196582623 +0000 UTC m=+152.009942012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.703492 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8hvbg" podStartSLOduration=123.703469952 podStartE2EDuration="2m3.703469952s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:36.636105685 +0000 UTC m=+151.449465084" watchObservedRunningTime="2025-12-06 06:59:36.703469952 +0000 UTC m=+151.516829341" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.749851 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-z4t2h" podStartSLOduration=123.749834032 podStartE2EDuration="2m3.749834032s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:36.696813019 +0000 UTC m=+151.510172438" watchObservedRunningTime="2025-12-06 06:59:36.749834032 +0000 UTC m=+151.563193421" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.818725 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:36 crc kubenswrapper[4954]: E1206 06:59:36.819212 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:37.319193781 +0000 UTC m=+152.132553170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.844106 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" podStartSLOduration=123.84407997 podStartE2EDuration="2m3.84407997s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:36.748942629 +0000 UTC m=+151.562302018" watchObservedRunningTime="2025-12-06 06:59:36.84407997 +0000 UTC m=+151.657439359" Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.854851 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5wq5n"] Dec 06 06:59:36 crc kubenswrapper[4954]: I1206 06:59:36.920704 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:36 crc kubenswrapper[4954]: E1206 06:59:36.921719 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:37.421696005 +0000 UTC m=+152.235055394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.035798 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:37 crc kubenswrapper[4954]: E1206 06:59:37.049543 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:37.549427846 +0000 UTC m=+152.362787235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.145951 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:37 crc kubenswrapper[4954]: E1206 06:59:37.146370 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:37.646316613 +0000 UTC m=+152.459676002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.147141 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:37 crc kubenswrapper[4954]: E1206 06:59:37.147972 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:37.647954626 +0000 UTC m=+152.461314015 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.249571 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:37 crc kubenswrapper[4954]: E1206 06:59:37.250388 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:37.750368867 +0000 UTC m=+152.563728256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.351856 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:37 crc kubenswrapper[4954]: E1206 06:59:37.352414 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:37.852393278 +0000 UTC m=+152.665752667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.372694 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dghdt"] Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.381484 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dghdt" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.387708 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.417888 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zbdtt"] Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.437578 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dghdt"] Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.453692 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:37 crc kubenswrapper[4954]: E1206 06:59:37.453827 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:37.953800974 +0000 UTC m=+152.767160363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.454096 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb38465-b161-44c8-9e80-4c3df43ed7b1-utilities\") pod \"redhat-marketplace-dghdt\" (UID: \"ceb38465-b161-44c8-9e80-4c3df43ed7b1\") " pod="openshift-marketplace/redhat-marketplace-dghdt" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.454150 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb38465-b161-44c8-9e80-4c3df43ed7b1-catalog-content\") pod \"redhat-marketplace-dghdt\" (UID: \"ceb38465-b161-44c8-9e80-4c3df43ed7b1\") " pod="openshift-marketplace/redhat-marketplace-dghdt" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.454199 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbhtg\" (UniqueName: \"kubernetes.io/projected/ceb38465-b161-44c8-9e80-4c3df43ed7b1-kube-api-access-wbhtg\") pod \"redhat-marketplace-dghdt\" (UID: \"ceb38465-b161-44c8-9e80-4c3df43ed7b1\") " pod="openshift-marketplace/redhat-marketplace-dghdt" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.454255 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:37 crc kubenswrapper[4954]: E1206 06:59:37.454537 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:37.954529463 +0000 UTC m=+152.767888852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.538778 4954 patch_prober.go:28] interesting pod/downloads-7954f5f757-56lst container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.538838 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-56lst" podUID="891188e6-3c26-44de-84b2-6585f0d5e7dd" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.560388 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.560713 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbhtg\" (UniqueName: \"kubernetes.io/projected/ceb38465-b161-44c8-9e80-4c3df43ed7b1-kube-api-access-wbhtg\") pod \"redhat-marketplace-dghdt\" (UID: \"ceb38465-b161-44c8-9e80-4c3df43ed7b1\") " pod="openshift-marketplace/redhat-marketplace-dghdt" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.560812 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb38465-b161-44c8-9e80-4c3df43ed7b1-utilities\") pod \"redhat-marketplace-dghdt\" (UID: \"ceb38465-b161-44c8-9e80-4c3df43ed7b1\") " pod="openshift-marketplace/redhat-marketplace-dghdt" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.560859 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb38465-b161-44c8-9e80-4c3df43ed7b1-catalog-content\") pod \"redhat-marketplace-dghdt\" (UID: \"ceb38465-b161-44c8-9e80-4c3df43ed7b1\") " pod="openshift-marketplace/redhat-marketplace-dghdt" Dec 06 06:59:37 crc kubenswrapper[4954]: E1206 06:59:37.561022 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:38.060983159 +0000 UTC m=+152.874342548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.561467 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb38465-b161-44c8-9e80-4c3df43ed7b1-catalog-content\") pod \"redhat-marketplace-dghdt\" (UID: \"ceb38465-b161-44c8-9e80-4c3df43ed7b1\") " pod="openshift-marketplace/redhat-marketplace-dghdt" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.561752 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb38465-b161-44c8-9e80-4c3df43ed7b1-utilities\") pod \"redhat-marketplace-dghdt\" (UID: \"ceb38465-b161-44c8-9e80-4c3df43ed7b1\") " pod="openshift-marketplace/redhat-marketplace-dghdt" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.634346 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-56lst" podStartSLOduration=124.634316772 podStartE2EDuration="2m4.634316772s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:37.610369887 +0000 UTC m=+152.423729276" watchObservedRunningTime="2025-12-06 06:59:37.634316772 +0000 UTC m=+152.447676161" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.661462 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbhtg\" (UniqueName: \"kubernetes.io/projected/ceb38465-b161-44c8-9e80-4c3df43ed7b1-kube-api-access-wbhtg\") pod \"redhat-marketplace-dghdt\" (UID: \"ceb38465-b161-44c8-9e80-4c3df43ed7b1\") " pod="openshift-marketplace/redhat-marketplace-dghdt" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.662264 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:37 crc kubenswrapper[4954]: E1206 06:59:37.680060 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:38.16262046 +0000 UTC m=+152.975979969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.681621 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2h9sc" podStartSLOduration=9.681589585 podStartE2EDuration="9.681589585s" podCreationTimestamp="2025-12-06 06:59:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:37.675408064 +0000 UTC m=+152.488767453" watchObservedRunningTime="2025-12-06 06:59:37.681589585 +0000 UTC m=+152.494948974" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.715231 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 06:59:37 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Dec 06 06:59:37 crc kubenswrapper[4954]: [+]process-running ok Dec 06 06:59:37 crc kubenswrapper[4954]: healthz check failed Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.715305 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.764931 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:37 crc kubenswrapper[4954]: E1206 06:59:37.765390 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:38.26537252 +0000 UTC m=+153.078731909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.780120 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-56lst" event={"ID":"891188e6-3c26-44de-84b2-6585f0d5e7dd","Type":"ContainerStarted","Data":"f55a9b87c288aa8d8eae9c8377b06c087cd5e4261dbd691fdd863a6711628129"} Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.780170 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wq5n" event={"ID":"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9","Type":"ContainerStarted","Data":"f0b5a77da269f6dff239fa0aca6f3fee519ba7203fed1f181f91e1960633a3e0"} Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.780181 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2h9sc" event={"ID":"f2c00fbc-6a46-4e89-af8e-202e23796264","Type":"ContainerStarted","Data":"4a483750bb1527017ac35b19cef54ab4576c5acec2bda8601d0605b0a7b0cf09"} Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.780205 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-56lst" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.780218 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5sc28" event={"ID":"2c9c8d86-e93d-431d-8734-331a2f3997bf","Type":"ContainerStarted","Data":"5e8ee14035f4a6c01fd4fd28537505167f6209a072b938d96ef1bafd84fa46e1"} Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.781967 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" event={"ID":"11bae354-1c41-442d-820c-d3cd3fa537d8","Type":"ContainerStarted","Data":"572e5903bba51c9990996f26f88df5636edf030e88195b0ed6968be17b3e6654"} Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.787633 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vfhrk"] Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.797170 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bbfcd" event={"ID":"5c76139a-69bb-4812-bd9a-481f6702904b","Type":"ContainerStarted","Data":"53b6554e24714966b8a901922b6417d9ebc0f29b68b03468ef663715e1f7310c"} Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.809154 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.854455 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-7p9tp" event={"ID":"9078eab8-cd16-404e-a8e6-e02c60ddfe16","Type":"ContainerStarted","Data":"49a28dc15164e28ead02f6148fec9c12a1a2288029970443dd429ecd5baa6946"} Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.854933 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-7p9tp" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.873515 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:37 crc kubenswrapper[4954]: E1206 06:59:37.876309 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:38.376295804 +0000 UTC m=+153.189655193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.884044 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4w27r" event={"ID":"8be1c797-26e0-4752-ab88-e593ce382532","Type":"ContainerStarted","Data":"b02e203c9f030555900ed92c6561e707e2731d34357c146a5831960ce7aa79cd"} Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.901346 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-srtmw"] Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.919783 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-srtmw" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.933173 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4" event={"ID":"c2c7e376-ab13-4dfa-900f-8de633105709","Type":"ContainerStarted","Data":"ef93735e42b3a0f6d59ae3c748e4dd01da95673b9109a66563800aa0c64b986c"} Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.964265 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-z6mbx" event={"ID":"2f2c25db-28aa-4b19-8f9a-21b78e02089f","Type":"ContainerStarted","Data":"23e93c7b64b7d1dc1386a5b854000efc0123724cea55ff8952c7e4cc6f0a5da8"} Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.975177 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.983626 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4916e82-5036-41a7-9ff5-a710313a630c-catalog-content\") pod \"redhat-marketplace-srtmw\" (UID: \"d4916e82-5036-41a7-9ff5-a710313a630c\") " pod="openshift-marketplace/redhat-marketplace-srtmw" Dec 06 06:59:37 crc kubenswrapper[4954]: E1206 06:59:37.986852 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:38.486803386 +0000 UTC m=+153.300162775 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.989398 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4916e82-5036-41a7-9ff5-a710313a630c-utilities\") pod \"redhat-marketplace-srtmw\" (UID: \"d4916e82-5036-41a7-9ff5-a710313a630c\") " pod="openshift-marketplace/redhat-marketplace-srtmw" Dec 06 06:59:37 crc kubenswrapper[4954]: I1206 06:59:37.989658 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jglcx\" (UniqueName: \"kubernetes.io/projected/d4916e82-5036-41a7-9ff5-a710313a630c-kube-api-access-jglcx\") pod \"redhat-marketplace-srtmw\" (UID: \"d4916e82-5036-41a7-9ff5-a710313a630c\") " pod="openshift-marketplace/redhat-marketplace-srtmw" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.002910 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" event={"ID":"97da928c-a8a1-48ef-90f4-68a650becdf6","Type":"ContainerStarted","Data":"a9574ad19f6d3ee15996d25ff8e3294a32fe268f15ee300dd1f81198811acc71"} Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.004259 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.012449 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zpgsz" event={"ID":"4603aefa-389d-4f23-b247-1c7e98f9dc94","Type":"ContainerStarted","Data":"5112c3fa2466c98b9be1ddcf1e7388fb9750172f23d433106693fce29f85086d"} Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.016595 4954 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-29x4r container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.016680 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" podUID="97da928c-a8a1-48ef-90f4-68a650becdf6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.115154 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bbfcd" podStartSLOduration=124.115123923 podStartE2EDuration="2m4.115123923s" podCreationTimestamp="2025-12-06 06:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:37.996600191 +0000 UTC m=+152.809959600" watchObservedRunningTime="2025-12-06 06:59:38.115123923 +0000 UTC m=+152.928483312" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.124124 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" event={"ID":"dcb356df-9cbe-46e4-a7c1-584a55224bdf","Type":"ContainerStarted","Data":"df86cdbc3fcff685bdd2ab57c758e26e518d6f3b2387c626ef01d2d7a5a8e0fe"} Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.135512 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4916e82-5036-41a7-9ff5-a710313a630c-utilities\") pod \"redhat-marketplace-srtmw\" (UID: \"d4916e82-5036-41a7-9ff5-a710313a630c\") " pod="openshift-marketplace/redhat-marketplace-srtmw" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.135671 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4916e82-5036-41a7-9ff5-a710313a630c-utilities\") pod \"redhat-marketplace-srtmw\" (UID: \"d4916e82-5036-41a7-9ff5-a710313a630c\") " pod="openshift-marketplace/redhat-marketplace-srtmw" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.136202 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jglcx\" (UniqueName: \"kubernetes.io/projected/d4916e82-5036-41a7-9ff5-a710313a630c-kube-api-access-jglcx\") pod \"redhat-marketplace-srtmw\" (UID: \"d4916e82-5036-41a7-9ff5-a710313a630c\") " pod="openshift-marketplace/redhat-marketplace-srtmw" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.136540 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.136632 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4916e82-5036-41a7-9ff5-a710313a630c-catalog-content\") pod \"redhat-marketplace-srtmw\" (UID: \"d4916e82-5036-41a7-9ff5-a710313a630c\") " pod="openshift-marketplace/redhat-marketplace-srtmw" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.137156 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-srtmw"] Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.137296 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4916e82-5036-41a7-9ff5-a710313a630c-catalog-content\") pod \"redhat-marketplace-srtmw\" (UID: \"d4916e82-5036-41a7-9ff5-a710313a630c\") " pod="openshift-marketplace/redhat-marketplace-srtmw" Dec 06 06:59:38 crc kubenswrapper[4954]: E1206 06:59:38.169101 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:38.66907033 +0000 UTC m=+153.482429719 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.190914 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p752" event={"ID":"2b823812-b773-4c33-9e75-55395275621d","Type":"ContainerStarted","Data":"b8b0f0dc42e1c683ee5dfc278bf7e21d9eabb697714dfdb475647effaf018a6b"} Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.196047 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p752" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.260103 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:38 crc kubenswrapper[4954]: E1206 06:59:38.261327 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:38.761309036 +0000 UTC m=+153.574668415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.266234 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dghdt" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.309806 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jglcx\" (UniqueName: \"kubernetes.io/projected/d4916e82-5036-41a7-9ff5-a710313a630c-kube-api-access-jglcx\") pod \"redhat-marketplace-srtmw\" (UID: \"d4916e82-5036-41a7-9ff5-a710313a630c\") " pod="openshift-marketplace/redhat-marketplace-srtmw" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.338439 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" event={"ID":"76a9970f-0017-4289-b4d5-804bbf7b0e9d","Type":"ContainerStarted","Data":"7c7ae8ec10dda50a5eed6cee5caf7815779003c4f094be0fe18ec4573b5f7256"} Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.343689 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.365063 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:38 crc kubenswrapper[4954]: E1206 06:59:38.366756 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:38.866712425 +0000 UTC m=+153.680072014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.367375 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-z6mbx" podStartSLOduration=125.367354102 podStartE2EDuration="2m5.367354102s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:38.257695341 +0000 UTC m=+153.071054750" watchObservedRunningTime="2025-12-06 06:59:38.367354102 +0000 UTC m=+153.180713491" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.387934 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-prszd"] Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.389502 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prszd" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.401410 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4" podStartSLOduration=125.401373889 podStartE2EDuration="2m5.401373889s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:38.364770174 +0000 UTC m=+153.178129573" watchObservedRunningTime="2025-12-06 06:59:38.401373889 +0000 UTC m=+153.214733278" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.403977 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7qkf" event={"ID":"e4fd0cda-b6fe-411e-a815-cd883c0ed24f","Type":"ContainerStarted","Data":"f5b33fc35689e7ff1b4c6ba0e64d33bec60ab404f520afcad6919b2b2f041fce"} Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.405306 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-srtmw" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.405518 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.420449 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-prszd"] Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.430951 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccjjn" event={"ID":"4ffb6c45-5d12-405d-8b73-9a54b4d0922f","Type":"ContainerStarted","Data":"5473f4f7b80cada19addceb664562b7cb8f239c700226d5c96b68a448d423e85"} Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.434992 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.472267 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:38 crc kubenswrapper[4954]: E1206 06:59:38.476785 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:38.976760366 +0000 UTC m=+153.790119755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.477255 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6gq5q" event={"ID":"5a1c6a2c-6c84-4a85-9fd2-3c589165934a","Type":"ContainerStarted","Data":"fa234445ec741d162779826f5011f5b8f542d7a9d91b274067e059354712a5bf"} Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.537481 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bb" event={"ID":"6003dd30-05c4-4565-82c7-d082a2d36d93","Type":"ContainerStarted","Data":"f2306a4bc9d5d1ed07fa1d10905292631f59dd9728f1b4a48a3d50958ad45257"} Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.540947 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7" event={"ID":"9dbb2866-4465-4a76-b007-f381ba65f51e","Type":"ContainerStarted","Data":"6d98bf99433d7bc8cfc42e69636f1cec058a82cfb74caaae4489c675dc502bb0"} Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.558309 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-7p9tp" podStartSLOduration=125.558292762 podStartE2EDuration="2m5.558292762s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:38.557753508 +0000 UTC m=+153.371112887" watchObservedRunningTime="2025-12-06 06:59:38.558292762 +0000 UTC m=+153.371652151" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.558801 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.567426 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p6nxk"] Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.576074 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kxzc\" (UniqueName: \"kubernetes.io/projected/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850-kube-api-access-4kxzc\") pod \"redhat-operators-prszd\" (UID: \"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850\") " pod="openshift-marketplace/redhat-operators-prszd" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.576133 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850-utilities\") pod \"redhat-operators-prszd\" (UID: \"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850\") " pod="openshift-marketplace/redhat-operators-prszd" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.576252 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850-catalog-content\") pod \"redhat-operators-prszd\" (UID: \"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850\") " pod="openshift-marketplace/redhat-operators-prszd" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.576318 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:38 crc kubenswrapper[4954]: E1206 06:59:38.576700 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:39.076684082 +0000 UTC m=+153.890043471 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.578153 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6nxk" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.608647 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p6nxk"] Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.623131 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p752" podStartSLOduration=125.623115673 podStartE2EDuration="2m5.623115673s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:38.621905191 +0000 UTC m=+153.435264570" watchObservedRunningTime="2025-12-06 06:59:38.623115673 +0000 UTC m=+153.436475062" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.684758 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bb" podStartSLOduration=125.68473817 podStartE2EDuration="2m5.68473817s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:38.684172496 +0000 UTC m=+153.497531895" watchObservedRunningTime="2025-12-06 06:59:38.68473817 +0000 UTC m=+153.498097559" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.685302 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.685766 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kxzc\" (UniqueName: \"kubernetes.io/projected/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850-kube-api-access-4kxzc\") pod \"redhat-operators-prszd\" (UID: \"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850\") " pod="openshift-marketplace/redhat-operators-prszd" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.685942 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850-utilities\") pod \"redhat-operators-prszd\" (UID: \"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850\") " pod="openshift-marketplace/redhat-operators-prszd" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.686080 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqhzj\" (UniqueName: \"kubernetes.io/projected/245f7b8a-be27-4fa8-ae16-161d52cef432-kube-api-access-pqhzj\") pod \"redhat-operators-p6nxk\" (UID: \"245f7b8a-be27-4fa8-ae16-161d52cef432\") " pod="openshift-marketplace/redhat-operators-p6nxk" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.686110 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/245f7b8a-be27-4fa8-ae16-161d52cef432-utilities\") pod \"redhat-operators-p6nxk\" (UID: \"245f7b8a-be27-4fa8-ae16-161d52cef432\") " pod="openshift-marketplace/redhat-operators-p6nxk" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.686381 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850-catalog-content\") pod \"redhat-operators-prszd\" (UID: \"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850\") " pod="openshift-marketplace/redhat-operators-prszd" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.686412 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/245f7b8a-be27-4fa8-ae16-161d52cef432-catalog-content\") pod \"redhat-operators-p6nxk\" (UID: \"245f7b8a-be27-4fa8-ae16-161d52cef432\") " pod="openshift-marketplace/redhat-operators-p6nxk" Dec 06 06:59:38 crc kubenswrapper[4954]: E1206 06:59:38.688908 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:39.188889389 +0000 UTC m=+154.002248858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.691346 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850-utilities\") pod \"redhat-operators-prszd\" (UID: \"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850\") " pod="openshift-marketplace/redhat-operators-prszd" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.694030 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850-catalog-content\") pod \"redhat-operators-prszd\" (UID: \"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850\") " pod="openshift-marketplace/redhat-operators-prszd" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.699994 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 06:59:38 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Dec 06 06:59:38 crc kubenswrapper[4954]: [+]process-running ok Dec 06 06:59:38 crc kubenswrapper[4954]: healthz check failed Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.700094 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.761309 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kxzc\" (UniqueName: \"kubernetes.io/projected/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850-kube-api-access-4kxzc\") pod \"redhat-operators-prszd\" (UID: \"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850\") " pod="openshift-marketplace/redhat-operators-prszd" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.787627 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqhzj\" (UniqueName: \"kubernetes.io/projected/245f7b8a-be27-4fa8-ae16-161d52cef432-kube-api-access-pqhzj\") pod \"redhat-operators-p6nxk\" (UID: \"245f7b8a-be27-4fa8-ae16-161d52cef432\") " pod="openshift-marketplace/redhat-operators-p6nxk" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.787683 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/245f7b8a-be27-4fa8-ae16-161d52cef432-utilities\") pod \"redhat-operators-p6nxk\" (UID: \"245f7b8a-be27-4fa8-ae16-161d52cef432\") " pod="openshift-marketplace/redhat-operators-p6nxk" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.788685 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/245f7b8a-be27-4fa8-ae16-161d52cef432-catalog-content\") pod \"redhat-operators-p6nxk\" (UID: \"245f7b8a-be27-4fa8-ae16-161d52cef432\") " pod="openshift-marketplace/redhat-operators-p6nxk" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.788759 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:38 crc kubenswrapper[4954]: E1206 06:59:38.789270 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:39.289252056 +0000 UTC m=+154.102611445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.790626 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/245f7b8a-be27-4fa8-ae16-161d52cef432-utilities\") pod \"redhat-operators-p6nxk\" (UID: \"245f7b8a-be27-4fa8-ae16-161d52cef432\") " pod="openshift-marketplace/redhat-operators-p6nxk" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.790975 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/245f7b8a-be27-4fa8-ae16-161d52cef432-catalog-content\") pod \"redhat-operators-p6nxk\" (UID: \"245f7b8a-be27-4fa8-ae16-161d52cef432\") " pod="openshift-marketplace/redhat-operators-p6nxk" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.795859 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4" podStartSLOduration=125.795843698 podStartE2EDuration="2m5.795843698s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:38.727144956 +0000 UTC m=+153.540504355" watchObservedRunningTime="2025-12-06 06:59:38.795843698 +0000 UTC m=+153.609203087" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.799379 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" podStartSLOduration=125.79936329 podStartE2EDuration="2m5.79936329s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:38.793694862 +0000 UTC m=+153.607054261" watchObservedRunningTime="2025-12-06 06:59:38.79936329 +0000 UTC m=+153.612722679" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.833966 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqhzj\" (UniqueName: \"kubernetes.io/projected/245f7b8a-be27-4fa8-ae16-161d52cef432-kube-api-access-pqhzj\") pod \"redhat-operators-p6nxk\" (UID: \"245f7b8a-be27-4fa8-ae16-161d52cef432\") " pod="openshift-marketplace/redhat-operators-p6nxk" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.854698 4954 patch_prober.go:28] interesting pod/console-operator-58897d9998-7p9tp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.854774 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-7p9tp" podUID="9078eab8-cd16-404e-a8e6-e02c60ddfe16" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.891642 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:38 crc kubenswrapper[4954]: E1206 06:59:38.892520 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:39.392500019 +0000 UTC m=+154.205859418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:38 crc kubenswrapper[4954]: I1206 06:59:38.938410 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" podStartSLOduration=125.938380016 podStartE2EDuration="2m5.938380016s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:38.879842479 +0000 UTC m=+153.693201868" watchObservedRunningTime="2025-12-06 06:59:38.938380016 +0000 UTC m=+153.751739405" Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.029547 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:39 crc kubenswrapper[4954]: E1206 06:59:39.030109 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:39.530094178 +0000 UTC m=+154.343453567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.042964 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccjjn" podStartSLOduration=126.042939213 podStartE2EDuration="2m6.042939213s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:39.039186655 +0000 UTC m=+153.852546044" watchObservedRunningTime="2025-12-06 06:59:39.042939213 +0000 UTC m=+153.856298602" Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.065662 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prszd" Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.075178 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6nxk" Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.116368 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" podStartSLOduration=126.116350068 podStartE2EDuration="2m6.116350068s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:39.101667005 +0000 UTC m=+153.915026404" watchObservedRunningTime="2025-12-06 06:59:39.116350068 +0000 UTC m=+153.929709457" Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.139947 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:39 crc kubenswrapper[4954]: E1206 06:59:39.141471 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:39.641452763 +0000 UTC m=+154.454812152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.236113 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.236946 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.244136 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:39 crc kubenswrapper[4954]: E1206 06:59:39.245208 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:39.745190049 +0000 UTC m=+154.558549438 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.245841 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.246179 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.263929 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.345010 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.345592 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5ec7685-0aeb-4cbb-87da-a37f7afae8bd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a5ec7685-0aeb-4cbb-87da-a37f7afae8bd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.345818 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5ec7685-0aeb-4cbb-87da-a37f7afae8bd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a5ec7685-0aeb-4cbb-87da-a37f7afae8bd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 06:59:39 crc kubenswrapper[4954]: E1206 06:59:39.345940 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:39.845921596 +0000 UTC m=+154.659280985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.450768 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.450831 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5ec7685-0aeb-4cbb-87da-a37f7afae8bd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a5ec7685-0aeb-4cbb-87da-a37f7afae8bd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.450864 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5ec7685-0aeb-4cbb-87da-a37f7afae8bd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a5ec7685-0aeb-4cbb-87da-a37f7afae8bd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.454590 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5ec7685-0aeb-4cbb-87da-a37f7afae8bd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a5ec7685-0aeb-4cbb-87da-a37f7afae8bd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 06:59:39 crc kubenswrapper[4954]: E1206 06:59:39.454763 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:39.954748444 +0000 UTC m=+154.768107833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.506421 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5ec7685-0aeb-4cbb-87da-a37f7afae8bd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a5ec7685-0aeb-4cbb-87da-a37f7afae8bd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.558344 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:39 crc kubenswrapper[4954]: E1206 06:59:39.558896 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:40.05886676 +0000 UTC m=+154.872226149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.650225 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dghdt"] Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.662095 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:39 crc kubenswrapper[4954]: E1206 06:59:39.662532 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:40.162518504 +0000 UTC m=+154.975877893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.678285 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 06:59:39 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Dec 06 06:59:39 crc kubenswrapper[4954]: [+]process-running ok Dec 06 06:59:39 crc kubenswrapper[4954]: healthz check failed Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.678423 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.722426 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" event={"ID":"f9ff9691-e91d-48ba-9ee9-de7807534c6e","Type":"ContainerStarted","Data":"6d8bc51a562ade8bc5c72b27b2b26a01b2077374e054f1abd39c39438dbacd9e"} Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.732848 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.763715 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" event={"ID":"b5678dfe-78f2-40b2-8673-8ae5c96bc282","Type":"ContainerStarted","Data":"4d2e225921854f7cdc620333a7c2b0ac12b3e71f9af42883dbe53074dd08cd37"} Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.764795 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:39 crc kubenswrapper[4954]: E1206 06:59:39.765323 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:40.265298925 +0000 UTC m=+155.078658314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.784759 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5sc28" event={"ID":"2c9c8d86-e93d-431d-8734-331a2f3997bf","Type":"ContainerStarted","Data":"f07873d626d4664803c4a8ccd6c2d96e79eeeda9fbecd1957c43ff9c5876ea5f"} Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.810531 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6gq5q" event={"ID":"5a1c6a2c-6c84-4a85-9fd2-3c589165934a","Type":"ContainerStarted","Data":"f4b1d39d8e363b147a23471c8c6f27be4784c8037c2996b052973f952c33ba11"} Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.859960 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7" event={"ID":"9dbb2866-4465-4a76-b007-f381ba65f51e","Type":"ContainerStarted","Data":"b38caddb0b5efe73f5b079400decc0ff6a11393b083012ee5460aca831a25af2"} Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.868625 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:39 crc kubenswrapper[4954]: E1206 06:59:39.869026 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:40.36901161 +0000 UTC m=+155.182370999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.904049 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zpgsz" event={"ID":"4603aefa-389d-4f23-b247-1c7e98f9dc94","Type":"ContainerStarted","Data":"fea1c748b25488364d612bc941aee887ce616930c1dcbe3ce34dc4b2dd964625"} Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.948409 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfhrk" event={"ID":"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f","Type":"ContainerStarted","Data":"a10492070d1762339ac5cda96653a795d4486e1cea0a39ad32966fcd90e77cfd"} Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.948485 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfhrk" event={"ID":"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f","Type":"ContainerStarted","Data":"43b631b446c7ca688f6c8d3feb7a9d4bb391b0f4595eceb865184fd672b57f53"} Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.971547 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:39 crc kubenswrapper[4954]: I1206 06:59:39.972249 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bt8q4" event={"ID":"e54ca3bc-3cac-4f64-a27b-6e0f899f16b7","Type":"ContainerStarted","Data":"6f9a8ff59b1afacde6bfbdf6d1a05c51fa53ec4623f9c624ae399b105413529c"} Dec 06 06:59:39 crc kubenswrapper[4954]: E1206 06:59:39.973210 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:40.473183917 +0000 UTC m=+155.286543326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.017339 4954 generic.go:334] "Generic (PLEG): container finished" podID="e4fd0cda-b6fe-411e-a815-cd883c0ed24f" containerID="08f2a70e152fc3b7b52ac42f8f94fc0aa02950ea9cac81de60b1976f9fbb05be" exitCode=0 Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.017503 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7qkf" event={"ID":"e4fd0cda-b6fe-411e-a815-cd883c0ed24f","Type":"ContainerDied","Data":"08f2a70e152fc3b7b52ac42f8f94fc0aa02950ea9cac81de60b1976f9fbb05be"} Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.018760 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-5sc28" podStartSLOduration=127.018728574 podStartE2EDuration="2m7.018728574s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:39.997383338 +0000 UTC m=+154.810742737" watchObservedRunningTime="2025-12-06 06:59:40.018728574 +0000 UTC m=+154.832087963" Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.043049 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-srtmw"] Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.060643 4954 generic.go:334] "Generic (PLEG): container finished" podID="8571f5ab-84f5-4516-835c-01f5361d0ca4" containerID="c6bde14c2ff52cd7ce0db45b1a6f51446dd43aa9eacbbe4b6f172408a9175b00" exitCode=0 Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.061003 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbdtt" event={"ID":"8571f5ab-84f5-4516-835c-01f5361d0ca4","Type":"ContainerDied","Data":"c6bde14c2ff52cd7ce0db45b1a6f51446dd43aa9eacbbe4b6f172408a9175b00"} Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.061039 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbdtt" event={"ID":"8571f5ab-84f5-4516-835c-01f5361d0ca4","Type":"ContainerStarted","Data":"e1b6ad0e7fb1bdc7caa14203657acdc8285e82c3896e818776395a340a23d265"} Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.061966 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.086303 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:40 crc kubenswrapper[4954]: E1206 06:59:40.086657 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:40.586641095 +0000 UTC m=+155.400000484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.114238 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.114298 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.121826 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gk4bb" event={"ID":"6003dd30-05c4-4565-82c7-d082a2d36d93","Type":"ContainerStarted","Data":"61877047ae7129bc994f155078f5119073aa2e55fa7d86ab0c142910106bd20d"} Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.178942 4954 generic.go:334] "Generic (PLEG): container finished" podID="c21ec18f-c5ed-4306-9732-4ebdf2ee71d9" containerID="00f2b60d77bb9b4a30bc1b9465fb1774f45251518068eea140e0cf8c48392a58" exitCode=0 Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.179039 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wq5n" event={"ID":"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9","Type":"ContainerDied","Data":"00f2b60d77bb9b4a30bc1b9465fb1774f45251518068eea140e0cf8c48392a58"} Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.188297 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:40 crc kubenswrapper[4954]: E1206 06:59:40.189226 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:40.689209731 +0000 UTC m=+155.502569120 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.268092 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4w27r" event={"ID":"8be1c797-26e0-4752-ab88-e593ce382532","Type":"ContainerStarted","Data":"587ba3e91a35f32ca71ddeebf388a2ce2e89b3b509bc436a855cb0953ed868ac"} Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.293821 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.321548 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4w27r" Dec 06 06:59:40 crc kubenswrapper[4954]: E1206 06:59:40.331021 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:40.830991929 +0000 UTC m=+155.644351318 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.338778 4954 patch_prober.go:28] interesting pod/downloads-7954f5f757-56lst container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.338887 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-56lst" podUID="891188e6-3c26-44de-84b2-6585f0d5e7dd" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.342133 4954 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-29x4r container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.342206 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" podUID="97da928c-a8a1-48ef-90f4-68a650becdf6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.353479 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-7p9tp" Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.379579 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-prszd"] Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.391150 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-zpgsz" podStartSLOduration=127.391124987 podStartE2EDuration="2m7.391124987s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:40.339493871 +0000 UTC m=+155.152853260" watchObservedRunningTime="2025-12-06 06:59:40.391124987 +0000 UTC m=+155.204484376" Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.414189 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:40 crc kubenswrapper[4954]: E1206 06:59:40.418974 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:40.918943173 +0000 UTC m=+155.732302562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.504225 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.527066 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:40 crc kubenswrapper[4954]: E1206 06:59:40.534082 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:41.034062116 +0000 UTC m=+155.847421505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.540920 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fwst7" podStartSLOduration=127.540897594 podStartE2EDuration="2m7.540897594s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:40.52578835 +0000 UTC m=+155.339147739" watchObservedRunningTime="2025-12-06 06:59:40.540897594 +0000 UTC m=+155.354256983" Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.623255 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6gq5q" podStartSLOduration=127.623234172 podStartE2EDuration="2m7.623234172s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:40.600069527 +0000 UTC m=+155.413428916" watchObservedRunningTime="2025-12-06 06:59:40.623234172 +0000 UTC m=+155.436593561" Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.627545 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p6nxk"] Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.628788 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:40 crc kubenswrapper[4954]: E1206 06:59:40.629982 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:41.129961577 +0000 UTC m=+155.943320966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.691647 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 06:59:40 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Dec 06 06:59:40 crc kubenswrapper[4954]: [+]process-running ok Dec 06 06:59:40 crc kubenswrapper[4954]: healthz check failed Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.691719 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.734027 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:40 crc kubenswrapper[4954]: E1206 06:59:40.734589 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:41.234549565 +0000 UTC m=+156.047908964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.835098 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:40 crc kubenswrapper[4954]: E1206 06:59:40.835535 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:41.335515239 +0000 UTC m=+156.148874618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.913741 4954 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 06 06:59:40 crc kubenswrapper[4954]: I1206 06:59:40.948447 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:40 crc kubenswrapper[4954]: E1206 06:59:40.949111 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:41.449096471 +0000 UTC m=+156.262455860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.025131 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4w27r" podStartSLOduration=13.025111184 podStartE2EDuration="13.025111184s" podCreationTimestamp="2025-12-06 06:59:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:41.022886206 +0000 UTC m=+155.836245595" watchObservedRunningTime="2025-12-06 06:59:41.025111184 +0000 UTC m=+155.838470563" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.058273 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:41 crc kubenswrapper[4954]: E1206 06:59:41.058925 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:41.558906975 +0000 UTC m=+156.372266364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.165605 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.167732 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:41 crc kubenswrapper[4954]: E1206 06:59:41.168103 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:41.668087153 +0000 UTC m=+156.481446542 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.268935 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:41 crc kubenswrapper[4954]: E1206 06:59:41.269891 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:41.769851797 +0000 UTC m=+156.583211186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.287255 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.287786 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.296591 4954 patch_prober.go:28] interesting pod/console-f9d7485db-z4t2h container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.296684 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-z4t2h" podUID="b7733adb-d709-4fee-bfc6-728721369b82" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.315511 4954 generic.go:334] "Generic (PLEG): container finished" podID="ceb38465-b161-44c8-9e80-4c3df43ed7b1" containerID="3cdb0c9c57cec75190f61f3c4b7f4bc12cfded742d12726f53162e40559aa137" exitCode=0 Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.315662 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dghdt" event={"ID":"ceb38465-b161-44c8-9e80-4c3df43ed7b1","Type":"ContainerDied","Data":"3cdb0c9c57cec75190f61f3c4b7f4bc12cfded742d12726f53162e40559aa137"} Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.315697 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dghdt" event={"ID":"ceb38465-b161-44c8-9e80-4c3df43ed7b1","Type":"ContainerStarted","Data":"9d2cb4f73677493171f2d2be87d85270c4c013a449ea923d4a84baf36e94aafb"} Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.353353 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6nxk" event={"ID":"245f7b8a-be27-4fa8-ae16-161d52cef432","Type":"ContainerStarted","Data":"72a568a69bf88675436f69dc070dd3f0179b76a570b3f21e8e1db38bc2c652d6"} Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.354831 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6nxk" event={"ID":"245f7b8a-be27-4fa8-ae16-161d52cef432","Type":"ContainerStarted","Data":"06d108eb293c54982d5c8d1913db27cd72968fcc808340233f2fcebb6eacdb74"} Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.382293 4954 generic.go:334] "Generic (PLEG): container finished" podID="b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" containerID="a10492070d1762339ac5cda96653a795d4486e1cea0a39ad32966fcd90e77cfd" exitCode=0 Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.382723 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfhrk" event={"ID":"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f","Type":"ContainerDied","Data":"a10492070d1762339ac5cda96653a795d4486e1cea0a39ad32966fcd90e77cfd"} Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.388022 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.393934 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.409771 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.409825 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:41 crc kubenswrapper[4954]: E1206 06:59:41.419502 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:41.919459899 +0000 UTC m=+156.732819288 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.419575 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.419668 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.440069 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.440623 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.454736 4954 generic.go:334] "Generic (PLEG): container finished" podID="d4916e82-5036-41a7-9ff5-a710313a630c" containerID="54b3c5fe930cd4a571e0173436080fb883cbf67f8ebbbdb7d1e9f56a130bd7b2" exitCode=0 Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.468305 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.523532 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:41 crc kubenswrapper[4954]: E1206 06:59:41.524346 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 06:59:42.024326215 +0000 UTC m=+156.837685604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.545245 4954 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-06T06:59:40.91376891Z","Handler":null,"Name":""} Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.558864 4954 generic.go:334] "Generic (PLEG): container finished" podID="c2c7e376-ab13-4dfa-900f-8de633105709" containerID="ef93735e42b3a0f6d59ae3c748e4dd01da95673b9109a66563800aa0c64b986c" exitCode=0 Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.561387 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srtmw" event={"ID":"d4916e82-5036-41a7-9ff5-a710313a630c","Type":"ContainerDied","Data":"54b3c5fe930cd4a571e0173436080fb883cbf67f8ebbbdb7d1e9f56a130bd7b2"} Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.561456 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srtmw" event={"ID":"d4916e82-5036-41a7-9ff5-a710313a630c","Type":"ContainerStarted","Data":"2cd9cb003f9df7c6eba2507558619ed94b77ac760b3c61f0f3ab722ce324dedb"} Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.561472 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4" event={"ID":"c2c7e376-ab13-4dfa-900f-8de633105709","Type":"ContainerDied","Data":"ef93735e42b3a0f6d59ae3c748e4dd01da95673b9109a66563800aa0c64b986c"} Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.562017 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.574925 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.616094 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" event={"ID":"b5678dfe-78f2-40b2-8673-8ae5c96bc282","Type":"ContainerStarted","Data":"672e9c6a56071ded4e6a40dff15b44e011ea993bea6ea0430dadce5cd78c8edb"} Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.631354 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prszd" event={"ID":"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850","Type":"ContainerStarted","Data":"00d04afe1ac5ed4a5e8647aba74db78e2c1ccbc80674038d6d95fb351050e2d9"} Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.631420 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prszd" event={"ID":"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850","Type":"ContainerStarted","Data":"a9efa0b9940f7cb10597fcb987b655865f1dd30777fd42dcd48adf17a048ec6c"} Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.631854 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:41 crc kubenswrapper[4954]: E1206 06:59:41.632295 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 06:59:42.132277611 +0000 UTC m=+156.945637000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sc4fg" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.658406 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.682480 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 06:59:41 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Dec 06 06:59:41 crc kubenswrapper[4954]: [+]process-running ok Dec 06 06:59:41 crc kubenswrapper[4954]: healthz check failed Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.682583 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.746218 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.746959 4954 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.747021 4954 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.747239 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" podStartSLOduration=128.747212809 podStartE2EDuration="2m8.747212809s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:41.745522575 +0000 UTC m=+156.558881974" watchObservedRunningTime="2025-12-06 06:59:41.747212809 +0000 UTC m=+156.560572198" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.779789 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.848215 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.867756 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.867814 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:41 crc kubenswrapper[4954]: I1206 06:59:41.934947 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sc4fg\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.068541 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.121729 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.123690 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.124023 4954 patch_prober.go:28] interesting pod/downloads-7954f5f757-56lst container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.124064 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-56lst" podUID="891188e6-3c26-44de-84b2-6585f0d5e7dd" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.124193 4954 patch_prober.go:28] interesting pod/downloads-7954f5f757-56lst container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.124253 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-56lst" podUID="891188e6-3c26-44de-84b2-6585f0d5e7dd" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.149950 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.172907 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.665339 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.684615 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 06:59:42 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Dec 06 06:59:42 crc kubenswrapper[4954]: [+]process-running ok Dec 06 06:59:42 crc kubenswrapper[4954]: healthz check failed Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.684677 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.697453 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" event={"ID":"f9ff9691-e91d-48ba-9ee9-de7807534c6e","Type":"ContainerStarted","Data":"a14b29164beb1b0d2c49c86bea67fb32666c8e179ecb6783e1afb435f8305cc6"} Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.700035 4954 generic.go:334] "Generic (PLEG): container finished" podID="92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850" containerID="00d04afe1ac5ed4a5e8647aba74db78e2c1ccbc80674038d6d95fb351050e2d9" exitCode=0 Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.700111 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prszd" event={"ID":"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850","Type":"ContainerDied","Data":"00d04afe1ac5ed4a5e8647aba74db78e2c1ccbc80674038d6d95fb351050e2d9"} Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.700991 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a5ec7685-0aeb-4cbb-87da-a37f7afae8bd","Type":"ContainerStarted","Data":"a55379d1df0180711dbde6ff2f7973e04166d6ef30ea8d676ccb8d97c0ef690e"} Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.743189 4954 generic.go:334] "Generic (PLEG): container finished" podID="245f7b8a-be27-4fa8-ae16-161d52cef432" containerID="72a568a69bf88675436f69dc070dd3f0179b76a570b3f21e8e1db38bc2c652d6" exitCode=0 Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.744243 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6nxk" event={"ID":"245f7b8a-be27-4fa8-ae16-161d52cef432","Type":"ContainerDied","Data":"72a568a69bf88675436f69dc070dd3f0179b76a570b3f21e8e1db38bc2c652d6"} Dec 06 06:59:42 crc kubenswrapper[4954]: I1206 06:59:42.801409 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c665p" Dec 06 06:59:42 crc kubenswrapper[4954]: W1206 06:59:42.951712 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-0e0824020d01012bb02963a54674e8d0e0063b27427fb767652ee3516c21fe5a WatchSource:0}: Error finding container 0e0824020d01012bb02963a54674e8d0e0063b27427fb767652ee3516c21fe5a: Status 404 returned error can't find the container with id 0e0824020d01012bb02963a54674e8d0e0063b27427fb767652ee3516c21fe5a Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.111034 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.112054 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.137234 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.137526 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.154135 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.219453 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/242b0103-c12b-4540-ada6-6083450bb1ae-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"242b0103-c12b-4540-ada6-6083450bb1ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.219538 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/242b0103-c12b-4540-ada6-6083450bb1ae-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"242b0103-c12b-4540-ada6-6083450bb1ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.303927 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sc4fg"] Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.325487 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/242b0103-c12b-4540-ada6-6083450bb1ae-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"242b0103-c12b-4540-ada6-6083450bb1ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.325602 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/242b0103-c12b-4540-ada6-6083450bb1ae-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"242b0103-c12b-4540-ada6-6083450bb1ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.326032 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/242b0103-c12b-4540-ada6-6083450bb1ae-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"242b0103-c12b-4540-ada6-6083450bb1ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.378198 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/242b0103-c12b-4540-ada6-6083450bb1ae-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"242b0103-c12b-4540-ada6-6083450bb1ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.506174 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.553962 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.669382 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 06:59:43 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Dec 06 06:59:43 crc kubenswrapper[4954]: [+]process-running ok Dec 06 06:59:43 crc kubenswrapper[4954]: healthz check failed Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.670301 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.781212 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.828387 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" event={"ID":"f9ff9691-e91d-48ba-9ee9-de7807534c6e","Type":"ContainerStarted","Data":"8ff3ca0a6ed4218d37aefa43b74917cd0063a38828e1a522156755fb06d5203c"} Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.832236 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4" event={"ID":"c2c7e376-ab13-4dfa-900f-8de633105709","Type":"ContainerDied","Data":"e766a5d3086914a55fa15a45967f17f99cc7294f3dd71d0a82585ff82461e71e"} Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.832287 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e766a5d3086914a55fa15a45967f17f99cc7294f3dd71d0a82585ff82461e71e" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.832343 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.847488 4954 generic.go:334] "Generic (PLEG): container finished" podID="a5ec7685-0aeb-4cbb-87da-a37f7afae8bd" containerID="bf44cdc2057aef747fc23f934194d39cfd7615a6fa51f2b9b0388ab0f37efd64" exitCode=0 Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.848048 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a5ec7685-0aeb-4cbb-87da-a37f7afae8bd","Type":"ContainerDied","Data":"bf44cdc2057aef747fc23f934194d39cfd7615a6fa51f2b9b0388ab0f37efd64"} Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.848895 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-d7n2g" podStartSLOduration=15.848859664999999 podStartE2EDuration="15.848859665s" podCreationTimestamp="2025-12-06 06:59:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:43.847142011 +0000 UTC m=+158.660501400" watchObservedRunningTime="2025-12-06 06:59:43.848859665 +0000 UTC m=+158.662219054" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.859710 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5b987210dbce62fd71d966c512243a20c899b14d25649d483039782f363162d8"} Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.894387 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" event={"ID":"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a","Type":"ContainerStarted","Data":"004f732a4fc7f18394c538b68b079ccf422e6607ffce83b1adf2de484b210837"} Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.911191 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2ca6b13cfc62d05bf7b0ccdb1ab9034bb8dedb155626c401a71279baa5f99443"} Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.911246 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0e0824020d01012bb02963a54674e8d0e0063b27427fb767652ee3516c21fe5a"} Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.922048 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f6fe9b79723d3e82764b4a6a7ba7c328e4beb132d07aac0cac32320341d03325"} Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.922779 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.939110 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2c7e376-ab13-4dfa-900f-8de633105709-secret-volume\") pod \"c2c7e376-ab13-4dfa-900f-8de633105709\" (UID: \"c2c7e376-ab13-4dfa-900f-8de633105709\") " Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.939219 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2c7e376-ab13-4dfa-900f-8de633105709-config-volume\") pod \"c2c7e376-ab13-4dfa-900f-8de633105709\" (UID: \"c2c7e376-ab13-4dfa-900f-8de633105709\") " Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.939525 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2rcp\" (UniqueName: \"kubernetes.io/projected/c2c7e376-ab13-4dfa-900f-8de633105709-kube-api-access-k2rcp\") pod \"c2c7e376-ab13-4dfa-900f-8de633105709\" (UID: \"c2c7e376-ab13-4dfa-900f-8de633105709\") " Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.941686 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2c7e376-ab13-4dfa-900f-8de633105709-config-volume" (OuterVolumeSpecName: "config-volume") pod "c2c7e376-ab13-4dfa-900f-8de633105709" (UID: "c2c7e376-ab13-4dfa-900f-8de633105709"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.989754 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2c7e376-ab13-4dfa-900f-8de633105709-kube-api-access-k2rcp" (OuterVolumeSpecName: "kube-api-access-k2rcp") pod "c2c7e376-ab13-4dfa-900f-8de633105709" (UID: "c2c7e376-ab13-4dfa-900f-8de633105709"). InnerVolumeSpecName "kube-api-access-k2rcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:59:43 crc kubenswrapper[4954]: I1206 06:59:43.991760 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c7e376-ab13-4dfa-900f-8de633105709-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c2c7e376-ab13-4dfa-900f-8de633105709" (UID: "c2c7e376-ab13-4dfa-900f-8de633105709"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 06:59:44 crc kubenswrapper[4954]: I1206 06:59:44.042351 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2c7e376-ab13-4dfa-900f-8de633105709-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 06:59:44 crc kubenswrapper[4954]: I1206 06:59:44.042391 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2c7e376-ab13-4dfa-900f-8de633105709-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 06:59:44 crc kubenswrapper[4954]: I1206 06:59:44.042403 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2rcp\" (UniqueName: \"kubernetes.io/projected/c2c7e376-ab13-4dfa-900f-8de633105709-kube-api-access-k2rcp\") on node \"crc\" DevicePath \"\"" Dec 06 06:59:44 crc kubenswrapper[4954]: I1206 06:59:44.200961 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 06 06:59:44 crc kubenswrapper[4954]: W1206 06:59:44.273604 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod242b0103_c12b_4540_ada6_6083450bb1ae.slice/crio-9a722d6c1522be38f008c8f996831aab1970174f73851560c166b87b8397cbb5 WatchSource:0}: Error finding container 9a722d6c1522be38f008c8f996831aab1970174f73851560c166b87b8397cbb5: Status 404 returned error can't find the container with id 9a722d6c1522be38f008c8f996831aab1970174f73851560c166b87b8397cbb5 Dec 06 06:59:44 crc kubenswrapper[4954]: I1206 06:59:44.697698 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 06:59:44 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Dec 06 06:59:44 crc kubenswrapper[4954]: [+]process-running ok Dec 06 06:59:44 crc kubenswrapper[4954]: healthz check failed Dec 06 06:59:44 crc kubenswrapper[4954]: I1206 06:59:44.698313 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 06:59:44 crc kubenswrapper[4954]: I1206 06:59:44.960279 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c9aa7b0556d8dbf935768bb553d4ff78406456a84ab54830404f375ecf4a1107"} Dec 06 06:59:44 crc kubenswrapper[4954]: I1206 06:59:44.972282 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" event={"ID":"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a","Type":"ContainerStarted","Data":"0a96214556870d1e7dd2409421f4eae15f004e3c1071c0837592b26601e398d6"} Dec 06 06:59:44 crc kubenswrapper[4954]: I1206 06:59:44.973942 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 06:59:44 crc kubenswrapper[4954]: I1206 06:59:44.992460 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"851f283016fc9d95979579671915521d35fc3ff2ec6823712956753941ef61f1"} Dec 06 06:59:45 crc kubenswrapper[4954]: I1206 06:59:45.005072 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" podStartSLOduration=132.005041122 podStartE2EDuration="2m12.005041122s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:44.997127476 +0000 UTC m=+159.810486875" watchObservedRunningTime="2025-12-06 06:59:45.005041122 +0000 UTC m=+159.818400511" Dec 06 06:59:45 crc kubenswrapper[4954]: I1206 06:59:45.008899 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"242b0103-c12b-4540-ada6-6083450bb1ae","Type":"ContainerStarted","Data":"9a722d6c1522be38f008c8f996831aab1970174f73851560c166b87b8397cbb5"} Dec 06 06:59:45 crc kubenswrapper[4954]: I1206 06:59:45.376092 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 06:59:45 crc kubenswrapper[4954]: I1206 06:59:45.492265 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5ec7685-0aeb-4cbb-87da-a37f7afae8bd-kubelet-dir\") pod \"a5ec7685-0aeb-4cbb-87da-a37f7afae8bd\" (UID: \"a5ec7685-0aeb-4cbb-87da-a37f7afae8bd\") " Dec 06 06:59:45 crc kubenswrapper[4954]: I1206 06:59:45.492410 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5ec7685-0aeb-4cbb-87da-a37f7afae8bd-kube-api-access\") pod \"a5ec7685-0aeb-4cbb-87da-a37f7afae8bd\" (UID: \"a5ec7685-0aeb-4cbb-87da-a37f7afae8bd\") " Dec 06 06:59:45 crc kubenswrapper[4954]: I1206 06:59:45.492399 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5ec7685-0aeb-4cbb-87da-a37f7afae8bd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a5ec7685-0aeb-4cbb-87da-a37f7afae8bd" (UID: "a5ec7685-0aeb-4cbb-87da-a37f7afae8bd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 06:59:45 crc kubenswrapper[4954]: I1206 06:59:45.513034 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ec7685-0aeb-4cbb-87da-a37f7afae8bd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a5ec7685-0aeb-4cbb-87da-a37f7afae8bd" (UID: "a5ec7685-0aeb-4cbb-87da-a37f7afae8bd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:59:45 crc kubenswrapper[4954]: I1206 06:59:45.594629 4954 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5ec7685-0aeb-4cbb-87da-a37f7afae8bd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 06:59:45 crc kubenswrapper[4954]: I1206 06:59:45.594684 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5ec7685-0aeb-4cbb-87da-a37f7afae8bd-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 06:59:45 crc kubenswrapper[4954]: I1206 06:59:45.669473 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 06:59:45 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Dec 06 06:59:45 crc kubenswrapper[4954]: [+]process-running ok Dec 06 06:59:45 crc kubenswrapper[4954]: healthz check failed Dec 06 06:59:45 crc kubenswrapper[4954]: I1206 06:59:45.669795 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 06:59:46 crc kubenswrapper[4954]: I1206 06:59:46.019060 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a5ec7685-0aeb-4cbb-87da-a37f7afae8bd","Type":"ContainerDied","Data":"a55379d1df0180711dbde6ff2f7973e04166d6ef30ea8d676ccb8d97c0ef690e"} Dec 06 06:59:46 crc kubenswrapper[4954]: I1206 06:59:46.019106 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a55379d1df0180711dbde6ff2f7973e04166d6ef30ea8d676ccb8d97c0ef690e" Dec 06 06:59:46 crc kubenswrapper[4954]: I1206 06:59:46.019179 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 06:59:46 crc kubenswrapper[4954]: I1206 06:59:46.052680 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"242b0103-c12b-4540-ada6-6083450bb1ae","Type":"ContainerStarted","Data":"57e9eedb39173dc9fedb185793f6a01bafca9865197c94c1648a8716be8b3980"} Dec 06 06:59:46 crc kubenswrapper[4954]: I1206 06:59:46.075740 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.075718439 podStartE2EDuration="3.075718439s" podCreationTimestamp="2025-12-06 06:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 06:59:46.071785506 +0000 UTC m=+160.885144915" watchObservedRunningTime="2025-12-06 06:59:46.075718439 +0000 UTC m=+160.889077818" Dec 06 06:59:46 crc kubenswrapper[4954]: I1206 06:59:46.380984 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:46 crc kubenswrapper[4954]: I1206 06:59:46.381587 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:46 crc kubenswrapper[4954]: I1206 06:59:46.395902 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:46 crc kubenswrapper[4954]: I1206 06:59:46.669464 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 06:59:46 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Dec 06 06:59:46 crc kubenswrapper[4954]: [+]process-running ok Dec 06 06:59:46 crc kubenswrapper[4954]: healthz check failed Dec 06 06:59:46 crc kubenswrapper[4954]: I1206 06:59:46.669532 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 06:59:47 crc kubenswrapper[4954]: I1206 06:59:47.086628 4954 generic.go:334] "Generic (PLEG): container finished" podID="242b0103-c12b-4540-ada6-6083450bb1ae" containerID="57e9eedb39173dc9fedb185793f6a01bafca9865197c94c1648a8716be8b3980" exitCode=0 Dec 06 06:59:47 crc kubenswrapper[4954]: I1206 06:59:47.086769 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"242b0103-c12b-4540-ada6-6083450bb1ae","Type":"ContainerDied","Data":"57e9eedb39173dc9fedb185793f6a01bafca9865197c94c1648a8716be8b3980"} Dec 06 06:59:47 crc kubenswrapper[4954]: I1206 06:59:47.094953 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-6bfbl" Dec 06 06:59:47 crc kubenswrapper[4954]: I1206 06:59:47.668311 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 06:59:47 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Dec 06 06:59:47 crc kubenswrapper[4954]: [+]process-running ok Dec 06 06:59:47 crc kubenswrapper[4954]: healthz check failed Dec 06 06:59:47 crc kubenswrapper[4954]: I1206 06:59:47.668437 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 06:59:47 crc kubenswrapper[4954]: I1206 06:59:47.814804 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4w27r" Dec 06 06:59:48 crc kubenswrapper[4954]: I1206 06:59:48.526188 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 06:59:48 crc kubenswrapper[4954]: I1206 06:59:48.667767 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 06:59:48 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Dec 06 06:59:48 crc kubenswrapper[4954]: [+]process-running ok Dec 06 06:59:48 crc kubenswrapper[4954]: healthz check failed Dec 06 06:59:48 crc kubenswrapper[4954]: I1206 06:59:48.667855 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 06:59:48 crc kubenswrapper[4954]: I1206 06:59:48.673510 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/242b0103-c12b-4540-ada6-6083450bb1ae-kube-api-access\") pod \"242b0103-c12b-4540-ada6-6083450bb1ae\" (UID: \"242b0103-c12b-4540-ada6-6083450bb1ae\") " Dec 06 06:59:48 crc kubenswrapper[4954]: I1206 06:59:48.674490 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/242b0103-c12b-4540-ada6-6083450bb1ae-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "242b0103-c12b-4540-ada6-6083450bb1ae" (UID: "242b0103-c12b-4540-ada6-6083450bb1ae"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 06:59:48 crc kubenswrapper[4954]: I1206 06:59:48.674296 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/242b0103-c12b-4540-ada6-6083450bb1ae-kubelet-dir\") pod \"242b0103-c12b-4540-ada6-6083450bb1ae\" (UID: \"242b0103-c12b-4540-ada6-6083450bb1ae\") " Dec 06 06:59:48 crc kubenswrapper[4954]: I1206 06:59:48.675829 4954 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/242b0103-c12b-4540-ada6-6083450bb1ae-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 06:59:48 crc kubenswrapper[4954]: I1206 06:59:48.698123 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/242b0103-c12b-4540-ada6-6083450bb1ae-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "242b0103-c12b-4540-ada6-6083450bb1ae" (UID: "242b0103-c12b-4540-ada6-6083450bb1ae"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 06:59:48 crc kubenswrapper[4954]: I1206 06:59:48.777342 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/242b0103-c12b-4540-ada6-6083450bb1ae-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 06:59:49 crc kubenswrapper[4954]: I1206 06:59:49.136773 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"242b0103-c12b-4540-ada6-6083450bb1ae","Type":"ContainerDied","Data":"9a722d6c1522be38f008c8f996831aab1970174f73851560c166b87b8397cbb5"} Dec 06 06:59:49 crc kubenswrapper[4954]: I1206 06:59:49.136865 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a722d6c1522be38f008c8f996831aab1970174f73851560c166b87b8397cbb5" Dec 06 06:59:49 crc kubenswrapper[4954]: I1206 06:59:49.136957 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 06:59:49 crc kubenswrapper[4954]: I1206 06:59:49.667362 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 06:59:49 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Dec 06 06:59:49 crc kubenswrapper[4954]: [+]process-running ok Dec 06 06:59:49 crc kubenswrapper[4954]: healthz check failed Dec 06 06:59:49 crc kubenswrapper[4954]: I1206 06:59:49.667450 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 06:59:50 crc kubenswrapper[4954]: I1206 06:59:50.667216 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 06:59:50 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Dec 06 06:59:50 crc kubenswrapper[4954]: [+]process-running ok Dec 06 06:59:50 crc kubenswrapper[4954]: healthz check failed Dec 06 06:59:50 crc kubenswrapper[4954]: I1206 06:59:50.667729 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 06:59:51 crc kubenswrapper[4954]: I1206 06:59:51.287611 4954 patch_prober.go:28] interesting pod/console-f9d7485db-z4t2h container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Dec 06 06:59:51 crc kubenswrapper[4954]: I1206 06:59:51.287673 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-z4t2h" podUID="b7733adb-d709-4fee-bfc6-728721369b82" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Dec 06 06:59:51 crc kubenswrapper[4954]: I1206 06:59:51.665864 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 06:59:51 crc kubenswrapper[4954]: [-]has-synced failed: reason withheld Dec 06 06:59:51 crc kubenswrapper[4954]: [+]process-running ok Dec 06 06:59:51 crc kubenswrapper[4954]: healthz check failed Dec 06 06:59:51 crc kubenswrapper[4954]: I1206 06:59:51.666005 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 06:59:52 crc kubenswrapper[4954]: I1206 06:59:52.120954 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-56lst" Dec 06 06:59:52 crc kubenswrapper[4954]: I1206 06:59:52.667489 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:52 crc kubenswrapper[4954]: I1206 06:59:52.669857 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-k5tts" Dec 06 06:59:56 crc kubenswrapper[4954]: I1206 06:59:56.133657 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs\") pod \"network-metrics-daemon-vtxfz\" (UID: \"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\") " pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:59:56 crc kubenswrapper[4954]: I1206 06:59:56.142054 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9377db43-9e5b-41e9-a9bc-f5fe3a81a457-metrics-certs\") pod \"network-metrics-daemon-vtxfz\" (UID: \"9377db43-9e5b-41e9-a9bc-f5fe3a81a457\") " pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 06:59:56 crc kubenswrapper[4954]: I1206 06:59:56.361200 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtxfz" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.143279 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm"] Dec 06 07:00:00 crc kubenswrapper[4954]: E1206 07:00:00.143921 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="242b0103-c12b-4540-ada6-6083450bb1ae" containerName="pruner" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.143937 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="242b0103-c12b-4540-ada6-6083450bb1ae" containerName="pruner" Dec 06 07:00:00 crc kubenswrapper[4954]: E1206 07:00:00.143956 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ec7685-0aeb-4cbb-87da-a37f7afae8bd" containerName="pruner" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.143964 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ec7685-0aeb-4cbb-87da-a37f7afae8bd" containerName="pruner" Dec 06 07:00:00 crc kubenswrapper[4954]: E1206 07:00:00.143986 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c7e376-ab13-4dfa-900f-8de633105709" containerName="collect-profiles" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.143996 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c7e376-ab13-4dfa-900f-8de633105709" containerName="collect-profiles" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.144130 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="242b0103-c12b-4540-ada6-6083450bb1ae" containerName="pruner" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.144147 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2c7e376-ab13-4dfa-900f-8de633105709" containerName="collect-profiles" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.144155 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ec7685-0aeb-4cbb-87da-a37f7afae8bd" containerName="pruner" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.145005 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.148585 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.148929 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.156554 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm"] Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.305380 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75a09e25-664b-42af-9587-8ce037ab7b82-config-volume\") pod \"collect-profiles-29416740-z7cjm\" (UID: \"75a09e25-664b-42af-9587-8ce037ab7b82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.305515 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75a09e25-664b-42af-9587-8ce037ab7b82-secret-volume\") pod \"collect-profiles-29416740-z7cjm\" (UID: \"75a09e25-664b-42af-9587-8ce037ab7b82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.305612 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62b8d\" (UniqueName: \"kubernetes.io/projected/75a09e25-664b-42af-9587-8ce037ab7b82-kube-api-access-62b8d\") pod \"collect-profiles-29416740-z7cjm\" (UID: \"75a09e25-664b-42af-9587-8ce037ab7b82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.408023 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75a09e25-664b-42af-9587-8ce037ab7b82-secret-volume\") pod \"collect-profiles-29416740-z7cjm\" (UID: \"75a09e25-664b-42af-9587-8ce037ab7b82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.408125 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62b8d\" (UniqueName: \"kubernetes.io/projected/75a09e25-664b-42af-9587-8ce037ab7b82-kube-api-access-62b8d\") pod \"collect-profiles-29416740-z7cjm\" (UID: \"75a09e25-664b-42af-9587-8ce037ab7b82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.408257 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75a09e25-664b-42af-9587-8ce037ab7b82-config-volume\") pod \"collect-profiles-29416740-z7cjm\" (UID: \"75a09e25-664b-42af-9587-8ce037ab7b82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.410118 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75a09e25-664b-42af-9587-8ce037ab7b82-config-volume\") pod \"collect-profiles-29416740-z7cjm\" (UID: \"75a09e25-664b-42af-9587-8ce037ab7b82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.415743 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75a09e25-664b-42af-9587-8ce037ab7b82-secret-volume\") pod \"collect-profiles-29416740-z7cjm\" (UID: \"75a09e25-664b-42af-9587-8ce037ab7b82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.431219 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62b8d\" (UniqueName: \"kubernetes.io/projected/75a09e25-664b-42af-9587-8ce037ab7b82-kube-api-access-62b8d\") pod \"collect-profiles-29416740-z7cjm\" (UID: \"75a09e25-664b-42af-9587-8ce037ab7b82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm" Dec 06 07:00:00 crc kubenswrapper[4954]: I1206 07:00:00.470653 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm" Dec 06 07:00:02 crc kubenswrapper[4954]: I1206 07:00:02.075418 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 07:00:02 crc kubenswrapper[4954]: I1206 07:00:02.120070 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 07:00:02 crc kubenswrapper[4954]: I1206 07:00:02.128691 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 07:00:10 crc kubenswrapper[4954]: I1206 07:00:10.101887 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:00:10 crc kubenswrapper[4954]: I1206 07:00:10.102710 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:00:12 crc kubenswrapper[4954]: I1206 07:00:12.116853 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p752" Dec 06 07:00:18 crc kubenswrapper[4954]: I1206 07:00:18.076663 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 06 07:00:18 crc kubenswrapper[4954]: I1206 07:00:18.079196 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 07:00:18 crc kubenswrapper[4954]: I1206 07:00:18.081393 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 06 07:00:18 crc kubenswrapper[4954]: I1206 07:00:18.083299 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 06 07:00:18 crc kubenswrapper[4954]: I1206 07:00:18.083661 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 06 07:00:18 crc kubenswrapper[4954]: I1206 07:00:18.241823 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/185de5b8-75f1-4158-9218-06c20298af4a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"185de5b8-75f1-4158-9218-06c20298af4a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 07:00:18 crc kubenswrapper[4954]: I1206 07:00:18.241978 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/185de5b8-75f1-4158-9218-06c20298af4a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"185de5b8-75f1-4158-9218-06c20298af4a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 07:00:18 crc kubenswrapper[4954]: I1206 07:00:18.343507 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/185de5b8-75f1-4158-9218-06c20298af4a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"185de5b8-75f1-4158-9218-06c20298af4a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 07:00:18 crc kubenswrapper[4954]: I1206 07:00:18.343587 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/185de5b8-75f1-4158-9218-06c20298af4a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"185de5b8-75f1-4158-9218-06c20298af4a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 07:00:18 crc kubenswrapper[4954]: I1206 07:00:18.343761 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/185de5b8-75f1-4158-9218-06c20298af4a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"185de5b8-75f1-4158-9218-06c20298af4a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 07:00:18 crc kubenswrapper[4954]: I1206 07:00:18.366883 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/185de5b8-75f1-4158-9218-06c20298af4a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"185de5b8-75f1-4158-9218-06c20298af4a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 07:00:18 crc kubenswrapper[4954]: I1206 07:00:18.409246 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 07:00:21 crc kubenswrapper[4954]: I1206 07:00:21.568256 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 07:00:23 crc kubenswrapper[4954]: I1206 07:00:23.092423 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 06 07:00:23 crc kubenswrapper[4954]: I1206 07:00:23.094949 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 07:00:23 crc kubenswrapper[4954]: I1206 07:00:23.095920 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 06 07:00:23 crc kubenswrapper[4954]: I1206 07:00:23.159607 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ca7f06d-b620-4abf-ae84-cbec03b0c534-var-lock\") pod \"installer-9-crc\" (UID: \"4ca7f06d-b620-4abf-ae84-cbec03b0c534\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 07:00:23 crc kubenswrapper[4954]: I1206 07:00:23.159679 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ca7f06d-b620-4abf-ae84-cbec03b0c534-kube-api-access\") pod \"installer-9-crc\" (UID: \"4ca7f06d-b620-4abf-ae84-cbec03b0c534\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 07:00:23 crc kubenswrapper[4954]: I1206 07:00:23.159748 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ca7f06d-b620-4abf-ae84-cbec03b0c534-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4ca7f06d-b620-4abf-ae84-cbec03b0c534\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 07:00:23 crc kubenswrapper[4954]: I1206 07:00:23.261063 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ca7f06d-b620-4abf-ae84-cbec03b0c534-var-lock\") pod \"installer-9-crc\" (UID: \"4ca7f06d-b620-4abf-ae84-cbec03b0c534\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 07:00:23 crc kubenswrapper[4954]: I1206 07:00:23.261139 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ca7f06d-b620-4abf-ae84-cbec03b0c534-kube-api-access\") pod \"installer-9-crc\" (UID: \"4ca7f06d-b620-4abf-ae84-cbec03b0c534\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 07:00:23 crc kubenswrapper[4954]: I1206 07:00:23.261174 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ca7f06d-b620-4abf-ae84-cbec03b0c534-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4ca7f06d-b620-4abf-ae84-cbec03b0c534\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 07:00:23 crc kubenswrapper[4954]: I1206 07:00:23.261318 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ca7f06d-b620-4abf-ae84-cbec03b0c534-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4ca7f06d-b620-4abf-ae84-cbec03b0c534\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 07:00:23 crc kubenswrapper[4954]: I1206 07:00:23.261636 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ca7f06d-b620-4abf-ae84-cbec03b0c534-var-lock\") pod \"installer-9-crc\" (UID: \"4ca7f06d-b620-4abf-ae84-cbec03b0c534\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 07:00:23 crc kubenswrapper[4954]: I1206 07:00:23.281191 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ca7f06d-b620-4abf-ae84-cbec03b0c534-kube-api-access\") pod \"installer-9-crc\" (UID: \"4ca7f06d-b620-4abf-ae84-cbec03b0c534\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 07:00:23 crc kubenswrapper[4954]: I1206 07:00:23.420415 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 07:00:33 crc kubenswrapper[4954]: E1206 07:00:33.164122 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 06 07:00:33 crc kubenswrapper[4954]: E1206 07:00:33.165000 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2nrpm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-vfhrk_openshift-marketplace(b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 07:00:33 crc kubenswrapper[4954]: E1206 07:00:33.166247 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-vfhrk" podUID="b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" Dec 06 07:00:37 crc kubenswrapper[4954]: E1206 07:00:37.893657 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-vfhrk" podUID="b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" Dec 06 07:00:38 crc kubenswrapper[4954]: E1206 07:00:38.290948 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 06 07:00:38 crc kubenswrapper[4954]: E1206 07:00:38.291163 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9rb69,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-zbdtt_openshift-marketplace(8571f5ab-84f5-4516-835c-01f5361d0ca4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 07:00:38 crc kubenswrapper[4954]: E1206 07:00:38.292336 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-zbdtt" podUID="8571f5ab-84f5-4516-835c-01f5361d0ca4" Dec 06 07:00:38 crc kubenswrapper[4954]: E1206 07:00:38.315120 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 06 07:00:38 crc kubenswrapper[4954]: E1206 07:00:38.315314 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sld9h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-r7qkf_openshift-marketplace(e4fd0cda-b6fe-411e-a815-cd883c0ed24f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 07:00:38 crc kubenswrapper[4954]: E1206 07:00:38.316666 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-r7qkf" podUID="e4fd0cda-b6fe-411e-a815-cd883c0ed24f" Dec 06 07:00:39 crc kubenswrapper[4954]: E1206 07:00:39.504690 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zbdtt" podUID="8571f5ab-84f5-4516-835c-01f5361d0ca4" Dec 06 07:00:39 crc kubenswrapper[4954]: E1206 07:00:39.504876 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-r7qkf" podUID="e4fd0cda-b6fe-411e-a815-cd883c0ed24f" Dec 06 07:00:39 crc kubenswrapper[4954]: E1206 07:00:39.578468 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 06 07:00:39 crc kubenswrapper[4954]: E1206 07:00:39.578689 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wbhtg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-dghdt_openshift-marketplace(ceb38465-b161-44c8-9e80-4c3df43ed7b1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 07:00:39 crc kubenswrapper[4954]: E1206 07:00:39.579889 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-dghdt" podUID="ceb38465-b161-44c8-9e80-4c3df43ed7b1" Dec 06 07:00:40 crc kubenswrapper[4954]: I1206 07:00:40.101614 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:00:40 crc kubenswrapper[4954]: I1206 07:00:40.102198 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:00:40 crc kubenswrapper[4954]: I1206 07:00:40.102284 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 07:00:40 crc kubenswrapper[4954]: I1206 07:00:40.103243 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 07:00:40 crc kubenswrapper[4954]: I1206 07:00:40.103392 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066" gracePeriod=600 Dec 06 07:00:40 crc kubenswrapper[4954]: I1206 07:00:40.485631 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066" exitCode=0 Dec 06 07:00:40 crc kubenswrapper[4954]: I1206 07:00:40.485789 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066"} Dec 06 07:00:42 crc kubenswrapper[4954]: E1206 07:00:42.646928 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-dghdt" podUID="ceb38465-b161-44c8-9e80-4c3df43ed7b1" Dec 06 07:00:42 crc kubenswrapper[4954]: E1206 07:00:42.745988 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 06 07:00:42 crc kubenswrapper[4954]: E1206 07:00:42.746682 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jglcx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-srtmw_openshift-marketplace(d4916e82-5036-41a7-9ff5-a710313a630c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 07:00:42 crc kubenswrapper[4954]: E1206 07:00:42.748844 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-srtmw" podUID="d4916e82-5036-41a7-9ff5-a710313a630c" Dec 06 07:00:42 crc kubenswrapper[4954]: E1206 07:00:42.764734 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 06 07:00:42 crc kubenswrapper[4954]: E1206 07:00:42.765221 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pqhzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-p6nxk_openshift-marketplace(245f7b8a-be27-4fa8-ae16-161d52cef432): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 07:00:42 crc kubenswrapper[4954]: E1206 07:00:42.766412 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-p6nxk" podUID="245f7b8a-be27-4fa8-ae16-161d52cef432" Dec 06 07:00:42 crc kubenswrapper[4954]: E1206 07:00:42.778694 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 06 07:00:42 crc kubenswrapper[4954]: E1206 07:00:42.778881 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4kxzc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-prszd_openshift-marketplace(92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 07:00:42 crc kubenswrapper[4954]: E1206 07:00:42.780252 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-prszd" podUID="92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850" Dec 06 07:00:42 crc kubenswrapper[4954]: E1206 07:00:42.800401 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 06 07:00:42 crc kubenswrapper[4954]: E1206 07:00:42.800615 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f58mz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5wq5n_openshift-marketplace(c21ec18f-c5ed-4306-9732-4ebdf2ee71d9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 07:00:42 crc kubenswrapper[4954]: E1206 07:00:42.802383 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5wq5n" podUID="c21ec18f-c5ed-4306-9732-4ebdf2ee71d9" Dec 06 07:00:42 crc kubenswrapper[4954]: I1206 07:00:42.970556 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 06 07:00:43 crc kubenswrapper[4954]: I1206 07:00:43.249289 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm"] Dec 06 07:00:43 crc kubenswrapper[4954]: W1206 07:00:43.257170 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75a09e25_664b_42af_9587_8ce037ab7b82.slice/crio-513bcf3f501351ef86ae158b44a379587718e46769130771631ff1bf47726b9d WatchSource:0}: Error finding container 513bcf3f501351ef86ae158b44a379587718e46769130771631ff1bf47726b9d: Status 404 returned error can't find the container with id 513bcf3f501351ef86ae158b44a379587718e46769130771631ff1bf47726b9d Dec 06 07:00:43 crc kubenswrapper[4954]: I1206 07:00:43.270145 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 06 07:00:43 crc kubenswrapper[4954]: I1206 07:00:43.298585 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vtxfz"] Dec 06 07:00:43 crc kubenswrapper[4954]: W1206 07:00:43.331977 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9377db43_9e5b_41e9_a9bc_f5fe3a81a457.slice/crio-cbd9bacb55a3e55afba1f190b2bfd6b423132842f05d075b842a7be64c979f56 WatchSource:0}: Error finding container cbd9bacb55a3e55afba1f190b2bfd6b423132842f05d075b842a7be64c979f56: Status 404 returned error can't find the container with id cbd9bacb55a3e55afba1f190b2bfd6b423132842f05d075b842a7be64c979f56 Dec 06 07:00:43 crc kubenswrapper[4954]: I1206 07:00:43.510419 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4ca7f06d-b620-4abf-ae84-cbec03b0c534","Type":"ContainerStarted","Data":"b068dc3067c14a8dde11f20a7e98bb530c928568633ca36959fba7eb0daad997"} Dec 06 07:00:43 crc kubenswrapper[4954]: I1206 07:00:43.512079 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4ca7f06d-b620-4abf-ae84-cbec03b0c534","Type":"ContainerStarted","Data":"620a6e6700defc367556bcfacef2264de8beb67df8a59c631d83a6fae0ba972a"} Dec 06 07:00:43 crc kubenswrapper[4954]: I1206 07:00:43.514381 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm" event={"ID":"75a09e25-664b-42af-9587-8ce037ab7b82","Type":"ContainerStarted","Data":"29592f7784d620334b63ad3b81dfa27d546fe39ea89acaa466a443c3ffdcf938"} Dec 06 07:00:43 crc kubenswrapper[4954]: I1206 07:00:43.514411 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm" event={"ID":"75a09e25-664b-42af-9587-8ce037ab7b82","Type":"ContainerStarted","Data":"513bcf3f501351ef86ae158b44a379587718e46769130771631ff1bf47726b9d"} Dec 06 07:00:43 crc kubenswrapper[4954]: I1206 07:00:43.516752 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"185de5b8-75f1-4158-9218-06c20298af4a","Type":"ContainerStarted","Data":"3a0a8f7d5c136a3fa5c530e185bc4fb0dd80385c054e6368c3b7cb8fd0bf5890"} Dec 06 07:00:43 crc kubenswrapper[4954]: I1206 07:00:43.524034 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"6914d276fe4a0b459ad50829342dc22d6ca712a1175430fe981582770ec6fdab"} Dec 06 07:00:43 crc kubenswrapper[4954]: I1206 07:00:43.525243 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" event={"ID":"9377db43-9e5b-41e9-a9bc-f5fe3a81a457","Type":"ContainerStarted","Data":"cbd9bacb55a3e55afba1f190b2bfd6b423132842f05d075b842a7be64c979f56"} Dec 06 07:00:43 crc kubenswrapper[4954]: E1206 07:00:43.528757 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-prszd" podUID="92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850" Dec 06 07:00:43 crc kubenswrapper[4954]: E1206 07:00:43.528930 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-p6nxk" podUID="245f7b8a-be27-4fa8-ae16-161d52cef432" Dec 06 07:00:43 crc kubenswrapper[4954]: E1206 07:00:43.529705 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5wq5n" podUID="c21ec18f-c5ed-4306-9732-4ebdf2ee71d9" Dec 06 07:00:43 crc kubenswrapper[4954]: E1206 07:00:43.529790 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-srtmw" podUID="d4916e82-5036-41a7-9ff5-a710313a630c" Dec 06 07:00:43 crc kubenswrapper[4954]: I1206 07:00:43.538681 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=20.538657934 podStartE2EDuration="20.538657934s" podCreationTimestamp="2025-12-06 07:00:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:00:43.529586438 +0000 UTC m=+218.342945837" watchObservedRunningTime="2025-12-06 07:00:43.538657934 +0000 UTC m=+218.352017323" Dec 06 07:00:44 crc kubenswrapper[4954]: I1206 07:00:44.535068 4954 generic.go:334] "Generic (PLEG): container finished" podID="75a09e25-664b-42af-9587-8ce037ab7b82" containerID="29592f7784d620334b63ad3b81dfa27d546fe39ea89acaa466a443c3ffdcf938" exitCode=0 Dec 06 07:00:44 crc kubenswrapper[4954]: I1206 07:00:44.535147 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm" event={"ID":"75a09e25-664b-42af-9587-8ce037ab7b82","Type":"ContainerDied","Data":"29592f7784d620334b63ad3b81dfa27d546fe39ea89acaa466a443c3ffdcf938"} Dec 06 07:00:44 crc kubenswrapper[4954]: I1206 07:00:44.537827 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"185de5b8-75f1-4158-9218-06c20298af4a","Type":"ContainerStarted","Data":"e84b17a656f2ee5ebf6e521cadec88e3a602537c2654cc6dc59ac4f3ca05e1b4"} Dec 06 07:00:44 crc kubenswrapper[4954]: I1206 07:00:44.539385 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" event={"ID":"9377db43-9e5b-41e9-a9bc-f5fe3a81a457","Type":"ContainerStarted","Data":"d13297d64bf455bfd1dd305a72e98f2ded19cc7237a34dad9028f0fdb64a385b"} Dec 06 07:00:44 crc kubenswrapper[4954]: I1206 07:00:44.539424 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vtxfz" event={"ID":"9377db43-9e5b-41e9-a9bc-f5fe3a81a457","Type":"ContainerStarted","Data":"bb77220a11fcb94f94d0e60a9425a9c15665521fb5de73cab895fe7cccaab07e"} Dec 06 07:00:44 crc kubenswrapper[4954]: I1206 07:00:44.562601 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=26.56255202 podStartE2EDuration="26.56255202s" podCreationTimestamp="2025-12-06 07:00:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:00:44.556117392 +0000 UTC m=+219.369476781" watchObservedRunningTime="2025-12-06 07:00:44.56255202 +0000 UTC m=+219.375911409" Dec 06 07:00:44 crc kubenswrapper[4954]: I1206 07:00:44.578657 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vtxfz" podStartSLOduration=191.578626169 podStartE2EDuration="3m11.578626169s" podCreationTimestamp="2025-12-06 06:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:00:44.570042865 +0000 UTC m=+219.383402254" watchObservedRunningTime="2025-12-06 07:00:44.578626169 +0000 UTC m=+219.391985568" Dec 06 07:00:44 crc kubenswrapper[4954]: I1206 07:00:44.834425 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm" Dec 06 07:00:44 crc kubenswrapper[4954]: I1206 07:00:44.882926 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75a09e25-664b-42af-9587-8ce037ab7b82-secret-volume\") pod \"75a09e25-664b-42af-9587-8ce037ab7b82\" (UID: \"75a09e25-664b-42af-9587-8ce037ab7b82\") " Dec 06 07:00:44 crc kubenswrapper[4954]: I1206 07:00:44.883067 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62b8d\" (UniqueName: \"kubernetes.io/projected/75a09e25-664b-42af-9587-8ce037ab7b82-kube-api-access-62b8d\") pod \"75a09e25-664b-42af-9587-8ce037ab7b82\" (UID: \"75a09e25-664b-42af-9587-8ce037ab7b82\") " Dec 06 07:00:44 crc kubenswrapper[4954]: I1206 07:00:44.883153 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75a09e25-664b-42af-9587-8ce037ab7b82-config-volume\") pod \"75a09e25-664b-42af-9587-8ce037ab7b82\" (UID: \"75a09e25-664b-42af-9587-8ce037ab7b82\") " Dec 06 07:00:44 crc kubenswrapper[4954]: I1206 07:00:44.884369 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75a09e25-664b-42af-9587-8ce037ab7b82-config-volume" (OuterVolumeSpecName: "config-volume") pod "75a09e25-664b-42af-9587-8ce037ab7b82" (UID: "75a09e25-664b-42af-9587-8ce037ab7b82"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:00:44 crc kubenswrapper[4954]: I1206 07:00:44.889744 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a09e25-664b-42af-9587-8ce037ab7b82-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "75a09e25-664b-42af-9587-8ce037ab7b82" (UID: "75a09e25-664b-42af-9587-8ce037ab7b82"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:00:44 crc kubenswrapper[4954]: I1206 07:00:44.890699 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75a09e25-664b-42af-9587-8ce037ab7b82-kube-api-access-62b8d" (OuterVolumeSpecName: "kube-api-access-62b8d") pod "75a09e25-664b-42af-9587-8ce037ab7b82" (UID: "75a09e25-664b-42af-9587-8ce037ab7b82"). InnerVolumeSpecName "kube-api-access-62b8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:00:44 crc kubenswrapper[4954]: I1206 07:00:44.984222 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75a09e25-664b-42af-9587-8ce037ab7b82-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 07:00:44 crc kubenswrapper[4954]: I1206 07:00:44.984264 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75a09e25-664b-42af-9587-8ce037ab7b82-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 07:00:44 crc kubenswrapper[4954]: I1206 07:00:44.984276 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62b8d\" (UniqueName: \"kubernetes.io/projected/75a09e25-664b-42af-9587-8ce037ab7b82-kube-api-access-62b8d\") on node \"crc\" DevicePath \"\"" Dec 06 07:00:45 crc kubenswrapper[4954]: I1206 07:00:45.547899 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm" event={"ID":"75a09e25-664b-42af-9587-8ce037ab7b82","Type":"ContainerDied","Data":"513bcf3f501351ef86ae158b44a379587718e46769130771631ff1bf47726b9d"} Dec 06 07:00:45 crc kubenswrapper[4954]: I1206 07:00:45.547961 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="513bcf3f501351ef86ae158b44a379587718e46769130771631ff1bf47726b9d" Dec 06 07:00:45 crc kubenswrapper[4954]: I1206 07:00:45.547930 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm" Dec 06 07:00:45 crc kubenswrapper[4954]: I1206 07:00:45.550008 4954 generic.go:334] "Generic (PLEG): container finished" podID="185de5b8-75f1-4158-9218-06c20298af4a" containerID="e84b17a656f2ee5ebf6e521cadec88e3a602537c2654cc6dc59ac4f3ca05e1b4" exitCode=0 Dec 06 07:00:45 crc kubenswrapper[4954]: I1206 07:00:45.550962 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"185de5b8-75f1-4158-9218-06c20298af4a","Type":"ContainerDied","Data":"e84b17a656f2ee5ebf6e521cadec88e3a602537c2654cc6dc59ac4f3ca05e1b4"} Dec 06 07:00:46 crc kubenswrapper[4954]: I1206 07:00:46.826598 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 07:00:46 crc kubenswrapper[4954]: I1206 07:00:46.911433 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/185de5b8-75f1-4158-9218-06c20298af4a-kube-api-access\") pod \"185de5b8-75f1-4158-9218-06c20298af4a\" (UID: \"185de5b8-75f1-4158-9218-06c20298af4a\") " Dec 06 07:00:46 crc kubenswrapper[4954]: I1206 07:00:46.911576 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/185de5b8-75f1-4158-9218-06c20298af4a-kubelet-dir\") pod \"185de5b8-75f1-4158-9218-06c20298af4a\" (UID: \"185de5b8-75f1-4158-9218-06c20298af4a\") " Dec 06 07:00:46 crc kubenswrapper[4954]: I1206 07:00:46.911632 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/185de5b8-75f1-4158-9218-06c20298af4a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "185de5b8-75f1-4158-9218-06c20298af4a" (UID: "185de5b8-75f1-4158-9218-06c20298af4a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:00:46 crc kubenswrapper[4954]: I1206 07:00:46.911949 4954 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/185de5b8-75f1-4158-9218-06c20298af4a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 07:00:46 crc kubenswrapper[4954]: I1206 07:00:46.916815 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/185de5b8-75f1-4158-9218-06c20298af4a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "185de5b8-75f1-4158-9218-06c20298af4a" (UID: "185de5b8-75f1-4158-9218-06c20298af4a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:00:47 crc kubenswrapper[4954]: I1206 07:00:47.013331 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/185de5b8-75f1-4158-9218-06c20298af4a-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 07:00:47 crc kubenswrapper[4954]: I1206 07:00:47.566061 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"185de5b8-75f1-4158-9218-06c20298af4a","Type":"ContainerDied","Data":"3a0a8f7d5c136a3fa5c530e185bc4fb0dd80385c054e6368c3b7cb8fd0bf5890"} Dec 06 07:00:47 crc kubenswrapper[4954]: I1206 07:00:47.566370 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a0a8f7d5c136a3fa5c530e185bc4fb0dd80385c054e6368c3b7cb8fd0bf5890" Dec 06 07:00:47 crc kubenswrapper[4954]: I1206 07:00:47.566166 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 07:00:51 crc kubenswrapper[4954]: I1206 07:00:51.588131 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfhrk" event={"ID":"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f","Type":"ContainerStarted","Data":"1abe9b8016c57f619ba2dc6f693d191bd67737d087800823510285435e750e6e"} Dec 06 07:00:52 crc kubenswrapper[4954]: I1206 07:00:52.596667 4954 generic.go:334] "Generic (PLEG): container finished" podID="b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" containerID="1abe9b8016c57f619ba2dc6f693d191bd67737d087800823510285435e750e6e" exitCode=0 Dec 06 07:00:52 crc kubenswrapper[4954]: I1206 07:00:52.596746 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfhrk" event={"ID":"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f","Type":"ContainerDied","Data":"1abe9b8016c57f619ba2dc6f693d191bd67737d087800823510285435e750e6e"} Dec 06 07:00:54 crc kubenswrapper[4954]: I1206 07:00:54.609227 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7qkf" event={"ID":"e4fd0cda-b6fe-411e-a815-cd883c0ed24f","Type":"ContainerStarted","Data":"42f11f06885ea92c3fdb9230ad88ef6019a7610d019c0fbf5c8f5bc96256504f"} Dec 06 07:00:54 crc kubenswrapper[4954]: I1206 07:00:54.611880 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfhrk" event={"ID":"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f","Type":"ContainerStarted","Data":"3fe5231654f12b40a2a7c805cc168900fdb417914f0fd8b96c8d52a90f15fe07"} Dec 06 07:00:54 crc kubenswrapper[4954]: I1206 07:00:54.649313 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vfhrk" podStartSLOduration=6.54249365 podStartE2EDuration="1m19.649286742s" podCreationTimestamp="2025-12-06 06:59:35 +0000 UTC" firstStartedPulling="2025-12-06 06:59:39.958145085 +0000 UTC m=+154.771504474" lastFinishedPulling="2025-12-06 07:00:53.064938187 +0000 UTC m=+227.878297566" observedRunningTime="2025-12-06 07:00:54.646097509 +0000 UTC m=+229.459456908" watchObservedRunningTime="2025-12-06 07:00:54.649286742 +0000 UTC m=+229.462646131" Dec 06 07:00:55 crc kubenswrapper[4954]: I1206 07:00:55.620821 4954 generic.go:334] "Generic (PLEG): container finished" podID="e4fd0cda-b6fe-411e-a815-cd883c0ed24f" containerID="42f11f06885ea92c3fdb9230ad88ef6019a7610d019c0fbf5c8f5bc96256504f" exitCode=0 Dec 06 07:00:55 crc kubenswrapper[4954]: I1206 07:00:55.620899 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7qkf" event={"ID":"e4fd0cda-b6fe-411e-a815-cd883c0ed24f","Type":"ContainerDied","Data":"42f11f06885ea92c3fdb9230ad88ef6019a7610d019c0fbf5c8f5bc96256504f"} Dec 06 07:00:56 crc kubenswrapper[4954]: I1206 07:00:56.147756 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vfhrk" Dec 06 07:00:56 crc kubenswrapper[4954]: I1206 07:00:56.148248 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vfhrk" Dec 06 07:00:56 crc kubenswrapper[4954]: I1206 07:00:56.236672 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vfhrk" Dec 06 07:00:56 crc kubenswrapper[4954]: I1206 07:00:56.630839 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6nxk" event={"ID":"245f7b8a-be27-4fa8-ae16-161d52cef432","Type":"ContainerStarted","Data":"707d4e2743dbc76e4c174df3f0152353415c4da3d6541d46b3b6ce8cf7e6d6a8"} Dec 06 07:00:56 crc kubenswrapper[4954]: I1206 07:00:56.634010 4954 generic.go:334] "Generic (PLEG): container finished" podID="d4916e82-5036-41a7-9ff5-a710313a630c" containerID="e8715c18a906442f4fc26acad8c41e106d44a3ddbc6f29202cf8449f80cd609a" exitCode=0 Dec 06 07:00:56 crc kubenswrapper[4954]: I1206 07:00:56.634084 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srtmw" event={"ID":"d4916e82-5036-41a7-9ff5-a710313a630c","Type":"ContainerDied","Data":"e8715c18a906442f4fc26acad8c41e106d44a3ddbc6f29202cf8449f80cd609a"} Dec 06 07:00:56 crc kubenswrapper[4954]: I1206 07:00:56.638341 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7qkf" event={"ID":"e4fd0cda-b6fe-411e-a815-cd883c0ed24f","Type":"ContainerStarted","Data":"2fca668661f0cf340b4dd568ac6480ee8cd4f22e06405d09c84a6649be9ad0ed"} Dec 06 07:00:56 crc kubenswrapper[4954]: I1206 07:00:56.640973 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dghdt" event={"ID":"ceb38465-b161-44c8-9e80-4c3df43ed7b1","Type":"ContainerStarted","Data":"4a05e687ecae746477f0d91bffd64a4f879662b1c31d765ba3520590a1cfeaac"} Dec 06 07:00:56 crc kubenswrapper[4954]: I1206 07:00:56.644657 4954 generic.go:334] "Generic (PLEG): container finished" podID="8571f5ab-84f5-4516-835c-01f5361d0ca4" containerID="e275a11fe1c2a0e21104e3b813c3abf8e65394ae24c24669584656cdde1d1ccc" exitCode=0 Dec 06 07:00:56 crc kubenswrapper[4954]: I1206 07:00:56.645164 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbdtt" event={"ID":"8571f5ab-84f5-4516-835c-01f5361d0ca4","Type":"ContainerDied","Data":"e275a11fe1c2a0e21104e3b813c3abf8e65394ae24c24669584656cdde1d1ccc"} Dec 06 07:00:56 crc kubenswrapper[4954]: I1206 07:00:56.721577 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r7qkf" podStartSLOduration=3.961417847 podStartE2EDuration="1m21.721535942s" podCreationTimestamp="2025-12-06 06:59:35 +0000 UTC" firstStartedPulling="2025-12-06 06:59:38.434664088 +0000 UTC m=+153.248023477" lastFinishedPulling="2025-12-06 07:00:56.194782183 +0000 UTC m=+231.008141572" observedRunningTime="2025-12-06 07:00:56.72108409 +0000 UTC m=+231.534443499" watchObservedRunningTime="2025-12-06 07:00:56.721535942 +0000 UTC m=+231.534895331" Dec 06 07:00:57 crc kubenswrapper[4954]: I1206 07:00:57.655430 4954 generic.go:334] "Generic (PLEG): container finished" podID="245f7b8a-be27-4fa8-ae16-161d52cef432" containerID="707d4e2743dbc76e4c174df3f0152353415c4da3d6541d46b3b6ce8cf7e6d6a8" exitCode=0 Dec 06 07:00:57 crc kubenswrapper[4954]: I1206 07:00:57.655980 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6nxk" event={"ID":"245f7b8a-be27-4fa8-ae16-161d52cef432","Type":"ContainerDied","Data":"707d4e2743dbc76e4c174df3f0152353415c4da3d6541d46b3b6ce8cf7e6d6a8"} Dec 06 07:00:57 crc kubenswrapper[4954]: I1206 07:00:57.666726 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wq5n" event={"ID":"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9","Type":"ContainerStarted","Data":"1826ba2f2f6562ed9bcd850d393eeff99df1d9a064eabd5d5a6a3dfe30acab00"} Dec 06 07:00:57 crc kubenswrapper[4954]: I1206 07:00:57.669821 4954 generic.go:334] "Generic (PLEG): container finished" podID="ceb38465-b161-44c8-9e80-4c3df43ed7b1" containerID="4a05e687ecae746477f0d91bffd64a4f879662b1c31d765ba3520590a1cfeaac" exitCode=0 Dec 06 07:00:57 crc kubenswrapper[4954]: I1206 07:00:57.669857 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dghdt" event={"ID":"ceb38465-b161-44c8-9e80-4c3df43ed7b1","Type":"ContainerDied","Data":"4a05e687ecae746477f0d91bffd64a4f879662b1c31d765ba3520590a1cfeaac"} Dec 06 07:00:58 crc kubenswrapper[4954]: I1206 07:00:58.679040 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dghdt" event={"ID":"ceb38465-b161-44c8-9e80-4c3df43ed7b1","Type":"ContainerStarted","Data":"077ab6db0e07b16d98fe37a9afeab8697c3ae1225670ffe91284d1fbda255662"} Dec 06 07:00:58 crc kubenswrapper[4954]: I1206 07:00:58.682730 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbdtt" event={"ID":"8571f5ab-84f5-4516-835c-01f5361d0ca4","Type":"ContainerStarted","Data":"fb899e54d819f5419a642fe9d990ee3ced66427231763c3ec57907866b42b7eb"} Dec 06 07:00:58 crc kubenswrapper[4954]: I1206 07:00:58.691319 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6nxk" event={"ID":"245f7b8a-be27-4fa8-ae16-161d52cef432","Type":"ContainerStarted","Data":"cf1a4897ed1f0d9b5239936c4ac5919acdfea5dc7a2f8006dd5665bd62c12676"} Dec 06 07:00:58 crc kubenswrapper[4954]: I1206 07:00:58.695119 4954 generic.go:334] "Generic (PLEG): container finished" podID="c21ec18f-c5ed-4306-9732-4ebdf2ee71d9" containerID="1826ba2f2f6562ed9bcd850d393eeff99df1d9a064eabd5d5a6a3dfe30acab00" exitCode=0 Dec 06 07:00:58 crc kubenswrapper[4954]: I1206 07:00:58.695215 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wq5n" event={"ID":"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9","Type":"ContainerDied","Data":"1826ba2f2f6562ed9bcd850d393eeff99df1d9a064eabd5d5a6a3dfe30acab00"} Dec 06 07:00:58 crc kubenswrapper[4954]: I1206 07:00:58.698265 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srtmw" event={"ID":"d4916e82-5036-41a7-9ff5-a710313a630c","Type":"ContainerStarted","Data":"965ba144f4e5f127555c1b2cd37ec474f4179485121191b711997bc32db1ded9"} Dec 06 07:00:58 crc kubenswrapper[4954]: I1206 07:00:58.735438 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dghdt" podStartSLOduration=4.912721466 podStartE2EDuration="1m21.735417271s" podCreationTimestamp="2025-12-06 06:59:37 +0000 UTC" firstStartedPulling="2025-12-06 06:59:41.320782476 +0000 UTC m=+156.134141865" lastFinishedPulling="2025-12-06 07:00:58.143478281 +0000 UTC m=+232.956837670" observedRunningTime="2025-12-06 07:00:58.711004094 +0000 UTC m=+233.524363483" watchObservedRunningTime="2025-12-06 07:00:58.735417271 +0000 UTC m=+233.548776660" Dec 06 07:00:58 crc kubenswrapper[4954]: I1206 07:00:58.756663 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p6nxk" podStartSLOduration=4.11230411 podStartE2EDuration="1m20.756647363s" podCreationTimestamp="2025-12-06 06:59:38 +0000 UTC" firstStartedPulling="2025-12-06 06:59:41.409079629 +0000 UTC m=+156.222439018" lastFinishedPulling="2025-12-06 07:00:58.053422872 +0000 UTC m=+232.866782271" observedRunningTime="2025-12-06 07:00:58.75342949 +0000 UTC m=+233.566788889" watchObservedRunningTime="2025-12-06 07:00:58.756647363 +0000 UTC m=+233.570006742" Dec 06 07:00:58 crc kubenswrapper[4954]: I1206 07:00:58.784209 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-srtmw" podStartSLOduration=5.697534395 podStartE2EDuration="1m21.784189292s" podCreationTimestamp="2025-12-06 06:59:37 +0000 UTC" firstStartedPulling="2025-12-06 06:59:41.467291687 +0000 UTC m=+156.280651076" lastFinishedPulling="2025-12-06 07:00:57.553946584 +0000 UTC m=+232.367305973" observedRunningTime="2025-12-06 07:00:58.780167677 +0000 UTC m=+233.593527066" watchObservedRunningTime="2025-12-06 07:00:58.784189292 +0000 UTC m=+233.597548691" Dec 06 07:00:58 crc kubenswrapper[4954]: I1206 07:00:58.810231 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zbdtt" podStartSLOduration=6.245726776 podStartE2EDuration="1m23.810169369s" podCreationTimestamp="2025-12-06 06:59:35 +0000 UTC" firstStartedPulling="2025-12-06 06:59:40.074136379 +0000 UTC m=+154.887495768" lastFinishedPulling="2025-12-06 07:00:57.638578972 +0000 UTC m=+232.451938361" observedRunningTime="2025-12-06 07:00:58.803576017 +0000 UTC m=+233.616935406" watchObservedRunningTime="2025-12-06 07:00:58.810169369 +0000 UTC m=+233.623528768" Dec 06 07:00:59 crc kubenswrapper[4954]: I1206 07:00:59.076293 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p6nxk" Dec 06 07:00:59 crc kubenswrapper[4954]: I1206 07:00:59.076352 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p6nxk" Dec 06 07:01:00 crc kubenswrapper[4954]: I1206 07:01:00.119097 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p6nxk" podUID="245f7b8a-be27-4fa8-ae16-161d52cef432" containerName="registry-server" probeResult="failure" output=< Dec 06 07:01:00 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 07:01:00 crc kubenswrapper[4954]: > Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.338679 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5g5d4"] Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.603705 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r7qkf" Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.605396 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r7qkf" Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.657778 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r7qkf" Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.672926 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5wq5n"] Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.682989 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vfhrk"] Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.683475 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vfhrk" podUID="b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" containerName="registry-server" containerID="cri-o://3fe5231654f12b40a2a7c805cc168900fdb417914f0fd8b96c8d52a90f15fe07" gracePeriod=30 Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.701168 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r7qkf"] Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.716121 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zbdtt"] Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.716594 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zbdtt" podUID="8571f5ab-84f5-4516-835c-01f5361d0ca4" containerName="registry-server" containerID="cri-o://fb899e54d819f5419a642fe9d990ee3ced66427231763c3ec57907866b42b7eb" gracePeriod=30 Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.727989 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-29x4r"] Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.728306 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" podUID="97da928c-a8a1-48ef-90f4-68a650becdf6" containerName="marketplace-operator" containerID="cri-o://a9574ad19f6d3ee15996d25ff8e3294a32fe268f15ee300dd1f81198811acc71" gracePeriod=30 Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.731308 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dghdt"] Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.731711 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dghdt" podUID="ceb38465-b161-44c8-9e80-4c3df43ed7b1" containerName="registry-server" containerID="cri-o://077ab6db0e07b16d98fe37a9afeab8697c3ae1225670ffe91284d1fbda255662" gracePeriod=30 Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.735502 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-srtmw"] Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.735904 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-srtmw" podUID="d4916e82-5036-41a7-9ff5-a710313a630c" containerName="registry-server" containerID="cri-o://965ba144f4e5f127555c1b2cd37ec474f4179485121191b711997bc32db1ded9" gracePeriod=30 Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.738369 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p4n6q"] Dec 06 07:01:05 crc kubenswrapper[4954]: E1206 07:01:05.739219 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185de5b8-75f1-4158-9218-06c20298af4a" containerName="pruner" Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.739249 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="185de5b8-75f1-4158-9218-06c20298af4a" containerName="pruner" Dec 06 07:01:05 crc kubenswrapper[4954]: E1206 07:01:05.739272 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75a09e25-664b-42af-9587-8ce037ab7b82" containerName="collect-profiles" Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.739283 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a09e25-664b-42af-9587-8ce037ab7b82" containerName="collect-profiles" Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.739412 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="185de5b8-75f1-4158-9218-06c20298af4a" containerName="pruner" Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.739428 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="75a09e25-664b-42af-9587-8ce037ab7b82" containerName="collect-profiles" Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.740075 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p4n6q" Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.747238 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p6nxk"] Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.748024 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p6nxk" podUID="245f7b8a-be27-4fa8-ae16-161d52cef432" containerName="registry-server" containerID="cri-o://cf1a4897ed1f0d9b5239936c4ac5919acdfea5dc7a2f8006dd5665bd62c12676" gracePeriod=30 Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.760665 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p4n6q"] Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.768726 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-prszd"] Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.813234 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-vfhrk" podUID="b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" containerName="registry-server" probeResult="failure" output="" Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.822178 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r7qkf" Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.900300 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61c42882-513e-47f2-8faf-a95e4b0126f6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p4n6q\" (UID: \"61c42882-513e-47f2-8faf-a95e4b0126f6\") " pod="openshift-marketplace/marketplace-operator-79b997595-p4n6q" Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.900389 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/61c42882-513e-47f2-8faf-a95e4b0126f6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p4n6q\" (UID: \"61c42882-513e-47f2-8faf-a95e4b0126f6\") " pod="openshift-marketplace/marketplace-operator-79b997595-p4n6q" Dec 06 07:01:05 crc kubenswrapper[4954]: I1206 07:01:05.900429 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbsbc\" (UniqueName: \"kubernetes.io/projected/61c42882-513e-47f2-8faf-a95e4b0126f6-kube-api-access-kbsbc\") pod \"marketplace-operator-79b997595-p4n6q\" (UID: \"61c42882-513e-47f2-8faf-a95e4b0126f6\") " pod="openshift-marketplace/marketplace-operator-79b997595-p4n6q" Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.002025 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61c42882-513e-47f2-8faf-a95e4b0126f6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p4n6q\" (UID: \"61c42882-513e-47f2-8faf-a95e4b0126f6\") " pod="openshift-marketplace/marketplace-operator-79b997595-p4n6q" Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.002123 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/61c42882-513e-47f2-8faf-a95e4b0126f6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p4n6q\" (UID: \"61c42882-513e-47f2-8faf-a95e4b0126f6\") " pod="openshift-marketplace/marketplace-operator-79b997595-p4n6q" Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.002175 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbsbc\" (UniqueName: \"kubernetes.io/projected/61c42882-513e-47f2-8faf-a95e4b0126f6-kube-api-access-kbsbc\") pod \"marketplace-operator-79b997595-p4n6q\" (UID: \"61c42882-513e-47f2-8faf-a95e4b0126f6\") " pod="openshift-marketplace/marketplace-operator-79b997595-p4n6q" Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.003519 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61c42882-513e-47f2-8faf-a95e4b0126f6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p4n6q\" (UID: \"61c42882-513e-47f2-8faf-a95e4b0126f6\") " pod="openshift-marketplace/marketplace-operator-79b997595-p4n6q" Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.008984 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/61c42882-513e-47f2-8faf-a95e4b0126f6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p4n6q\" (UID: \"61c42882-513e-47f2-8faf-a95e4b0126f6\") " pod="openshift-marketplace/marketplace-operator-79b997595-p4n6q" Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.022065 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbsbc\" (UniqueName: \"kubernetes.io/projected/61c42882-513e-47f2-8faf-a95e4b0126f6-kube-api-access-kbsbc\") pod \"marketplace-operator-79b997595-p4n6q\" (UID: \"61c42882-513e-47f2-8faf-a95e4b0126f6\") " pod="openshift-marketplace/marketplace-operator-79b997595-p4n6q" Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.080211 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p4n6q" Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.106086 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zbdtt" Dec 06 07:01:06 crc kubenswrapper[4954]: E1206 07:01:06.149009 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe5231654f12b40a2a7c805cc168900fdb417914f0fd8b96c8d52a90f15fe07 is running failed: container process not found" containerID="3fe5231654f12b40a2a7c805cc168900fdb417914f0fd8b96c8d52a90f15fe07" cmd=["grpc_health_probe","-addr=:50051"] Dec 06 07:01:06 crc kubenswrapper[4954]: E1206 07:01:06.149825 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe5231654f12b40a2a7c805cc168900fdb417914f0fd8b96c8d52a90f15fe07 is running failed: container process not found" containerID="3fe5231654f12b40a2a7c805cc168900fdb417914f0fd8b96c8d52a90f15fe07" cmd=["grpc_health_probe","-addr=:50051"] Dec 06 07:01:06 crc kubenswrapper[4954]: E1206 07:01:06.150140 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe5231654f12b40a2a7c805cc168900fdb417914f0fd8b96c8d52a90f15fe07 is running failed: container process not found" containerID="3fe5231654f12b40a2a7c805cc168900fdb417914f0fd8b96c8d52a90f15fe07" cmd=["grpc_health_probe","-addr=:50051"] Dec 06 07:01:06 crc kubenswrapper[4954]: E1206 07:01:06.150184 4954 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe5231654f12b40a2a7c805cc168900fdb417914f0fd8b96c8d52a90f15fe07 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-vfhrk" podUID="b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" containerName="registry-server" Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.776509 4954 generic.go:334] "Generic (PLEG): container finished" podID="b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" containerID="3fe5231654f12b40a2a7c805cc168900fdb417914f0fd8b96c8d52a90f15fe07" exitCode=0 Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.776547 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfhrk" event={"ID":"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f","Type":"ContainerDied","Data":"3fe5231654f12b40a2a7c805cc168900fdb417914f0fd8b96c8d52a90f15fe07"} Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.780310 4954 generic.go:334] "Generic (PLEG): container finished" podID="d4916e82-5036-41a7-9ff5-a710313a630c" containerID="965ba144f4e5f127555c1b2cd37ec474f4179485121191b711997bc32db1ded9" exitCode=0 Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.780369 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srtmw" event={"ID":"d4916e82-5036-41a7-9ff5-a710313a630c","Type":"ContainerDied","Data":"965ba144f4e5f127555c1b2cd37ec474f4179485121191b711997bc32db1ded9"} Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.783356 4954 generic.go:334] "Generic (PLEG): container finished" podID="ceb38465-b161-44c8-9e80-4c3df43ed7b1" containerID="077ab6db0e07b16d98fe37a9afeab8697c3ae1225670ffe91284d1fbda255662" exitCode=0 Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.783390 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dghdt" event={"ID":"ceb38465-b161-44c8-9e80-4c3df43ed7b1","Type":"ContainerDied","Data":"077ab6db0e07b16d98fe37a9afeab8697c3ae1225670ffe91284d1fbda255662"} Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.785617 4954 generic.go:334] "Generic (PLEG): container finished" podID="97da928c-a8a1-48ef-90f4-68a650becdf6" containerID="a9574ad19f6d3ee15996d25ff8e3294a32fe268f15ee300dd1f81198811acc71" exitCode=0 Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.785663 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" event={"ID":"97da928c-a8a1-48ef-90f4-68a650becdf6","Type":"ContainerDied","Data":"a9574ad19f6d3ee15996d25ff8e3294a32fe268f15ee300dd1f81198811acc71"} Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.788201 4954 generic.go:334] "Generic (PLEG): container finished" podID="8571f5ab-84f5-4516-835c-01f5361d0ca4" containerID="fb899e54d819f5419a642fe9d990ee3ced66427231763c3ec57907866b42b7eb" exitCode=0 Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.788291 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbdtt" event={"ID":"8571f5ab-84f5-4516-835c-01f5361d0ca4","Type":"ContainerDied","Data":"fb899e54d819f5419a642fe9d990ee3ced66427231763c3ec57907866b42b7eb"} Dec 06 07:01:06 crc kubenswrapper[4954]: I1206 07:01:06.788437 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r7qkf" podUID="e4fd0cda-b6fe-411e-a815-cd883c0ed24f" containerName="registry-server" containerID="cri-o://2fca668661f0cf340b4dd568ac6480ee8cd4f22e06405d09c84a6649be9ad0ed" gracePeriod=30 Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.571361 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.728786 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l226\" (UniqueName: \"kubernetes.io/projected/97da928c-a8a1-48ef-90f4-68a650becdf6-kube-api-access-2l226\") pod \"97da928c-a8a1-48ef-90f4-68a650becdf6\" (UID: \"97da928c-a8a1-48ef-90f4-68a650becdf6\") " Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.728852 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97da928c-a8a1-48ef-90f4-68a650becdf6-marketplace-operator-metrics\") pod \"97da928c-a8a1-48ef-90f4-68a650becdf6\" (UID: \"97da928c-a8a1-48ef-90f4-68a650becdf6\") " Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.728906 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97da928c-a8a1-48ef-90f4-68a650becdf6-marketplace-trusted-ca\") pod \"97da928c-a8a1-48ef-90f4-68a650becdf6\" (UID: \"97da928c-a8a1-48ef-90f4-68a650becdf6\") " Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.730179 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97da928c-a8a1-48ef-90f4-68a650becdf6-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "97da928c-a8a1-48ef-90f4-68a650becdf6" (UID: "97da928c-a8a1-48ef-90f4-68a650becdf6"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.736087 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97da928c-a8a1-48ef-90f4-68a650becdf6-kube-api-access-2l226" (OuterVolumeSpecName: "kube-api-access-2l226") pod "97da928c-a8a1-48ef-90f4-68a650becdf6" (UID: "97da928c-a8a1-48ef-90f4-68a650becdf6"). InnerVolumeSpecName "kube-api-access-2l226". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.736639 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97da928c-a8a1-48ef-90f4-68a650becdf6-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "97da928c-a8a1-48ef-90f4-68a650becdf6" (UID: "97da928c-a8a1-48ef-90f4-68a650becdf6"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.772883 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vfhrk" Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.800252 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" event={"ID":"97da928c-a8a1-48ef-90f4-68a650becdf6","Type":"ContainerDied","Data":"ae991b17adef97791a3877e3ebc1100ca2515671894485f83f54d8d73d0fd2d0"} Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.805063 4954 scope.go:117] "RemoveContainer" containerID="a9574ad19f6d3ee15996d25ff8e3294a32fe268f15ee300dd1f81198811acc71" Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.800857 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-29x4r" Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.811328 4954 generic.go:334] "Generic (PLEG): container finished" podID="245f7b8a-be27-4fa8-ae16-161d52cef432" containerID="cf1a4897ed1f0d9b5239936c4ac5919acdfea5dc7a2f8006dd5665bd62c12676" exitCode=0 Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.811403 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6nxk" event={"ID":"245f7b8a-be27-4fa8-ae16-161d52cef432","Type":"ContainerDied","Data":"cf1a4897ed1f0d9b5239936c4ac5919acdfea5dc7a2f8006dd5665bd62c12676"} Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.813688 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfhrk" event={"ID":"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f","Type":"ContainerDied","Data":"43b631b446c7ca688f6c8d3feb7a9d4bb391b0f4595eceb865184fd672b57f53"} Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.813780 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vfhrk" Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.830787 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l226\" (UniqueName: \"kubernetes.io/projected/97da928c-a8a1-48ef-90f4-68a650becdf6-kube-api-access-2l226\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.831003 4954 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97da928c-a8a1-48ef-90f4-68a650becdf6-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.831024 4954 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97da928c-a8a1-48ef-90f4-68a650becdf6-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.843141 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-29x4r"] Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.848724 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-29x4r"] Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.932500 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f-catalog-content\") pod \"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f\" (UID: \"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f\") " Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.933038 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nrpm\" (UniqueName: \"kubernetes.io/projected/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f-kube-api-access-2nrpm\") pod \"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f\" (UID: \"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f\") " Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.933140 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f-utilities\") pod \"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f\" (UID: \"b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f\") " Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.934047 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f-utilities" (OuterVolumeSpecName: "utilities") pod "b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" (UID: "b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.937072 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f-kube-api-access-2nrpm" (OuterVolumeSpecName: "kube-api-access-2nrpm") pod "b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" (UID: "b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f"). InnerVolumeSpecName "kube-api-access-2nrpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:01:07 crc kubenswrapper[4954]: I1206 07:01:07.990001 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" (UID: "b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:01:08 crc kubenswrapper[4954]: I1206 07:01:08.035473 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nrpm\" (UniqueName: \"kubernetes.io/projected/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f-kube-api-access-2nrpm\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:08 crc kubenswrapper[4954]: I1206 07:01:08.035865 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:08 crc kubenswrapper[4954]: I1206 07:01:08.035942 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:08 crc kubenswrapper[4954]: I1206 07:01:08.145056 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vfhrk"] Dec 06 07:01:08 crc kubenswrapper[4954]: I1206 07:01:08.148322 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vfhrk"] Dec 06 07:01:08 crc kubenswrapper[4954]: I1206 07:01:08.268027 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dghdt" Dec 06 07:01:08 crc kubenswrapper[4954]: I1206 07:01:08.406694 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-srtmw" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.454603 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97da928c-a8a1-48ef-90f4-68a650becdf6" path="/var/lib/kubelet/pods/97da928c-a8a1-48ef-90f4-68a650becdf6/volumes" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.455870 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" path="/var/lib/kubelet/pods/b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f/volumes" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.678380 4954 scope.go:117] "RemoveContainer" containerID="3fe5231654f12b40a2a7c805cc168900fdb417914f0fd8b96c8d52a90f15fe07" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.757986 4954 scope.go:117] "RemoveContainer" containerID="1abe9b8016c57f619ba2dc6f693d191bd67737d087800823510285435e750e6e" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.860595 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-srtmw" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.866304 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4916e82-5036-41a7-9ff5-a710313a630c-catalog-content\") pod \"d4916e82-5036-41a7-9ff5-a710313a630c\" (UID: \"d4916e82-5036-41a7-9ff5-a710313a630c\") " Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.866454 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jglcx\" (UniqueName: \"kubernetes.io/projected/d4916e82-5036-41a7-9ff5-a710313a630c-kube-api-access-jglcx\") pod \"d4916e82-5036-41a7-9ff5-a710313a630c\" (UID: \"d4916e82-5036-41a7-9ff5-a710313a630c\") " Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.866526 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4916e82-5036-41a7-9ff5-a710313a630c-utilities\") pod \"d4916e82-5036-41a7-9ff5-a710313a630c\" (UID: \"d4916e82-5036-41a7-9ff5-a710313a630c\") " Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.869159 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4916e82-5036-41a7-9ff5-a710313a630c-utilities" (OuterVolumeSpecName: "utilities") pod "d4916e82-5036-41a7-9ff5-a710313a630c" (UID: "d4916e82-5036-41a7-9ff5-a710313a630c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.877322 4954 scope.go:117] "RemoveContainer" containerID="a10492070d1762339ac5cda96653a795d4486e1cea0a39ad32966fcd90e77cfd" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.886727 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4916e82-5036-41a7-9ff5-a710313a630c-kube-api-access-jglcx" (OuterVolumeSpecName: "kube-api-access-jglcx") pod "d4916e82-5036-41a7-9ff5-a710313a630c" (UID: "d4916e82-5036-41a7-9ff5-a710313a630c"). InnerVolumeSpecName "kube-api-access-jglcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.888039 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zbdtt" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.909891 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dghdt" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.927309 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4916e82-5036-41a7-9ff5-a710313a630c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4916e82-5036-41a7-9ff5-a710313a630c" (UID: "d4916e82-5036-41a7-9ff5-a710313a630c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.927784 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srtmw" event={"ID":"d4916e82-5036-41a7-9ff5-a710313a630c","Type":"ContainerDied","Data":"2cd9cb003f9df7c6eba2507558619ed94b77ac760b3c61f0f3ab722ce324dedb"} Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.927826 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-srtmw" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.927843 4954 scope.go:117] "RemoveContainer" containerID="965ba144f4e5f127555c1b2cd37ec474f4179485121191b711997bc32db1ded9" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.944547 4954 scope.go:117] "RemoveContainer" containerID="e8715c18a906442f4fc26acad8c41e106d44a3ddbc6f29202cf8449f80cd609a" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.950852 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dghdt" event={"ID":"ceb38465-b161-44c8-9e80-4c3df43ed7b1","Type":"ContainerDied","Data":"9d2cb4f73677493171f2d2be87d85270c4c013a449ea923d4a84baf36e94aafb"} Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.950985 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dghdt" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.960913 4954 generic.go:334] "Generic (PLEG): container finished" podID="e4fd0cda-b6fe-411e-a815-cd883c0ed24f" containerID="2fca668661f0cf340b4dd568ac6480ee8cd4f22e06405d09c84a6649be9ad0ed" exitCode=0 Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.961205 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7qkf" event={"ID":"e4fd0cda-b6fe-411e-a815-cd883c0ed24f","Type":"ContainerDied","Data":"2fca668661f0cf340b4dd568ac6480ee8cd4f22e06405d09c84a6649be9ad0ed"} Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.966826 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbdtt" event={"ID":"8571f5ab-84f5-4516-835c-01f5361d0ca4","Type":"ContainerDied","Data":"e1b6ad0e7fb1bdc7caa14203657acdc8285e82c3896e818776395a340a23d265"} Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.967031 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zbdtt" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.968316 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8571f5ab-84f5-4516-835c-01f5361d0ca4-catalog-content\") pod \"8571f5ab-84f5-4516-835c-01f5361d0ca4\" (UID: \"8571f5ab-84f5-4516-835c-01f5361d0ca4\") " Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.968375 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8571f5ab-84f5-4516-835c-01f5361d0ca4-utilities\") pod \"8571f5ab-84f5-4516-835c-01f5361d0ca4\" (UID: \"8571f5ab-84f5-4516-835c-01f5361d0ca4\") " Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.968496 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb38465-b161-44c8-9e80-4c3df43ed7b1-catalog-content\") pod \"ceb38465-b161-44c8-9e80-4c3df43ed7b1\" (UID: \"ceb38465-b161-44c8-9e80-4c3df43ed7b1\") " Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.968539 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rb69\" (UniqueName: \"kubernetes.io/projected/8571f5ab-84f5-4516-835c-01f5361d0ca4-kube-api-access-9rb69\") pod \"8571f5ab-84f5-4516-835c-01f5361d0ca4\" (UID: \"8571f5ab-84f5-4516-835c-01f5361d0ca4\") " Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.968658 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb38465-b161-44c8-9e80-4c3df43ed7b1-utilities\") pod \"ceb38465-b161-44c8-9e80-4c3df43ed7b1\" (UID: \"ceb38465-b161-44c8-9e80-4c3df43ed7b1\") " Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.968691 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbhtg\" (UniqueName: \"kubernetes.io/projected/ceb38465-b161-44c8-9e80-4c3df43ed7b1-kube-api-access-wbhtg\") pod \"ceb38465-b161-44c8-9e80-4c3df43ed7b1\" (UID: \"ceb38465-b161-44c8-9e80-4c3df43ed7b1\") " Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.969010 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jglcx\" (UniqueName: \"kubernetes.io/projected/d4916e82-5036-41a7-9ff5-a710313a630c-kube-api-access-jglcx\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.969025 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4916e82-5036-41a7-9ff5-a710313a630c-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.969037 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4916e82-5036-41a7-9ff5-a710313a630c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.976036 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceb38465-b161-44c8-9e80-4c3df43ed7b1-utilities" (OuterVolumeSpecName: "utilities") pod "ceb38465-b161-44c8-9e80-4c3df43ed7b1" (UID: "ceb38465-b161-44c8-9e80-4c3df43ed7b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.977177 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8571f5ab-84f5-4516-835c-01f5361d0ca4-utilities" (OuterVolumeSpecName: "utilities") pod "8571f5ab-84f5-4516-835c-01f5361d0ca4" (UID: "8571f5ab-84f5-4516-835c-01f5361d0ca4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.986888 4954 scope.go:117] "RemoveContainer" containerID="54b3c5fe930cd4a571e0173436080fb883cbf67f8ebbbdb7d1e9f56a130bd7b2" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.988642 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8571f5ab-84f5-4516-835c-01f5361d0ca4-kube-api-access-9rb69" (OuterVolumeSpecName: "kube-api-access-9rb69") pod "8571f5ab-84f5-4516-835c-01f5361d0ca4" (UID: "8571f5ab-84f5-4516-835c-01f5361d0ca4"). InnerVolumeSpecName "kube-api-access-9rb69". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.990353 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6nxk" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.992448 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceb38465-b161-44c8-9e80-4c3df43ed7b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ceb38465-b161-44c8-9e80-4c3df43ed7b1" (UID: "ceb38465-b161-44c8-9e80-4c3df43ed7b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:01:09 crc kubenswrapper[4954]: I1206 07:01:09.992735 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceb38465-b161-44c8-9e80-4c3df43ed7b1-kube-api-access-wbhtg" (OuterVolumeSpecName: "kube-api-access-wbhtg") pod "ceb38465-b161-44c8-9e80-4c3df43ed7b1" (UID: "ceb38465-b161-44c8-9e80-4c3df43ed7b1"). InnerVolumeSpecName "kube-api-access-wbhtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.015662 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-srtmw"] Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.021245 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-srtmw"] Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.028896 4954 scope.go:117] "RemoveContainer" containerID="077ab6db0e07b16d98fe37a9afeab8697c3ae1225670ffe91284d1fbda255662" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.051506 4954 scope.go:117] "RemoveContainer" containerID="4a05e687ecae746477f0d91bffd64a4f879662b1c31d765ba3520590a1cfeaac" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.074274 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqhzj\" (UniqueName: \"kubernetes.io/projected/245f7b8a-be27-4fa8-ae16-161d52cef432-kube-api-access-pqhzj\") pod \"245f7b8a-be27-4fa8-ae16-161d52cef432\" (UID: \"245f7b8a-be27-4fa8-ae16-161d52cef432\") " Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.074402 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/245f7b8a-be27-4fa8-ae16-161d52cef432-catalog-content\") pod \"245f7b8a-be27-4fa8-ae16-161d52cef432\" (UID: \"245f7b8a-be27-4fa8-ae16-161d52cef432\") " Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.074488 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/245f7b8a-be27-4fa8-ae16-161d52cef432-utilities\") pod \"245f7b8a-be27-4fa8-ae16-161d52cef432\" (UID: \"245f7b8a-be27-4fa8-ae16-161d52cef432\") " Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.074824 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb38465-b161-44c8-9e80-4c3df43ed7b1-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.074846 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbhtg\" (UniqueName: \"kubernetes.io/projected/ceb38465-b161-44c8-9e80-4c3df43ed7b1-kube-api-access-wbhtg\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.074857 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8571f5ab-84f5-4516-835c-01f5361d0ca4-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.074866 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb38465-b161-44c8-9e80-4c3df43ed7b1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.074879 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rb69\" (UniqueName: \"kubernetes.io/projected/8571f5ab-84f5-4516-835c-01f5361d0ca4-kube-api-access-9rb69\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.086620 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/245f7b8a-be27-4fa8-ae16-161d52cef432-kube-api-access-pqhzj" (OuterVolumeSpecName: "kube-api-access-pqhzj") pod "245f7b8a-be27-4fa8-ae16-161d52cef432" (UID: "245f7b8a-be27-4fa8-ae16-161d52cef432"). InnerVolumeSpecName "kube-api-access-pqhzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.093274 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8571f5ab-84f5-4516-835c-01f5361d0ca4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8571f5ab-84f5-4516-835c-01f5361d0ca4" (UID: "8571f5ab-84f5-4516-835c-01f5361d0ca4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.096537 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p4n6q"] Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.101210 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/245f7b8a-be27-4fa8-ae16-161d52cef432-utilities" (OuterVolumeSpecName: "utilities") pod "245f7b8a-be27-4fa8-ae16-161d52cef432" (UID: "245f7b8a-be27-4fa8-ae16-161d52cef432"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.135649 4954 scope.go:117] "RemoveContainer" containerID="3cdb0c9c57cec75190f61f3c4b7f4bc12cfded742d12726f53162e40559aa137" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.174598 4954 scope.go:117] "RemoveContainer" containerID="fb899e54d819f5419a642fe9d990ee3ced66427231763c3ec57907866b42b7eb" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.175731 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/245f7b8a-be27-4fa8-ae16-161d52cef432-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.175756 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8571f5ab-84f5-4516-835c-01f5361d0ca4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.175768 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqhzj\" (UniqueName: \"kubernetes.io/projected/245f7b8a-be27-4fa8-ae16-161d52cef432-kube-api-access-pqhzj\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.213765 4954 scope.go:117] "RemoveContainer" containerID="e275a11fe1c2a0e21104e3b813c3abf8e65394ae24c24669584656cdde1d1ccc" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.260034 4954 scope.go:117] "RemoveContainer" containerID="c6bde14c2ff52cd7ce0db45b1a6f51446dd43aa9eacbbe4b6f172408a9175b00" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.269892 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/245f7b8a-be27-4fa8-ae16-161d52cef432-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "245f7b8a-be27-4fa8-ae16-161d52cef432" (UID: "245f7b8a-be27-4fa8-ae16-161d52cef432"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.277917 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/245f7b8a-be27-4fa8-ae16-161d52cef432-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.291994 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dghdt"] Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.318342 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dghdt"] Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.333668 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zbdtt"] Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.351081 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zbdtt"] Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.639166 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7qkf" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.682033 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4fd0cda-b6fe-411e-a815-cd883c0ed24f-catalog-content\") pod \"e4fd0cda-b6fe-411e-a815-cd883c0ed24f\" (UID: \"e4fd0cda-b6fe-411e-a815-cd883c0ed24f\") " Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.682123 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sld9h\" (UniqueName: \"kubernetes.io/projected/e4fd0cda-b6fe-411e-a815-cd883c0ed24f-kube-api-access-sld9h\") pod \"e4fd0cda-b6fe-411e-a815-cd883c0ed24f\" (UID: \"e4fd0cda-b6fe-411e-a815-cd883c0ed24f\") " Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.682213 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4fd0cda-b6fe-411e-a815-cd883c0ed24f-utilities\") pod \"e4fd0cda-b6fe-411e-a815-cd883c0ed24f\" (UID: \"e4fd0cda-b6fe-411e-a815-cd883c0ed24f\") " Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.683421 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4fd0cda-b6fe-411e-a815-cd883c0ed24f-utilities" (OuterVolumeSpecName: "utilities") pod "e4fd0cda-b6fe-411e-a815-cd883c0ed24f" (UID: "e4fd0cda-b6fe-411e-a815-cd883c0ed24f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.691204 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4fd0cda-b6fe-411e-a815-cd883c0ed24f-kube-api-access-sld9h" (OuterVolumeSpecName: "kube-api-access-sld9h") pod "e4fd0cda-b6fe-411e-a815-cd883c0ed24f" (UID: "e4fd0cda-b6fe-411e-a815-cd883c0ed24f"). InnerVolumeSpecName "kube-api-access-sld9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.761784 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4fd0cda-b6fe-411e-a815-cd883c0ed24f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4fd0cda-b6fe-411e-a815-cd883c0ed24f" (UID: "e4fd0cda-b6fe-411e-a815-cd883c0ed24f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.784183 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4fd0cda-b6fe-411e-a815-cd883c0ed24f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.784224 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sld9h\" (UniqueName: \"kubernetes.io/projected/e4fd0cda-b6fe-411e-a815-cd883c0ed24f-kube-api-access-sld9h\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.784238 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4fd0cda-b6fe-411e-a815-cd883c0ed24f-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.972922 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p4n6q" event={"ID":"61c42882-513e-47f2-8faf-a95e4b0126f6","Type":"ContainerStarted","Data":"a171d3ab58bb7f20ceab82d11174636006fc24a32366deb70357bb3e089a0439"} Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.973004 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p4n6q" event={"ID":"61c42882-513e-47f2-8faf-a95e4b0126f6","Type":"ContainerStarted","Data":"0fa0804ca1da5f29e3360314bedd1a2bb750b53684b33c854b135bb05785320f"} Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.974304 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-p4n6q" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.976175 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-p4n6q" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.978655 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wq5n" event={"ID":"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9","Type":"ContainerStarted","Data":"6ef6cea829332596066e18d30ff787550810b1c19b6f1709322366be5fc2d3ec"} Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.978787 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5wq5n" podUID="c21ec18f-c5ed-4306-9732-4ebdf2ee71d9" containerName="registry-server" containerID="cri-o://6ef6cea829332596066e18d30ff787550810b1c19b6f1709322366be5fc2d3ec" gracePeriod=30 Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.981269 4954 generic.go:334] "Generic (PLEG): container finished" podID="92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850" containerID="28783838f744ba7c064264e9031d742246e9a388a5b1f854032fd91b7ea1e357" exitCode=0 Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.981298 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prszd" event={"ID":"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850","Type":"ContainerDied","Data":"28783838f744ba7c064264e9031d742246e9a388a5b1f854032fd91b7ea1e357"} Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.983516 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7qkf" event={"ID":"e4fd0cda-b6fe-411e-a815-cd883c0ed24f","Type":"ContainerDied","Data":"f5b33fc35689e7ff1b4c6ba0e64d33bec60ab404f520afcad6919b2b2f041fce"} Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.983574 4954 scope.go:117] "RemoveContainer" containerID="2fca668661f0cf340b4dd568ac6480ee8cd4f22e06405d09c84a6649be9ad0ed" Dec 06 07:01:10 crc kubenswrapper[4954]: I1206 07:01:10.983682 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7qkf" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.002166 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-p4n6q" podStartSLOduration=6.002141612 podStartE2EDuration="6.002141612s" podCreationTimestamp="2025-12-06 07:01:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:01:10.992373027 +0000 UTC m=+245.805732426" watchObservedRunningTime="2025-12-06 07:01:11.002141612 +0000 UTC m=+245.815501001" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.004953 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6nxk" event={"ID":"245f7b8a-be27-4fa8-ae16-161d52cef432","Type":"ContainerDied","Data":"06d108eb293c54982d5c8d1913db27cd72968fcc808340233f2fcebb6eacdb74"} Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.005139 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6nxk" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.016175 4954 scope.go:117] "RemoveContainer" containerID="42f11f06885ea92c3fdb9230ad88ef6019a7610d019c0fbf5c8f5bc96256504f" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.017188 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5wq5n" podStartSLOduration=6.525045104 podStartE2EDuration="1m36.017176604s" podCreationTimestamp="2025-12-06 06:59:35 +0000 UTC" firstStartedPulling="2025-12-06 06:59:40.267231906 +0000 UTC m=+155.080591295" lastFinishedPulling="2025-12-06 07:01:09.759363396 +0000 UTC m=+244.572722795" observedRunningTime="2025-12-06 07:01:11.016437365 +0000 UTC m=+245.829796754" watchObservedRunningTime="2025-12-06 07:01:11.017176604 +0000 UTC m=+245.830535993" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.049990 4954 scope.go:117] "RemoveContainer" containerID="08f2a70e152fc3b7b52ac42f8f94fc0aa02950ea9cac81de60b1976f9fbb05be" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.078815 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p6nxk"] Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.087410 4954 scope.go:117] "RemoveContainer" containerID="cf1a4897ed1f0d9b5239936c4ac5919acdfea5dc7a2f8006dd5665bd62c12676" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.089787 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p6nxk"] Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.093193 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r7qkf"] Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.098992 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r7qkf"] Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.121789 4954 scope.go:117] "RemoveContainer" containerID="707d4e2743dbc76e4c174df3f0152353415c4da3d6541d46b3b6ce8cf7e6d6a8" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.149382 4954 scope.go:117] "RemoveContainer" containerID="72a568a69bf88675436f69dc070dd3f0179b76a570b3f21e8e1db38bc2c652d6" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.226713 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prszd" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.290662 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kxzc\" (UniqueName: \"kubernetes.io/projected/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850-kube-api-access-4kxzc\") pod \"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850\" (UID: \"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850\") " Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.290719 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850-utilities\") pod \"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850\" (UID: \"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850\") " Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.290765 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850-catalog-content\") pod \"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850\" (UID: \"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850\") " Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.291972 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850-utilities" (OuterVolumeSpecName: "utilities") pod "92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850" (UID: "92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.296435 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850-kube-api-access-4kxzc" (OuterVolumeSpecName: "kube-api-access-4kxzc") pod "92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850" (UID: "92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850"). InnerVolumeSpecName "kube-api-access-4kxzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.392255 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kxzc\" (UniqueName: \"kubernetes.io/projected/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850-kube-api-access-4kxzc\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.392303 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.425933 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850" (UID: "92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.455329 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="245f7b8a-be27-4fa8-ae16-161d52cef432" path="/var/lib/kubelet/pods/245f7b8a-be27-4fa8-ae16-161d52cef432/volumes" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.456293 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8571f5ab-84f5-4516-835c-01f5361d0ca4" path="/var/lib/kubelet/pods/8571f5ab-84f5-4516-835c-01f5361d0ca4/volumes" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.457075 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceb38465-b161-44c8-9e80-4c3df43ed7b1" path="/var/lib/kubelet/pods/ceb38465-b161-44c8-9e80-4c3df43ed7b1/volumes" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.458504 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4916e82-5036-41a7-9ff5-a710313a630c" path="/var/lib/kubelet/pods/d4916e82-5036-41a7-9ff5-a710313a630c/volumes" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.459333 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4fd0cda-b6fe-411e-a815-cd883c0ed24f" path="/var/lib/kubelet/pods/e4fd0cda-b6fe-411e-a815-cd883c0ed24f/volumes" Dec 06 07:01:11 crc kubenswrapper[4954]: I1206 07:01:11.493659 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.015589 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5wq5n_c21ec18f-c5ed-4306-9732-4ebdf2ee71d9/registry-server/0.log" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.017080 4954 generic.go:334] "Generic (PLEG): container finished" podID="c21ec18f-c5ed-4306-9732-4ebdf2ee71d9" containerID="6ef6cea829332596066e18d30ff787550810b1c19b6f1709322366be5fc2d3ec" exitCode=1 Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.017173 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wq5n" event={"ID":"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9","Type":"ContainerDied","Data":"6ef6cea829332596066e18d30ff787550810b1c19b6f1709322366be5fc2d3ec"} Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.019726 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prszd" event={"ID":"92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850","Type":"ContainerDied","Data":"a9efa0b9940f7cb10597fcb987b655865f1dd30777fd42dcd48adf17a048ec6c"} Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.019770 4954 scope.go:117] "RemoveContainer" containerID="28783838f744ba7c064264e9031d742246e9a388a5b1f854032fd91b7ea1e357" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.019916 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prszd" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.056918 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xqpx7"] Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.058773 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb38465-b161-44c8-9e80-4c3df43ed7b1" containerName="extract-content" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.058796 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb38465-b161-44c8-9e80-4c3df43ed7b1" containerName="extract-content" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.058812 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850" containerName="extract-content" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.058820 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850" containerName="extract-content" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.058959 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4fd0cda-b6fe-411e-a815-cd883c0ed24f" containerName="extract-utilities" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.058969 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4fd0cda-b6fe-411e-a815-cd883c0ed24f" containerName="extract-utilities" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.058976 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb38465-b161-44c8-9e80-4c3df43ed7b1" containerName="registry-server" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.058982 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb38465-b161-44c8-9e80-4c3df43ed7b1" containerName="registry-server" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.059025 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4fd0cda-b6fe-411e-a815-cd883c0ed24f" containerName="registry-server" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059032 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4fd0cda-b6fe-411e-a815-cd883c0ed24f" containerName="registry-server" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.059042 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97da928c-a8a1-48ef-90f4-68a650becdf6" containerName="marketplace-operator" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059049 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="97da928c-a8a1-48ef-90f4-68a650becdf6" containerName="marketplace-operator" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.059055 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb38465-b161-44c8-9e80-4c3df43ed7b1" containerName="extract-utilities" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059060 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb38465-b161-44c8-9e80-4c3df43ed7b1" containerName="extract-utilities" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.059067 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="245f7b8a-be27-4fa8-ae16-161d52cef432" containerName="extract-content" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059101 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="245f7b8a-be27-4fa8-ae16-161d52cef432" containerName="extract-content" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.059110 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8571f5ab-84f5-4516-835c-01f5361d0ca4" containerName="extract-content" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059117 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8571f5ab-84f5-4516-835c-01f5361d0ca4" containerName="extract-content" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.059123 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="245f7b8a-be27-4fa8-ae16-161d52cef432" containerName="registry-server" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059130 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="245f7b8a-be27-4fa8-ae16-161d52cef432" containerName="registry-server" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.059141 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4916e82-5036-41a7-9ff5-a710313a630c" containerName="extract-utilities" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059147 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4916e82-5036-41a7-9ff5-a710313a630c" containerName="extract-utilities" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.059154 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8571f5ab-84f5-4516-835c-01f5361d0ca4" containerName="registry-server" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059188 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8571f5ab-84f5-4516-835c-01f5361d0ca4" containerName="registry-server" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.059198 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850" containerName="extract-utilities" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059204 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850" containerName="extract-utilities" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.059212 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8571f5ab-84f5-4516-835c-01f5361d0ca4" containerName="extract-utilities" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059218 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8571f5ab-84f5-4516-835c-01f5361d0ca4" containerName="extract-utilities" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.059226 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4916e82-5036-41a7-9ff5-a710313a630c" containerName="extract-content" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059236 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4916e82-5036-41a7-9ff5-a710313a630c" containerName="extract-content" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.059264 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4916e82-5036-41a7-9ff5-a710313a630c" containerName="registry-server" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059271 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4916e82-5036-41a7-9ff5-a710313a630c" containerName="registry-server" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.059278 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4fd0cda-b6fe-411e-a815-cd883c0ed24f" containerName="extract-content" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059285 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4fd0cda-b6fe-411e-a815-cd883c0ed24f" containerName="extract-content" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.059295 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" containerName="extract-utilities" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059301 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" containerName="extract-utilities" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.059309 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" containerName="extract-content" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059316 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" containerName="extract-content" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.059333 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="245f7b8a-be27-4fa8-ae16-161d52cef432" containerName="extract-utilities" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059339 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="245f7b8a-be27-4fa8-ae16-161d52cef432" containerName="extract-utilities" Dec 06 07:01:12 crc kubenswrapper[4954]: E1206 07:01:12.059349 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" containerName="registry-server" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059355 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" containerName="registry-server" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059584 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceb38465-b161-44c8-9e80-4c3df43ed7b1" containerName="registry-server" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059601 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850" containerName="extract-content" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059607 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b267ba1c-ec79-40fa-902e-9fa1c4d0fc2f" containerName="registry-server" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059616 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="97da928c-a8a1-48ef-90f4-68a650becdf6" containerName="marketplace-operator" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059627 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8571f5ab-84f5-4516-835c-01f5361d0ca4" containerName="registry-server" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059634 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4fd0cda-b6fe-411e-a815-cd883c0ed24f" containerName="registry-server" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059642 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="245f7b8a-be27-4fa8-ae16-161d52cef432" containerName="registry-server" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.059650 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4916e82-5036-41a7-9ff5-a710313a630c" containerName="registry-server" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.060437 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xqpx7" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.063437 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.071740 4954 scope.go:117] "RemoveContainer" containerID="00d04afe1ac5ed4a5e8647aba74db78e2c1ccbc80674038d6d95fb351050e2d9" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.080907 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xqpx7"] Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.091260 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-prszd"] Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.095468 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-prszd"] Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.099875 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc7t5\" (UniqueName: \"kubernetes.io/projected/b9cc4661-bc27-418f-be19-fb8acd289e3f-kube-api-access-sc7t5\") pod \"community-operators-xqpx7\" (UID: \"b9cc4661-bc27-418f-be19-fb8acd289e3f\") " pod="openshift-marketplace/community-operators-xqpx7" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.100056 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9cc4661-bc27-418f-be19-fb8acd289e3f-utilities\") pod \"community-operators-xqpx7\" (UID: \"b9cc4661-bc27-418f-be19-fb8acd289e3f\") " pod="openshift-marketplace/community-operators-xqpx7" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.100139 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9cc4661-bc27-418f-be19-fb8acd289e3f-catalog-content\") pod \"community-operators-xqpx7\" (UID: \"b9cc4661-bc27-418f-be19-fb8acd289e3f\") " pod="openshift-marketplace/community-operators-xqpx7" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.201620 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9cc4661-bc27-418f-be19-fb8acd289e3f-catalog-content\") pod \"community-operators-xqpx7\" (UID: \"b9cc4661-bc27-418f-be19-fb8acd289e3f\") " pod="openshift-marketplace/community-operators-xqpx7" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.201748 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc7t5\" (UniqueName: \"kubernetes.io/projected/b9cc4661-bc27-418f-be19-fb8acd289e3f-kube-api-access-sc7t5\") pod \"community-operators-xqpx7\" (UID: \"b9cc4661-bc27-418f-be19-fb8acd289e3f\") " pod="openshift-marketplace/community-operators-xqpx7" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.202326 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9cc4661-bc27-418f-be19-fb8acd289e3f-utilities\") pod \"community-operators-xqpx7\" (UID: \"b9cc4661-bc27-418f-be19-fb8acd289e3f\") " pod="openshift-marketplace/community-operators-xqpx7" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.204306 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9cc4661-bc27-418f-be19-fb8acd289e3f-utilities\") pod \"community-operators-xqpx7\" (UID: \"b9cc4661-bc27-418f-be19-fb8acd289e3f\") " pod="openshift-marketplace/community-operators-xqpx7" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.206508 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9cc4661-bc27-418f-be19-fb8acd289e3f-catalog-content\") pod \"community-operators-xqpx7\" (UID: \"b9cc4661-bc27-418f-be19-fb8acd289e3f\") " pod="openshift-marketplace/community-operators-xqpx7" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.233545 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc7t5\" (UniqueName: \"kubernetes.io/projected/b9cc4661-bc27-418f-be19-fb8acd289e3f-kube-api-access-sc7t5\") pod \"community-operators-xqpx7\" (UID: \"b9cc4661-bc27-418f-be19-fb8acd289e3f\") " pod="openshift-marketplace/community-operators-xqpx7" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.427881 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xqpx7" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.441351 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5wq5n_c21ec18f-c5ed-4306-9732-4ebdf2ee71d9/registry-server/0.log" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.442217 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wq5n" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.606682 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9-utilities\") pod \"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9\" (UID: \"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9\") " Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.606735 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f58mz\" (UniqueName: \"kubernetes.io/projected/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9-kube-api-access-f58mz\") pod \"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9\" (UID: \"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9\") " Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.606761 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9-catalog-content\") pod \"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9\" (UID: \"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9\") " Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.608086 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9-utilities" (OuterVolumeSpecName: "utilities") pod "c21ec18f-c5ed-4306-9732-4ebdf2ee71d9" (UID: "c21ec18f-c5ed-4306-9732-4ebdf2ee71d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.613011 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9-kube-api-access-f58mz" (OuterVolumeSpecName: "kube-api-access-f58mz") pod "c21ec18f-c5ed-4306-9732-4ebdf2ee71d9" (UID: "c21ec18f-c5ed-4306-9732-4ebdf2ee71d9"). InnerVolumeSpecName "kube-api-access-f58mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.652966 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xqpx7"] Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.661176 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c21ec18f-c5ed-4306-9732-4ebdf2ee71d9" (UID: "c21ec18f-c5ed-4306-9732-4ebdf2ee71d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:01:12 crc kubenswrapper[4954]: W1206 07:01:12.662024 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9cc4661_bc27_418f_be19_fb8acd289e3f.slice/crio-7d155bccbdfd72871e8df9e507c0ba22992ca03a8ea74b4759d253a325f02819 WatchSource:0}: Error finding container 7d155bccbdfd72871e8df9e507c0ba22992ca03a8ea74b4759d253a325f02819: Status 404 returned error can't find the container with id 7d155bccbdfd72871e8df9e507c0ba22992ca03a8ea74b4759d253a325f02819 Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.707807 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.708298 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f58mz\" (UniqueName: \"kubernetes.io/projected/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9-kube-api-access-f58mz\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:12 crc kubenswrapper[4954]: I1206 07:01:12.708318 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.041010 4954 generic.go:334] "Generic (PLEG): container finished" podID="b9cc4661-bc27-418f-be19-fb8acd289e3f" containerID="812ab61b745c1904453e4ebeee1f2374493033e2616e18659f7183bd285a6237" exitCode=0 Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.041144 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqpx7" event={"ID":"b9cc4661-bc27-418f-be19-fb8acd289e3f","Type":"ContainerDied","Data":"812ab61b745c1904453e4ebeee1f2374493033e2616e18659f7183bd285a6237"} Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.041339 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqpx7" event={"ID":"b9cc4661-bc27-418f-be19-fb8acd289e3f","Type":"ContainerStarted","Data":"7d155bccbdfd72871e8df9e507c0ba22992ca03a8ea74b4759d253a325f02819"} Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.046638 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5wq5n_c21ec18f-c5ed-4306-9732-4ebdf2ee71d9/registry-server/0.log" Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.048300 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wq5n" event={"ID":"c21ec18f-c5ed-4306-9732-4ebdf2ee71d9","Type":"ContainerDied","Data":"f0b5a77da269f6dff239fa0aca6f3fee519ba7203fed1f181f91e1960633a3e0"} Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.048342 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wq5n" Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.048378 4954 scope.go:117] "RemoveContainer" containerID="6ef6cea829332596066e18d30ff787550810b1c19b6f1709322366be5fc2d3ec" Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.071264 4954 scope.go:117] "RemoveContainer" containerID="1826ba2f2f6562ed9bcd850d393eeff99df1d9a064eabd5d5a6a3dfe30acab00" Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.085926 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5wq5n"] Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.088410 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5wq5n"] Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.102965 4954 scope.go:117] "RemoveContainer" containerID="00f2b60d77bb9b4a30bc1b9465fb1774f45251518068eea140e0cf8c48392a58" Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.458913 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850" path="/var/lib/kubelet/pods/92d3edfd-eaa7-4ce2-a0f9-07bd5aa9a850/volumes" Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.459651 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c21ec18f-c5ed-4306-9732-4ebdf2ee71d9" path="/var/lib/kubelet/pods/c21ec18f-c5ed-4306-9732-4ebdf2ee71d9/volumes" Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.845345 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-76xzd"] Dec 06 07:01:13 crc kubenswrapper[4954]: E1206 07:01:13.845615 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21ec18f-c5ed-4306-9732-4ebdf2ee71d9" containerName="registry-server" Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.845631 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21ec18f-c5ed-4306-9732-4ebdf2ee71d9" containerName="registry-server" Dec 06 07:01:13 crc kubenswrapper[4954]: E1206 07:01:13.845649 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21ec18f-c5ed-4306-9732-4ebdf2ee71d9" containerName="extract-content" Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.845655 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21ec18f-c5ed-4306-9732-4ebdf2ee71d9" containerName="extract-content" Dec 06 07:01:13 crc kubenswrapper[4954]: E1206 07:01:13.845668 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21ec18f-c5ed-4306-9732-4ebdf2ee71d9" containerName="extract-utilities" Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.845676 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21ec18f-c5ed-4306-9732-4ebdf2ee71d9" containerName="extract-utilities" Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.845755 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c21ec18f-c5ed-4306-9732-4ebdf2ee71d9" containerName="registry-server" Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.846486 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76xzd" Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.850373 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.858399 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-76xzd"] Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.927181 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d75qx\" (UniqueName: \"kubernetes.io/projected/2f2fbf41-1df2-47da-96e7-84c6c4514646-kube-api-access-d75qx\") pod \"redhat-marketplace-76xzd\" (UID: \"2f2fbf41-1df2-47da-96e7-84c6c4514646\") " pod="openshift-marketplace/redhat-marketplace-76xzd" Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.927265 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f2fbf41-1df2-47da-96e7-84c6c4514646-catalog-content\") pod \"redhat-marketplace-76xzd\" (UID: \"2f2fbf41-1df2-47da-96e7-84c6c4514646\") " pod="openshift-marketplace/redhat-marketplace-76xzd" Dec 06 07:01:13 crc kubenswrapper[4954]: I1206 07:01:13.930353 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f2fbf41-1df2-47da-96e7-84c6c4514646-utilities\") pod \"redhat-marketplace-76xzd\" (UID: \"2f2fbf41-1df2-47da-96e7-84c6c4514646\") " pod="openshift-marketplace/redhat-marketplace-76xzd" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.032543 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d75qx\" (UniqueName: \"kubernetes.io/projected/2f2fbf41-1df2-47da-96e7-84c6c4514646-kube-api-access-d75qx\") pod \"redhat-marketplace-76xzd\" (UID: \"2f2fbf41-1df2-47da-96e7-84c6c4514646\") " pod="openshift-marketplace/redhat-marketplace-76xzd" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.032647 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f2fbf41-1df2-47da-96e7-84c6c4514646-catalog-content\") pod \"redhat-marketplace-76xzd\" (UID: \"2f2fbf41-1df2-47da-96e7-84c6c4514646\") " pod="openshift-marketplace/redhat-marketplace-76xzd" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.032693 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f2fbf41-1df2-47da-96e7-84c6c4514646-utilities\") pod \"redhat-marketplace-76xzd\" (UID: \"2f2fbf41-1df2-47da-96e7-84c6c4514646\") " pod="openshift-marketplace/redhat-marketplace-76xzd" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.033318 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f2fbf41-1df2-47da-96e7-84c6c4514646-utilities\") pod \"redhat-marketplace-76xzd\" (UID: \"2f2fbf41-1df2-47da-96e7-84c6c4514646\") " pod="openshift-marketplace/redhat-marketplace-76xzd" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.033427 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f2fbf41-1df2-47da-96e7-84c6c4514646-catalog-content\") pod \"redhat-marketplace-76xzd\" (UID: \"2f2fbf41-1df2-47da-96e7-84c6c4514646\") " pod="openshift-marketplace/redhat-marketplace-76xzd" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.056740 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d75qx\" (UniqueName: \"kubernetes.io/projected/2f2fbf41-1df2-47da-96e7-84c6c4514646-kube-api-access-d75qx\") pod \"redhat-marketplace-76xzd\" (UID: \"2f2fbf41-1df2-47da-96e7-84c6c4514646\") " pod="openshift-marketplace/redhat-marketplace-76xzd" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.169110 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76xzd" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.405840 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-76xzd"] Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.444202 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hzqhb"] Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.446712 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hzqhb" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.449369 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.459433 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hzqhb"] Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.538649 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ccf6324-bb8e-416b-b705-27a3b16f01ed-utilities\") pod \"redhat-operators-hzqhb\" (UID: \"6ccf6324-bb8e-416b-b705-27a3b16f01ed\") " pod="openshift-marketplace/redhat-operators-hzqhb" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.538728 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ccf6324-bb8e-416b-b705-27a3b16f01ed-catalog-content\") pod \"redhat-operators-hzqhb\" (UID: \"6ccf6324-bb8e-416b-b705-27a3b16f01ed\") " pod="openshift-marketplace/redhat-operators-hzqhb" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.538763 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrj56\" (UniqueName: \"kubernetes.io/projected/6ccf6324-bb8e-416b-b705-27a3b16f01ed-kube-api-access-zrj56\") pod \"redhat-operators-hzqhb\" (UID: \"6ccf6324-bb8e-416b-b705-27a3b16f01ed\") " pod="openshift-marketplace/redhat-operators-hzqhb" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.640493 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ccf6324-bb8e-416b-b705-27a3b16f01ed-utilities\") pod \"redhat-operators-hzqhb\" (UID: \"6ccf6324-bb8e-416b-b705-27a3b16f01ed\") " pod="openshift-marketplace/redhat-operators-hzqhb" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.640661 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ccf6324-bb8e-416b-b705-27a3b16f01ed-catalog-content\") pod \"redhat-operators-hzqhb\" (UID: \"6ccf6324-bb8e-416b-b705-27a3b16f01ed\") " pod="openshift-marketplace/redhat-operators-hzqhb" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.640719 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrj56\" (UniqueName: \"kubernetes.io/projected/6ccf6324-bb8e-416b-b705-27a3b16f01ed-kube-api-access-zrj56\") pod \"redhat-operators-hzqhb\" (UID: \"6ccf6324-bb8e-416b-b705-27a3b16f01ed\") " pod="openshift-marketplace/redhat-operators-hzqhb" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.641205 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ccf6324-bb8e-416b-b705-27a3b16f01ed-catalog-content\") pod \"redhat-operators-hzqhb\" (UID: \"6ccf6324-bb8e-416b-b705-27a3b16f01ed\") " pod="openshift-marketplace/redhat-operators-hzqhb" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.641422 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ccf6324-bb8e-416b-b705-27a3b16f01ed-utilities\") pod \"redhat-operators-hzqhb\" (UID: \"6ccf6324-bb8e-416b-b705-27a3b16f01ed\") " pod="openshift-marketplace/redhat-operators-hzqhb" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.665984 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrj56\" (UniqueName: \"kubernetes.io/projected/6ccf6324-bb8e-416b-b705-27a3b16f01ed-kube-api-access-zrj56\") pod \"redhat-operators-hzqhb\" (UID: \"6ccf6324-bb8e-416b-b705-27a3b16f01ed\") " pod="openshift-marketplace/redhat-operators-hzqhb" Dec 06 07:01:14 crc kubenswrapper[4954]: I1206 07:01:14.772009 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hzqhb" Dec 06 07:01:15 crc kubenswrapper[4954]: I1206 07:01:15.078322 4954 generic.go:334] "Generic (PLEG): container finished" podID="b9cc4661-bc27-418f-be19-fb8acd289e3f" containerID="d10aa37641a42d454224670f9b09eeccb3b4c0a22e6ccb1c4f89c6da044883fe" exitCode=0 Dec 06 07:01:15 crc kubenswrapper[4954]: I1206 07:01:15.078958 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqpx7" event={"ID":"b9cc4661-bc27-418f-be19-fb8acd289e3f","Type":"ContainerDied","Data":"d10aa37641a42d454224670f9b09eeccb3b4c0a22e6ccb1c4f89c6da044883fe"} Dec 06 07:01:15 crc kubenswrapper[4954]: I1206 07:01:15.082963 4954 generic.go:334] "Generic (PLEG): container finished" podID="2f2fbf41-1df2-47da-96e7-84c6c4514646" containerID="ad02343397e5be20df8e9b1af90834951f085428e6b00e5e0c88de6c4dc66ece" exitCode=0 Dec 06 07:01:15 crc kubenswrapper[4954]: I1206 07:01:15.083034 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76xzd" event={"ID":"2f2fbf41-1df2-47da-96e7-84c6c4514646","Type":"ContainerDied","Data":"ad02343397e5be20df8e9b1af90834951f085428e6b00e5e0c88de6c4dc66ece"} Dec 06 07:01:15 crc kubenswrapper[4954]: I1206 07:01:15.083108 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76xzd" event={"ID":"2f2fbf41-1df2-47da-96e7-84c6c4514646","Type":"ContainerStarted","Data":"a72a703123aa7e407117bd56ad64ef960ec0f2619e7ddd2a32b3c5ce190398f9"} Dec 06 07:01:15 crc kubenswrapper[4954]: I1206 07:01:15.205112 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hzqhb"] Dec 06 07:01:15 crc kubenswrapper[4954]: W1206 07:01:15.213346 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ccf6324_bb8e_416b_b705_27a3b16f01ed.slice/crio-5a47cb6b8dc9f0f2cc5d4a6031d13a617fcbe6c70e4fa536c25891d2c94a7c7e WatchSource:0}: Error finding container 5a47cb6b8dc9f0f2cc5d4a6031d13a617fcbe6c70e4fa536c25891d2c94a7c7e: Status 404 returned error can't find the container with id 5a47cb6b8dc9f0f2cc5d4a6031d13a617fcbe6c70e4fa536c25891d2c94a7c7e Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.102089 4954 generic.go:334] "Generic (PLEG): container finished" podID="2f2fbf41-1df2-47da-96e7-84c6c4514646" containerID="862879b4c61690ebb64632e6370b9bb0ca10426517554ad336215362562d01ff" exitCode=0 Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.102190 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76xzd" event={"ID":"2f2fbf41-1df2-47da-96e7-84c6c4514646","Type":"ContainerDied","Data":"862879b4c61690ebb64632e6370b9bb0ca10426517554ad336215362562d01ff"} Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.107857 4954 generic.go:334] "Generic (PLEG): container finished" podID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" containerID="4feb0c5596b921e0e91f5fe0ab70cb9e7774c4eebaba23659256cee7b5b0ac20" exitCode=0 Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.107956 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzqhb" event={"ID":"6ccf6324-bb8e-416b-b705-27a3b16f01ed","Type":"ContainerDied","Data":"4feb0c5596b921e0e91f5fe0ab70cb9e7774c4eebaba23659256cee7b5b0ac20"} Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.107994 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzqhb" event={"ID":"6ccf6324-bb8e-416b-b705-27a3b16f01ed","Type":"ContainerStarted","Data":"5a47cb6b8dc9f0f2cc5d4a6031d13a617fcbe6c70e4fa536c25891d2c94a7c7e"} Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.111863 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqpx7" event={"ID":"b9cc4661-bc27-418f-be19-fb8acd289e3f","Type":"ContainerStarted","Data":"0ccad4510dcb08566464f3fe79877a3df59cc000b01a9aab1dc3e009c2809632"} Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.155972 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xqpx7" podStartSLOduration=1.7375806599999999 podStartE2EDuration="4.155944008s" podCreationTimestamp="2025-12-06 07:01:12 +0000 UTC" firstStartedPulling="2025-12-06 07:01:13.042659345 +0000 UTC m=+247.856018734" lastFinishedPulling="2025-12-06 07:01:15.461022693 +0000 UTC m=+250.274382082" observedRunningTime="2025-12-06 07:01:16.153850734 +0000 UTC m=+250.967210133" watchObservedRunningTime="2025-12-06 07:01:16.155944008 +0000 UTC m=+250.969303397" Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.241293 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sbt66"] Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.242524 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sbt66" Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.246937 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.255973 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sbt66"] Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.263050 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89a2ab41-05c3-433e-aa02-4cc276fe349a-utilities\") pod \"certified-operators-sbt66\" (UID: \"89a2ab41-05c3-433e-aa02-4cc276fe349a\") " pod="openshift-marketplace/certified-operators-sbt66" Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.263113 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gwjg\" (UniqueName: \"kubernetes.io/projected/89a2ab41-05c3-433e-aa02-4cc276fe349a-kube-api-access-9gwjg\") pod \"certified-operators-sbt66\" (UID: \"89a2ab41-05c3-433e-aa02-4cc276fe349a\") " pod="openshift-marketplace/certified-operators-sbt66" Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.263149 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89a2ab41-05c3-433e-aa02-4cc276fe349a-catalog-content\") pod \"certified-operators-sbt66\" (UID: \"89a2ab41-05c3-433e-aa02-4cc276fe349a\") " pod="openshift-marketplace/certified-operators-sbt66" Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.364627 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gwjg\" (UniqueName: \"kubernetes.io/projected/89a2ab41-05c3-433e-aa02-4cc276fe349a-kube-api-access-9gwjg\") pod \"certified-operators-sbt66\" (UID: \"89a2ab41-05c3-433e-aa02-4cc276fe349a\") " pod="openshift-marketplace/certified-operators-sbt66" Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.366077 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89a2ab41-05c3-433e-aa02-4cc276fe349a-catalog-content\") pod \"certified-operators-sbt66\" (UID: \"89a2ab41-05c3-433e-aa02-4cc276fe349a\") " pod="openshift-marketplace/certified-operators-sbt66" Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.366299 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89a2ab41-05c3-433e-aa02-4cc276fe349a-utilities\") pod \"certified-operators-sbt66\" (UID: \"89a2ab41-05c3-433e-aa02-4cc276fe349a\") " pod="openshift-marketplace/certified-operators-sbt66" Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.366868 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89a2ab41-05c3-433e-aa02-4cc276fe349a-catalog-content\") pod \"certified-operators-sbt66\" (UID: \"89a2ab41-05c3-433e-aa02-4cc276fe349a\") " pod="openshift-marketplace/certified-operators-sbt66" Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.366951 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89a2ab41-05c3-433e-aa02-4cc276fe349a-utilities\") pod \"certified-operators-sbt66\" (UID: \"89a2ab41-05c3-433e-aa02-4cc276fe349a\") " pod="openshift-marketplace/certified-operators-sbt66" Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.389356 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gwjg\" (UniqueName: \"kubernetes.io/projected/89a2ab41-05c3-433e-aa02-4cc276fe349a-kube-api-access-9gwjg\") pod \"certified-operators-sbt66\" (UID: \"89a2ab41-05c3-433e-aa02-4cc276fe349a\") " pod="openshift-marketplace/certified-operators-sbt66" Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.560889 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sbt66" Dec 06 07:01:16 crc kubenswrapper[4954]: I1206 07:01:16.833433 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sbt66"] Dec 06 07:01:17 crc kubenswrapper[4954]: I1206 07:01:17.120236 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sbt66" event={"ID":"89a2ab41-05c3-433e-aa02-4cc276fe349a","Type":"ContainerStarted","Data":"37ed0d17bee359fb4492a75eeafcd29132195563792bc9c7e2fde362865a828c"} Dec 06 07:01:18 crc kubenswrapper[4954]: I1206 07:01:18.129524 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzqhb" event={"ID":"6ccf6324-bb8e-416b-b705-27a3b16f01ed","Type":"ContainerStarted","Data":"1f98673368d2027aa5426bfdfa194a7f4b42a8dab0a046320b12ac92813b00d2"} Dec 06 07:01:18 crc kubenswrapper[4954]: I1206 07:01:18.131354 4954 generic.go:334] "Generic (PLEG): container finished" podID="89a2ab41-05c3-433e-aa02-4cc276fe349a" containerID="2a4a54c54078f7e2188d48dfa69947cb41688a9345ebae41c9fd198839bc48e1" exitCode=0 Dec 06 07:01:18 crc kubenswrapper[4954]: I1206 07:01:18.131395 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sbt66" event={"ID":"89a2ab41-05c3-433e-aa02-4cc276fe349a","Type":"ContainerDied","Data":"2a4a54c54078f7e2188d48dfa69947cb41688a9345ebae41c9fd198839bc48e1"} Dec 06 07:01:18 crc kubenswrapper[4954]: I1206 07:01:18.134807 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76xzd" event={"ID":"2f2fbf41-1df2-47da-96e7-84c6c4514646","Type":"ContainerStarted","Data":"88825dc474cc1663f3fcda013a4898721727dae1f37074cbe4bbfa1d42d661d7"} Dec 06 07:01:18 crc kubenswrapper[4954]: I1206 07:01:18.209862 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-76xzd" podStartSLOduration=3.667914642 podStartE2EDuration="5.20983454s" podCreationTimestamp="2025-12-06 07:01:13 +0000 UTC" firstStartedPulling="2025-12-06 07:01:15.086169456 +0000 UTC m=+249.899528845" lastFinishedPulling="2025-12-06 07:01:16.628089354 +0000 UTC m=+251.441448743" observedRunningTime="2025-12-06 07:01:18.208544126 +0000 UTC m=+253.021903515" watchObservedRunningTime="2025-12-06 07:01:18.20983454 +0000 UTC m=+253.023193929" Dec 06 07:01:19 crc kubenswrapper[4954]: I1206 07:01:19.142979 4954 generic.go:334] "Generic (PLEG): container finished" podID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" containerID="1f98673368d2027aa5426bfdfa194a7f4b42a8dab0a046320b12ac92813b00d2" exitCode=0 Dec 06 07:01:19 crc kubenswrapper[4954]: I1206 07:01:19.143066 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzqhb" event={"ID":"6ccf6324-bb8e-416b-b705-27a3b16f01ed","Type":"ContainerDied","Data":"1f98673368d2027aa5426bfdfa194a7f4b42a8dab0a046320b12ac92813b00d2"} Dec 06 07:01:19 crc kubenswrapper[4954]: E1206 07:01:19.147257 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89a2ab41_05c3_433e_aa02_4cc276fe349a.slice/crio-conmon-8dce7a81c22d0153c78e670c0a040a8248fc8bec9e612f4047702a2fd0e8820e.scope\": RecentStats: unable to find data in memory cache]" Dec 06 07:01:19 crc kubenswrapper[4954]: I1206 07:01:19.152034 4954 generic.go:334] "Generic (PLEG): container finished" podID="89a2ab41-05c3-433e-aa02-4cc276fe349a" containerID="8dce7a81c22d0153c78e670c0a040a8248fc8bec9e612f4047702a2fd0e8820e" exitCode=0 Dec 06 07:01:19 crc kubenswrapper[4954]: I1206 07:01:19.152518 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sbt66" event={"ID":"89a2ab41-05c3-433e-aa02-4cc276fe349a","Type":"ContainerDied","Data":"8dce7a81c22d0153c78e670c0a040a8248fc8bec9e612f4047702a2fd0e8820e"} Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.991848 4954 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.992791 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73" gracePeriod=15 Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.992900 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5" gracePeriod=15 Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.992929 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739" gracePeriod=15 Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.992982 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354" gracePeriod=15 Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.992832 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96" gracePeriod=15 Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.995501 4954 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 07:01:20 crc kubenswrapper[4954]: E1206 07:01:20.996028 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.996055 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 06 07:01:20 crc kubenswrapper[4954]: E1206 07:01:20.996074 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.996083 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 07:01:20 crc kubenswrapper[4954]: E1206 07:01:20.996095 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.996103 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 07:01:20 crc kubenswrapper[4954]: E1206 07:01:20.996115 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.996123 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 06 07:01:20 crc kubenswrapper[4954]: E1206 07:01:20.996133 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.996140 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 06 07:01:20 crc kubenswrapper[4954]: E1206 07:01:20.996152 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.996161 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 07:01:20 crc kubenswrapper[4954]: E1206 07:01:20.996174 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.996184 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 06 07:01:20 crc kubenswrapper[4954]: E1206 07:01:20.996198 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.996205 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.996361 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.996379 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.996391 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.996406 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.996417 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.996424 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.996678 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 07:01:20 crc kubenswrapper[4954]: I1206 07:01:20.998398 4954 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.013085 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.023244 4954 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.062619 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.143732 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.143981 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.144026 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.144058 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.144125 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.144167 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.144196 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.144222 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.174111 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzqhb" event={"ID":"6ccf6324-bb8e-416b-b705-27a3b16f01ed","Type":"ContainerStarted","Data":"0be6d84f405ce1eed51e46522bf664217efaa271b9d76aea40ed09b9ed7ed190"} Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.175056 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.175456 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.178288 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sbt66" event={"ID":"89a2ab41-05c3-433e-aa02-4cc276fe349a","Type":"ContainerStarted","Data":"3b6d9da40d6dba579312589653261351d6e5d9b7f7e0091d6fb94f88c726ad69"} Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.178815 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.179364 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.179993 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.246479 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.246547 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.246666 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.246715 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.246794 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.246794 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.246972 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.247085 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.247173 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.247231 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.247270 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.247296 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.247270 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.247331 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.247353 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.247396 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: I1206 07:01:21.369042 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:01:21 crc kubenswrapper[4954]: W1206 07:01:21.392494 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-9bb1485a1e3e62865854a50fca386a7dc4abf5dd06a652a9e754dd0078ad03d0 WatchSource:0}: Error finding container 9bb1485a1e3e62865854a50fca386a7dc4abf5dd06a652a9e754dd0078ad03d0: Status 404 returned error can't find the container with id 9bb1485a1e3e62865854a50fca386a7dc4abf5dd06a652a9e754dd0078ad03d0 Dec 06 07:01:21 crc kubenswrapper[4954]: E1206 07:01:21.396401 4954 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.114:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e8e39748fb7bd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 07:01:21.395668925 +0000 UTC m=+256.209028334,LastTimestamp:2025-12-06 07:01:21.395668925 +0000 UTC m=+256.209028334,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.187837 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f9f631dc4762d1aa00e2cb55922fc155966fdb8ee3587f5650abda1a3f961ba1"} Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.188424 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"9bb1485a1e3e62865854a50fca386a7dc4abf5dd06a652a9e754dd0078ad03d0"} Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.188819 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.189012 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.189188 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.190028 4954 generic.go:334] "Generic (PLEG): container finished" podID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" containerID="b068dc3067c14a8dde11f20a7e98bb530c928568633ca36959fba7eb0daad997" exitCode=0 Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.190079 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4ca7f06d-b620-4abf-ae84-cbec03b0c534","Type":"ContainerDied","Data":"b068dc3067c14a8dde11f20a7e98bb530c928568633ca36959fba7eb0daad997"} Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.190420 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.190797 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.191465 4954 status_manager.go:851] "Failed to get status for pod" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.191914 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.192776 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.194038 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.194809 4954 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96" exitCode=0 Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.194836 4954 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739" exitCode=0 Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.194845 4954 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5" exitCode=0 Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.194854 4954 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354" exitCode=2 Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.194930 4954 scope.go:117] "RemoveContainer" containerID="b42462c67dcc5b8981987627b8dbf8b89c989192051ede9d39b7d37877feb335" Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.428965 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xqpx7" Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.429044 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xqpx7" Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.471025 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xqpx7" Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.471893 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.472595 4954 status_manager.go:851] "Failed to get status for pod" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.473207 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.473587 4954 status_manager.go:851] "Failed to get status for pod" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" pod="openshift-marketplace/community-operators-xqpx7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xqpx7\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:22 crc kubenswrapper[4954]: I1206 07:01:22.473897 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.236118 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.288085 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xqpx7" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.289332 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.289927 4954 status_manager.go:851] "Failed to get status for pod" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" pod="openshift-marketplace/community-operators-xqpx7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xqpx7\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.291617 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.292169 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.294199 4954 status_manager.go:851] "Failed to get status for pod" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.494622 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.496095 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.496386 4954 status_manager.go:851] "Failed to get status for pod" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.496706 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.496983 4954 status_manager.go:851] "Failed to get status for pod" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" pod="openshift-marketplace/community-operators-xqpx7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xqpx7\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.497228 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.682118 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ca7f06d-b620-4abf-ae84-cbec03b0c534-kubelet-dir\") pod \"4ca7f06d-b620-4abf-ae84-cbec03b0c534\" (UID: \"4ca7f06d-b620-4abf-ae84-cbec03b0c534\") " Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.682237 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ca7f06d-b620-4abf-ae84-cbec03b0c534-kube-api-access\") pod \"4ca7f06d-b620-4abf-ae84-cbec03b0c534\" (UID: \"4ca7f06d-b620-4abf-ae84-cbec03b0c534\") " Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.682282 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ca7f06d-b620-4abf-ae84-cbec03b0c534-var-lock\") pod \"4ca7f06d-b620-4abf-ae84-cbec03b0c534\" (UID: \"4ca7f06d-b620-4abf-ae84-cbec03b0c534\") " Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.682450 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ca7f06d-b620-4abf-ae84-cbec03b0c534-var-lock" (OuterVolumeSpecName: "var-lock") pod "4ca7f06d-b620-4abf-ae84-cbec03b0c534" (UID: "4ca7f06d-b620-4abf-ae84-cbec03b0c534"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.682450 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ca7f06d-b620-4abf-ae84-cbec03b0c534-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4ca7f06d-b620-4abf-ae84-cbec03b0c534" (UID: "4ca7f06d-b620-4abf-ae84-cbec03b0c534"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.689550 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ca7f06d-b620-4abf-ae84-cbec03b0c534-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4ca7f06d-b620-4abf-ae84-cbec03b0c534" (UID: "4ca7f06d-b620-4abf-ae84-cbec03b0c534"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.783463 4954 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ca7f06d-b620-4abf-ae84-cbec03b0c534-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.783519 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ca7f06d-b620-4abf-ae84-cbec03b0c534-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:23 crc kubenswrapper[4954]: I1206 07:01:23.783530 4954 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ca7f06d-b620-4abf-ae84-cbec03b0c534-var-lock\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.169715 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-76xzd" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.170213 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-76xzd" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.221371 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.222302 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.222950 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.223153 4954 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.223318 4954 status_manager.go:851] "Failed to get status for pod" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.223535 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.223744 4954 status_manager.go:851] "Failed to get status for pod" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" pod="openshift-marketplace/community-operators-xqpx7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xqpx7\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.224175 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.232935 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-76xzd" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.233809 4954 status_manager.go:851] "Failed to get status for pod" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" pod="openshift-marketplace/community-operators-xqpx7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xqpx7\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.234300 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.234636 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.234933 4954 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.235269 4954 status_manager.go:851] "Failed to get status for pod" podUID="2f2fbf41-1df2-47da-96e7-84c6c4514646" pod="openshift-marketplace/redhat-marketplace-76xzd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-76xzd\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.235509 4954 status_manager.go:851] "Failed to get status for pod" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.235934 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.252188 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.255180 4954 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73" exitCode=0 Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.255305 4954 scope.go:117] "RemoveContainer" containerID="1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.255493 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.262511 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4ca7f06d-b620-4abf-ae84-cbec03b0c534","Type":"ContainerDied","Data":"620a6e6700defc367556bcfacef2264de8beb67df8a59c631d83a6fae0ba972a"} Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.262601 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="620a6e6700defc367556bcfacef2264de8beb67df8a59c631d83a6fae0ba972a" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.262646 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.277952 4954 scope.go:117] "RemoveContainer" containerID="1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.287933 4954 status_manager.go:851] "Failed to get status for pod" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" pod="openshift-marketplace/community-operators-xqpx7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xqpx7\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.291940 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.292292 4954 status_manager.go:851] "Failed to get status for pod" podUID="2f2fbf41-1df2-47da-96e7-84c6c4514646" pod="openshift-marketplace/redhat-marketplace-76xzd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-76xzd\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.292731 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.293155 4954 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.293402 4954 status_manager.go:851] "Failed to get status for pod" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.293922 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.303555 4954 scope.go:117] "RemoveContainer" containerID="5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.309332 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-76xzd" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.310040 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.310401 4954 status_manager.go:851] "Failed to get status for pod" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" pod="openshift-marketplace/community-operators-xqpx7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xqpx7\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.310914 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.311109 4954 status_manager.go:851] "Failed to get status for pod" podUID="2f2fbf41-1df2-47da-96e7-84c6c4514646" pod="openshift-marketplace/redhat-marketplace-76xzd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-76xzd\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.311266 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.311445 4954 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.311853 4954 status_manager.go:851] "Failed to get status for pod" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.322503 4954 scope.go:117] "RemoveContainer" containerID="ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.356128 4954 scope.go:117] "RemoveContainer" containerID="afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.379518 4954 scope.go:117] "RemoveContainer" containerID="4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.391943 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.392153 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.392296 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.392094 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.392612 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.393028 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.393484 4954 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.394032 4954 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.394054 4954 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.404080 4954 scope.go:117] "RemoveContainer" containerID="1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96" Dec 06 07:01:24 crc kubenswrapper[4954]: E1206 07:01:24.406020 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\": container with ID starting with 1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96 not found: ID does not exist" containerID="1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.406067 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96"} err="failed to get container status \"1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\": rpc error: code = NotFound desc = could not find container \"1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96\": container with ID starting with 1e6e4f5cd73d6c5575cedcbd0dd660912c12f95ba458dccec6627b6d3b8c6f96 not found: ID does not exist" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.406094 4954 scope.go:117] "RemoveContainer" containerID="1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739" Dec 06 07:01:24 crc kubenswrapper[4954]: E1206 07:01:24.406720 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\": container with ID starting with 1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739 not found: ID does not exist" containerID="1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.406831 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739"} err="failed to get container status \"1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\": rpc error: code = NotFound desc = could not find container \"1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739\": container with ID starting with 1db9487c15e7dfe04f678947ed83883ea45cee13e5c08bc8e0dcaf83cecf6739 not found: ID does not exist" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.406997 4954 scope.go:117] "RemoveContainer" containerID="5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5" Dec 06 07:01:24 crc kubenswrapper[4954]: E1206 07:01:24.408234 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\": container with ID starting with 5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5 not found: ID does not exist" containerID="5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.408274 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5"} err="failed to get container status \"5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\": rpc error: code = NotFound desc = could not find container \"5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5\": container with ID starting with 5229ff531fb2dd0b3b07ab39070b9055ecfc283eeaa335ace77a80822de269f5 not found: ID does not exist" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.408301 4954 scope.go:117] "RemoveContainer" containerID="ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354" Dec 06 07:01:24 crc kubenswrapper[4954]: E1206 07:01:24.410913 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\": container with ID starting with ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354 not found: ID does not exist" containerID="ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.410979 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354"} err="failed to get container status \"ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\": rpc error: code = NotFound desc = could not find container \"ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354\": container with ID starting with ff47fd7267ef33edd9fb0c04d5cfdc77818a72927a8fc4c662bc4979563ed354 not found: ID does not exist" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.411021 4954 scope.go:117] "RemoveContainer" containerID="afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73" Dec 06 07:01:24 crc kubenswrapper[4954]: E1206 07:01:24.412044 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\": container with ID starting with afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73 not found: ID does not exist" containerID="afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.412105 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73"} err="failed to get container status \"afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\": rpc error: code = NotFound desc = could not find container \"afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73\": container with ID starting with afea04f8ce3cb693b5870bb4141f722fb7d08df5ac12ed400165340c39658b73 not found: ID does not exist" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.412134 4954 scope.go:117] "RemoveContainer" containerID="4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c" Dec 06 07:01:24 crc kubenswrapper[4954]: E1206 07:01:24.412967 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\": container with ID starting with 4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c not found: ID does not exist" containerID="4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.413027 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c"} err="failed to get container status \"4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\": rpc error: code = NotFound desc = could not find container \"4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c\": container with ID starting with 4fd3b139ccb3491db8365ec7553dfb2f3b503808198c552bafdc3196cee72b0c not found: ID does not exist" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.572841 4954 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.573106 4954 status_manager.go:851] "Failed to get status for pod" podUID="2f2fbf41-1df2-47da-96e7-84c6c4514646" pod="openshift-marketplace/redhat-marketplace-76xzd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-76xzd\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.573724 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.574390 4954 status_manager.go:851] "Failed to get status for pod" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.574763 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.575172 4954 status_manager.go:851] "Failed to get status for pod" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" pod="openshift-marketplace/community-operators-xqpx7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xqpx7\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.575605 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.772667 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hzqhb" Dec 06 07:01:24 crc kubenswrapper[4954]: I1206 07:01:24.772764 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hzqhb" Dec 06 07:01:25 crc kubenswrapper[4954]: E1206 07:01:25.210030 4954 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:25 crc kubenswrapper[4954]: E1206 07:01:25.211010 4954 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:25 crc kubenswrapper[4954]: E1206 07:01:25.211651 4954 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:25 crc kubenswrapper[4954]: E1206 07:01:25.212040 4954 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:25 crc kubenswrapper[4954]: E1206 07:01:25.212406 4954 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:25 crc kubenswrapper[4954]: I1206 07:01:25.212472 4954 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 06 07:01:25 crc kubenswrapper[4954]: E1206 07:01:25.212962 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.114:6443: connect: connection refused" interval="200ms" Dec 06 07:01:25 crc kubenswrapper[4954]: E1206 07:01:25.414752 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.114:6443: connect: connection refused" interval="400ms" Dec 06 07:01:25 crc kubenswrapper[4954]: I1206 07:01:25.447472 4954 status_manager.go:851] "Failed to get status for pod" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" pod="openshift-marketplace/community-operators-xqpx7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xqpx7\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:25 crc kubenswrapper[4954]: I1206 07:01:25.448009 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:25 crc kubenswrapper[4954]: I1206 07:01:25.448240 4954 status_manager.go:851] "Failed to get status for pod" podUID="2f2fbf41-1df2-47da-96e7-84c6c4514646" pod="openshift-marketplace/redhat-marketplace-76xzd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-76xzd\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:25 crc kubenswrapper[4954]: I1206 07:01:25.448417 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:25 crc kubenswrapper[4954]: I1206 07:01:25.448613 4954 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:25 crc kubenswrapper[4954]: I1206 07:01:25.448788 4954 status_manager.go:851] "Failed to get status for pod" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:25 crc kubenswrapper[4954]: I1206 07:01:25.449027 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:25 crc kubenswrapper[4954]: I1206 07:01:25.452229 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 06 07:01:25 crc kubenswrapper[4954]: E1206 07:01:25.658318 4954 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.114:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e8e39748fb7bd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 07:01:21.395668925 +0000 UTC m=+256.209028334,LastTimestamp:2025-12-06 07:01:21.395668925 +0000 UTC m=+256.209028334,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 07:01:25 crc kubenswrapper[4954]: I1206 07:01:25.815272 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hzqhb" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" containerName="registry-server" probeResult="failure" output=< Dec 06 07:01:25 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 07:01:25 crc kubenswrapper[4954]: > Dec 06 07:01:25 crc kubenswrapper[4954]: E1206 07:01:25.815652 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.114:6443: connect: connection refused" interval="800ms" Dec 06 07:01:26 crc kubenswrapper[4954]: I1206 07:01:26.563541 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sbt66" Dec 06 07:01:26 crc kubenswrapper[4954]: I1206 07:01:26.564098 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sbt66" Dec 06 07:01:26 crc kubenswrapper[4954]: I1206 07:01:26.614079 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sbt66" Dec 06 07:01:26 crc kubenswrapper[4954]: I1206 07:01:26.614741 4954 status_manager.go:851] "Failed to get status for pod" podUID="2f2fbf41-1df2-47da-96e7-84c6c4514646" pod="openshift-marketplace/redhat-marketplace-76xzd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-76xzd\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:26 crc kubenswrapper[4954]: I1206 07:01:26.615140 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:26 crc kubenswrapper[4954]: I1206 07:01:26.615597 4954 status_manager.go:851] "Failed to get status for pod" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:26 crc kubenswrapper[4954]: I1206 07:01:26.615920 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:26 crc kubenswrapper[4954]: I1206 07:01:26.616119 4954 status_manager.go:851] "Failed to get status for pod" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" pod="openshift-marketplace/community-operators-xqpx7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xqpx7\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:26 crc kubenswrapper[4954]: E1206 07:01:26.616255 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.114:6443: connect: connection refused" interval="1.6s" Dec 06 07:01:26 crc kubenswrapper[4954]: I1206 07:01:26.616295 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:27 crc kubenswrapper[4954]: I1206 07:01:27.321485 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sbt66" Dec 06 07:01:27 crc kubenswrapper[4954]: I1206 07:01:27.322239 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:27 crc kubenswrapper[4954]: I1206 07:01:27.322673 4954 status_manager.go:851] "Failed to get status for pod" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" pod="openshift-marketplace/community-operators-xqpx7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xqpx7\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:27 crc kubenswrapper[4954]: I1206 07:01:27.323003 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:27 crc kubenswrapper[4954]: I1206 07:01:27.323267 4954 status_manager.go:851] "Failed to get status for pod" podUID="2f2fbf41-1df2-47da-96e7-84c6c4514646" pod="openshift-marketplace/redhat-marketplace-76xzd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-76xzd\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:27 crc kubenswrapper[4954]: I1206 07:01:27.323591 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:27 crc kubenswrapper[4954]: I1206 07:01:27.323906 4954 status_manager.go:851] "Failed to get status for pod" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:28 crc kubenswrapper[4954]: E1206 07:01:28.217728 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.114:6443: connect: connection refused" interval="3.2s" Dec 06 07:01:30 crc kubenswrapper[4954]: I1206 07:01:30.383133 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" podUID="4c71517f-f5be-4508-8ce4-df43bd8700b7" containerName="oauth-openshift" containerID="cri-o://5af4757e9ea26ef3a1ad1c9124b5cea1df905378f885c22950890d2303be7c41" gracePeriod=15 Dec 06 07:01:31 crc kubenswrapper[4954]: I1206 07:01:31.163439 4954 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-5g5d4 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" start-of-body= Dec 06 07:01:31 crc kubenswrapper[4954]: I1206 07:01:31.164062 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" podUID="4c71517f-f5be-4508-8ce4-df43bd8700b7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" Dec 06 07:01:31 crc kubenswrapper[4954]: E1206 07:01:31.419231 4954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.114:6443: connect: connection refused" interval="6.4s" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.313358 4954 generic.go:334] "Generic (PLEG): container finished" podID="4c71517f-f5be-4508-8ce4-df43bd8700b7" containerID="5af4757e9ea26ef3a1ad1c9124b5cea1df905378f885c22950890d2303be7c41" exitCode=0 Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.313472 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" event={"ID":"4c71517f-f5be-4508-8ce4-df43bd8700b7","Type":"ContainerDied","Data":"5af4757e9ea26ef3a1ad1c9124b5cea1df905378f885c22950890d2303be7c41"} Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.313826 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" event={"ID":"4c71517f-f5be-4508-8ce4-df43bd8700b7","Type":"ContainerDied","Data":"2378b48570e9865f3430dbf17c91ffe52a56edb05cd9c7f5db33323a009ced8d"} Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.313854 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2378b48570e9865f3430dbf17c91ffe52a56edb05cd9c7f5db33323a009ced8d" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.333081 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.333784 4954 status_manager.go:851] "Failed to get status for pod" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" pod="openshift-marketplace/community-operators-xqpx7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xqpx7\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.334509 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.335308 4954 status_manager.go:851] "Failed to get status for pod" podUID="2f2fbf41-1df2-47da-96e7-84c6c4514646" pod="openshift-marketplace/redhat-marketplace-76xzd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-76xzd\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.335556 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.335995 4954 status_manager.go:851] "Failed to get status for pod" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.336443 4954 status_manager.go:851] "Failed to get status for pod" podUID="4c71517f-f5be-4508-8ce4-df43bd8700b7" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-5g5d4\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.336681 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.411664 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-cliconfig\") pod \"4c71517f-f5be-4508-8ce4-df43bd8700b7\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.411727 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-ocp-branding-template\") pod \"4c71517f-f5be-4508-8ce4-df43bd8700b7\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.411758 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-router-certs\") pod \"4c71517f-f5be-4508-8ce4-df43bd8700b7\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.411803 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-template-login\") pod \"4c71517f-f5be-4508-8ce4-df43bd8700b7\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.411831 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-service-ca\") pod \"4c71517f-f5be-4508-8ce4-df43bd8700b7\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.411854 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-audit-policies\") pod \"4c71517f-f5be-4508-8ce4-df43bd8700b7\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.411884 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c71517f-f5be-4508-8ce4-df43bd8700b7-audit-dir\") pod \"4c71517f-f5be-4508-8ce4-df43bd8700b7\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.411921 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-session\") pod \"4c71517f-f5be-4508-8ce4-df43bd8700b7\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.411946 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wpnb\" (UniqueName: \"kubernetes.io/projected/4c71517f-f5be-4508-8ce4-df43bd8700b7-kube-api-access-4wpnb\") pod \"4c71517f-f5be-4508-8ce4-df43bd8700b7\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.411977 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-trusted-ca-bundle\") pod \"4c71517f-f5be-4508-8ce4-df43bd8700b7\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.412014 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-template-error\") pod \"4c71517f-f5be-4508-8ce4-df43bd8700b7\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.412043 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-serving-cert\") pod \"4c71517f-f5be-4508-8ce4-df43bd8700b7\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.412068 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-idp-0-file-data\") pod \"4c71517f-f5be-4508-8ce4-df43bd8700b7\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.412107 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-template-provider-selection\") pod \"4c71517f-f5be-4508-8ce4-df43bd8700b7\" (UID: \"4c71517f-f5be-4508-8ce4-df43bd8700b7\") " Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.412902 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "4c71517f-f5be-4508-8ce4-df43bd8700b7" (UID: "4c71517f-f5be-4508-8ce4-df43bd8700b7"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.412985 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c71517f-f5be-4508-8ce4-df43bd8700b7-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4c71517f-f5be-4508-8ce4-df43bd8700b7" (UID: "4c71517f-f5be-4508-8ce4-df43bd8700b7"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.414517 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "4c71517f-f5be-4508-8ce4-df43bd8700b7" (UID: "4c71517f-f5be-4508-8ce4-df43bd8700b7"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.415186 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "4c71517f-f5be-4508-8ce4-df43bd8700b7" (UID: "4c71517f-f5be-4508-8ce4-df43bd8700b7"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.415276 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "4c71517f-f5be-4508-8ce4-df43bd8700b7" (UID: "4c71517f-f5be-4508-8ce4-df43bd8700b7"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.432392 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "4c71517f-f5be-4508-8ce4-df43bd8700b7" (UID: "4c71517f-f5be-4508-8ce4-df43bd8700b7"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.448802 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "4c71517f-f5be-4508-8ce4-df43bd8700b7" (UID: "4c71517f-f5be-4508-8ce4-df43bd8700b7"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.448960 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "4c71517f-f5be-4508-8ce4-df43bd8700b7" (UID: "4c71517f-f5be-4508-8ce4-df43bd8700b7"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.450371 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "4c71517f-f5be-4508-8ce4-df43bd8700b7" (UID: "4c71517f-f5be-4508-8ce4-df43bd8700b7"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.450659 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c71517f-f5be-4508-8ce4-df43bd8700b7-kube-api-access-4wpnb" (OuterVolumeSpecName: "kube-api-access-4wpnb") pod "4c71517f-f5be-4508-8ce4-df43bd8700b7" (UID: "4c71517f-f5be-4508-8ce4-df43bd8700b7"). InnerVolumeSpecName "kube-api-access-4wpnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.450702 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "4c71517f-f5be-4508-8ce4-df43bd8700b7" (UID: "4c71517f-f5be-4508-8ce4-df43bd8700b7"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.451004 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "4c71517f-f5be-4508-8ce4-df43bd8700b7" (UID: "4c71517f-f5be-4508-8ce4-df43bd8700b7"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.451621 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "4c71517f-f5be-4508-8ce4-df43bd8700b7" (UID: "4c71517f-f5be-4508-8ce4-df43bd8700b7"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.451878 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "4c71517f-f5be-4508-8ce4-df43bd8700b7" (UID: "4c71517f-f5be-4508-8ce4-df43bd8700b7"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.514002 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.514447 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.514460 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.514472 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.514485 4954 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.514499 4954 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c71517f-f5be-4508-8ce4-df43bd8700b7-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.514509 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.514521 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wpnb\" (UniqueName: \"kubernetes.io/projected/4c71517f-f5be-4508-8ce4-df43bd8700b7-kube-api-access-4wpnb\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.514534 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.514545 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.514574 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.514591 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.514628 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:32 crc kubenswrapper[4954]: I1206 07:01:32.514642 4954 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4c71517f-f5be-4508-8ce4-df43bd8700b7-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.320457 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.322348 4954 status_manager.go:851] "Failed to get status for pod" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" pod="openshift-marketplace/community-operators-xqpx7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xqpx7\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.322964 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.323702 4954 status_manager.go:851] "Failed to get status for pod" podUID="2f2fbf41-1df2-47da-96e7-84c6c4514646" pod="openshift-marketplace/redhat-marketplace-76xzd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-76xzd\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.324202 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.324616 4954 status_manager.go:851] "Failed to get status for pod" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.324957 4954 status_manager.go:851] "Failed to get status for pod" podUID="4c71517f-f5be-4508-8ce4-df43bd8700b7" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-5g5d4\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.325423 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.442988 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.444556 4954 status_manager.go:851] "Failed to get status for pod" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.444774 4954 status_manager.go:851] "Failed to get status for pod" podUID="4c71517f-f5be-4508-8ce4-df43bd8700b7" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-5g5d4\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.444945 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.445101 4954 status_manager.go:851] "Failed to get status for pod" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" pod="openshift-marketplace/community-operators-xqpx7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xqpx7\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.445437 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.445749 4954 status_manager.go:851] "Failed to get status for pod" podUID="2f2fbf41-1df2-47da-96e7-84c6c4514646" pod="openshift-marketplace/redhat-marketplace-76xzd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-76xzd\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.446226 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.936662 4954 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="598b09d0-83c1-4b56-b622-5d01f4792661" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.936710 4954 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="598b09d0-83c1-4b56-b622-5d01f4792661" Dec 06 07:01:33 crc kubenswrapper[4954]: E1206 07:01:33.937203 4954 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:33 crc kubenswrapper[4954]: I1206 07:01:33.937660 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:33 crc kubenswrapper[4954]: W1206 07:01:33.961331 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-df486dd8fd93b6612b696ae0a4fb17e9f35b44a909f3f10bb3ba1d4cf3fd3b72 WatchSource:0}: Error finding container df486dd8fd93b6612b696ae0a4fb17e9f35b44a909f3f10bb3ba1d4cf3fd3b72: Status 404 returned error can't find the container with id df486dd8fd93b6612b696ae0a4fb17e9f35b44a909f3f10bb3ba1d4cf3fd3b72 Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.329290 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.329726 4954 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="de5913af46a45e08c8e6774fc882ac9c66797ddc6429110d193bab506e91126e" exitCode=1 Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.329802 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"de5913af46a45e08c8e6774fc882ac9c66797ddc6429110d193bab506e91126e"} Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.330364 4954 scope.go:117] "RemoveContainer" containerID="de5913af46a45e08c8e6774fc882ac9c66797ddc6429110d193bab506e91126e" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.330930 4954 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.331366 4954 status_manager.go:851] "Failed to get status for pod" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" pod="openshift-marketplace/community-operators-xqpx7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xqpx7\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.331443 4954 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d9206a4fa5b23f33704becffd3e58b9be5bed9ccf890a0dafc4e985853aaffcd" exitCode=0 Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.331485 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d9206a4fa5b23f33704becffd3e58b9be5bed9ccf890a0dafc4e985853aaffcd"} Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.331532 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"df486dd8fd93b6612b696ae0a4fb17e9f35b44a909f3f10bb3ba1d4cf3fd3b72"} Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.331820 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.331853 4954 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="598b09d0-83c1-4b56-b622-5d01f4792661" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.331870 4954 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="598b09d0-83c1-4b56-b622-5d01f4792661" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.332096 4954 status_manager.go:851] "Failed to get status for pod" podUID="2f2fbf41-1df2-47da-96e7-84c6c4514646" pod="openshift-marketplace/redhat-marketplace-76xzd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-76xzd\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:34 crc kubenswrapper[4954]: E1206 07:01:34.332216 4954 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.332332 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.332638 4954 status_manager.go:851] "Failed to get status for pod" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.332862 4954 status_manager.go:851] "Failed to get status for pod" podUID="4c71517f-f5be-4508-8ce4-df43bd8700b7" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-5g5d4\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.333131 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.333589 4954 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.334156 4954 status_manager.go:851] "Failed to get status for pod" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" pod="openshift-marketplace/community-operators-xqpx7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xqpx7\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.334674 4954 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.334964 4954 status_manager.go:851] "Failed to get status for pod" podUID="89a2ab41-05c3-433e-aa02-4cc276fe349a" pod="openshift-marketplace/certified-operators-sbt66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sbt66\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.335247 4954 status_manager.go:851] "Failed to get status for pod" podUID="2f2fbf41-1df2-47da-96e7-84c6c4514646" pod="openshift-marketplace/redhat-marketplace-76xzd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-76xzd\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.335515 4954 status_manager.go:851] "Failed to get status for pod" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.335894 4954 status_manager.go:851] "Failed to get status for pod" podUID="4c71517f-f5be-4508-8ce4-df43bd8700b7" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-5g5d4\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.336190 4954 status_manager.go:851] "Failed to get status for pod" podUID="6ccf6324-bb8e-416b-b705-27a3b16f01ed" pod="openshift-marketplace/redhat-operators-hzqhb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hzqhb\": dial tcp 38.129.56.114:6443: connect: connection refused" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.818941 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hzqhb" Dec 06 07:01:34 crc kubenswrapper[4954]: I1206 07:01:34.862681 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hzqhb" Dec 06 07:01:35 crc kubenswrapper[4954]: I1206 07:01:35.349928 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 06 07:01:35 crc kubenswrapper[4954]: I1206 07:01:35.350409 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ae3b796603fc5213051f46425c4579e6965667523eb33272f8e336d59eff51a9"} Dec 06 07:01:35 crc kubenswrapper[4954]: I1206 07:01:35.370618 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c797a81938003d0b3d298a53cbfea621629c1cbb386b835bb2aec61dd090be7c"} Dec 06 07:01:35 crc kubenswrapper[4954]: I1206 07:01:35.370710 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"16238c8bd7bcabf7cc910a967fcf89cc32797f58736ccdd39e7a3f4bcd85d6b2"} Dec 06 07:01:35 crc kubenswrapper[4954]: I1206 07:01:35.370738 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"69a705307918cc1bd3143bfda2b17ecea9649cf8f24ae24c03be2ccfe3df430f"} Dec 06 07:01:36 crc kubenswrapper[4954]: I1206 07:01:36.379361 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"97d213b7869e2cdc34f7ff3cd99433668203304c1207406b163fac7c4f0942aa"} Dec 06 07:01:36 crc kubenswrapper[4954]: I1206 07:01:36.379708 4954 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="598b09d0-83c1-4b56-b622-5d01f4792661" Dec 06 07:01:36 crc kubenswrapper[4954]: I1206 07:01:36.379746 4954 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="598b09d0-83c1-4b56-b622-5d01f4792661" Dec 06 07:01:36 crc kubenswrapper[4954]: I1206 07:01:36.379723 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"34d35df0fa40b0598735f5d84e2af82946e0c1194911468227fb64089825e530"} Dec 06 07:01:36 crc kubenswrapper[4954]: I1206 07:01:36.379832 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:37 crc kubenswrapper[4954]: I1206 07:01:37.519209 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 07:01:37 crc kubenswrapper[4954]: I1206 07:01:37.519631 4954 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 06 07:01:37 crc kubenswrapper[4954]: I1206 07:01:37.519949 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 06 07:01:38 crc kubenswrapper[4954]: I1206 07:01:38.938179 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:38 crc kubenswrapper[4954]: I1206 07:01:38.938262 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:38 crc kubenswrapper[4954]: I1206 07:01:38.943436 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:39 crc kubenswrapper[4954]: I1206 07:01:39.088455 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 07:01:41 crc kubenswrapper[4954]: I1206 07:01:41.394074 4954 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:41 crc kubenswrapper[4954]: I1206 07:01:41.418478 4954 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="598b09d0-83c1-4b56-b622-5d01f4792661" Dec 06 07:01:41 crc kubenswrapper[4954]: I1206 07:01:41.418510 4954 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="598b09d0-83c1-4b56-b622-5d01f4792661" Dec 06 07:01:41 crc kubenswrapper[4954]: I1206 07:01:41.425020 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:01:41 crc kubenswrapper[4954]: I1206 07:01:41.428465 4954 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="fa07b3bb-b19c-4314-9672-e4c3ba649286" Dec 06 07:01:42 crc kubenswrapper[4954]: I1206 07:01:42.427350 4954 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="598b09d0-83c1-4b56-b622-5d01f4792661" Dec 06 07:01:42 crc kubenswrapper[4954]: I1206 07:01:42.427404 4954 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="598b09d0-83c1-4b56-b622-5d01f4792661" Dec 06 07:01:45 crc kubenswrapper[4954]: I1206 07:01:45.483405 4954 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="fa07b3bb-b19c-4314-9672-e4c3ba649286" Dec 06 07:01:47 crc kubenswrapper[4954]: I1206 07:01:47.518878 4954 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 06 07:01:47 crc kubenswrapper[4954]: I1206 07:01:47.518983 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 06 07:01:51 crc kubenswrapper[4954]: I1206 07:01:51.343665 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 06 07:01:52 crc kubenswrapper[4954]: I1206 07:01:52.002934 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 06 07:01:52 crc kubenswrapper[4954]: I1206 07:01:52.122417 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 06 07:01:52 crc kubenswrapper[4954]: I1206 07:01:52.299658 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 06 07:01:52 crc kubenswrapper[4954]: I1206 07:01:52.310364 4954 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 06 07:01:53 crc kubenswrapper[4954]: I1206 07:01:53.242272 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 06 07:01:53 crc kubenswrapper[4954]: I1206 07:01:53.621935 4954 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 06 07:01:53 crc kubenswrapper[4954]: I1206 07:01:53.860154 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 06 07:01:54 crc kubenswrapper[4954]: I1206 07:01:54.051584 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 06 07:01:54 crc kubenswrapper[4954]: I1206 07:01:54.063177 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 06 07:01:54 crc kubenswrapper[4954]: I1206 07:01:54.185010 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 06 07:01:54 crc kubenswrapper[4954]: I1206 07:01:54.249079 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 06 07:01:54 crc kubenswrapper[4954]: I1206 07:01:54.397266 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 06 07:01:54 crc kubenswrapper[4954]: I1206 07:01:54.505353 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 06 07:01:54 crc kubenswrapper[4954]: I1206 07:01:54.553668 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 06 07:01:55 crc kubenswrapper[4954]: I1206 07:01:55.007179 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 06 07:01:55 crc kubenswrapper[4954]: I1206 07:01:55.021146 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 06 07:01:55 crc kubenswrapper[4954]: I1206 07:01:55.034204 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 06 07:01:55 crc kubenswrapper[4954]: I1206 07:01:55.037140 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 06 07:01:55 crc kubenswrapper[4954]: I1206 07:01:55.144186 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 06 07:01:55 crc kubenswrapper[4954]: I1206 07:01:55.160361 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 06 07:01:55 crc kubenswrapper[4954]: I1206 07:01:55.328965 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 06 07:01:55 crc kubenswrapper[4954]: I1206 07:01:55.604475 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 06 07:01:55 crc kubenswrapper[4954]: I1206 07:01:55.622089 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 06 07:01:55 crc kubenswrapper[4954]: I1206 07:01:55.685900 4954 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 06 07:01:55 crc kubenswrapper[4954]: I1206 07:01:55.699143 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 06 07:01:55 crc kubenswrapper[4954]: I1206 07:01:55.737351 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 06 07:01:55 crc kubenswrapper[4954]: I1206 07:01:55.807285 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 06 07:01:55 crc kubenswrapper[4954]: I1206 07:01:55.893123 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 06 07:01:55 crc kubenswrapper[4954]: I1206 07:01:55.949338 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 06 07:01:56 crc kubenswrapper[4954]: I1206 07:01:56.185646 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 06 07:01:56 crc kubenswrapper[4954]: I1206 07:01:56.274030 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 06 07:01:56 crc kubenswrapper[4954]: I1206 07:01:56.355495 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 06 07:01:56 crc kubenswrapper[4954]: I1206 07:01:56.365089 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 06 07:01:56 crc kubenswrapper[4954]: I1206 07:01:56.377368 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 06 07:01:56 crc kubenswrapper[4954]: I1206 07:01:56.410074 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 06 07:01:56 crc kubenswrapper[4954]: I1206 07:01:56.631370 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 06 07:01:56 crc kubenswrapper[4954]: I1206 07:01:56.632265 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 06 07:01:56 crc kubenswrapper[4954]: I1206 07:01:56.782395 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 06 07:01:56 crc kubenswrapper[4954]: I1206 07:01:56.807370 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 06 07:01:56 crc kubenswrapper[4954]: I1206 07:01:56.820864 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 06 07:01:56 crc kubenswrapper[4954]: I1206 07:01:56.834791 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.032731 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.032999 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.235112 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.294307 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.318703 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.401864 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.491894 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.519035 4954 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.519126 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.519207 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.520193 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"ae3b796603fc5213051f46425c4579e6965667523eb33272f8e336d59eff51a9"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.520361 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://ae3b796603fc5213051f46425c4579e6965667523eb33272f8e336d59eff51a9" gracePeriod=30 Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.522812 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.531247 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.549467 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.616674 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.727397 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.884675 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.912346 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.920260 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.945355 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 06 07:01:57 crc kubenswrapper[4954]: I1206 07:01:57.987738 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.029829 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.147486 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.174172 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.311384 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.339438 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.383032 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.411423 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.417438 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.481510 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.512825 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.563054 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.585257 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.635200 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.679898 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.685940 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.724900 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.754462 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.823031 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.859551 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.891927 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 06 07:01:58 crc kubenswrapper[4954]: I1206 07:01:58.980454 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.017795 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.027641 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.029773 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.104848 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.273753 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.300026 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.365200 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.379486 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.459441 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.472105 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.508492 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.583222 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.609555 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.642129 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.820346 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.832144 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.940847 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.962927 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.971728 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 06 07:01:59 crc kubenswrapper[4954]: I1206 07:01:59.999372 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 06 07:02:00 crc kubenswrapper[4954]: I1206 07:02:00.017012 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 06 07:02:00 crc kubenswrapper[4954]: I1206 07:02:00.045392 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 06 07:02:00 crc kubenswrapper[4954]: I1206 07:02:00.114679 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 06 07:02:00 crc kubenswrapper[4954]: I1206 07:02:00.201688 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 06 07:02:00 crc kubenswrapper[4954]: I1206 07:02:00.288125 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 06 07:02:00 crc kubenswrapper[4954]: I1206 07:02:00.311980 4954 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 06 07:02:00 crc kubenswrapper[4954]: I1206 07:02:00.323370 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 06 07:02:00 crc kubenswrapper[4954]: I1206 07:02:00.412305 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 06 07:02:00 crc kubenswrapper[4954]: I1206 07:02:00.446343 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 06 07:02:00 crc kubenswrapper[4954]: I1206 07:02:00.545116 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 06 07:02:00 crc kubenswrapper[4954]: I1206 07:02:00.550216 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 06 07:02:00 crc kubenswrapper[4954]: I1206 07:02:00.577473 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 06 07:02:00 crc kubenswrapper[4954]: I1206 07:02:00.609069 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 06 07:02:00 crc kubenswrapper[4954]: I1206 07:02:00.671765 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 06 07:02:00 crc kubenswrapper[4954]: I1206 07:02:00.786326 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 06 07:02:00 crc kubenswrapper[4954]: I1206 07:02:00.943269 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 06 07:02:00 crc kubenswrapper[4954]: I1206 07:02:00.963969 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 06 07:02:01 crc kubenswrapper[4954]: I1206 07:02:01.030454 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 06 07:02:01 crc kubenswrapper[4954]: I1206 07:02:01.073527 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 06 07:02:01 crc kubenswrapper[4954]: I1206 07:02:01.091046 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 06 07:02:01 crc kubenswrapper[4954]: I1206 07:02:01.107441 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 06 07:02:01 crc kubenswrapper[4954]: I1206 07:02:01.253594 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 06 07:02:01 crc kubenswrapper[4954]: I1206 07:02:01.278510 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 06 07:02:01 crc kubenswrapper[4954]: I1206 07:02:01.314065 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 06 07:02:01 crc kubenswrapper[4954]: I1206 07:02:01.387259 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 06 07:02:01 crc kubenswrapper[4954]: I1206 07:02:01.451037 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 06 07:02:01 crc kubenswrapper[4954]: I1206 07:02:01.492832 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 06 07:02:01 crc kubenswrapper[4954]: I1206 07:02:01.518365 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 06 07:02:01 crc kubenswrapper[4954]: I1206 07:02:01.582910 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 06 07:02:01 crc kubenswrapper[4954]: I1206 07:02:01.584907 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 06 07:02:01 crc kubenswrapper[4954]: I1206 07:02:01.695327 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 06 07:02:01 crc kubenswrapper[4954]: I1206 07:02:01.734011 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 06 07:02:01 crc kubenswrapper[4954]: I1206 07:02:01.774288 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 06 07:02:01 crc kubenswrapper[4954]: I1206 07:02:01.839650 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.012286 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.119674 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.161318 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.206122 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.211211 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.250105 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.337667 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.389196 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.408520 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.410746 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.487033 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.508290 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.540668 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.587381 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.665504 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.762754 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.777047 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.792155 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.889869 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 06 07:02:02 crc kubenswrapper[4954]: I1206 07:02:02.960453 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.001046 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.003688 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.014376 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.060024 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.130041 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.189955 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.292024 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.293197 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.353373 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.471146 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.483141 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.487143 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.498951 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.572694 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.575855 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.721960 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.768244 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.768272 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.776344 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.794464 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.909548 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.910847 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.922300 4954 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pod4c71517f-f5be-4508-8ce4-df43bd8700b7"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pod4c71517f-f5be-4508-8ce4-df43bd8700b7] : Timed out while waiting for systemd to remove kubepods-burstable-pod4c71517f_f5be_4508_8ce4_df43bd8700b7.slice" Dec 06 07:02:03 crc kubenswrapper[4954]: E1206 07:02:03.922381 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods burstable pod4c71517f-f5be-4508-8ce4-df43bd8700b7] : unable to destroy cgroup paths for cgroup [kubepods burstable pod4c71517f-f5be-4508-8ce4-df43bd8700b7] : Timed out while waiting for systemd to remove kubepods-burstable-pod4c71517f_f5be_4508_8ce4_df43bd8700b7.slice" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" podUID="4c71517f-f5be-4508-8ce4-df43bd8700b7" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.938342 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.945510 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 06 07:02:03 crc kubenswrapper[4954]: I1206 07:02:03.998051 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 06 07:02:04 crc kubenswrapper[4954]: I1206 07:02:04.065746 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 06 07:02:04 crc kubenswrapper[4954]: I1206 07:02:04.154280 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 06 07:02:04 crc kubenswrapper[4954]: I1206 07:02:04.210288 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 06 07:02:04 crc kubenswrapper[4954]: I1206 07:02:04.212959 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 06 07:02:04 crc kubenswrapper[4954]: I1206 07:02:04.293079 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 06 07:02:04 crc kubenswrapper[4954]: I1206 07:02:04.369988 4954 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 06 07:02:04 crc kubenswrapper[4954]: I1206 07:02:04.446195 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 06 07:02:04 crc kubenswrapper[4954]: I1206 07:02:04.450843 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 06 07:02:04 crc kubenswrapper[4954]: I1206 07:02:04.579870 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5g5d4" Dec 06 07:02:04 crc kubenswrapper[4954]: I1206 07:02:04.581951 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 06 07:02:04 crc kubenswrapper[4954]: I1206 07:02:04.657346 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 06 07:02:04 crc kubenswrapper[4954]: I1206 07:02:04.668787 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 06 07:02:04 crc kubenswrapper[4954]: I1206 07:02:04.686033 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 06 07:02:04 crc kubenswrapper[4954]: I1206 07:02:04.744126 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 06 07:02:04 crc kubenswrapper[4954]: I1206 07:02:04.767761 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 06 07:02:04 crc kubenswrapper[4954]: I1206 07:02:04.888825 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 06 07:02:04 crc kubenswrapper[4954]: I1206 07:02:04.930898 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 06 07:02:05 crc kubenswrapper[4954]: I1206 07:02:05.003046 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 06 07:02:05 crc kubenswrapper[4954]: I1206 07:02:05.007825 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 06 07:02:05 crc kubenswrapper[4954]: I1206 07:02:05.063557 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 06 07:02:05 crc kubenswrapper[4954]: I1206 07:02:05.119984 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 06 07:02:05 crc kubenswrapper[4954]: I1206 07:02:05.154946 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 06 07:02:05 crc kubenswrapper[4954]: I1206 07:02:05.166772 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 06 07:02:05 crc kubenswrapper[4954]: I1206 07:02:05.175353 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 06 07:02:05 crc kubenswrapper[4954]: I1206 07:02:05.215542 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 06 07:02:05 crc kubenswrapper[4954]: I1206 07:02:05.225073 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 06 07:02:05 crc kubenswrapper[4954]: I1206 07:02:05.323524 4954 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 06 07:02:05 crc kubenswrapper[4954]: I1206 07:02:05.367714 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 06 07:02:05 crc kubenswrapper[4954]: I1206 07:02:05.475747 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 06 07:02:05 crc kubenswrapper[4954]: I1206 07:02:05.523949 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 06 07:02:05 crc kubenswrapper[4954]: I1206 07:02:05.684027 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 06 07:02:05 crc kubenswrapper[4954]: I1206 07:02:05.716072 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 06 07:02:05 crc kubenswrapper[4954]: I1206 07:02:05.936984 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 06 07:02:06 crc kubenswrapper[4954]: I1206 07:02:06.048849 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 06 07:02:06 crc kubenswrapper[4954]: I1206 07:02:06.210495 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 06 07:02:06 crc kubenswrapper[4954]: I1206 07:02:06.309726 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 06 07:02:06 crc kubenswrapper[4954]: I1206 07:02:06.311913 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 06 07:02:06 crc kubenswrapper[4954]: I1206 07:02:06.384256 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 06 07:02:06 crc kubenswrapper[4954]: I1206 07:02:06.466826 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 06 07:02:06 crc kubenswrapper[4954]: I1206 07:02:06.507771 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 06 07:02:06 crc kubenswrapper[4954]: I1206 07:02:06.591469 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 06 07:02:06 crc kubenswrapper[4954]: I1206 07:02:06.824961 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 06 07:02:06 crc kubenswrapper[4954]: I1206 07:02:06.933734 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 06 07:02:06 crc kubenswrapper[4954]: I1206 07:02:06.963686 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.112707 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.246158 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.285351 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.325224 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.393350 4954 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.394588 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sbt66" podStartSLOduration=49.307372056 podStartE2EDuration="51.394542026s" podCreationTimestamp="2025-12-06 07:01:16 +0000 UTC" firstStartedPulling="2025-12-06 07:01:18.132982845 +0000 UTC m=+252.946342234" lastFinishedPulling="2025-12-06 07:01:20.220152815 +0000 UTC m=+255.033512204" observedRunningTime="2025-12-06 07:01:41.192533763 +0000 UTC m=+276.005893172" watchObservedRunningTime="2025-12-06 07:02:07.394542026 +0000 UTC m=+302.207901435" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.407865 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hzqhb" podStartSLOduration=49.414399212 podStartE2EDuration="53.407837133s" podCreationTimestamp="2025-12-06 07:01:14 +0000 UTC" firstStartedPulling="2025-12-06 07:01:16.112262249 +0000 UTC m=+250.925621638" lastFinishedPulling="2025-12-06 07:01:20.10570017 +0000 UTC m=+254.919059559" observedRunningTime="2025-12-06 07:01:41.246144094 +0000 UTC m=+276.059503493" watchObservedRunningTime="2025-12-06 07:02:07.407837133 +0000 UTC m=+302.221196512" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.410322 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=46.410291119 podStartE2EDuration="46.410291119s" podCreationTimestamp="2025-12-06 07:01:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:01:41.147451561 +0000 UTC m=+275.960810950" watchObservedRunningTime="2025-12-06 07:02:07.410291119 +0000 UTC m=+302.223650498" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.412590 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-5g5d4"] Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.412646 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv","openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 07:02:07 crc kubenswrapper[4954]: E1206 07:02:07.412904 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c71517f-f5be-4508-8ce4-df43bd8700b7" containerName="oauth-openshift" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.412925 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c71517f-f5be-4508-8ce4-df43bd8700b7" containerName="oauth-openshift" Dec 06 07:02:07 crc kubenswrapper[4954]: E1206 07:02:07.412956 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" containerName="installer" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.412964 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" containerName="installer" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.413137 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ca7f06d-b620-4abf-ae84-cbec03b0c534" containerName="installer" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.413157 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c71517f-f5be-4508-8ce4-df43bd8700b7" containerName="oauth-openshift" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.413313 4954 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="598b09d0-83c1-4b56-b622-5d01f4792661" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.413349 4954 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="598b09d0-83c1-4b56-b622-5d01f4792661" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.413957 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.420598 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.420688 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.420799 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.420836 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.420886 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.421169 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.423723 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.423837 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.424112 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.425340 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.425438 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.428247 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.428993 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.430090 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.434780 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.439796 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.452197 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c71517f-f5be-4508-8ce4-df43bd8700b7" path="/var/lib/kubelet/pods/4c71517f-f5be-4508-8ce4-df43bd8700b7/volumes" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.466507 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=26.46648721 podStartE2EDuration="26.46648721s" podCreationTimestamp="2025-12-06 07:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:02:07.462872753 +0000 UTC m=+302.276232162" watchObservedRunningTime="2025-12-06 07:02:07.46648721 +0000 UTC m=+302.279846599" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.513986 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.514047 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-router-certs\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.514071 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9f938443-5e76-454f-b814-23d0f60515d4-audit-policies\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.514100 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.514125 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-user-template-error\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.514343 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.514424 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-service-ca\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.514453 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-user-template-login\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.514498 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fl4p\" (UniqueName: \"kubernetes.io/projected/9f938443-5e76-454f-b814-23d0f60515d4-kube-api-access-2fl4p\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.514641 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-session\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.514704 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.514769 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9f938443-5e76-454f-b814-23d0f60515d4-audit-dir\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.514810 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.514900 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.527871 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.601076 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.616178 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9f938443-5e76-454f-b814-23d0f60515d4-audit-policies\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.616261 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.616314 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-user-template-error\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.616395 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.616451 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-service-ca\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.616483 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-user-template-login\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.616515 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fl4p\" (UniqueName: \"kubernetes.io/projected/9f938443-5e76-454f-b814-23d0f60515d4-kube-api-access-2fl4p\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.616555 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-session\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.616805 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.616871 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9f938443-5e76-454f-b814-23d0f60515d4-audit-dir\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.616910 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.616955 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.617008 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.617050 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-router-certs\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.617284 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9f938443-5e76-454f-b814-23d0f60515d4-audit-dir\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.618100 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-service-ca\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.618671 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.618766 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9f938443-5e76-454f-b814-23d0f60515d4-audit-policies\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.619025 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.625006 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.625734 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.626020 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-user-template-error\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.626111 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-router-certs\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.626312 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.626528 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.626849 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-user-template-login\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.626954 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9f938443-5e76-454f-b814-23d0f60515d4-v4-0-config-system-session\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.637677 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fl4p\" (UniqueName: \"kubernetes.io/projected/9f938443-5e76-454f-b814-23d0f60515d4-kube-api-access-2fl4p\") pod \"oauth-openshift-679cb4ddc5-x8vqv\" (UID: \"9f938443-5e76-454f-b814-23d0f60515d4\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.706114 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.734269 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.737474 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.962976 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv"] Dec 06 07:02:07 crc kubenswrapper[4954]: I1206 07:02:07.983052 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 06 07:02:08 crc kubenswrapper[4954]: I1206 07:02:08.003782 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 06 07:02:08 crc kubenswrapper[4954]: I1206 07:02:08.008009 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 06 07:02:08 crc kubenswrapper[4954]: I1206 07:02:08.069712 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 06 07:02:08 crc kubenswrapper[4954]: I1206 07:02:08.460109 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 06 07:02:08 crc kubenswrapper[4954]: I1206 07:02:08.606522 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" event={"ID":"9f938443-5e76-454f-b814-23d0f60515d4","Type":"ContainerStarted","Data":"e569fdc406b63985d157202026c6fa21c6833c7f88f3d4aaeb815f18160f0769"} Dec 06 07:02:08 crc kubenswrapper[4954]: I1206 07:02:08.606588 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" event={"ID":"9f938443-5e76-454f-b814-23d0f60515d4","Type":"ContainerStarted","Data":"1ac1ef5ee9b411c5881ad0228298e4d72124c166cc3f3bd6217846fb35033969"} Dec 06 07:02:08 crc kubenswrapper[4954]: I1206 07:02:08.606939 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:08 crc kubenswrapper[4954]: I1206 07:02:08.631868 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" podStartSLOduration=63.631786536 podStartE2EDuration="1m3.631786536s" podCreationTimestamp="2025-12-06 07:01:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:02:08.629210617 +0000 UTC m=+303.442570006" watchObservedRunningTime="2025-12-06 07:02:08.631786536 +0000 UTC m=+303.445145945" Dec 06 07:02:08 crc kubenswrapper[4954]: I1206 07:02:08.738545 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" Dec 06 07:02:09 crc kubenswrapper[4954]: I1206 07:02:09.424548 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 06 07:02:10 crc kubenswrapper[4954]: I1206 07:02:10.136367 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 06 07:02:15 crc kubenswrapper[4954]: I1206 07:02:15.245428 4954 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 06 07:02:15 crc kubenswrapper[4954]: I1206 07:02:15.246841 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f9f631dc4762d1aa00e2cb55922fc155966fdb8ee3587f5650abda1a3f961ba1" gracePeriod=5 Dec 06 07:02:20 crc kubenswrapper[4954]: I1206 07:02:20.683662 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 06 07:02:20 crc kubenswrapper[4954]: I1206 07:02:20.684493 4954 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f9f631dc4762d1aa00e2cb55922fc155966fdb8ee3587f5650abda1a3f961ba1" exitCode=137 Dec 06 07:02:20 crc kubenswrapper[4954]: I1206 07:02:20.827094 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 06 07:02:20 crc kubenswrapper[4954]: I1206 07:02:20.827211 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:02:20 crc kubenswrapper[4954]: I1206 07:02:20.926896 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 07:02:20 crc kubenswrapper[4954]: I1206 07:02:20.926993 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 07:02:20 crc kubenswrapper[4954]: I1206 07:02:20.927025 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 07:02:20 crc kubenswrapper[4954]: I1206 07:02:20.927070 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 07:02:20 crc kubenswrapper[4954]: I1206 07:02:20.927095 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 07:02:20 crc kubenswrapper[4954]: I1206 07:02:20.927101 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:02:20 crc kubenswrapper[4954]: I1206 07:02:20.927193 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:02:20 crc kubenswrapper[4954]: I1206 07:02:20.927157 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:02:20 crc kubenswrapper[4954]: I1206 07:02:20.927255 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:02:20 crc kubenswrapper[4954]: I1206 07:02:20.927359 4954 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 06 07:02:20 crc kubenswrapper[4954]: I1206 07:02:20.927374 4954 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 06 07:02:20 crc kubenswrapper[4954]: I1206 07:02:20.927384 4954 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 06 07:02:20 crc kubenswrapper[4954]: I1206 07:02:20.927393 4954 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 06 07:02:20 crc kubenswrapper[4954]: I1206 07:02:20.938260 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:02:21 crc kubenswrapper[4954]: I1206 07:02:21.029108 4954 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 06 07:02:21 crc kubenswrapper[4954]: I1206 07:02:21.450803 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 06 07:02:21 crc kubenswrapper[4954]: I1206 07:02:21.451377 4954 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 06 07:02:21 crc kubenswrapper[4954]: I1206 07:02:21.462818 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 06 07:02:21 crc kubenswrapper[4954]: I1206 07:02:21.462880 4954 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="06ded8ea-ccfc-42d5-8e85-7c03ff1fea92" Dec 06 07:02:21 crc kubenswrapper[4954]: I1206 07:02:21.466160 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 06 07:02:21 crc kubenswrapper[4954]: I1206 07:02:21.466222 4954 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="06ded8ea-ccfc-42d5-8e85-7c03ff1fea92" Dec 06 07:02:21 crc kubenswrapper[4954]: I1206 07:02:21.693500 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 06 07:02:21 crc kubenswrapper[4954]: I1206 07:02:21.693641 4954 scope.go:117] "RemoveContainer" containerID="f9f631dc4762d1aa00e2cb55922fc155966fdb8ee3587f5650abda1a3f961ba1" Dec 06 07:02:21 crc kubenswrapper[4954]: I1206 07:02:21.693781 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 07:02:27 crc kubenswrapper[4954]: I1206 07:02:27.744530 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 06 07:02:27 crc kubenswrapper[4954]: I1206 07:02:27.747468 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 06 07:02:27 crc kubenswrapper[4954]: I1206 07:02:27.747536 4954 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ae3b796603fc5213051f46425c4579e6965667523eb33272f8e336d59eff51a9" exitCode=137 Dec 06 07:02:27 crc kubenswrapper[4954]: I1206 07:02:27.747603 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ae3b796603fc5213051f46425c4579e6965667523eb33272f8e336d59eff51a9"} Dec 06 07:02:27 crc kubenswrapper[4954]: I1206 07:02:27.747651 4954 scope.go:117] "RemoveContainer" containerID="de5913af46a45e08c8e6774fc882ac9c66797ddc6429110d193bab506e91126e" Dec 06 07:02:28 crc kubenswrapper[4954]: I1206 07:02:28.758132 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 06 07:02:28 crc kubenswrapper[4954]: I1206 07:02:28.759912 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"da24c0214addedcb2193fb2b1cd0f251e01785277e2083391e7bce481babb2d8"} Dec 06 07:02:29 crc kubenswrapper[4954]: I1206 07:02:29.088150 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 07:02:37 crc kubenswrapper[4954]: I1206 07:02:37.518286 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 07:02:37 crc kubenswrapper[4954]: I1206 07:02:37.525657 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 07:02:37 crc kubenswrapper[4954]: I1206 07:02:37.825942 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.355798 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6"] Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.356988 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" podUID="1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919" containerName="route-controller-manager" containerID="cri-o://ed274cbd2937dd9c60f84900f8173ec9ee4d8ec6945bafd8e9343ee9c7c31c2a" gracePeriod=30 Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.367745 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jzdxj"] Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.368059 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" podUID="11bae354-1c41-442d-820c-d3cd3fa537d8" containerName="controller-manager" containerID="cri-o://572e5903bba51c9990996f26f88df5636edf030e88195b0ed6968be17b3e6654" gracePeriod=30 Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.523607 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xv4c7"] Dec 06 07:02:47 crc kubenswrapper[4954]: E1206 07:02:47.524622 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.524646 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.524788 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.525499 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.539130 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xv4c7"] Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.611716 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/814920d2-586e-4329-8215-4c61feb12041-registry-certificates\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.611815 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/814920d2-586e-4329-8215-4c61feb12041-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.612070 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/814920d2-586e-4329-8215-4c61feb12041-registry-tls\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.612113 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/814920d2-586e-4329-8215-4c61feb12041-bound-sa-token\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.612164 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/814920d2-586e-4329-8215-4c61feb12041-trusted-ca\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.612218 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/814920d2-586e-4329-8215-4c61feb12041-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.612256 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psknb\" (UniqueName: \"kubernetes.io/projected/814920d2-586e-4329-8215-4c61feb12041-kube-api-access-psknb\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.612307 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.673821 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.713747 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/814920d2-586e-4329-8215-4c61feb12041-registry-certificates\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.713808 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/814920d2-586e-4329-8215-4c61feb12041-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.713845 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/814920d2-586e-4329-8215-4c61feb12041-registry-tls\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.713864 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/814920d2-586e-4329-8215-4c61feb12041-bound-sa-token\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.713887 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/814920d2-586e-4329-8215-4c61feb12041-trusted-ca\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.713914 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/814920d2-586e-4329-8215-4c61feb12041-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.713935 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psknb\" (UniqueName: \"kubernetes.io/projected/814920d2-586e-4329-8215-4c61feb12041-kube-api-access-psknb\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.715840 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/814920d2-586e-4329-8215-4c61feb12041-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.716299 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/814920d2-586e-4329-8215-4c61feb12041-trusted-ca\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.717007 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/814920d2-586e-4329-8215-4c61feb12041-registry-certificates\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.725938 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/814920d2-586e-4329-8215-4c61feb12041-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.733067 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psknb\" (UniqueName: \"kubernetes.io/projected/814920d2-586e-4329-8215-4c61feb12041-kube-api-access-psknb\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.734133 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/814920d2-586e-4329-8215-4c61feb12041-bound-sa-token\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.735443 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/814920d2-586e-4329-8215-4c61feb12041-registry-tls\") pod \"image-registry-66df7c8f76-xv4c7\" (UID: \"814920d2-586e-4329-8215-4c61feb12041\") " pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.817288 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.822612 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.855926 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.898910 4954 generic.go:334] "Generic (PLEG): container finished" podID="11bae354-1c41-442d-820c-d3cd3fa537d8" containerID="572e5903bba51c9990996f26f88df5636edf030e88195b0ed6968be17b3e6654" exitCode=0 Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.899002 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.899025 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" event={"ID":"11bae354-1c41-442d-820c-d3cd3fa537d8","Type":"ContainerDied","Data":"572e5903bba51c9990996f26f88df5636edf030e88195b0ed6968be17b3e6654"} Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.899672 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jzdxj" event={"ID":"11bae354-1c41-442d-820c-d3cd3fa537d8","Type":"ContainerDied","Data":"b1d40dfa8ee62fdc64b51200d0003552e46f1079f39cfa20dae90dee48f4aa6c"} Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.899723 4954 scope.go:117] "RemoveContainer" containerID="572e5903bba51c9990996f26f88df5636edf030e88195b0ed6968be17b3e6654" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.904035 4954 generic.go:334] "Generic (PLEG): container finished" podID="1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919" containerID="ed274cbd2937dd9c60f84900f8173ec9ee4d8ec6945bafd8e9343ee9c7c31c2a" exitCode=0 Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.904084 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" event={"ID":"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919","Type":"ContainerDied","Data":"ed274cbd2937dd9c60f84900f8173ec9ee4d8ec6945bafd8e9343ee9c7c31c2a"} Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.904119 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" event={"ID":"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919","Type":"ContainerDied","Data":"7ce19d9d3f44a097f43a70bb9a890a106167cbb5f8cc0d5c173e7521b2ff9e50"} Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.904172 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.916336 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11bae354-1c41-442d-820c-d3cd3fa537d8-client-ca\") pod \"11bae354-1c41-442d-820c-d3cd3fa537d8\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.916723 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-client-ca\") pod \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\" (UID: \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\") " Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.916871 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c6hk\" (UniqueName: \"kubernetes.io/projected/11bae354-1c41-442d-820c-d3cd3fa537d8-kube-api-access-5c6hk\") pod \"11bae354-1c41-442d-820c-d3cd3fa537d8\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.917029 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-config\") pod \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\" (UID: \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\") " Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.917168 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-serving-cert\") pod \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\" (UID: \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\") " Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.917317 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11bae354-1c41-442d-820c-d3cd3fa537d8-proxy-ca-bundles\") pod \"11bae354-1c41-442d-820c-d3cd3fa537d8\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.917386 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11bae354-1c41-442d-820c-d3cd3fa537d8-config\") pod \"11bae354-1c41-442d-820c-d3cd3fa537d8\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.917487 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdsm4\" (UniqueName: \"kubernetes.io/projected/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-kube-api-access-xdsm4\") pod \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\" (UID: \"1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919\") " Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.917606 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11bae354-1c41-442d-820c-d3cd3fa537d8-serving-cert\") pod \"11bae354-1c41-442d-820c-d3cd3fa537d8\" (UID: \"11bae354-1c41-442d-820c-d3cd3fa537d8\") " Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.921631 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-config" (OuterVolumeSpecName: "config") pod "1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919" (UID: "1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.922299 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11bae354-1c41-442d-820c-d3cd3fa537d8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "11bae354-1c41-442d-820c-d3cd3fa537d8" (UID: "11bae354-1c41-442d-820c-d3cd3fa537d8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.922607 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-client-ca" (OuterVolumeSpecName: "client-ca") pod "1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919" (UID: "1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.923437 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11bae354-1c41-442d-820c-d3cd3fa537d8-config" (OuterVolumeSpecName: "config") pod "11bae354-1c41-442d-820c-d3cd3fa537d8" (UID: "11bae354-1c41-442d-820c-d3cd3fa537d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.923483 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11bae354-1c41-442d-820c-d3cd3fa537d8-client-ca" (OuterVolumeSpecName: "client-ca") pod "11bae354-1c41-442d-820c-d3cd3fa537d8" (UID: "11bae354-1c41-442d-820c-d3cd3fa537d8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.925748 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11bae354-1c41-442d-820c-d3cd3fa537d8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "11bae354-1c41-442d-820c-d3cd3fa537d8" (UID: "11bae354-1c41-442d-820c-d3cd3fa537d8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.926126 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919" (UID: "1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.926940 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-kube-api-access-xdsm4" (OuterVolumeSpecName: "kube-api-access-xdsm4") pod "1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919" (UID: "1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919"). InnerVolumeSpecName "kube-api-access-xdsm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.928291 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11bae354-1c41-442d-820c-d3cd3fa537d8-kube-api-access-5c6hk" (OuterVolumeSpecName: "kube-api-access-5c6hk") pod "11bae354-1c41-442d-820c-d3cd3fa537d8" (UID: "11bae354-1c41-442d-820c-d3cd3fa537d8"). InnerVolumeSpecName "kube-api-access-5c6hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.943681 4954 scope.go:117] "RemoveContainer" containerID="572e5903bba51c9990996f26f88df5636edf030e88195b0ed6968be17b3e6654" Dec 06 07:02:47 crc kubenswrapper[4954]: E1206 07:02:47.946015 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"572e5903bba51c9990996f26f88df5636edf030e88195b0ed6968be17b3e6654\": container with ID starting with 572e5903bba51c9990996f26f88df5636edf030e88195b0ed6968be17b3e6654 not found: ID does not exist" containerID="572e5903bba51c9990996f26f88df5636edf030e88195b0ed6968be17b3e6654" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.946068 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"572e5903bba51c9990996f26f88df5636edf030e88195b0ed6968be17b3e6654"} err="failed to get container status \"572e5903bba51c9990996f26f88df5636edf030e88195b0ed6968be17b3e6654\": rpc error: code = NotFound desc = could not find container \"572e5903bba51c9990996f26f88df5636edf030e88195b0ed6968be17b3e6654\": container with ID starting with 572e5903bba51c9990996f26f88df5636edf030e88195b0ed6968be17b3e6654 not found: ID does not exist" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.946099 4954 scope.go:117] "RemoveContainer" containerID="ed274cbd2937dd9c60f84900f8173ec9ee4d8ec6945bafd8e9343ee9c7c31c2a" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.967571 4954 scope.go:117] "RemoveContainer" containerID="ed274cbd2937dd9c60f84900f8173ec9ee4d8ec6945bafd8e9343ee9c7c31c2a" Dec 06 07:02:47 crc kubenswrapper[4954]: E1206 07:02:47.968134 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed274cbd2937dd9c60f84900f8173ec9ee4d8ec6945bafd8e9343ee9c7c31c2a\": container with ID starting with ed274cbd2937dd9c60f84900f8173ec9ee4d8ec6945bafd8e9343ee9c7c31c2a not found: ID does not exist" containerID="ed274cbd2937dd9c60f84900f8173ec9ee4d8ec6945bafd8e9343ee9c7c31c2a" Dec 06 07:02:47 crc kubenswrapper[4954]: I1206 07:02:47.968185 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed274cbd2937dd9c60f84900f8173ec9ee4d8ec6945bafd8e9343ee9c7c31c2a"} err="failed to get container status \"ed274cbd2937dd9c60f84900f8173ec9ee4d8ec6945bafd8e9343ee9c7c31c2a\": rpc error: code = NotFound desc = could not find container \"ed274cbd2937dd9c60f84900f8173ec9ee4d8ec6945bafd8e9343ee9c7c31c2a\": container with ID starting with ed274cbd2937dd9c60f84900f8173ec9ee4d8ec6945bafd8e9343ee9c7c31c2a not found: ID does not exist" Dec 06 07:02:48 crc kubenswrapper[4954]: I1206 07:02:48.020693 4954 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11bae354-1c41-442d-820c-d3cd3fa537d8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 07:02:48 crc kubenswrapper[4954]: I1206 07:02:48.020744 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11bae354-1c41-442d-820c-d3cd3fa537d8-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:02:48 crc kubenswrapper[4954]: I1206 07:02:48.020755 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdsm4\" (UniqueName: \"kubernetes.io/projected/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-kube-api-access-xdsm4\") on node \"crc\" DevicePath \"\"" Dec 06 07:02:48 crc kubenswrapper[4954]: I1206 07:02:48.020764 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11bae354-1c41-442d-820c-d3cd3fa537d8-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 07:02:48 crc kubenswrapper[4954]: I1206 07:02:48.020775 4954 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11bae354-1c41-442d-820c-d3cd3fa537d8-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 07:02:48 crc kubenswrapper[4954]: I1206 07:02:48.020785 4954 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 07:02:48 crc kubenswrapper[4954]: I1206 07:02:48.020798 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c6hk\" (UniqueName: \"kubernetes.io/projected/11bae354-1c41-442d-820c-d3cd3fa537d8-kube-api-access-5c6hk\") on node \"crc\" DevicePath \"\"" Dec 06 07:02:48 crc kubenswrapper[4954]: I1206 07:02:48.020831 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:02:48 crc kubenswrapper[4954]: I1206 07:02:48.020841 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 07:02:48 crc kubenswrapper[4954]: I1206 07:02:48.143449 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xv4c7"] Dec 06 07:02:48 crc kubenswrapper[4954]: I1206 07:02:48.232303 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jzdxj"] Dec 06 07:02:48 crc kubenswrapper[4954]: I1206 07:02:48.239698 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jzdxj"] Dec 06 07:02:48 crc kubenswrapper[4954]: I1206 07:02:48.251190 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6"] Dec 06 07:02:48 crc kubenswrapper[4954]: I1206 07:02:48.255715 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pvtl6"] Dec 06 07:02:48 crc kubenswrapper[4954]: I1206 07:02:48.914049 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" event={"ID":"814920d2-586e-4329-8215-4c61feb12041","Type":"ContainerStarted","Data":"51b5359fef32d0e6d202f88532f3951b48ce005072840d3aa393203fdfff3b65"} Dec 06 07:02:48 crc kubenswrapper[4954]: I1206 07:02:48.914599 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" event={"ID":"814920d2-586e-4329-8215-4c61feb12041","Type":"ContainerStarted","Data":"b662bf068dba1196f1449f3ee345ecc9225370205e7fc67cd93abf4802808184"} Dec 06 07:02:48 crc kubenswrapper[4954]: I1206 07:02:48.914640 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:02:48 crc kubenswrapper[4954]: I1206 07:02:48.938402 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" podStartSLOduration=1.9383766709999999 podStartE2EDuration="1.938376671s" podCreationTimestamp="2025-12-06 07:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:02:48.934618388 +0000 UTC m=+343.747977777" watchObservedRunningTime="2025-12-06 07:02:48.938376671 +0000 UTC m=+343.751736070" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.283646 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld"] Dec 06 07:02:49 crc kubenswrapper[4954]: E1206 07:02:49.284004 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919" containerName="route-controller-manager" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.284025 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919" containerName="route-controller-manager" Dec 06 07:02:49 crc kubenswrapper[4954]: E1206 07:02:49.284045 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11bae354-1c41-442d-820c-d3cd3fa537d8" containerName="controller-manager" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.284054 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="11bae354-1c41-442d-820c-d3cd3fa537d8" containerName="controller-manager" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.284176 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919" containerName="route-controller-manager" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.284199 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="11bae354-1c41-442d-820c-d3cd3fa537d8" containerName="controller-manager" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.284851 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.288016 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5f55469458-nd7jf"] Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.288267 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.288733 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.288933 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.289084 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.289806 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.293990 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.294087 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.294132 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.294164 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.294787 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.295008 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.295082 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.295227 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.296174 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f55469458-nd7jf"] Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.304090 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.306368 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld"] Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.353270 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq5dd\" (UniqueName: \"kubernetes.io/projected/440f05eb-1491-4704-87e3-e0865ecfd083-kube-api-access-bq5dd\") pod \"controller-manager-5f55469458-nd7jf\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.353348 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cd1ec6f-7991-4dfe-9013-91ad2e241d3b-serving-cert\") pod \"route-controller-manager-8575994bb-6x5ld\" (UID: \"1cd1ec6f-7991-4dfe-9013-91ad2e241d3b\") " pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.353405 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5bvj\" (UniqueName: \"kubernetes.io/projected/1cd1ec6f-7991-4dfe-9013-91ad2e241d3b-kube-api-access-r5bvj\") pod \"route-controller-manager-8575994bb-6x5ld\" (UID: \"1cd1ec6f-7991-4dfe-9013-91ad2e241d3b\") " pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.353442 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/440f05eb-1491-4704-87e3-e0865ecfd083-config\") pod \"controller-manager-5f55469458-nd7jf\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.353506 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cd1ec6f-7991-4dfe-9013-91ad2e241d3b-config\") pod \"route-controller-manager-8575994bb-6x5ld\" (UID: \"1cd1ec6f-7991-4dfe-9013-91ad2e241d3b\") " pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.353529 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/440f05eb-1491-4704-87e3-e0865ecfd083-serving-cert\") pod \"controller-manager-5f55469458-nd7jf\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.353607 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/440f05eb-1491-4704-87e3-e0865ecfd083-client-ca\") pod \"controller-manager-5f55469458-nd7jf\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.353861 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1cd1ec6f-7991-4dfe-9013-91ad2e241d3b-client-ca\") pod \"route-controller-manager-8575994bb-6x5ld\" (UID: \"1cd1ec6f-7991-4dfe-9013-91ad2e241d3b\") " pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.353918 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/440f05eb-1491-4704-87e3-e0865ecfd083-proxy-ca-bundles\") pod \"controller-manager-5f55469458-nd7jf\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.450544 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11bae354-1c41-442d-820c-d3cd3fa537d8" path="/var/lib/kubelet/pods/11bae354-1c41-442d-820c-d3cd3fa537d8/volumes" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.451473 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919" path="/var/lib/kubelet/pods/1f83216b-b9cb-4cfe-8b9e-dfcfb1e89919/volumes" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.455747 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/440f05eb-1491-4704-87e3-e0865ecfd083-client-ca\") pod \"controller-manager-5f55469458-nd7jf\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.455819 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/440f05eb-1491-4704-87e3-e0865ecfd083-proxy-ca-bundles\") pod \"controller-manager-5f55469458-nd7jf\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.455848 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1cd1ec6f-7991-4dfe-9013-91ad2e241d3b-client-ca\") pod \"route-controller-manager-8575994bb-6x5ld\" (UID: \"1cd1ec6f-7991-4dfe-9013-91ad2e241d3b\") " pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.455872 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq5dd\" (UniqueName: \"kubernetes.io/projected/440f05eb-1491-4704-87e3-e0865ecfd083-kube-api-access-bq5dd\") pod \"controller-manager-5f55469458-nd7jf\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.455901 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cd1ec6f-7991-4dfe-9013-91ad2e241d3b-serving-cert\") pod \"route-controller-manager-8575994bb-6x5ld\" (UID: \"1cd1ec6f-7991-4dfe-9013-91ad2e241d3b\") " pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.455935 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5bvj\" (UniqueName: \"kubernetes.io/projected/1cd1ec6f-7991-4dfe-9013-91ad2e241d3b-kube-api-access-r5bvj\") pod \"route-controller-manager-8575994bb-6x5ld\" (UID: \"1cd1ec6f-7991-4dfe-9013-91ad2e241d3b\") " pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.455965 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/440f05eb-1491-4704-87e3-e0865ecfd083-config\") pod \"controller-manager-5f55469458-nd7jf\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.455993 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cd1ec6f-7991-4dfe-9013-91ad2e241d3b-config\") pod \"route-controller-manager-8575994bb-6x5ld\" (UID: \"1cd1ec6f-7991-4dfe-9013-91ad2e241d3b\") " pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.456320 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/440f05eb-1491-4704-87e3-e0865ecfd083-serving-cert\") pod \"controller-manager-5f55469458-nd7jf\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.457403 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1cd1ec6f-7991-4dfe-9013-91ad2e241d3b-client-ca\") pod \"route-controller-manager-8575994bb-6x5ld\" (UID: \"1cd1ec6f-7991-4dfe-9013-91ad2e241d3b\") " pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.457615 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/440f05eb-1491-4704-87e3-e0865ecfd083-proxy-ca-bundles\") pod \"controller-manager-5f55469458-nd7jf\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.457808 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cd1ec6f-7991-4dfe-9013-91ad2e241d3b-config\") pod \"route-controller-manager-8575994bb-6x5ld\" (UID: \"1cd1ec6f-7991-4dfe-9013-91ad2e241d3b\") " pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.457996 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/440f05eb-1491-4704-87e3-e0865ecfd083-config\") pod \"controller-manager-5f55469458-nd7jf\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.458436 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/440f05eb-1491-4704-87e3-e0865ecfd083-client-ca\") pod \"controller-manager-5f55469458-nd7jf\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.481850 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq5dd\" (UniqueName: \"kubernetes.io/projected/440f05eb-1491-4704-87e3-e0865ecfd083-kube-api-access-bq5dd\") pod \"controller-manager-5f55469458-nd7jf\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.482521 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5bvj\" (UniqueName: \"kubernetes.io/projected/1cd1ec6f-7991-4dfe-9013-91ad2e241d3b-kube-api-access-r5bvj\") pod \"route-controller-manager-8575994bb-6x5ld\" (UID: \"1cd1ec6f-7991-4dfe-9013-91ad2e241d3b\") " pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.486487 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/440f05eb-1491-4704-87e3-e0865ecfd083-serving-cert\") pod \"controller-manager-5f55469458-nd7jf\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.495422 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cd1ec6f-7991-4dfe-9013-91ad2e241d3b-serving-cert\") pod \"route-controller-manager-8575994bb-6x5ld\" (UID: \"1cd1ec6f-7991-4dfe-9013-91ad2e241d3b\") " pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.607705 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" Dec 06 07:02:49 crc kubenswrapper[4954]: I1206 07:02:49.616367 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:50 crc kubenswrapper[4954]: I1206 07:02:50.035517 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f55469458-nd7jf"] Dec 06 07:02:50 crc kubenswrapper[4954]: W1206 07:02:50.064405 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod440f05eb_1491_4704_87e3_e0865ecfd083.slice/crio-d406f555b0d811cb1ba6089ededeb6368e537dc1eafea6b22a95a45a0a6cd6b8 WatchSource:0}: Error finding container d406f555b0d811cb1ba6089ededeb6368e537dc1eafea6b22a95a45a0a6cd6b8: Status 404 returned error can't find the container with id d406f555b0d811cb1ba6089ededeb6368e537dc1eafea6b22a95a45a0a6cd6b8 Dec 06 07:02:50 crc kubenswrapper[4954]: I1206 07:02:50.107258 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld"] Dec 06 07:02:50 crc kubenswrapper[4954]: W1206 07:02:50.134942 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cd1ec6f_7991_4dfe_9013_91ad2e241d3b.slice/crio-79b0091adbe4921f18ca0c2a18bd0284aaa5828e5d1c142ece767d05f4725c41 WatchSource:0}: Error finding container 79b0091adbe4921f18ca0c2a18bd0284aaa5828e5d1c142ece767d05f4725c41: Status 404 returned error can't find the container with id 79b0091adbe4921f18ca0c2a18bd0284aaa5828e5d1c142ece767d05f4725c41 Dec 06 07:02:50 crc kubenswrapper[4954]: I1206 07:02:50.928742 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" event={"ID":"1cd1ec6f-7991-4dfe-9013-91ad2e241d3b","Type":"ContainerStarted","Data":"672e128b580ed3a1a2f95d18cb3293909fc15c5f38f0d478ea43db829e1ef833"} Dec 06 07:02:50 crc kubenswrapper[4954]: I1206 07:02:50.928801 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" event={"ID":"1cd1ec6f-7991-4dfe-9013-91ad2e241d3b","Type":"ContainerStarted","Data":"79b0091adbe4921f18ca0c2a18bd0284aaa5828e5d1c142ece767d05f4725c41"} Dec 06 07:02:50 crc kubenswrapper[4954]: I1206 07:02:50.929945 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" Dec 06 07:02:50 crc kubenswrapper[4954]: I1206 07:02:50.931600 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" event={"ID":"440f05eb-1491-4704-87e3-e0865ecfd083","Type":"ContainerStarted","Data":"40599f3b3d818ef2188670531d60d3c79b28409bf67946a84571184b59ce9208"} Dec 06 07:02:50 crc kubenswrapper[4954]: I1206 07:02:50.931637 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" event={"ID":"440f05eb-1491-4704-87e3-e0865ecfd083","Type":"ContainerStarted","Data":"d406f555b0d811cb1ba6089ededeb6368e537dc1eafea6b22a95a45a0a6cd6b8"} Dec 06 07:02:50 crc kubenswrapper[4954]: I1206 07:02:50.931940 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:50 crc kubenswrapper[4954]: I1206 07:02:50.936305 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:02:51 crc kubenswrapper[4954]: I1206 07:02:51.001311 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" podStartSLOduration=4.001284133 podStartE2EDuration="4.001284133s" podCreationTimestamp="2025-12-06 07:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:02:50.968981159 +0000 UTC m=+345.782340548" watchObservedRunningTime="2025-12-06 07:02:51.001284133 +0000 UTC m=+345.814643522" Dec 06 07:02:51 crc kubenswrapper[4954]: I1206 07:02:51.004419 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" podStartSLOduration=4.004410039 podStartE2EDuration="4.004410039s" podCreationTimestamp="2025-12-06 07:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:02:51.000919513 +0000 UTC m=+345.814278902" watchObservedRunningTime="2025-12-06 07:02:51.004410039 +0000 UTC m=+345.817769428" Dec 06 07:02:51 crc kubenswrapper[4954]: I1206 07:02:51.190889 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8575994bb-6x5ld" Dec 06 07:03:07 crc kubenswrapper[4954]: I1206 07:03:07.864917 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-xv4c7" Dec 06 07:03:07 crc kubenswrapper[4954]: I1206 07:03:07.927447 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sc4fg"] Dec 06 07:03:08 crc kubenswrapper[4954]: I1206 07:03:08.114510 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f55469458-nd7jf"] Dec 06 07:03:08 crc kubenswrapper[4954]: I1206 07:03:08.114859 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" podUID="440f05eb-1491-4704-87e3-e0865ecfd083" containerName="controller-manager" containerID="cri-o://40599f3b3d818ef2188670531d60d3c79b28409bf67946a84571184b59ce9208" gracePeriod=30 Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.042758 4954 generic.go:334] "Generic (PLEG): container finished" podID="440f05eb-1491-4704-87e3-e0865ecfd083" containerID="40599f3b3d818ef2188670531d60d3c79b28409bf67946a84571184b59ce9208" exitCode=0 Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.043254 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" event={"ID":"440f05eb-1491-4704-87e3-e0865ecfd083","Type":"ContainerDied","Data":"40599f3b3d818ef2188670531d60d3c79b28409bf67946a84571184b59ce9208"} Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.129657 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.162866 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6986b9b5c-phln9"] Dec 06 07:03:09 crc kubenswrapper[4954]: E1206 07:03:09.163198 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440f05eb-1491-4704-87e3-e0865ecfd083" containerName="controller-manager" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.163218 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="440f05eb-1491-4704-87e3-e0865ecfd083" containerName="controller-manager" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.163372 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="440f05eb-1491-4704-87e3-e0865ecfd083" containerName="controller-manager" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.164079 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.170087 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/440f05eb-1491-4704-87e3-e0865ecfd083-serving-cert\") pod \"440f05eb-1491-4704-87e3-e0865ecfd083\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.170149 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/440f05eb-1491-4704-87e3-e0865ecfd083-proxy-ca-bundles\") pod \"440f05eb-1491-4704-87e3-e0865ecfd083\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.170191 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq5dd\" (UniqueName: \"kubernetes.io/projected/440f05eb-1491-4704-87e3-e0865ecfd083-kube-api-access-bq5dd\") pod \"440f05eb-1491-4704-87e3-e0865ecfd083\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.170261 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/440f05eb-1491-4704-87e3-e0865ecfd083-client-ca\") pod \"440f05eb-1491-4704-87e3-e0865ecfd083\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.170282 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/440f05eb-1491-4704-87e3-e0865ecfd083-config\") pod \"440f05eb-1491-4704-87e3-e0865ecfd083\" (UID: \"440f05eb-1491-4704-87e3-e0865ecfd083\") " Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.171626 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/440f05eb-1491-4704-87e3-e0865ecfd083-config" (OuterVolumeSpecName: "config") pod "440f05eb-1491-4704-87e3-e0865ecfd083" (UID: "440f05eb-1491-4704-87e3-e0865ecfd083"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.172450 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/440f05eb-1491-4704-87e3-e0865ecfd083-client-ca" (OuterVolumeSpecName: "client-ca") pod "440f05eb-1491-4704-87e3-e0865ecfd083" (UID: "440f05eb-1491-4704-87e3-e0865ecfd083"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.172576 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/440f05eb-1491-4704-87e3-e0865ecfd083-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "440f05eb-1491-4704-87e3-e0865ecfd083" (UID: "440f05eb-1491-4704-87e3-e0865ecfd083"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.179044 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/440f05eb-1491-4704-87e3-e0865ecfd083-kube-api-access-bq5dd" (OuterVolumeSpecName: "kube-api-access-bq5dd") pod "440f05eb-1491-4704-87e3-e0865ecfd083" (UID: "440f05eb-1491-4704-87e3-e0865ecfd083"). InnerVolumeSpecName "kube-api-access-bq5dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.183843 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/440f05eb-1491-4704-87e3-e0865ecfd083-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "440f05eb-1491-4704-87e3-e0865ecfd083" (UID: "440f05eb-1491-4704-87e3-e0865ecfd083"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.189848 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6986b9b5c-phln9"] Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.272519 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc530b04-9a42-4424-86d6-4920002e0f07-serving-cert\") pod \"controller-manager-6986b9b5c-phln9\" (UID: \"dc530b04-9a42-4424-86d6-4920002e0f07\") " pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.272635 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dc530b04-9a42-4424-86d6-4920002e0f07-client-ca\") pod \"controller-manager-6986b9b5c-phln9\" (UID: \"dc530b04-9a42-4424-86d6-4920002e0f07\") " pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.272821 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc530b04-9a42-4424-86d6-4920002e0f07-proxy-ca-bundles\") pod \"controller-manager-6986b9b5c-phln9\" (UID: \"dc530b04-9a42-4424-86d6-4920002e0f07\") " pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.272938 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc530b04-9a42-4424-86d6-4920002e0f07-config\") pod \"controller-manager-6986b9b5c-phln9\" (UID: \"dc530b04-9a42-4424-86d6-4920002e0f07\") " pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.272967 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnw9s\" (UniqueName: \"kubernetes.io/projected/dc530b04-9a42-4424-86d6-4920002e0f07-kube-api-access-rnw9s\") pod \"controller-manager-6986b9b5c-phln9\" (UID: \"dc530b04-9a42-4424-86d6-4920002e0f07\") " pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.273159 4954 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/440f05eb-1491-4704-87e3-e0865ecfd083-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.273187 4954 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/440f05eb-1491-4704-87e3-e0865ecfd083-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.273238 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq5dd\" (UniqueName: \"kubernetes.io/projected/440f05eb-1491-4704-87e3-e0865ecfd083-kube-api-access-bq5dd\") on node \"crc\" DevicePath \"\"" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.273266 4954 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/440f05eb-1491-4704-87e3-e0865ecfd083-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.273288 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/440f05eb-1491-4704-87e3-e0865ecfd083-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.375133 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc530b04-9a42-4424-86d6-4920002e0f07-serving-cert\") pod \"controller-manager-6986b9b5c-phln9\" (UID: \"dc530b04-9a42-4424-86d6-4920002e0f07\") " pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.375249 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dc530b04-9a42-4424-86d6-4920002e0f07-client-ca\") pod \"controller-manager-6986b9b5c-phln9\" (UID: \"dc530b04-9a42-4424-86d6-4920002e0f07\") " pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.375300 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc530b04-9a42-4424-86d6-4920002e0f07-proxy-ca-bundles\") pod \"controller-manager-6986b9b5c-phln9\" (UID: \"dc530b04-9a42-4424-86d6-4920002e0f07\") " pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.375327 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc530b04-9a42-4424-86d6-4920002e0f07-config\") pod \"controller-manager-6986b9b5c-phln9\" (UID: \"dc530b04-9a42-4424-86d6-4920002e0f07\") " pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.375346 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnw9s\" (UniqueName: \"kubernetes.io/projected/dc530b04-9a42-4424-86d6-4920002e0f07-kube-api-access-rnw9s\") pod \"controller-manager-6986b9b5c-phln9\" (UID: \"dc530b04-9a42-4424-86d6-4920002e0f07\") " pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.376907 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dc530b04-9a42-4424-86d6-4920002e0f07-client-ca\") pod \"controller-manager-6986b9b5c-phln9\" (UID: \"dc530b04-9a42-4424-86d6-4920002e0f07\") " pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.377139 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc530b04-9a42-4424-86d6-4920002e0f07-config\") pod \"controller-manager-6986b9b5c-phln9\" (UID: \"dc530b04-9a42-4424-86d6-4920002e0f07\") " pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.378436 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc530b04-9a42-4424-86d6-4920002e0f07-proxy-ca-bundles\") pod \"controller-manager-6986b9b5c-phln9\" (UID: \"dc530b04-9a42-4424-86d6-4920002e0f07\") " pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.379629 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc530b04-9a42-4424-86d6-4920002e0f07-serving-cert\") pod \"controller-manager-6986b9b5c-phln9\" (UID: \"dc530b04-9a42-4424-86d6-4920002e0f07\") " pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.396199 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnw9s\" (UniqueName: \"kubernetes.io/projected/dc530b04-9a42-4424-86d6-4920002e0f07-kube-api-access-rnw9s\") pod \"controller-manager-6986b9b5c-phln9\" (UID: \"dc530b04-9a42-4424-86d6-4920002e0f07\") " pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.514841 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:09 crc kubenswrapper[4954]: I1206 07:03:09.726624 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6986b9b5c-phln9"] Dec 06 07:03:10 crc kubenswrapper[4954]: I1206 07:03:10.050712 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" event={"ID":"dc530b04-9a42-4424-86d6-4920002e0f07","Type":"ContainerStarted","Data":"9f7d507e7d73107b40a5002fbc13762b829766659ad0a48c721f3752cd1116d2"} Dec 06 07:03:10 crc kubenswrapper[4954]: I1206 07:03:10.050783 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" event={"ID":"dc530b04-9a42-4424-86d6-4920002e0f07","Type":"ContainerStarted","Data":"17693f5cb609a1741a13ae8a5912676280a51e931206ad01447133496c06d942"} Dec 06 07:03:10 crc kubenswrapper[4954]: I1206 07:03:10.051170 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:10 crc kubenswrapper[4954]: I1206 07:03:10.052442 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" event={"ID":"440f05eb-1491-4704-87e3-e0865ecfd083","Type":"ContainerDied","Data":"d406f555b0d811cb1ba6089ededeb6368e537dc1eafea6b22a95a45a0a6cd6b8"} Dec 06 07:03:10 crc kubenswrapper[4954]: I1206 07:03:10.052600 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f55469458-nd7jf" Dec 06 07:03:10 crc kubenswrapper[4954]: I1206 07:03:10.052700 4954 scope.go:117] "RemoveContainer" containerID="40599f3b3d818ef2188670531d60d3c79b28409bf67946a84571184b59ce9208" Dec 06 07:03:10 crc kubenswrapper[4954]: I1206 07:03:10.056642 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" Dec 06 07:03:10 crc kubenswrapper[4954]: I1206 07:03:10.076874 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" podStartSLOduration=2.076839248 podStartE2EDuration="2.076839248s" podCreationTimestamp="2025-12-06 07:03:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:03:10.074016981 +0000 UTC m=+364.887376380" watchObservedRunningTime="2025-12-06 07:03:10.076839248 +0000 UTC m=+364.890198637" Dec 06 07:03:10 crc kubenswrapper[4954]: I1206 07:03:10.102255 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:03:10 crc kubenswrapper[4954]: I1206 07:03:10.102342 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:03:10 crc kubenswrapper[4954]: I1206 07:03:10.128004 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f55469458-nd7jf"] Dec 06 07:03:10 crc kubenswrapper[4954]: I1206 07:03:10.140660 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5f55469458-nd7jf"] Dec 06 07:03:11 crc kubenswrapper[4954]: I1206 07:03:11.451059 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="440f05eb-1491-4704-87e3-e0865ecfd083" path="/var/lib/kubelet/pods/440f05eb-1491-4704-87e3-e0865ecfd083/volumes" Dec 06 07:03:32 crc kubenswrapper[4954]: I1206 07:03:32.968934 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" podUID="c89e8d9c-6713-4a8b-8da7-bc90619c1f8a" containerName="registry" containerID="cri-o://0a96214556870d1e7dd2409421f4eae15f004e3c1071c0837592b26601e398d6" gracePeriod=30 Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.187752 4954 generic.go:334] "Generic (PLEG): container finished" podID="c89e8d9c-6713-4a8b-8da7-bc90619c1f8a" containerID="0a96214556870d1e7dd2409421f4eae15f004e3c1071c0837592b26601e398d6" exitCode=0 Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.187819 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" event={"ID":"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a","Type":"ContainerDied","Data":"0a96214556870d1e7dd2409421f4eae15f004e3c1071c0837592b26601e398d6"} Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.463529 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.474901 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-ca-trust-extracted\") pod \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.474968 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-registry-certificates\") pod \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.475002 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7plrp\" (UniqueName: \"kubernetes.io/projected/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-kube-api-access-7plrp\") pod \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.475056 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-trusted-ca\") pod \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.475076 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-installation-pull-secrets\") pod \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.475331 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.475389 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-bound-sa-token\") pod \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.475415 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-registry-tls\") pod \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\" (UID: \"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a\") " Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.476852 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.478052 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.492917 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.493680 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.495492 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.497060 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-kube-api-access-7plrp" (OuterVolumeSpecName: "kube-api-access-7plrp") pod "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a"). InnerVolumeSpecName "kube-api-access-7plrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.499246 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.500136 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a" (UID: "c89e8d9c-6713-4a8b-8da7-bc90619c1f8a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.577757 4954 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.577814 4954 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.577827 4954 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.577839 4954 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.577857 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7plrp\" (UniqueName: \"kubernetes.io/projected/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-kube-api-access-7plrp\") on node \"crc\" DevicePath \"\"" Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.577870 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 07:03:33 crc kubenswrapper[4954]: I1206 07:03:33.577883 4954 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 06 07:03:34 crc kubenswrapper[4954]: I1206 07:03:34.195927 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" event={"ID":"c89e8d9c-6713-4a8b-8da7-bc90619c1f8a","Type":"ContainerDied","Data":"004f732a4fc7f18394c538b68b079ccf422e6607ffce83b1adf2de484b210837"} Dec 06 07:03:34 crc kubenswrapper[4954]: I1206 07:03:34.195995 4954 scope.go:117] "RemoveContainer" containerID="0a96214556870d1e7dd2409421f4eae15f004e3c1071c0837592b26601e398d6" Dec 06 07:03:34 crc kubenswrapper[4954]: I1206 07:03:34.196064 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sc4fg" Dec 06 07:03:34 crc kubenswrapper[4954]: I1206 07:03:34.235934 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sc4fg"] Dec 06 07:03:34 crc kubenswrapper[4954]: I1206 07:03:34.240372 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sc4fg"] Dec 06 07:03:35 crc kubenswrapper[4954]: I1206 07:03:35.451390 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c89e8d9c-6713-4a8b-8da7-bc90619c1f8a" path="/var/lib/kubelet/pods/c89e8d9c-6713-4a8b-8da7-bc90619c1f8a/volumes" Dec 06 07:03:40 crc kubenswrapper[4954]: I1206 07:03:40.101362 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:03:40 crc kubenswrapper[4954]: I1206 07:03:40.101866 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:04:10 crc kubenswrapper[4954]: I1206 07:04:10.101668 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:04:10 crc kubenswrapper[4954]: I1206 07:04:10.102542 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:04:10 crc kubenswrapper[4954]: I1206 07:04:10.102630 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 07:04:10 crc kubenswrapper[4954]: I1206 07:04:10.103382 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6914d276fe4a0b459ad50829342dc22d6ca712a1175430fe981582770ec6fdab"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 07:04:10 crc kubenswrapper[4954]: I1206 07:04:10.103450 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://6914d276fe4a0b459ad50829342dc22d6ca712a1175430fe981582770ec6fdab" gracePeriod=600 Dec 06 07:04:10 crc kubenswrapper[4954]: I1206 07:04:10.420434 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="6914d276fe4a0b459ad50829342dc22d6ca712a1175430fe981582770ec6fdab" exitCode=0 Dec 06 07:04:10 crc kubenswrapper[4954]: I1206 07:04:10.420672 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"6914d276fe4a0b459ad50829342dc22d6ca712a1175430fe981582770ec6fdab"} Dec 06 07:04:10 crc kubenswrapper[4954]: I1206 07:04:10.421034 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"d0c3c6a88b0be66ff290dabcd7ce56c5ffdbad3c9a80efd470d63d942c17ce6d"} Dec 06 07:04:10 crc kubenswrapper[4954]: I1206 07:04:10.421066 4954 scope.go:117] "RemoveContainer" containerID="9c090f6581aaad8bdcfa148e753f7da5d167a0af53ba16f3c7d1ebb340e77066" Dec 06 07:06:05 crc kubenswrapper[4954]: I1206 07:06:05.653328 4954 scope.go:117] "RemoveContainer" containerID="5af4757e9ea26ef3a1ad1c9124b5cea1df905378f885c22950890d2303be7c41" Dec 06 07:06:10 crc kubenswrapper[4954]: I1206 07:06:10.101542 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:06:10 crc kubenswrapper[4954]: I1206 07:06:10.102093 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:06:40 crc kubenswrapper[4954]: I1206 07:06:40.100869 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:06:40 crc kubenswrapper[4954]: I1206 07:06:40.101800 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:07:10 crc kubenswrapper[4954]: I1206 07:07:10.102049 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:07:10 crc kubenswrapper[4954]: I1206 07:07:10.103068 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:07:10 crc kubenswrapper[4954]: I1206 07:07:10.103160 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 07:07:10 crc kubenswrapper[4954]: I1206 07:07:10.104144 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d0c3c6a88b0be66ff290dabcd7ce56c5ffdbad3c9a80efd470d63d942c17ce6d"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 07:07:10 crc kubenswrapper[4954]: I1206 07:07:10.104204 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://d0c3c6a88b0be66ff290dabcd7ce56c5ffdbad3c9a80efd470d63d942c17ce6d" gracePeriod=600 Dec 06 07:07:10 crc kubenswrapper[4954]: I1206 07:07:10.566866 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="d0c3c6a88b0be66ff290dabcd7ce56c5ffdbad3c9a80efd470d63d942c17ce6d" exitCode=0 Dec 06 07:07:10 crc kubenswrapper[4954]: I1206 07:07:10.566939 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"d0c3c6a88b0be66ff290dabcd7ce56c5ffdbad3c9a80efd470d63d942c17ce6d"} Dec 06 07:07:10 crc kubenswrapper[4954]: I1206 07:07:10.567279 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"cfcc55f9e114bfc684b44c6b3eb548c36a24218366f0dfbb06f158d610a881a9"} Dec 06 07:07:10 crc kubenswrapper[4954]: I1206 07:07:10.567305 4954 scope.go:117] "RemoveContainer" containerID="6914d276fe4a0b459ad50829342dc22d6ca712a1175430fe981582770ec6fdab" Dec 06 07:09:10 crc kubenswrapper[4954]: I1206 07:09:10.100814 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:09:10 crc kubenswrapper[4954]: I1206 07:09:10.101480 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:09:40 crc kubenswrapper[4954]: I1206 07:09:40.101437 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:09:40 crc kubenswrapper[4954]: I1206 07:09:40.102426 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:09:49 crc kubenswrapper[4954]: I1206 07:09:49.299048 4954 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 06 07:10:10 crc kubenswrapper[4954]: I1206 07:10:10.101107 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:10:10 crc kubenswrapper[4954]: I1206 07:10:10.101725 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:10:10 crc kubenswrapper[4954]: I1206 07:10:10.101770 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 07:10:10 crc kubenswrapper[4954]: I1206 07:10:10.102244 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cfcc55f9e114bfc684b44c6b3eb548c36a24218366f0dfbb06f158d610a881a9"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 07:10:10 crc kubenswrapper[4954]: I1206 07:10:10.102308 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://cfcc55f9e114bfc684b44c6b3eb548c36a24218366f0dfbb06f158d610a881a9" gracePeriod=600 Dec 06 07:10:10 crc kubenswrapper[4954]: I1206 07:10:10.665138 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="cfcc55f9e114bfc684b44c6b3eb548c36a24218366f0dfbb06f158d610a881a9" exitCode=0 Dec 06 07:10:10 crc kubenswrapper[4954]: I1206 07:10:10.665182 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"cfcc55f9e114bfc684b44c6b3eb548c36a24218366f0dfbb06f158d610a881a9"} Dec 06 07:10:10 crc kubenswrapper[4954]: I1206 07:10:10.666258 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"16342cc1e9e4e44c49d62d3e217ed95b4b7faa3f28bb16ba730ed0b900485233"} Dec 06 07:10:10 crc kubenswrapper[4954]: I1206 07:10:10.666301 4954 scope.go:117] "RemoveContainer" containerID="d0c3c6a88b0be66ff290dabcd7ce56c5ffdbad3c9a80efd470d63d942c17ce6d" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.554449 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-crz6w"] Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.556039 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovn-controller" containerID="cri-o://8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6" gracePeriod=30 Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.556192 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c" gracePeriod=30 Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.556260 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovn-acl-logging" containerID="cri-o://a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456" gracePeriod=30 Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.556190 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="sbdb" containerID="cri-o://07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f" gracePeriod=30 Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.556297 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="kube-rbac-proxy-node" containerID="cri-o://087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e" gracePeriod=30 Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.556156 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="nbdb" containerID="cri-o://771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4" gracePeriod=30 Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.556365 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="northd" containerID="cri-o://f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b" gracePeriod=30 Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.585721 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovnkube-controller" containerID="cri-o://0c9ec58f7b3b34bd066406e31183b463197f7ff4457112cb1e17a7f40e20eadb" gracePeriod=30 Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.834199 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovnkube-controller/3.log" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.838253 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovn-acl-logging/0.log" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.838846 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovn-controller/0.log" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839336 4954 generic.go:334] "Generic (PLEG): container finished" podID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerID="0c9ec58f7b3b34bd066406e31183b463197f7ff4457112cb1e17a7f40e20eadb" exitCode=0 Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839377 4954 generic.go:334] "Generic (PLEG): container finished" podID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerID="07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f" exitCode=0 Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839385 4954 generic.go:334] "Generic (PLEG): container finished" podID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerID="771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4" exitCode=0 Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839395 4954 generic.go:334] "Generic (PLEG): container finished" podID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerID="f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b" exitCode=0 Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839405 4954 generic.go:334] "Generic (PLEG): container finished" podID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerID="bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c" exitCode=0 Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839405 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerDied","Data":"0c9ec58f7b3b34bd066406e31183b463197f7ff4457112cb1e17a7f40e20eadb"} Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839471 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerDied","Data":"07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f"} Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839487 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerDied","Data":"771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4"} Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839500 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerDied","Data":"f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b"} Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839511 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerDied","Data":"bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c"} Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839424 4954 generic.go:334] "Generic (PLEG): container finished" podID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerID="087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e" exitCode=0 Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839540 4954 scope.go:117] "RemoveContainer" containerID="eb1edbcb12fbf2d933be983b2d4ed0eef36eb346d6d792ed44e67c0f3377b7f1" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839526 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerDied","Data":"087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e"} Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839542 4954 generic.go:334] "Generic (PLEG): container finished" podID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerID="a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456" exitCode=143 Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839703 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerDied","Data":"a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456"} Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839717 4954 generic.go:334] "Generic (PLEG): container finished" podID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerID="8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6" exitCode=143 Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839742 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerDied","Data":"8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6"} Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839764 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" event={"ID":"7cc429b1-3932-4515-a2b8-f0dd601f3e4c","Type":"ContainerDied","Data":"9cb7f752f403d46b9bebe914f4a9490233f5c9a23fa8eee1613b03c6335ce412"} Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.839782 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cb7f752f403d46b9bebe914f4a9490233f5c9a23fa8eee1613b03c6335ce412" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.842162 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rsvgk_1d174f37-f89e-4daf-a663-3cad4e33dad2/kube-multus/2.log" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.842681 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rsvgk_1d174f37-f89e-4daf-a663-3cad4e33dad2/kube-multus/1.log" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.842724 4954 generic.go:334] "Generic (PLEG): container finished" podID="1d174f37-f89e-4daf-a663-3cad4e33dad2" containerID="927b002a44fb819e93b6e6a1f9d1406a9e6322216198277c321d2d21450973f5" exitCode=2 Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.842753 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rsvgk" event={"ID":"1d174f37-f89e-4daf-a663-3cad4e33dad2","Type":"ContainerDied","Data":"927b002a44fb819e93b6e6a1f9d1406a9e6322216198277c321d2d21450973f5"} Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.843334 4954 scope.go:117] "RemoveContainer" containerID="927b002a44fb819e93b6e6a1f9d1406a9e6322216198277c321d2d21450973f5" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.847092 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovn-acl-logging/0.log" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.847758 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovn-controller/0.log" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.848324 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.874180 4954 scope.go:117] "RemoveContainer" containerID="2fc97358db525a0093a6b1350179318aec4b2da84f919bbdf3f2e5a56205a363" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.895445 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-ovnkube-config\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.895545 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-log-socket\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.895594 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-ovn-node-metrics-cert\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.895725 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-systemd-units\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.895751 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.895767 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-slash\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.895791 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-cni-netd\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.895823 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-kubelet\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.895853 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-run-openvswitch\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.895899 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2rg6\" (UniqueName: \"kubernetes.io/projected/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-kube-api-access-f2rg6\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.895931 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-node-log\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.895952 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-run-netns\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.895968 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-run-ovn\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.895992 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-var-lib-openvswitch\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.896015 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-env-overrides\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.896043 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-ovnkube-script-lib\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.896073 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-etc-openvswitch\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.896096 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-run-ovn-kubernetes\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.896127 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-run-systemd\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.896150 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-cni-bin\") pod \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\" (UID: \"7cc429b1-3932-4515-a2b8-f0dd601f3e4c\") " Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.896431 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.896647 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.896675 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.897044 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.897286 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.897357 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-node-log" (OuterVolumeSpecName: "node-log") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.897476 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-log-socket" (OuterVolumeSpecName: "log-socket") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.897523 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.897581 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-slash" (OuterVolumeSpecName: "host-slash") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.897623 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.897631 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.897669 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.897649 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.897693 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.897712 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.897843 4954 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.898225 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.903368 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.905708 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.905976 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-kube-api-access-f2rg6" (OuterVolumeSpecName: "kube-api-access-f2rg6") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "kube-api-access-f2rg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.917329 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "7cc429b1-3932-4515-a2b8-f0dd601f3e4c" (UID: "7cc429b1-3932-4515-a2b8-f0dd601f3e4c"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.926332 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bb2zw"] Dec 06 07:10:37 crc kubenswrapper[4954]: E1206 07:10:37.926639 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="northd" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.926658 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="northd" Dec 06 07:10:37 crc kubenswrapper[4954]: E1206 07:10:37.926667 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="kube-rbac-proxy-ovn-metrics" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.926674 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="kube-rbac-proxy-ovn-metrics" Dec 06 07:10:37 crc kubenswrapper[4954]: E1206 07:10:37.926684 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovnkube-controller" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.926691 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovnkube-controller" Dec 06 07:10:37 crc kubenswrapper[4954]: E1206 07:10:37.926703 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovn-controller" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.926712 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovn-controller" Dec 06 07:10:37 crc kubenswrapper[4954]: E1206 07:10:37.926724 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovnkube-controller" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.926732 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovnkube-controller" Dec 06 07:10:37 crc kubenswrapper[4954]: E1206 07:10:37.926742 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="nbdb" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.926749 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="nbdb" Dec 06 07:10:37 crc kubenswrapper[4954]: E1206 07:10:37.926762 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="kubecfg-setup" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.926773 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="kubecfg-setup" Dec 06 07:10:37 crc kubenswrapper[4954]: E1206 07:10:37.926792 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="kube-rbac-proxy-node" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.926800 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="kube-rbac-proxy-node" Dec 06 07:10:37 crc kubenswrapper[4954]: E1206 07:10:37.926811 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="sbdb" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.926819 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="sbdb" Dec 06 07:10:37 crc kubenswrapper[4954]: E1206 07:10:37.926827 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovnkube-controller" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.926832 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovnkube-controller" Dec 06 07:10:37 crc kubenswrapper[4954]: E1206 07:10:37.926839 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovnkube-controller" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.926844 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovnkube-controller" Dec 06 07:10:37 crc kubenswrapper[4954]: E1206 07:10:37.926853 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c89e8d9c-6713-4a8b-8da7-bc90619c1f8a" containerName="registry" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.926859 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c89e8d9c-6713-4a8b-8da7-bc90619c1f8a" containerName="registry" Dec 06 07:10:37 crc kubenswrapper[4954]: E1206 07:10:37.926870 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovn-acl-logging" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.926875 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovn-acl-logging" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.926986 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovn-controller" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.926998 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovnkube-controller" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.927005 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="kube-rbac-proxy-ovn-metrics" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.927012 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="sbdb" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.927021 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="kube-rbac-proxy-node" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.927030 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c89e8d9c-6713-4a8b-8da7-bc90619c1f8a" containerName="registry" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.927038 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovnkube-controller" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.927044 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="nbdb" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.927050 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovnkube-controller" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.927056 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="northd" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.927064 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovn-acl-logging" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.927074 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovnkube-controller" Dec 06 07:10:37 crc kubenswrapper[4954]: E1206 07:10:37.927163 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovnkube-controller" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.927170 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovnkube-controller" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.927272 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" containerName="ovnkube-controller" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.929258 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.998640 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-slash\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.998704 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-node-log\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.998736 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86thk\" (UniqueName: \"kubernetes.io/projected/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-kube-api-access-86thk\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.998761 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-run-netns\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.998776 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-run-ovn-kubernetes\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.998794 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-etc-openvswitch\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.998815 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-var-lib-openvswitch\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.998845 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-run-openvswitch\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.998864 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-run-systemd\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.998882 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-env-overrides\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.998906 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.998979 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-systemd-units\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.998996 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-ovnkube-config\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999016 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-cni-bin\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999057 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-run-ovn\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999080 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-log-socket\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999114 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-ovnkube-script-lib\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999132 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-cni-netd\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999151 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-kubelet\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999167 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-ovn-node-metrics-cert\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999288 4954 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999325 4954 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999337 4954 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999347 4954 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999356 4954 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999365 4954 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-log-socket\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999374 4954 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999384 4954 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999396 4954 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-slash\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999407 4954 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999417 4954 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999428 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2rg6\" (UniqueName: \"kubernetes.io/projected/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-kube-api-access-f2rg6\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999436 4954 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999445 4954 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-node-log\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999453 4954 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999461 4954 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999493 4954 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999502 4954 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:37 crc kubenswrapper[4954]: I1206 07:10:37.999512 4954 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7cc429b1-3932-4515-a2b8-f0dd601f3e4c-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.101637 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-ovnkube-script-lib\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.101727 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-cni-netd\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.101756 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-kubelet\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.101783 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-ovn-node-metrics-cert\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.101814 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-node-log\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.101840 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-slash\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.101867 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86thk\" (UniqueName: \"kubernetes.io/projected/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-kube-api-access-86thk\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.101894 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-run-netns\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.101917 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-run-ovn-kubernetes\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.101940 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-etc-openvswitch\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.101969 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-var-lib-openvswitch\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.101997 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-run-openvswitch\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.102021 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-run-systemd\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.102040 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-env-overrides\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.102072 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.102107 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-systemd-units\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.102129 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-ovnkube-config\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.102154 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-cni-bin\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.102176 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-run-ovn\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.102193 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-log-socket\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.102292 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-log-socket\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.102352 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-cni-netd\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.102380 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-kubelet\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.102671 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-ovnkube-script-lib\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.102770 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-run-openvswitch\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.102814 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-node-log\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.102856 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-slash\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.103270 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-run-ovn-kubernetes\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.103320 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-var-lib-openvswitch\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.103326 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-systemd-units\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.103355 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-run-netns\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.103361 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.103388 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-host-cni-bin\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.103290 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-etc-openvswitch\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.103435 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-run-ovn\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.103490 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-run-systemd\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.104022 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-env-overrides\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.104089 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-ovnkube-config\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.108528 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-ovn-node-metrics-cert\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.120691 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86thk\" (UniqueName: \"kubernetes.io/projected/c527a9e1-e1f5-45f7-88c5-6d22c902c47c-kube-api-access-86thk\") pod \"ovnkube-node-bb2zw\" (UID: \"c527a9e1-e1f5-45f7-88c5-6d22c902c47c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.248192 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:38 crc kubenswrapper[4954]: W1206 07:10:38.266370 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc527a9e1_e1f5_45f7_88c5_6d22c902c47c.slice/crio-2cfa8a1c9453b7ba2f6ad0b339c6ac46007ebfc576c10ea1807d9e837ced9192 WatchSource:0}: Error finding container 2cfa8a1c9453b7ba2f6ad0b339c6ac46007ebfc576c10ea1807d9e837ced9192: Status 404 returned error can't find the container with id 2cfa8a1c9453b7ba2f6ad0b339c6ac46007ebfc576c10ea1807d9e837ced9192 Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.859787 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovn-acl-logging/0.log" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.861878 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crz6w_7cc429b1-3932-4515-a2b8-f0dd601f3e4c/ovn-controller/0.log" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.862447 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-crz6w" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.864753 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rsvgk_1d174f37-f89e-4daf-a663-3cad4e33dad2/kube-multus/2.log" Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.864899 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rsvgk" event={"ID":"1d174f37-f89e-4daf-a663-3cad4e33dad2","Type":"ContainerStarted","Data":"e8e0f3fea576265f19468d8585710c03454f6067f46b94c928800b852874f357"} Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.866890 4954 generic.go:334] "Generic (PLEG): container finished" podID="c527a9e1-e1f5-45f7-88c5-6d22c902c47c" containerID="bad0f8c43a315f8eca54cd84c5f967982fde0f043666cde1adefe630d0fa43dd" exitCode=0 Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.866945 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" event={"ID":"c527a9e1-e1f5-45f7-88c5-6d22c902c47c","Type":"ContainerDied","Data":"bad0f8c43a315f8eca54cd84c5f967982fde0f043666cde1adefe630d0fa43dd"} Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.866983 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" event={"ID":"c527a9e1-e1f5-45f7-88c5-6d22c902c47c","Type":"ContainerStarted","Data":"2cfa8a1c9453b7ba2f6ad0b339c6ac46007ebfc576c10ea1807d9e837ced9192"} Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.928459 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-crz6w"] Dec 06 07:10:38 crc kubenswrapper[4954]: I1206 07:10:38.939440 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-crz6w"] Dec 06 07:10:39 crc kubenswrapper[4954]: I1206 07:10:39.451969 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cc429b1-3932-4515-a2b8-f0dd601f3e4c" path="/var/lib/kubelet/pods/7cc429b1-3932-4515-a2b8-f0dd601f3e4c/volumes" Dec 06 07:10:39 crc kubenswrapper[4954]: I1206 07:10:39.876522 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" event={"ID":"c527a9e1-e1f5-45f7-88c5-6d22c902c47c","Type":"ContainerStarted","Data":"44af10ccef49b795374064de91e8f3dab0b454bc5fc8d4a0ca7e9badeb81e607"} Dec 06 07:10:39 crc kubenswrapper[4954]: I1206 07:10:39.876593 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" event={"ID":"c527a9e1-e1f5-45f7-88c5-6d22c902c47c","Type":"ContainerStarted","Data":"d829eeef5f57e9a8ca522cd71be87347c1409bb82f4311e5c163be650c88b6e1"} Dec 06 07:10:39 crc kubenswrapper[4954]: I1206 07:10:39.876628 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" event={"ID":"c527a9e1-e1f5-45f7-88c5-6d22c902c47c","Type":"ContainerStarted","Data":"be404c07501be4f695b63b8dd192492cb2b401af15c1606ea92b27fba49a55c1"} Dec 06 07:10:39 crc kubenswrapper[4954]: I1206 07:10:39.876637 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" event={"ID":"c527a9e1-e1f5-45f7-88c5-6d22c902c47c","Type":"ContainerStarted","Data":"48bd0106c60ec11dca78b01cf15224189ca2e0a64984c97b098666199edb0155"} Dec 06 07:10:39 crc kubenswrapper[4954]: I1206 07:10:39.876647 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" event={"ID":"c527a9e1-e1f5-45f7-88c5-6d22c902c47c","Type":"ContainerStarted","Data":"44efdf021ae68fa85b71b7aaa77ee53803d1d675d1307bee49638c0d1b67e3ad"} Dec 06 07:10:39 crc kubenswrapper[4954]: I1206 07:10:39.876664 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" event={"ID":"c527a9e1-e1f5-45f7-88c5-6d22c902c47c","Type":"ContainerStarted","Data":"04ead37bd3eccb13a2af8a70fa5f7e123809e9f2958622bcbe163777250867e5"} Dec 06 07:10:42 crc kubenswrapper[4954]: I1206 07:10:42.900231 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" event={"ID":"c527a9e1-e1f5-45f7-88c5-6d22c902c47c","Type":"ContainerStarted","Data":"5dd3148be11fc0353b8a816b0f4e7ff9179375bc6901763b3cb60d2b6a90bfe0"} Dec 06 07:10:46 crc kubenswrapper[4954]: I1206 07:10:46.929048 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" event={"ID":"c527a9e1-e1f5-45f7-88c5-6d22c902c47c","Type":"ContainerStarted","Data":"887e7a0ab3886f61a42735b2bf98cd8d13c49df8e6aa70fb625603ef7f39a340"} Dec 06 07:10:46 crc kubenswrapper[4954]: I1206 07:10:46.929653 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:46 crc kubenswrapper[4954]: I1206 07:10:46.929674 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:46 crc kubenswrapper[4954]: I1206 07:10:46.958383 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:46 crc kubenswrapper[4954]: I1206 07:10:46.967355 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" podStartSLOduration=9.967335213 podStartE2EDuration="9.967335213s" podCreationTimestamp="2025-12-06 07:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:10:46.960723787 +0000 UTC m=+821.774083186" watchObservedRunningTime="2025-12-06 07:10:46.967335213 +0000 UTC m=+821.780694602" Dec 06 07:10:47 crc kubenswrapper[4954]: I1206 07:10:47.936095 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:47 crc kubenswrapper[4954]: I1206 07:10:47.963969 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.310686 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-5ql4s"] Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.311965 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5ql4s" Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.314263 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.315175 4954 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-98b5k" Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.315202 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.317285 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.320010 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-5ql4s"] Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.357011 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8eaf5c66-35fa-41ac-915e-adfac50f0261-crc-storage\") pod \"crc-storage-crc-5ql4s\" (UID: \"8eaf5c66-35fa-41ac-915e-adfac50f0261\") " pod="crc-storage/crc-storage-crc-5ql4s" Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.357073 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8eaf5c66-35fa-41ac-915e-adfac50f0261-node-mnt\") pod \"crc-storage-crc-5ql4s\" (UID: \"8eaf5c66-35fa-41ac-915e-adfac50f0261\") " pod="crc-storage/crc-storage-crc-5ql4s" Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.357117 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9556\" (UniqueName: \"kubernetes.io/projected/8eaf5c66-35fa-41ac-915e-adfac50f0261-kube-api-access-k9556\") pod \"crc-storage-crc-5ql4s\" (UID: \"8eaf5c66-35fa-41ac-915e-adfac50f0261\") " pod="crc-storage/crc-storage-crc-5ql4s" Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.458729 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8eaf5c66-35fa-41ac-915e-adfac50f0261-crc-storage\") pod \"crc-storage-crc-5ql4s\" (UID: \"8eaf5c66-35fa-41ac-915e-adfac50f0261\") " pod="crc-storage/crc-storage-crc-5ql4s" Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.458802 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8eaf5c66-35fa-41ac-915e-adfac50f0261-node-mnt\") pod \"crc-storage-crc-5ql4s\" (UID: \"8eaf5c66-35fa-41ac-915e-adfac50f0261\") " pod="crc-storage/crc-storage-crc-5ql4s" Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.458842 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9556\" (UniqueName: \"kubernetes.io/projected/8eaf5c66-35fa-41ac-915e-adfac50f0261-kube-api-access-k9556\") pod \"crc-storage-crc-5ql4s\" (UID: \"8eaf5c66-35fa-41ac-915e-adfac50f0261\") " pod="crc-storage/crc-storage-crc-5ql4s" Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.459207 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8eaf5c66-35fa-41ac-915e-adfac50f0261-node-mnt\") pod \"crc-storage-crc-5ql4s\" (UID: \"8eaf5c66-35fa-41ac-915e-adfac50f0261\") " pod="crc-storage/crc-storage-crc-5ql4s" Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.459814 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8eaf5c66-35fa-41ac-915e-adfac50f0261-crc-storage\") pod \"crc-storage-crc-5ql4s\" (UID: \"8eaf5c66-35fa-41ac-915e-adfac50f0261\") " pod="crc-storage/crc-storage-crc-5ql4s" Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.482760 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9556\" (UniqueName: \"kubernetes.io/projected/8eaf5c66-35fa-41ac-915e-adfac50f0261-kube-api-access-k9556\") pod \"crc-storage-crc-5ql4s\" (UID: \"8eaf5c66-35fa-41ac-915e-adfac50f0261\") " pod="crc-storage/crc-storage-crc-5ql4s" Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.633777 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5ql4s" Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.848589 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-5ql4s"] Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.858865 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 07:10:49 crc kubenswrapper[4954]: I1206 07:10:49.952175 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5ql4s" event={"ID":"8eaf5c66-35fa-41ac-915e-adfac50f0261","Type":"ContainerStarted","Data":"975cc0144afbb873b42bb467a7d34c61771b3bb1f0b4a0021d61aa9680a514d8"} Dec 06 07:10:52 crc kubenswrapper[4954]: I1206 07:10:52.973481 4954 generic.go:334] "Generic (PLEG): container finished" podID="8eaf5c66-35fa-41ac-915e-adfac50f0261" containerID="df96f4f2d31d66b67a00f4ca761a910560a4aa6b969945f427faf07832ee150c" exitCode=0 Dec 06 07:10:52 crc kubenswrapper[4954]: I1206 07:10:52.973612 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5ql4s" event={"ID":"8eaf5c66-35fa-41ac-915e-adfac50f0261","Type":"ContainerDied","Data":"df96f4f2d31d66b67a00f4ca761a910560a4aa6b969945f427faf07832ee150c"} Dec 06 07:10:54 crc kubenswrapper[4954]: I1206 07:10:54.225491 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5ql4s" Dec 06 07:10:54 crc kubenswrapper[4954]: I1206 07:10:54.328916 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8eaf5c66-35fa-41ac-915e-adfac50f0261-node-mnt\") pod \"8eaf5c66-35fa-41ac-915e-adfac50f0261\" (UID: \"8eaf5c66-35fa-41ac-915e-adfac50f0261\") " Dec 06 07:10:54 crc kubenswrapper[4954]: I1206 07:10:54.329126 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8eaf5c66-35fa-41ac-915e-adfac50f0261-crc-storage\") pod \"8eaf5c66-35fa-41ac-915e-adfac50f0261\" (UID: \"8eaf5c66-35fa-41ac-915e-adfac50f0261\") " Dec 06 07:10:54 crc kubenswrapper[4954]: I1206 07:10:54.329129 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8eaf5c66-35fa-41ac-915e-adfac50f0261-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "8eaf5c66-35fa-41ac-915e-adfac50f0261" (UID: "8eaf5c66-35fa-41ac-915e-adfac50f0261"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:10:54 crc kubenswrapper[4954]: I1206 07:10:54.329157 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9556\" (UniqueName: \"kubernetes.io/projected/8eaf5c66-35fa-41ac-915e-adfac50f0261-kube-api-access-k9556\") pod \"8eaf5c66-35fa-41ac-915e-adfac50f0261\" (UID: \"8eaf5c66-35fa-41ac-915e-adfac50f0261\") " Dec 06 07:10:54 crc kubenswrapper[4954]: I1206 07:10:54.329794 4954 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8eaf5c66-35fa-41ac-915e-adfac50f0261-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:54 crc kubenswrapper[4954]: I1206 07:10:54.345836 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eaf5c66-35fa-41ac-915e-adfac50f0261-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "8eaf5c66-35fa-41ac-915e-adfac50f0261" (UID: "8eaf5c66-35fa-41ac-915e-adfac50f0261"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:10:54 crc kubenswrapper[4954]: I1206 07:10:54.351583 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eaf5c66-35fa-41ac-915e-adfac50f0261-kube-api-access-k9556" (OuterVolumeSpecName: "kube-api-access-k9556") pod "8eaf5c66-35fa-41ac-915e-adfac50f0261" (UID: "8eaf5c66-35fa-41ac-915e-adfac50f0261"). InnerVolumeSpecName "kube-api-access-k9556". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:10:54 crc kubenswrapper[4954]: I1206 07:10:54.430748 4954 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8eaf5c66-35fa-41ac-915e-adfac50f0261-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:54 crc kubenswrapper[4954]: I1206 07:10:54.430805 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9556\" (UniqueName: \"kubernetes.io/projected/8eaf5c66-35fa-41ac-915e-adfac50f0261-kube-api-access-k9556\") on node \"crc\" DevicePath \"\"" Dec 06 07:10:54 crc kubenswrapper[4954]: I1206 07:10:54.988992 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5ql4s" event={"ID":"8eaf5c66-35fa-41ac-915e-adfac50f0261","Type":"ContainerDied","Data":"975cc0144afbb873b42bb467a7d34c61771b3bb1f0b4a0021d61aa9680a514d8"} Dec 06 07:10:54 crc kubenswrapper[4954]: I1206 07:10:54.989059 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5ql4s" Dec 06 07:10:54 crc kubenswrapper[4954]: I1206 07:10:54.989073 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="975cc0144afbb873b42bb467a7d34c61771b3bb1f0b4a0021d61aa9680a514d8" Dec 06 07:11:01 crc kubenswrapper[4954]: I1206 07:11:01.951698 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9"] Dec 06 07:11:01 crc kubenswrapper[4954]: E1206 07:11:01.952535 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eaf5c66-35fa-41ac-915e-adfac50f0261" containerName="storage" Dec 06 07:11:01 crc kubenswrapper[4954]: I1206 07:11:01.952552 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eaf5c66-35fa-41ac-915e-adfac50f0261" containerName="storage" Dec 06 07:11:01 crc kubenswrapper[4954]: I1206 07:11:01.952740 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eaf5c66-35fa-41ac-915e-adfac50f0261" containerName="storage" Dec 06 07:11:01 crc kubenswrapper[4954]: I1206 07:11:01.953664 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9" Dec 06 07:11:01 crc kubenswrapper[4954]: I1206 07:11:01.955773 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 06 07:11:01 crc kubenswrapper[4954]: I1206 07:11:01.964130 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9"] Dec 06 07:11:02 crc kubenswrapper[4954]: I1206 07:11:02.141686 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd3269d5-87e6-4cd2-a667-217f6da3b3f9-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9\" (UID: \"dd3269d5-87e6-4cd2-a667-217f6da3b3f9\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9" Dec 06 07:11:02 crc kubenswrapper[4954]: I1206 07:11:02.141774 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bhnx\" (UniqueName: \"kubernetes.io/projected/dd3269d5-87e6-4cd2-a667-217f6da3b3f9-kube-api-access-8bhnx\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9\" (UID: \"dd3269d5-87e6-4cd2-a667-217f6da3b3f9\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9" Dec 06 07:11:02 crc kubenswrapper[4954]: I1206 07:11:02.141840 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd3269d5-87e6-4cd2-a667-217f6da3b3f9-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9\" (UID: \"dd3269d5-87e6-4cd2-a667-217f6da3b3f9\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9" Dec 06 07:11:02 crc kubenswrapper[4954]: I1206 07:11:02.243754 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd3269d5-87e6-4cd2-a667-217f6da3b3f9-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9\" (UID: \"dd3269d5-87e6-4cd2-a667-217f6da3b3f9\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9" Dec 06 07:11:02 crc kubenswrapper[4954]: I1206 07:11:02.244077 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd3269d5-87e6-4cd2-a667-217f6da3b3f9-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9\" (UID: \"dd3269d5-87e6-4cd2-a667-217f6da3b3f9\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9" Dec 06 07:11:02 crc kubenswrapper[4954]: I1206 07:11:02.244203 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bhnx\" (UniqueName: \"kubernetes.io/projected/dd3269d5-87e6-4cd2-a667-217f6da3b3f9-kube-api-access-8bhnx\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9\" (UID: \"dd3269d5-87e6-4cd2-a667-217f6da3b3f9\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9" Dec 06 07:11:02 crc kubenswrapper[4954]: I1206 07:11:02.244435 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd3269d5-87e6-4cd2-a667-217f6da3b3f9-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9\" (UID: \"dd3269d5-87e6-4cd2-a667-217f6da3b3f9\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9" Dec 06 07:11:02 crc kubenswrapper[4954]: I1206 07:11:02.245641 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd3269d5-87e6-4cd2-a667-217f6da3b3f9-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9\" (UID: \"dd3269d5-87e6-4cd2-a667-217f6da3b3f9\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9" Dec 06 07:11:02 crc kubenswrapper[4954]: I1206 07:11:02.270680 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bhnx\" (UniqueName: \"kubernetes.io/projected/dd3269d5-87e6-4cd2-a667-217f6da3b3f9-kube-api-access-8bhnx\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9\" (UID: \"dd3269d5-87e6-4cd2-a667-217f6da3b3f9\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9" Dec 06 07:11:02 crc kubenswrapper[4954]: I1206 07:11:02.275001 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9" Dec 06 07:11:02 crc kubenswrapper[4954]: I1206 07:11:02.487217 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9"] Dec 06 07:11:02 crc kubenswrapper[4954]: W1206 07:11:02.494112 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd3269d5_87e6_4cd2_a667_217f6da3b3f9.slice/crio-666a27ad644e04714cc1fc205af929cbd34d8471ce17e6b5476c701fd1fd092f WatchSource:0}: Error finding container 666a27ad644e04714cc1fc205af929cbd34d8471ce17e6b5476c701fd1fd092f: Status 404 returned error can't find the container with id 666a27ad644e04714cc1fc205af929cbd34d8471ce17e6b5476c701fd1fd092f Dec 06 07:11:03 crc kubenswrapper[4954]: I1206 07:11:03.039687 4954 generic.go:334] "Generic (PLEG): container finished" podID="dd3269d5-87e6-4cd2-a667-217f6da3b3f9" containerID="b0023fb7376f640dba5e58172f4e4a626b4cb0732627b8ddc6aa814ab1575733" exitCode=0 Dec 06 07:11:03 crc kubenswrapper[4954]: I1206 07:11:03.039752 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9" event={"ID":"dd3269d5-87e6-4cd2-a667-217f6da3b3f9","Type":"ContainerDied","Data":"b0023fb7376f640dba5e58172f4e4a626b4cb0732627b8ddc6aa814ab1575733"} Dec 06 07:11:03 crc kubenswrapper[4954]: I1206 07:11:03.039868 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9" event={"ID":"dd3269d5-87e6-4cd2-a667-217f6da3b3f9","Type":"ContainerStarted","Data":"666a27ad644e04714cc1fc205af929cbd34d8471ce17e6b5476c701fd1fd092f"} Dec 06 07:11:03 crc kubenswrapper[4954]: I1206 07:11:03.755596 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9pdfn"] Dec 06 07:11:03 crc kubenswrapper[4954]: I1206 07:11:03.756994 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9pdfn" Dec 06 07:11:03 crc kubenswrapper[4954]: I1206 07:11:03.775440 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9pdfn"] Dec 06 07:11:03 crc kubenswrapper[4954]: I1206 07:11:03.866296 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed01c7d5-c189-423f-a812-a4963501bd12-catalog-content\") pod \"redhat-operators-9pdfn\" (UID: \"ed01c7d5-c189-423f-a812-a4963501bd12\") " pod="openshift-marketplace/redhat-operators-9pdfn" Dec 06 07:11:03 crc kubenswrapper[4954]: I1206 07:11:03.866392 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4fl2\" (UniqueName: \"kubernetes.io/projected/ed01c7d5-c189-423f-a812-a4963501bd12-kube-api-access-m4fl2\") pod \"redhat-operators-9pdfn\" (UID: \"ed01c7d5-c189-423f-a812-a4963501bd12\") " pod="openshift-marketplace/redhat-operators-9pdfn" Dec 06 07:11:03 crc kubenswrapper[4954]: I1206 07:11:03.866550 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed01c7d5-c189-423f-a812-a4963501bd12-utilities\") pod \"redhat-operators-9pdfn\" (UID: \"ed01c7d5-c189-423f-a812-a4963501bd12\") " pod="openshift-marketplace/redhat-operators-9pdfn" Dec 06 07:11:03 crc kubenswrapper[4954]: I1206 07:11:03.968052 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed01c7d5-c189-423f-a812-a4963501bd12-catalog-content\") pod \"redhat-operators-9pdfn\" (UID: \"ed01c7d5-c189-423f-a812-a4963501bd12\") " pod="openshift-marketplace/redhat-operators-9pdfn" Dec 06 07:11:03 crc kubenswrapper[4954]: I1206 07:11:03.968147 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4fl2\" (UniqueName: \"kubernetes.io/projected/ed01c7d5-c189-423f-a812-a4963501bd12-kube-api-access-m4fl2\") pod \"redhat-operators-9pdfn\" (UID: \"ed01c7d5-c189-423f-a812-a4963501bd12\") " pod="openshift-marketplace/redhat-operators-9pdfn" Dec 06 07:11:03 crc kubenswrapper[4954]: I1206 07:11:03.968179 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed01c7d5-c189-423f-a812-a4963501bd12-utilities\") pod \"redhat-operators-9pdfn\" (UID: \"ed01c7d5-c189-423f-a812-a4963501bd12\") " pod="openshift-marketplace/redhat-operators-9pdfn" Dec 06 07:11:03 crc kubenswrapper[4954]: I1206 07:11:03.968673 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed01c7d5-c189-423f-a812-a4963501bd12-catalog-content\") pod \"redhat-operators-9pdfn\" (UID: \"ed01c7d5-c189-423f-a812-a4963501bd12\") " pod="openshift-marketplace/redhat-operators-9pdfn" Dec 06 07:11:03 crc kubenswrapper[4954]: I1206 07:11:03.968685 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed01c7d5-c189-423f-a812-a4963501bd12-utilities\") pod \"redhat-operators-9pdfn\" (UID: \"ed01c7d5-c189-423f-a812-a4963501bd12\") " pod="openshift-marketplace/redhat-operators-9pdfn" Dec 06 07:11:03 crc kubenswrapper[4954]: I1206 07:11:03.995384 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4fl2\" (UniqueName: \"kubernetes.io/projected/ed01c7d5-c189-423f-a812-a4963501bd12-kube-api-access-m4fl2\") pod \"redhat-operators-9pdfn\" (UID: \"ed01c7d5-c189-423f-a812-a4963501bd12\") " pod="openshift-marketplace/redhat-operators-9pdfn" Dec 06 07:11:04 crc kubenswrapper[4954]: I1206 07:11:04.077238 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9pdfn" Dec 06 07:11:04 crc kubenswrapper[4954]: I1206 07:11:04.391603 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9pdfn"] Dec 06 07:11:05 crc kubenswrapper[4954]: I1206 07:11:05.053785 4954 generic.go:334] "Generic (PLEG): container finished" podID="ed01c7d5-c189-423f-a812-a4963501bd12" containerID="9bd63291a0e444cc4d47156a929b528dc03fbe1b4953147b37b28079f8dde006" exitCode=0 Dec 06 07:11:05 crc kubenswrapper[4954]: I1206 07:11:05.054180 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pdfn" event={"ID":"ed01c7d5-c189-423f-a812-a4963501bd12","Type":"ContainerDied","Data":"9bd63291a0e444cc4d47156a929b528dc03fbe1b4953147b37b28079f8dde006"} Dec 06 07:11:05 crc kubenswrapper[4954]: I1206 07:11:05.054218 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pdfn" event={"ID":"ed01c7d5-c189-423f-a812-a4963501bd12","Type":"ContainerStarted","Data":"51d0d27e2e5377cae488afcd5e6d4f2594186b70a41bb5b6c5ebb34189299eb2"} Dec 06 07:11:05 crc kubenswrapper[4954]: I1206 07:11:05.059324 4954 generic.go:334] "Generic (PLEG): container finished" podID="dd3269d5-87e6-4cd2-a667-217f6da3b3f9" containerID="0d1e898780f31169ea8ddcc3cb550f9811460ec959e42316c8ba5a3b89833a41" exitCode=0 Dec 06 07:11:05 crc kubenswrapper[4954]: I1206 07:11:05.059385 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9" event={"ID":"dd3269d5-87e6-4cd2-a667-217f6da3b3f9","Type":"ContainerDied","Data":"0d1e898780f31169ea8ddcc3cb550f9811460ec959e42316c8ba5a3b89833a41"} Dec 06 07:11:05 crc kubenswrapper[4954]: I1206 07:11:05.742598 4954 scope.go:117] "RemoveContainer" containerID="bdfe1b9103cc84a0e615eaf87edaf139f8223355173d5eea612e14aaee868d8c" Dec 06 07:11:05 crc kubenswrapper[4954]: I1206 07:11:05.761675 4954 scope.go:117] "RemoveContainer" containerID="07bd7dbfbd09451ec330aa0f2a5db9a677754f3146d1c76df75dfb9e3bd2fc6f" Dec 06 07:11:05 crc kubenswrapper[4954]: I1206 07:11:05.777627 4954 scope.go:117] "RemoveContainer" containerID="0c9ec58f7b3b34bd066406e31183b463197f7ff4457112cb1e17a7f40e20eadb" Dec 06 07:11:05 crc kubenswrapper[4954]: I1206 07:11:05.795122 4954 scope.go:117] "RemoveContainer" containerID="8b5e30996ac73864fcd251029efdcd61ca2baeb94ca0b1e058e75168eba91cb6" Dec 06 07:11:05 crc kubenswrapper[4954]: I1206 07:11:05.811605 4954 scope.go:117] "RemoveContainer" containerID="087a4376f658ce13c852b8285a351ac1171f86a197d974549b6506dc15f7362e" Dec 06 07:11:05 crc kubenswrapper[4954]: I1206 07:11:05.839298 4954 scope.go:117] "RemoveContainer" containerID="51f18503377fdb96fd4dc0f24f33cd341c9f86f8e9813eda09ca263b4946d672" Dec 06 07:11:05 crc kubenswrapper[4954]: I1206 07:11:05.853718 4954 scope.go:117] "RemoveContainer" containerID="771737326aa1f4bce3b79b01789eeb3c582bfa322b74c827fd0de90deb9808b4" Dec 06 07:11:05 crc kubenswrapper[4954]: I1206 07:11:05.894411 4954 scope.go:117] "RemoveContainer" containerID="f3a36f04936ad736767fa47da34331d74624541def6549d47c54fa194b2d452b" Dec 06 07:11:05 crc kubenswrapper[4954]: I1206 07:11:05.909320 4954 scope.go:117] "RemoveContainer" containerID="a71da50370047722e8b60953dbde28d22567c61eb4460d128e2c90f2875a3456" Dec 06 07:11:06 crc kubenswrapper[4954]: I1206 07:11:06.068070 4954 generic.go:334] "Generic (PLEG): container finished" podID="dd3269d5-87e6-4cd2-a667-217f6da3b3f9" containerID="ae02e00e9204c6cdd9b2821b94285f6917e814f26531b974ef76217a8f635788" exitCode=0 Dec 06 07:11:06 crc kubenswrapper[4954]: I1206 07:11:06.068141 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9" event={"ID":"dd3269d5-87e6-4cd2-a667-217f6da3b3f9","Type":"ContainerDied","Data":"ae02e00e9204c6cdd9b2821b94285f6917e814f26531b974ef76217a8f635788"} Dec 06 07:11:07 crc kubenswrapper[4954]: I1206 07:11:07.078167 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pdfn" event={"ID":"ed01c7d5-c189-423f-a812-a4963501bd12","Type":"ContainerStarted","Data":"9f95baf92899398063747a2915f83ac1f2430b49b4cce0163c5b964963211f6d"} Dec 06 07:11:07 crc kubenswrapper[4954]: I1206 07:11:07.434277 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9" Dec 06 07:11:07 crc kubenswrapper[4954]: I1206 07:11:07.619061 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd3269d5-87e6-4cd2-a667-217f6da3b3f9-bundle\") pod \"dd3269d5-87e6-4cd2-a667-217f6da3b3f9\" (UID: \"dd3269d5-87e6-4cd2-a667-217f6da3b3f9\") " Dec 06 07:11:07 crc kubenswrapper[4954]: I1206 07:11:07.619144 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd3269d5-87e6-4cd2-a667-217f6da3b3f9-util\") pod \"dd3269d5-87e6-4cd2-a667-217f6da3b3f9\" (UID: \"dd3269d5-87e6-4cd2-a667-217f6da3b3f9\") " Dec 06 07:11:07 crc kubenswrapper[4954]: I1206 07:11:07.619199 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bhnx\" (UniqueName: \"kubernetes.io/projected/dd3269d5-87e6-4cd2-a667-217f6da3b3f9-kube-api-access-8bhnx\") pod \"dd3269d5-87e6-4cd2-a667-217f6da3b3f9\" (UID: \"dd3269d5-87e6-4cd2-a667-217f6da3b3f9\") " Dec 06 07:11:07 crc kubenswrapper[4954]: I1206 07:11:07.620061 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd3269d5-87e6-4cd2-a667-217f6da3b3f9-bundle" (OuterVolumeSpecName: "bundle") pod "dd3269d5-87e6-4cd2-a667-217f6da3b3f9" (UID: "dd3269d5-87e6-4cd2-a667-217f6da3b3f9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:11:07 crc kubenswrapper[4954]: I1206 07:11:07.620470 4954 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd3269d5-87e6-4cd2-a667-217f6da3b3f9-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:11:07 crc kubenswrapper[4954]: I1206 07:11:07.627166 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd3269d5-87e6-4cd2-a667-217f6da3b3f9-kube-api-access-8bhnx" (OuterVolumeSpecName: "kube-api-access-8bhnx") pod "dd3269d5-87e6-4cd2-a667-217f6da3b3f9" (UID: "dd3269d5-87e6-4cd2-a667-217f6da3b3f9"). InnerVolumeSpecName "kube-api-access-8bhnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:11:07 crc kubenswrapper[4954]: I1206 07:11:07.633821 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd3269d5-87e6-4cd2-a667-217f6da3b3f9-util" (OuterVolumeSpecName: "util") pod "dd3269d5-87e6-4cd2-a667-217f6da3b3f9" (UID: "dd3269d5-87e6-4cd2-a667-217f6da3b3f9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:11:07 crc kubenswrapper[4954]: I1206 07:11:07.722134 4954 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd3269d5-87e6-4cd2-a667-217f6da3b3f9-util\") on node \"crc\" DevicePath \"\"" Dec 06 07:11:07 crc kubenswrapper[4954]: I1206 07:11:07.722198 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bhnx\" (UniqueName: \"kubernetes.io/projected/dd3269d5-87e6-4cd2-a667-217f6da3b3f9-kube-api-access-8bhnx\") on node \"crc\" DevicePath \"\"" Dec 06 07:11:08 crc kubenswrapper[4954]: I1206 07:11:08.085205 4954 generic.go:334] "Generic (PLEG): container finished" podID="ed01c7d5-c189-423f-a812-a4963501bd12" containerID="9f95baf92899398063747a2915f83ac1f2430b49b4cce0163c5b964963211f6d" exitCode=0 Dec 06 07:11:08 crc kubenswrapper[4954]: I1206 07:11:08.085305 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pdfn" event={"ID":"ed01c7d5-c189-423f-a812-a4963501bd12","Type":"ContainerDied","Data":"9f95baf92899398063747a2915f83ac1f2430b49b4cce0163c5b964963211f6d"} Dec 06 07:11:08 crc kubenswrapper[4954]: I1206 07:11:08.088639 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9" event={"ID":"dd3269d5-87e6-4cd2-a667-217f6da3b3f9","Type":"ContainerDied","Data":"666a27ad644e04714cc1fc205af929cbd34d8471ce17e6b5476c701fd1fd092f"} Dec 06 07:11:08 crc kubenswrapper[4954]: I1206 07:11:08.088717 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="666a27ad644e04714cc1fc205af929cbd34d8471ce17e6b5476c701fd1fd092f" Dec 06 07:11:08 crc kubenswrapper[4954]: I1206 07:11:08.088739 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9" Dec 06 07:11:08 crc kubenswrapper[4954]: I1206 07:11:08.272520 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bb2zw" Dec 06 07:11:09 crc kubenswrapper[4954]: I1206 07:11:09.097012 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pdfn" event={"ID":"ed01c7d5-c189-423f-a812-a4963501bd12","Type":"ContainerStarted","Data":"25753b34b35aa66935fd29d5b72fc5092159b70778b78dfb0cab54286a9628c2"} Dec 06 07:11:09 crc kubenswrapper[4954]: I1206 07:11:09.122265 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9pdfn" podStartSLOduration=2.689413518 podStartE2EDuration="6.122245631s" podCreationTimestamp="2025-12-06 07:11:03 +0000 UTC" firstStartedPulling="2025-12-06 07:11:05.057807755 +0000 UTC m=+839.871167144" lastFinishedPulling="2025-12-06 07:11:08.490639868 +0000 UTC m=+843.303999257" observedRunningTime="2025-12-06 07:11:09.117254717 +0000 UTC m=+843.930614116" watchObservedRunningTime="2025-12-06 07:11:09.122245631 +0000 UTC m=+843.935605020" Dec 06 07:11:09 crc kubenswrapper[4954]: I1206 07:11:09.662652 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-ftlnm"] Dec 06 07:11:09 crc kubenswrapper[4954]: E1206 07:11:09.663394 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3269d5-87e6-4cd2-a667-217f6da3b3f9" containerName="pull" Dec 06 07:11:09 crc kubenswrapper[4954]: I1206 07:11:09.663421 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3269d5-87e6-4cd2-a667-217f6da3b3f9" containerName="pull" Dec 06 07:11:09 crc kubenswrapper[4954]: E1206 07:11:09.663450 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3269d5-87e6-4cd2-a667-217f6da3b3f9" containerName="util" Dec 06 07:11:09 crc kubenswrapper[4954]: I1206 07:11:09.663458 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3269d5-87e6-4cd2-a667-217f6da3b3f9" containerName="util" Dec 06 07:11:09 crc kubenswrapper[4954]: E1206 07:11:09.663469 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3269d5-87e6-4cd2-a667-217f6da3b3f9" containerName="extract" Dec 06 07:11:09 crc kubenswrapper[4954]: I1206 07:11:09.663480 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3269d5-87e6-4cd2-a667-217f6da3b3f9" containerName="extract" Dec 06 07:11:09 crc kubenswrapper[4954]: I1206 07:11:09.663609 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd3269d5-87e6-4cd2-a667-217f6da3b3f9" containerName="extract" Dec 06 07:11:09 crc kubenswrapper[4954]: I1206 07:11:09.664100 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-ftlnm" Dec 06 07:11:09 crc kubenswrapper[4954]: I1206 07:11:09.666295 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 06 07:11:09 crc kubenswrapper[4954]: I1206 07:11:09.666769 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 06 07:11:09 crc kubenswrapper[4954]: I1206 07:11:09.670007 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-px4rc" Dec 06 07:11:09 crc kubenswrapper[4954]: I1206 07:11:09.675058 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-ftlnm"] Dec 06 07:11:09 crc kubenswrapper[4954]: I1206 07:11:09.853638 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkps2\" (UniqueName: \"kubernetes.io/projected/b50936c1-0d88-46be-8fbb-d554594c21a3-kube-api-access-pkps2\") pod \"nmstate-operator-5b5b58f5c8-ftlnm\" (UID: \"b50936c1-0d88-46be-8fbb-d554594c21a3\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-ftlnm" Dec 06 07:11:09 crc kubenswrapper[4954]: I1206 07:11:09.955663 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkps2\" (UniqueName: \"kubernetes.io/projected/b50936c1-0d88-46be-8fbb-d554594c21a3-kube-api-access-pkps2\") pod \"nmstate-operator-5b5b58f5c8-ftlnm\" (UID: \"b50936c1-0d88-46be-8fbb-d554594c21a3\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-ftlnm" Dec 06 07:11:09 crc kubenswrapper[4954]: I1206 07:11:09.975074 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkps2\" (UniqueName: \"kubernetes.io/projected/b50936c1-0d88-46be-8fbb-d554594c21a3-kube-api-access-pkps2\") pod \"nmstate-operator-5b5b58f5c8-ftlnm\" (UID: \"b50936c1-0d88-46be-8fbb-d554594c21a3\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-ftlnm" Dec 06 07:11:09 crc kubenswrapper[4954]: I1206 07:11:09.980208 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-ftlnm" Dec 06 07:11:10 crc kubenswrapper[4954]: I1206 07:11:10.240090 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-ftlnm"] Dec 06 07:11:10 crc kubenswrapper[4954]: W1206 07:11:10.251592 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb50936c1_0d88_46be_8fbb_d554594c21a3.slice/crio-fe7d0ab251cc336da120fa50c54d73a1cadc6b3dd32ede9ca51bcdeb434c3ee3 WatchSource:0}: Error finding container fe7d0ab251cc336da120fa50c54d73a1cadc6b3dd32ede9ca51bcdeb434c3ee3: Status 404 returned error can't find the container with id fe7d0ab251cc336da120fa50c54d73a1cadc6b3dd32ede9ca51bcdeb434c3ee3 Dec 06 07:11:11 crc kubenswrapper[4954]: I1206 07:11:11.117040 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-ftlnm" event={"ID":"b50936c1-0d88-46be-8fbb-d554594c21a3","Type":"ContainerStarted","Data":"fe7d0ab251cc336da120fa50c54d73a1cadc6b3dd32ede9ca51bcdeb434c3ee3"} Dec 06 07:11:14 crc kubenswrapper[4954]: I1206 07:11:14.078366 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9pdfn" Dec 06 07:11:14 crc kubenswrapper[4954]: I1206 07:11:14.078881 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9pdfn" Dec 06 07:11:14 crc kubenswrapper[4954]: I1206 07:11:14.138914 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9pdfn" Dec 06 07:11:14 crc kubenswrapper[4954]: I1206 07:11:14.139341 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-ftlnm" event={"ID":"b50936c1-0d88-46be-8fbb-d554594c21a3","Type":"ContainerStarted","Data":"90630f11f6949af1dae4d78fc79bd4ce5475847b169280a57b0fa8ddffbf00a1"} Dec 06 07:11:14 crc kubenswrapper[4954]: I1206 07:11:14.182494 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9pdfn" Dec 06 07:11:14 crc kubenswrapper[4954]: I1206 07:11:14.204293 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-ftlnm" podStartSLOduration=2.339184351 podStartE2EDuration="5.204274005s" podCreationTimestamp="2025-12-06 07:11:09 +0000 UTC" firstStartedPulling="2025-12-06 07:11:10.255189189 +0000 UTC m=+845.068548578" lastFinishedPulling="2025-12-06 07:11:13.120278843 +0000 UTC m=+847.933638232" observedRunningTime="2025-12-06 07:11:14.177839959 +0000 UTC m=+848.991199348" watchObservedRunningTime="2025-12-06 07:11:14.204274005 +0000 UTC m=+849.017633394" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.157971 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-cvwlf"] Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.159683 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-cvwlf" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.162903 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-n9hcp" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.167936 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-llch7"] Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.168776 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-llch7" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.183902 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.188533 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-cvwlf"] Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.224193 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-llch7"] Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.228068 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-k9nmn"] Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.229054 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-k9nmn" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.337437 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2d304ffb-d35d-4464-9f6f-8b1ed846b641-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-llch7\" (UID: \"2d304ffb-d35d-4464-9f6f-8b1ed846b641\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-llch7" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.337495 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg597\" (UniqueName: \"kubernetes.io/projected/257074d3-7882-4f60-b7d6-92374480fafa-kube-api-access-hg597\") pod \"nmstate-metrics-7f946cbc9-cvwlf\" (UID: \"257074d3-7882-4f60-b7d6-92374480fafa\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-cvwlf" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.337526 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/60516977-e70c-4cd1-a029-15f0be8f3a12-ovs-socket\") pod \"nmstate-handler-k9nmn\" (UID: \"60516977-e70c-4cd1-a029-15f0be8f3a12\") " pod="openshift-nmstate/nmstate-handler-k9nmn" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.337578 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/60516977-e70c-4cd1-a029-15f0be8f3a12-dbus-socket\") pod \"nmstate-handler-k9nmn\" (UID: \"60516977-e70c-4cd1-a029-15f0be8f3a12\") " pod="openshift-nmstate/nmstate-handler-k9nmn" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.337614 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnmm5\" (UniqueName: \"kubernetes.io/projected/60516977-e70c-4cd1-a029-15f0be8f3a12-kube-api-access-fnmm5\") pod \"nmstate-handler-k9nmn\" (UID: \"60516977-e70c-4cd1-a029-15f0be8f3a12\") " pod="openshift-nmstate/nmstate-handler-k9nmn" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.337718 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5lqv\" (UniqueName: \"kubernetes.io/projected/2d304ffb-d35d-4464-9f6f-8b1ed846b641-kube-api-access-q5lqv\") pod \"nmstate-webhook-5f6d4c5ccb-llch7\" (UID: \"2d304ffb-d35d-4464-9f6f-8b1ed846b641\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-llch7" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.337887 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/60516977-e70c-4cd1-a029-15f0be8f3a12-nmstate-lock\") pod \"nmstate-handler-k9nmn\" (UID: \"60516977-e70c-4cd1-a029-15f0be8f3a12\") " pod="openshift-nmstate/nmstate-handler-k9nmn" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.436687 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zl4qd"] Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.437718 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zl4qd" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.438862 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2d304ffb-d35d-4464-9f6f-8b1ed846b641-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-llch7\" (UID: \"2d304ffb-d35d-4464-9f6f-8b1ed846b641\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-llch7" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.438920 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg597\" (UniqueName: \"kubernetes.io/projected/257074d3-7882-4f60-b7d6-92374480fafa-kube-api-access-hg597\") pod \"nmstate-metrics-7f946cbc9-cvwlf\" (UID: \"257074d3-7882-4f60-b7d6-92374480fafa\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-cvwlf" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.438949 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/60516977-e70c-4cd1-a029-15f0be8f3a12-ovs-socket\") pod \"nmstate-handler-k9nmn\" (UID: \"60516977-e70c-4cd1-a029-15f0be8f3a12\") " pod="openshift-nmstate/nmstate-handler-k9nmn" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.438979 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/60516977-e70c-4cd1-a029-15f0be8f3a12-dbus-socket\") pod \"nmstate-handler-k9nmn\" (UID: \"60516977-e70c-4cd1-a029-15f0be8f3a12\") " pod="openshift-nmstate/nmstate-handler-k9nmn" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.439002 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnmm5\" (UniqueName: \"kubernetes.io/projected/60516977-e70c-4cd1-a029-15f0be8f3a12-kube-api-access-fnmm5\") pod \"nmstate-handler-k9nmn\" (UID: \"60516977-e70c-4cd1-a029-15f0be8f3a12\") " pod="openshift-nmstate/nmstate-handler-k9nmn" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.439050 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5lqv\" (UniqueName: \"kubernetes.io/projected/2d304ffb-d35d-4464-9f6f-8b1ed846b641-kube-api-access-q5lqv\") pod \"nmstate-webhook-5f6d4c5ccb-llch7\" (UID: \"2d304ffb-d35d-4464-9f6f-8b1ed846b641\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-llch7" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.439085 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/60516977-e70c-4cd1-a029-15f0be8f3a12-nmstate-lock\") pod \"nmstate-handler-k9nmn\" (UID: \"60516977-e70c-4cd1-a029-15f0be8f3a12\") " pod="openshift-nmstate/nmstate-handler-k9nmn" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.439168 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/60516977-e70c-4cd1-a029-15f0be8f3a12-nmstate-lock\") pod \"nmstate-handler-k9nmn\" (UID: \"60516977-e70c-4cd1-a029-15f0be8f3a12\") " pod="openshift-nmstate/nmstate-handler-k9nmn" Dec 06 07:11:15 crc kubenswrapper[4954]: E1206 07:11:15.439313 4954 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 06 07:11:15 crc kubenswrapper[4954]: E1206 07:11:15.439383 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d304ffb-d35d-4464-9f6f-8b1ed846b641-tls-key-pair podName:2d304ffb-d35d-4464-9f6f-8b1ed846b641 nodeName:}" failed. No retries permitted until 2025-12-06 07:11:15.939357599 +0000 UTC m=+850.752716988 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/2d304ffb-d35d-4464-9f6f-8b1ed846b641-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-llch7" (UID: "2d304ffb-d35d-4464-9f6f-8b1ed846b641") : secret "openshift-nmstate-webhook" not found Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.439905 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/60516977-e70c-4cd1-a029-15f0be8f3a12-ovs-socket\") pod \"nmstate-handler-k9nmn\" (UID: \"60516977-e70c-4cd1-a029-15f0be8f3a12\") " pod="openshift-nmstate/nmstate-handler-k9nmn" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.440294 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/60516977-e70c-4cd1-a029-15f0be8f3a12-dbus-socket\") pod \"nmstate-handler-k9nmn\" (UID: \"60516977-e70c-4cd1-a029-15f0be8f3a12\") " pod="openshift-nmstate/nmstate-handler-k9nmn" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.442970 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.443045 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-q5t79" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.443039 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.456727 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zl4qd"] Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.464721 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5lqv\" (UniqueName: \"kubernetes.io/projected/2d304ffb-d35d-4464-9f6f-8b1ed846b641-kube-api-access-q5lqv\") pod \"nmstate-webhook-5f6d4c5ccb-llch7\" (UID: \"2d304ffb-d35d-4464-9f6f-8b1ed846b641\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-llch7" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.471131 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnmm5\" (UniqueName: \"kubernetes.io/projected/60516977-e70c-4cd1-a029-15f0be8f3a12-kube-api-access-fnmm5\") pod \"nmstate-handler-k9nmn\" (UID: \"60516977-e70c-4cd1-a029-15f0be8f3a12\") " pod="openshift-nmstate/nmstate-handler-k9nmn" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.486972 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg597\" (UniqueName: \"kubernetes.io/projected/257074d3-7882-4f60-b7d6-92374480fafa-kube-api-access-hg597\") pod \"nmstate-metrics-7f946cbc9-cvwlf\" (UID: \"257074d3-7882-4f60-b7d6-92374480fafa\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-cvwlf" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.540553 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpx42\" (UniqueName: \"kubernetes.io/projected/86a6ada5-37c3-4250-9f12-d1e208786bcf-kube-api-access-bpx42\") pod \"nmstate-console-plugin-7fbb5f6569-zl4qd\" (UID: \"86a6ada5-37c3-4250-9f12-d1e208786bcf\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zl4qd" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.540707 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/86a6ada5-37c3-4250-9f12-d1e208786bcf-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-zl4qd\" (UID: \"86a6ada5-37c3-4250-9f12-d1e208786bcf\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zl4qd" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.540790 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/86a6ada5-37c3-4250-9f12-d1e208786bcf-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-zl4qd\" (UID: \"86a6ada5-37c3-4250-9f12-d1e208786bcf\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zl4qd" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.550369 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-k9nmn" Dec 06 07:11:15 crc kubenswrapper[4954]: W1206 07:11:15.588175 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60516977_e70c_4cd1_a029_15f0be8f3a12.slice/crio-60a866ca3f3416a8330558d7eee00f6cb762924e1da17553a085352adab95420 WatchSource:0}: Error finding container 60a866ca3f3416a8330558d7eee00f6cb762924e1da17553a085352adab95420: Status 404 returned error can't find the container with id 60a866ca3f3416a8330558d7eee00f6cb762924e1da17553a085352adab95420 Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.630670 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d69bfccc8-jnq2b"] Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.632033 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.641693 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpx42\" (UniqueName: \"kubernetes.io/projected/86a6ada5-37c3-4250-9f12-d1e208786bcf-kube-api-access-bpx42\") pod \"nmstate-console-plugin-7fbb5f6569-zl4qd\" (UID: \"86a6ada5-37c3-4250-9f12-d1e208786bcf\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zl4qd" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.641752 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/86a6ada5-37c3-4250-9f12-d1e208786bcf-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-zl4qd\" (UID: \"86a6ada5-37c3-4250-9f12-d1e208786bcf\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zl4qd" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.641817 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/86a6ada5-37c3-4250-9f12-d1e208786bcf-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-zl4qd\" (UID: \"86a6ada5-37c3-4250-9f12-d1e208786bcf\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zl4qd" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.642818 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/86a6ada5-37c3-4250-9f12-d1e208786bcf-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-zl4qd\" (UID: \"86a6ada5-37c3-4250-9f12-d1e208786bcf\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zl4qd" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.647807 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d69bfccc8-jnq2b"] Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.652270 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/86a6ada5-37c3-4250-9f12-d1e208786bcf-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-zl4qd\" (UID: \"86a6ada5-37c3-4250-9f12-d1e208786bcf\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zl4qd" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.667060 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpx42\" (UniqueName: \"kubernetes.io/projected/86a6ada5-37c3-4250-9f12-d1e208786bcf-kube-api-access-bpx42\") pod \"nmstate-console-plugin-7fbb5f6569-zl4qd\" (UID: \"86a6ada5-37c3-4250-9f12-d1e208786bcf\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zl4qd" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.743133 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9782a6e6-f010-439c-ba94-6c33a3fde9a1-console-oauth-config\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.743217 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9782a6e6-f010-439c-ba94-6c33a3fde9a1-trusted-ca-bundle\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.743240 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9782a6e6-f010-439c-ba94-6c33a3fde9a1-service-ca\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.743294 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9782a6e6-f010-439c-ba94-6c33a3fde9a1-oauth-serving-cert\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.743341 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbl7d\" (UniqueName: \"kubernetes.io/projected/9782a6e6-f010-439c-ba94-6c33a3fde9a1-kube-api-access-nbl7d\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.743449 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9782a6e6-f010-439c-ba94-6c33a3fde9a1-console-config\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.743485 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9782a6e6-f010-439c-ba94-6c33a3fde9a1-console-serving-cert\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.759223 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zl4qd" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.780772 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-cvwlf" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.846269 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9782a6e6-f010-439c-ba94-6c33a3fde9a1-console-oauth-config\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.846405 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9782a6e6-f010-439c-ba94-6c33a3fde9a1-trusted-ca-bundle\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.846445 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9782a6e6-f010-439c-ba94-6c33a3fde9a1-oauth-serving-cert\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.846464 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9782a6e6-f010-439c-ba94-6c33a3fde9a1-service-ca\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.846505 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbl7d\" (UniqueName: \"kubernetes.io/projected/9782a6e6-f010-439c-ba94-6c33a3fde9a1-kube-api-access-nbl7d\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.846697 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9782a6e6-f010-439c-ba94-6c33a3fde9a1-console-config\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.846766 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9782a6e6-f010-439c-ba94-6c33a3fde9a1-console-serving-cert\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.848340 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9782a6e6-f010-439c-ba94-6c33a3fde9a1-oauth-serving-cert\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.848479 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9782a6e6-f010-439c-ba94-6c33a3fde9a1-console-config\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.848674 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9782a6e6-f010-439c-ba94-6c33a3fde9a1-trusted-ca-bundle\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.849361 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9782a6e6-f010-439c-ba94-6c33a3fde9a1-service-ca\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.851494 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9782a6e6-f010-439c-ba94-6c33a3fde9a1-console-oauth-config\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.853138 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9782a6e6-f010-439c-ba94-6c33a3fde9a1-console-serving-cert\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.867641 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbl7d\" (UniqueName: \"kubernetes.io/projected/9782a6e6-f010-439c-ba94-6c33a3fde9a1-kube-api-access-nbl7d\") pod \"console-5d69bfccc8-jnq2b\" (UID: \"9782a6e6-f010-439c-ba94-6c33a3fde9a1\") " pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.948473 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2d304ffb-d35d-4464-9f6f-8b1ed846b641-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-llch7\" (UID: \"2d304ffb-d35d-4464-9f6f-8b1ed846b641\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-llch7" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.953253 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:15 crc kubenswrapper[4954]: I1206 07:11:15.953455 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2d304ffb-d35d-4464-9f6f-8b1ed846b641-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-llch7\" (UID: \"2d304ffb-d35d-4464-9f6f-8b1ed846b641\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-llch7" Dec 06 07:11:16 crc kubenswrapper[4954]: I1206 07:11:16.016914 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zl4qd"] Dec 06 07:11:16 crc kubenswrapper[4954]: I1206 07:11:16.092344 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-cvwlf"] Dec 06 07:11:16 crc kubenswrapper[4954]: I1206 07:11:16.093770 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-llch7" Dec 06 07:11:16 crc kubenswrapper[4954]: W1206 07:11:16.119750 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod257074d3_7882_4f60_b7d6_92374480fafa.slice/crio-dead4f0ed71ff6c64c5c447f9862bfa5d9ab69a689fe38e0f13da0ad5cdadfc5 WatchSource:0}: Error finding container dead4f0ed71ff6c64c5c447f9862bfa5d9ab69a689fe38e0f13da0ad5cdadfc5: Status 404 returned error can't find the container with id dead4f0ed71ff6c64c5c447f9862bfa5d9ab69a689fe38e0f13da0ad5cdadfc5 Dec 06 07:11:16 crc kubenswrapper[4954]: I1206 07:11:16.160799 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zl4qd" event={"ID":"86a6ada5-37c3-4250-9f12-d1e208786bcf","Type":"ContainerStarted","Data":"153a2cd4e4dead1680e41e68a90b0718114cd7ac45c2acd0a932ac9260ec6fee"} Dec 06 07:11:16 crc kubenswrapper[4954]: I1206 07:11:16.161779 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-cvwlf" event={"ID":"257074d3-7882-4f60-b7d6-92374480fafa","Type":"ContainerStarted","Data":"dead4f0ed71ff6c64c5c447f9862bfa5d9ab69a689fe38e0f13da0ad5cdadfc5"} Dec 06 07:11:16 crc kubenswrapper[4954]: I1206 07:11:16.164585 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-k9nmn" event={"ID":"60516977-e70c-4cd1-a029-15f0be8f3a12","Type":"ContainerStarted","Data":"60a866ca3f3416a8330558d7eee00f6cb762924e1da17553a085352adab95420"} Dec 06 07:11:16 crc kubenswrapper[4954]: I1206 07:11:16.228416 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d69bfccc8-jnq2b"] Dec 06 07:11:16 crc kubenswrapper[4954]: W1206 07:11:16.232145 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9782a6e6_f010_439c_ba94_6c33a3fde9a1.slice/crio-c06bc21279cd2fb95bd7420e379b2cef084891dd5af34cac9f6f659f75c55043 WatchSource:0}: Error finding container c06bc21279cd2fb95bd7420e379b2cef084891dd5af34cac9f6f659f75c55043: Status 404 returned error can't find the container with id c06bc21279cd2fb95bd7420e379b2cef084891dd5af34cac9f6f659f75c55043 Dec 06 07:11:16 crc kubenswrapper[4954]: I1206 07:11:16.317294 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-llch7"] Dec 06 07:11:16 crc kubenswrapper[4954]: W1206 07:11:16.323679 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d304ffb_d35d_4464_9f6f_8b1ed846b641.slice/crio-a3a2dd588dda7823eaa5e0577b8c2e360ad14adc7ba40b56359a93fbbeb8f8cd WatchSource:0}: Error finding container a3a2dd588dda7823eaa5e0577b8c2e360ad14adc7ba40b56359a93fbbeb8f8cd: Status 404 returned error can't find the container with id a3a2dd588dda7823eaa5e0577b8c2e360ad14adc7ba40b56359a93fbbeb8f8cd Dec 06 07:11:16 crc kubenswrapper[4954]: I1206 07:11:16.550977 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9pdfn"] Dec 06 07:11:16 crc kubenswrapper[4954]: I1206 07:11:16.551246 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9pdfn" podUID="ed01c7d5-c189-423f-a812-a4963501bd12" containerName="registry-server" containerID="cri-o://25753b34b35aa66935fd29d5b72fc5092159b70778b78dfb0cab54286a9628c2" gracePeriod=2 Dec 06 07:11:17 crc kubenswrapper[4954]: I1206 07:11:17.172595 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-llch7" event={"ID":"2d304ffb-d35d-4464-9f6f-8b1ed846b641","Type":"ContainerStarted","Data":"a3a2dd588dda7823eaa5e0577b8c2e360ad14adc7ba40b56359a93fbbeb8f8cd"} Dec 06 07:11:17 crc kubenswrapper[4954]: I1206 07:11:17.174880 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d69bfccc8-jnq2b" event={"ID":"9782a6e6-f010-439c-ba94-6c33a3fde9a1","Type":"ContainerStarted","Data":"cd5a9c5e7f1bc5164c98e9f99a93dc97a91b3846c8d193aed6b8886d518b28ea"} Dec 06 07:11:17 crc kubenswrapper[4954]: I1206 07:11:17.174917 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d69bfccc8-jnq2b" event={"ID":"9782a6e6-f010-439c-ba94-6c33a3fde9a1","Type":"ContainerStarted","Data":"c06bc21279cd2fb95bd7420e379b2cef084891dd5af34cac9f6f659f75c55043"} Dec 06 07:11:17 crc kubenswrapper[4954]: I1206 07:11:17.204376 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d69bfccc8-jnq2b" podStartSLOduration=2.204355832 podStartE2EDuration="2.204355832s" podCreationTimestamp="2025-12-06 07:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:11:17.198113416 +0000 UTC m=+852.011472805" watchObservedRunningTime="2025-12-06 07:11:17.204355832 +0000 UTC m=+852.017715221" Dec 06 07:11:19 crc kubenswrapper[4954]: I1206 07:11:19.191800 4954 generic.go:334] "Generic (PLEG): container finished" podID="ed01c7d5-c189-423f-a812-a4963501bd12" containerID="25753b34b35aa66935fd29d5b72fc5092159b70778b78dfb0cab54286a9628c2" exitCode=0 Dec 06 07:11:19 crc kubenswrapper[4954]: I1206 07:11:19.191879 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pdfn" event={"ID":"ed01c7d5-c189-423f-a812-a4963501bd12","Type":"ContainerDied","Data":"25753b34b35aa66935fd29d5b72fc5092159b70778b78dfb0cab54286a9628c2"} Dec 06 07:11:19 crc kubenswrapper[4954]: I1206 07:11:19.839851 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9pdfn" Dec 06 07:11:20 crc kubenswrapper[4954]: I1206 07:11:20.015380 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed01c7d5-c189-423f-a812-a4963501bd12-catalog-content\") pod \"ed01c7d5-c189-423f-a812-a4963501bd12\" (UID: \"ed01c7d5-c189-423f-a812-a4963501bd12\") " Dec 06 07:11:20 crc kubenswrapper[4954]: I1206 07:11:20.015470 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4fl2\" (UniqueName: \"kubernetes.io/projected/ed01c7d5-c189-423f-a812-a4963501bd12-kube-api-access-m4fl2\") pod \"ed01c7d5-c189-423f-a812-a4963501bd12\" (UID: \"ed01c7d5-c189-423f-a812-a4963501bd12\") " Dec 06 07:11:20 crc kubenswrapper[4954]: I1206 07:11:20.015543 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed01c7d5-c189-423f-a812-a4963501bd12-utilities\") pod \"ed01c7d5-c189-423f-a812-a4963501bd12\" (UID: \"ed01c7d5-c189-423f-a812-a4963501bd12\") " Dec 06 07:11:20 crc kubenswrapper[4954]: I1206 07:11:20.017218 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed01c7d5-c189-423f-a812-a4963501bd12-utilities" (OuterVolumeSpecName: "utilities") pod "ed01c7d5-c189-423f-a812-a4963501bd12" (UID: "ed01c7d5-c189-423f-a812-a4963501bd12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:11:20 crc kubenswrapper[4954]: I1206 07:11:20.023576 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed01c7d5-c189-423f-a812-a4963501bd12-kube-api-access-m4fl2" (OuterVolumeSpecName: "kube-api-access-m4fl2") pod "ed01c7d5-c189-423f-a812-a4963501bd12" (UID: "ed01c7d5-c189-423f-a812-a4963501bd12"). InnerVolumeSpecName "kube-api-access-m4fl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:11:20 crc kubenswrapper[4954]: I1206 07:11:20.117833 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4fl2\" (UniqueName: \"kubernetes.io/projected/ed01c7d5-c189-423f-a812-a4963501bd12-kube-api-access-m4fl2\") on node \"crc\" DevicePath \"\"" Dec 06 07:11:20 crc kubenswrapper[4954]: I1206 07:11:20.117883 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed01c7d5-c189-423f-a812-a4963501bd12-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:11:20 crc kubenswrapper[4954]: I1206 07:11:20.131522 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed01c7d5-c189-423f-a812-a4963501bd12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed01c7d5-c189-423f-a812-a4963501bd12" (UID: "ed01c7d5-c189-423f-a812-a4963501bd12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:11:20 crc kubenswrapper[4954]: I1206 07:11:20.202602 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pdfn" event={"ID":"ed01c7d5-c189-423f-a812-a4963501bd12","Type":"ContainerDied","Data":"51d0d27e2e5377cae488afcd5e6d4f2594186b70a41bb5b6c5ebb34189299eb2"} Dec 06 07:11:20 crc kubenswrapper[4954]: I1206 07:11:20.202668 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9pdfn" Dec 06 07:11:20 crc kubenswrapper[4954]: I1206 07:11:20.202701 4954 scope.go:117] "RemoveContainer" containerID="25753b34b35aa66935fd29d5b72fc5092159b70778b78dfb0cab54286a9628c2" Dec 06 07:11:20 crc kubenswrapper[4954]: I1206 07:11:20.218875 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed01c7d5-c189-423f-a812-a4963501bd12-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:11:20 crc kubenswrapper[4954]: I1206 07:11:20.238165 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9pdfn"] Dec 06 07:11:20 crc kubenswrapper[4954]: I1206 07:11:20.244409 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9pdfn"] Dec 06 07:11:21 crc kubenswrapper[4954]: I1206 07:11:21.452914 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed01c7d5-c189-423f-a812-a4963501bd12" path="/var/lib/kubelet/pods/ed01c7d5-c189-423f-a812-a4963501bd12/volumes" Dec 06 07:11:21 crc kubenswrapper[4954]: I1206 07:11:21.924735 4954 scope.go:117] "RemoveContainer" containerID="9f95baf92899398063747a2915f83ac1f2430b49b4cce0163c5b964963211f6d" Dec 06 07:11:21 crc kubenswrapper[4954]: I1206 07:11:21.958636 4954 scope.go:117] "RemoveContainer" containerID="9bd63291a0e444cc4d47156a929b528dc03fbe1b4953147b37b28079f8dde006" Dec 06 07:11:23 crc kubenswrapper[4954]: I1206 07:11:23.226334 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zl4qd" event={"ID":"86a6ada5-37c3-4250-9f12-d1e208786bcf","Type":"ContainerStarted","Data":"7d35376cd7bef8b641b32e4bb44edffa9bc428fbb96b3b88fe296bb7f235e8a1"} Dec 06 07:11:23 crc kubenswrapper[4954]: I1206 07:11:23.228128 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-cvwlf" event={"ID":"257074d3-7882-4f60-b7d6-92374480fafa","Type":"ContainerStarted","Data":"9e64e8ba4e70dfbdf3039432473aa5a529f3ed144c848a07e60ef123b6ede905"} Dec 06 07:11:23 crc kubenswrapper[4954]: I1206 07:11:23.231038 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-llch7" event={"ID":"2d304ffb-d35d-4464-9f6f-8b1ed846b641","Type":"ContainerStarted","Data":"064ed57f27645de8f69933bf150006fa24b160316028c13435f0ad5cde608c7a"} Dec 06 07:11:23 crc kubenswrapper[4954]: I1206 07:11:23.231272 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-llch7" Dec 06 07:11:23 crc kubenswrapper[4954]: I1206 07:11:23.233428 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-k9nmn" event={"ID":"60516977-e70c-4cd1-a029-15f0be8f3a12","Type":"ContainerStarted","Data":"3267b2b5ef0a4e80008ab3fa5e6be6c443954110551b62f9aac86dbc4acf6f9c"} Dec 06 07:11:23 crc kubenswrapper[4954]: I1206 07:11:23.233885 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-k9nmn" Dec 06 07:11:23 crc kubenswrapper[4954]: I1206 07:11:23.245348 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zl4qd" podStartSLOduration=2.279949731 podStartE2EDuration="8.245330298s" podCreationTimestamp="2025-12-06 07:11:15 +0000 UTC" firstStartedPulling="2025-12-06 07:11:16.030385629 +0000 UTC m=+850.843745018" lastFinishedPulling="2025-12-06 07:11:21.995766196 +0000 UTC m=+856.809125585" observedRunningTime="2025-12-06 07:11:23.244990229 +0000 UTC m=+858.058349628" watchObservedRunningTime="2025-12-06 07:11:23.245330298 +0000 UTC m=+858.058689687" Dec 06 07:11:23 crc kubenswrapper[4954]: I1206 07:11:23.266527 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-k9nmn" podStartSLOduration=1.863588354 podStartE2EDuration="8.266510653s" podCreationTimestamp="2025-12-06 07:11:15 +0000 UTC" firstStartedPulling="2025-12-06 07:11:15.592924859 +0000 UTC m=+850.406284258" lastFinishedPulling="2025-12-06 07:11:21.995847168 +0000 UTC m=+856.809206557" observedRunningTime="2025-12-06 07:11:23.263129783 +0000 UTC m=+858.076489172" watchObservedRunningTime="2025-12-06 07:11:23.266510653 +0000 UTC m=+858.079870042" Dec 06 07:11:23 crc kubenswrapper[4954]: I1206 07:11:23.289950 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-llch7" podStartSLOduration=2.619688491 podStartE2EDuration="8.289931919s" podCreationTimestamp="2025-12-06 07:11:15 +0000 UTC" firstStartedPulling="2025-12-06 07:11:16.326901585 +0000 UTC m=+851.140260974" lastFinishedPulling="2025-12-06 07:11:21.997145013 +0000 UTC m=+856.810504402" observedRunningTime="2025-12-06 07:11:23.286338723 +0000 UTC m=+858.099698122" watchObservedRunningTime="2025-12-06 07:11:23.289931919 +0000 UTC m=+858.103291308" Dec 06 07:11:25 crc kubenswrapper[4954]: I1206 07:11:25.954386 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:25 crc kubenswrapper[4954]: I1206 07:11:25.955229 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:25 crc kubenswrapper[4954]: I1206 07:11:25.960818 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:26 crc kubenswrapper[4954]: I1206 07:11:26.257631 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d69bfccc8-jnq2b" Dec 06 07:11:26 crc kubenswrapper[4954]: I1206 07:11:26.312275 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-z4t2h"] Dec 06 07:11:30 crc kubenswrapper[4954]: I1206 07:11:30.275380 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-cvwlf" event={"ID":"257074d3-7882-4f60-b7d6-92374480fafa","Type":"ContainerStarted","Data":"b317d0f2a115dbc9cef45d64092bd1af82062b8fc9ac626e684541720d04c7bf"} Dec 06 07:11:30 crc kubenswrapper[4954]: I1206 07:11:30.292706 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-cvwlf" podStartSLOduration=2.146879738 podStartE2EDuration="15.292684973s" podCreationTimestamp="2025-12-06 07:11:15 +0000 UTC" firstStartedPulling="2025-12-06 07:11:16.124275765 +0000 UTC m=+850.937635144" lastFinishedPulling="2025-12-06 07:11:29.27008099 +0000 UTC m=+864.083440379" observedRunningTime="2025-12-06 07:11:30.290942206 +0000 UTC m=+865.104301605" watchObservedRunningTime="2025-12-06 07:11:30.292684973 +0000 UTC m=+865.106044362" Dec 06 07:11:30 crc kubenswrapper[4954]: I1206 07:11:30.577243 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-k9nmn" Dec 06 07:11:36 crc kubenswrapper[4954]: I1206 07:11:36.107607 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-llch7" Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.417907 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd"] Dec 06 07:11:49 crc kubenswrapper[4954]: E1206 07:11:49.418885 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed01c7d5-c189-423f-a812-a4963501bd12" containerName="extract-utilities" Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.418906 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed01c7d5-c189-423f-a812-a4963501bd12" containerName="extract-utilities" Dec 06 07:11:49 crc kubenswrapper[4954]: E1206 07:11:49.418937 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed01c7d5-c189-423f-a812-a4963501bd12" containerName="extract-content" Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.418946 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed01c7d5-c189-423f-a812-a4963501bd12" containerName="extract-content" Dec 06 07:11:49 crc kubenswrapper[4954]: E1206 07:11:49.418963 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed01c7d5-c189-423f-a812-a4963501bd12" containerName="registry-server" Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.418972 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed01c7d5-c189-423f-a812-a4963501bd12" containerName="registry-server" Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.419098 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed01c7d5-c189-423f-a812-a4963501bd12" containerName="registry-server" Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.420197 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd" Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.423168 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.426126 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd"] Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.545030 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/688b9e13-c31d-446c-acb7-ba10a4a6a0f1-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd\" (UID: \"688b9e13-c31d-446c-acb7-ba10a4a6a0f1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd" Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.545109 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ltfj\" (UniqueName: \"kubernetes.io/projected/688b9e13-c31d-446c-acb7-ba10a4a6a0f1-kube-api-access-8ltfj\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd\" (UID: \"688b9e13-c31d-446c-acb7-ba10a4a6a0f1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd" Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.545139 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/688b9e13-c31d-446c-acb7-ba10a4a6a0f1-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd\" (UID: \"688b9e13-c31d-446c-acb7-ba10a4a6a0f1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd" Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.646258 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/688b9e13-c31d-446c-acb7-ba10a4a6a0f1-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd\" (UID: \"688b9e13-c31d-446c-acb7-ba10a4a6a0f1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd" Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.646322 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ltfj\" (UniqueName: \"kubernetes.io/projected/688b9e13-c31d-446c-acb7-ba10a4a6a0f1-kube-api-access-8ltfj\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd\" (UID: \"688b9e13-c31d-446c-acb7-ba10a4a6a0f1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd" Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.646365 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/688b9e13-c31d-446c-acb7-ba10a4a6a0f1-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd\" (UID: \"688b9e13-c31d-446c-acb7-ba10a4a6a0f1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd" Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.646928 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/688b9e13-c31d-446c-acb7-ba10a4a6a0f1-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd\" (UID: \"688b9e13-c31d-446c-acb7-ba10a4a6a0f1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd" Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.647028 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/688b9e13-c31d-446c-acb7-ba10a4a6a0f1-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd\" (UID: \"688b9e13-c31d-446c-acb7-ba10a4a6a0f1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd" Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.669392 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ltfj\" (UniqueName: \"kubernetes.io/projected/688b9e13-c31d-446c-acb7-ba10a4a6a0f1-kube-api-access-8ltfj\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd\" (UID: \"688b9e13-c31d-446c-acb7-ba10a4a6a0f1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd" Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.743922 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd" Dec 06 07:11:49 crc kubenswrapper[4954]: I1206 07:11:49.935845 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd"] Dec 06 07:11:50 crc kubenswrapper[4954]: I1206 07:11:50.401942 4954 generic.go:334] "Generic (PLEG): container finished" podID="688b9e13-c31d-446c-acb7-ba10a4a6a0f1" containerID="22483f1da94486bff83cc9f59aa2e1faa51d80d4500a00ea8bcf111d9fe50891" exitCode=0 Dec 06 07:11:50 crc kubenswrapper[4954]: I1206 07:11:50.402022 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd" event={"ID":"688b9e13-c31d-446c-acb7-ba10a4a6a0f1","Type":"ContainerDied","Data":"22483f1da94486bff83cc9f59aa2e1faa51d80d4500a00ea8bcf111d9fe50891"} Dec 06 07:11:50 crc kubenswrapper[4954]: I1206 07:11:50.402064 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd" event={"ID":"688b9e13-c31d-446c-acb7-ba10a4a6a0f1","Type":"ContainerStarted","Data":"2893c195370b673235ba23be1270a241ecf79e90e4f3c7185489b008c0676ecd"} Dec 06 07:11:51 crc kubenswrapper[4954]: I1206 07:11:51.353695 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-z4t2h" podUID="b7733adb-d709-4fee-bfc6-728721369b82" containerName="console" containerID="cri-o://dad48dc047c52cd05870b82198941900349d8c4ed02bda7e065e4961c5eaabb3" gracePeriod=15 Dec 06 07:11:51 crc kubenswrapper[4954]: I1206 07:11:51.804465 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-z4t2h_b7733adb-d709-4fee-bfc6-728721369b82/console/0.log" Dec 06 07:11:51 crc kubenswrapper[4954]: I1206 07:11:51.804596 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 07:11:51 crc kubenswrapper[4954]: I1206 07:11:51.979994 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-oauth-serving-cert\") pod \"b7733adb-d709-4fee-bfc6-728721369b82\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " Dec 06 07:11:51 crc kubenswrapper[4954]: I1206 07:11:51.980104 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-console-config\") pod \"b7733adb-d709-4fee-bfc6-728721369b82\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " Dec 06 07:11:51 crc kubenswrapper[4954]: I1206 07:11:51.980153 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-trusted-ca-bundle\") pod \"b7733adb-d709-4fee-bfc6-728721369b82\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " Dec 06 07:11:51 crc kubenswrapper[4954]: I1206 07:11:51.980185 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7733adb-d709-4fee-bfc6-728721369b82-console-oauth-config\") pod \"b7733adb-d709-4fee-bfc6-728721369b82\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " Dec 06 07:11:51 crc kubenswrapper[4954]: I1206 07:11:51.980226 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-service-ca\") pod \"b7733adb-d709-4fee-bfc6-728721369b82\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " Dec 06 07:11:51 crc kubenswrapper[4954]: I1206 07:11:51.980319 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7733adb-d709-4fee-bfc6-728721369b82-console-serving-cert\") pod \"b7733adb-d709-4fee-bfc6-728721369b82\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " Dec 06 07:11:51 crc kubenswrapper[4954]: I1206 07:11:51.980364 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vvk2\" (UniqueName: \"kubernetes.io/projected/b7733adb-d709-4fee-bfc6-728721369b82-kube-api-access-6vvk2\") pod \"b7733adb-d709-4fee-bfc6-728721369b82\" (UID: \"b7733adb-d709-4fee-bfc6-728721369b82\") " Dec 06 07:11:51 crc kubenswrapper[4954]: I1206 07:11:51.981456 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-console-config" (OuterVolumeSpecName: "console-config") pod "b7733adb-d709-4fee-bfc6-728721369b82" (UID: "b7733adb-d709-4fee-bfc6-728721369b82"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:11:51 crc kubenswrapper[4954]: I1206 07:11:51.981492 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b7733adb-d709-4fee-bfc6-728721369b82" (UID: "b7733adb-d709-4fee-bfc6-728721369b82"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:11:51 crc kubenswrapper[4954]: I1206 07:11:51.981706 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b7733adb-d709-4fee-bfc6-728721369b82" (UID: "b7733adb-d709-4fee-bfc6-728721369b82"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:11:51 crc kubenswrapper[4954]: I1206 07:11:51.981945 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-service-ca" (OuterVolumeSpecName: "service-ca") pod "b7733adb-d709-4fee-bfc6-728721369b82" (UID: "b7733adb-d709-4fee-bfc6-728721369b82"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:11:51 crc kubenswrapper[4954]: I1206 07:11:51.986171 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7733adb-d709-4fee-bfc6-728721369b82-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b7733adb-d709-4fee-bfc6-728721369b82" (UID: "b7733adb-d709-4fee-bfc6-728721369b82"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:11:51 crc kubenswrapper[4954]: I1206 07:11:51.986866 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7733adb-d709-4fee-bfc6-728721369b82-kube-api-access-6vvk2" (OuterVolumeSpecName: "kube-api-access-6vvk2") pod "b7733adb-d709-4fee-bfc6-728721369b82" (UID: "b7733adb-d709-4fee-bfc6-728721369b82"). InnerVolumeSpecName "kube-api-access-6vvk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:11:51 crc kubenswrapper[4954]: I1206 07:11:51.986918 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7733adb-d709-4fee-bfc6-728721369b82-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b7733adb-d709-4fee-bfc6-728721369b82" (UID: "b7733adb-d709-4fee-bfc6-728721369b82"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.082616 4954 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.082671 4954 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7733adb-d709-4fee-bfc6-728721369b82-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.082685 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vvk2\" (UniqueName: \"kubernetes.io/projected/b7733adb-d709-4fee-bfc6-728721369b82-kube-api-access-6vvk2\") on node \"crc\" DevicePath \"\"" Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.082697 4954 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.082709 4954 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-console-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.082722 4954 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7733adb-d709-4fee-bfc6-728721369b82-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.082734 4954 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7733adb-d709-4fee-bfc6-728721369b82-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.419487 4954 generic.go:334] "Generic (PLEG): container finished" podID="688b9e13-c31d-446c-acb7-ba10a4a6a0f1" containerID="3ae88efaec2c9f62791ac089d380bfb8f3e1a5977bbefc44663863de235733b9" exitCode=0 Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.419639 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd" event={"ID":"688b9e13-c31d-446c-acb7-ba10a4a6a0f1","Type":"ContainerDied","Data":"3ae88efaec2c9f62791ac089d380bfb8f3e1a5977bbefc44663863de235733b9"} Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.422658 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-z4t2h_b7733adb-d709-4fee-bfc6-728721369b82/console/0.log" Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.422708 4954 generic.go:334] "Generic (PLEG): container finished" podID="b7733adb-d709-4fee-bfc6-728721369b82" containerID="dad48dc047c52cd05870b82198941900349d8c4ed02bda7e065e4961c5eaabb3" exitCode=2 Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.422742 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-z4t2h" event={"ID":"b7733adb-d709-4fee-bfc6-728721369b82","Type":"ContainerDied","Data":"dad48dc047c52cd05870b82198941900349d8c4ed02bda7e065e4961c5eaabb3"} Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.422774 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-z4t2h" event={"ID":"b7733adb-d709-4fee-bfc6-728721369b82","Type":"ContainerDied","Data":"02b8a177e8cbe0122f0c1c162d47c1e78341dd182ce0d247fb5e5acb3b2e9f7e"} Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.422806 4954 scope.go:117] "RemoveContainer" containerID="dad48dc047c52cd05870b82198941900349d8c4ed02bda7e065e4961c5eaabb3" Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.422815 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-z4t2h" Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.459580 4954 scope.go:117] "RemoveContainer" containerID="dad48dc047c52cd05870b82198941900349d8c4ed02bda7e065e4961c5eaabb3" Dec 06 07:11:52 crc kubenswrapper[4954]: E1206 07:11:52.460231 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dad48dc047c52cd05870b82198941900349d8c4ed02bda7e065e4961c5eaabb3\": container with ID starting with dad48dc047c52cd05870b82198941900349d8c4ed02bda7e065e4961c5eaabb3 not found: ID does not exist" containerID="dad48dc047c52cd05870b82198941900349d8c4ed02bda7e065e4961c5eaabb3" Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.460275 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dad48dc047c52cd05870b82198941900349d8c4ed02bda7e065e4961c5eaabb3"} err="failed to get container status \"dad48dc047c52cd05870b82198941900349d8c4ed02bda7e065e4961c5eaabb3\": rpc error: code = NotFound desc = could not find container \"dad48dc047c52cd05870b82198941900349d8c4ed02bda7e065e4961c5eaabb3\": container with ID starting with dad48dc047c52cd05870b82198941900349d8c4ed02bda7e065e4961c5eaabb3 not found: ID does not exist" Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.468505 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-z4t2h"] Dec 06 07:11:52 crc kubenswrapper[4954]: I1206 07:11:52.473879 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-z4t2h"] Dec 06 07:11:53 crc kubenswrapper[4954]: I1206 07:11:53.431789 4954 generic.go:334] "Generic (PLEG): container finished" podID="688b9e13-c31d-446c-acb7-ba10a4a6a0f1" containerID="339aeaac4eeeba21c820ceae9e3a180fb893a88547809b03b13f4ccbe951991e" exitCode=0 Dec 06 07:11:53 crc kubenswrapper[4954]: I1206 07:11:53.431886 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd" event={"ID":"688b9e13-c31d-446c-acb7-ba10a4a6a0f1","Type":"ContainerDied","Data":"339aeaac4eeeba21c820ceae9e3a180fb893a88547809b03b13f4ccbe951991e"} Dec 06 07:11:53 crc kubenswrapper[4954]: I1206 07:11:53.451281 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7733adb-d709-4fee-bfc6-728721369b82" path="/var/lib/kubelet/pods/b7733adb-d709-4fee-bfc6-728721369b82/volumes" Dec 06 07:11:54 crc kubenswrapper[4954]: I1206 07:11:54.680337 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd" Dec 06 07:11:54 crc kubenswrapper[4954]: I1206 07:11:54.819705 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/688b9e13-c31d-446c-acb7-ba10a4a6a0f1-util\") pod \"688b9e13-c31d-446c-acb7-ba10a4a6a0f1\" (UID: \"688b9e13-c31d-446c-acb7-ba10a4a6a0f1\") " Dec 06 07:11:54 crc kubenswrapper[4954]: I1206 07:11:54.819800 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/688b9e13-c31d-446c-acb7-ba10a4a6a0f1-bundle\") pod \"688b9e13-c31d-446c-acb7-ba10a4a6a0f1\" (UID: \"688b9e13-c31d-446c-acb7-ba10a4a6a0f1\") " Dec 06 07:11:54 crc kubenswrapper[4954]: I1206 07:11:54.819899 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ltfj\" (UniqueName: \"kubernetes.io/projected/688b9e13-c31d-446c-acb7-ba10a4a6a0f1-kube-api-access-8ltfj\") pod \"688b9e13-c31d-446c-acb7-ba10a4a6a0f1\" (UID: \"688b9e13-c31d-446c-acb7-ba10a4a6a0f1\") " Dec 06 07:11:54 crc kubenswrapper[4954]: I1206 07:11:54.821148 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/688b9e13-c31d-446c-acb7-ba10a4a6a0f1-bundle" (OuterVolumeSpecName: "bundle") pod "688b9e13-c31d-446c-acb7-ba10a4a6a0f1" (UID: "688b9e13-c31d-446c-acb7-ba10a4a6a0f1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:11:54 crc kubenswrapper[4954]: I1206 07:11:54.828154 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/688b9e13-c31d-446c-acb7-ba10a4a6a0f1-kube-api-access-8ltfj" (OuterVolumeSpecName: "kube-api-access-8ltfj") pod "688b9e13-c31d-446c-acb7-ba10a4a6a0f1" (UID: "688b9e13-c31d-446c-acb7-ba10a4a6a0f1"). InnerVolumeSpecName "kube-api-access-8ltfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:11:54 crc kubenswrapper[4954]: I1206 07:11:54.835021 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/688b9e13-c31d-446c-acb7-ba10a4a6a0f1-util" (OuterVolumeSpecName: "util") pod "688b9e13-c31d-446c-acb7-ba10a4a6a0f1" (UID: "688b9e13-c31d-446c-acb7-ba10a4a6a0f1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:11:54 crc kubenswrapper[4954]: I1206 07:11:54.921300 4954 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/688b9e13-c31d-446c-acb7-ba10a4a6a0f1-util\") on node \"crc\" DevicePath \"\"" Dec 06 07:11:54 crc kubenswrapper[4954]: I1206 07:11:54.921618 4954 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/688b9e13-c31d-446c-acb7-ba10a4a6a0f1-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:11:54 crc kubenswrapper[4954]: I1206 07:11:54.921752 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ltfj\" (UniqueName: \"kubernetes.io/projected/688b9e13-c31d-446c-acb7-ba10a4a6a0f1-kube-api-access-8ltfj\") on node \"crc\" DevicePath \"\"" Dec 06 07:11:55 crc kubenswrapper[4954]: I1206 07:11:55.451081 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd" Dec 06 07:11:55 crc kubenswrapper[4954]: I1206 07:11:55.451544 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd" event={"ID":"688b9e13-c31d-446c-acb7-ba10a4a6a0f1","Type":"ContainerDied","Data":"2893c195370b673235ba23be1270a241ecf79e90e4f3c7185489b008c0676ecd"} Dec 06 07:11:55 crc kubenswrapper[4954]: I1206 07:11:55.451621 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2893c195370b673235ba23be1270a241ecf79e90e4f3c7185489b008c0676ecd" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.214966 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77"] Dec 06 07:12:04 crc kubenswrapper[4954]: E1206 07:12:04.215939 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7733adb-d709-4fee-bfc6-728721369b82" containerName="console" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.215955 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7733adb-d709-4fee-bfc6-728721369b82" containerName="console" Dec 06 07:12:04 crc kubenswrapper[4954]: E1206 07:12:04.215972 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688b9e13-c31d-446c-acb7-ba10a4a6a0f1" containerName="extract" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.215978 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="688b9e13-c31d-446c-acb7-ba10a4a6a0f1" containerName="extract" Dec 06 07:12:04 crc kubenswrapper[4954]: E1206 07:12:04.215996 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688b9e13-c31d-446c-acb7-ba10a4a6a0f1" containerName="util" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.216004 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="688b9e13-c31d-446c-acb7-ba10a4a6a0f1" containerName="util" Dec 06 07:12:04 crc kubenswrapper[4954]: E1206 07:12:04.216013 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688b9e13-c31d-446c-acb7-ba10a4a6a0f1" containerName="pull" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.216019 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="688b9e13-c31d-446c-acb7-ba10a4a6a0f1" containerName="pull" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.216124 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="688b9e13-c31d-446c-acb7-ba10a4a6a0f1" containerName="extract" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.216142 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7733adb-d709-4fee-bfc6-728721369b82" containerName="console" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.216620 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.222542 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.222816 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.223054 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.226099 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.229657 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-5b6nr" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.247685 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77"] Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.253781 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/63d7776b-9c1b-4fe6-a76e-74225b88dab7-webhook-cert\") pod \"metallb-operator-controller-manager-574b6cdf6f-tkt77\" (UID: \"63d7776b-9c1b-4fe6-a76e-74225b88dab7\") " pod="metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.253844 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8s6w\" (UniqueName: \"kubernetes.io/projected/63d7776b-9c1b-4fe6-a76e-74225b88dab7-kube-api-access-n8s6w\") pod \"metallb-operator-controller-manager-574b6cdf6f-tkt77\" (UID: \"63d7776b-9c1b-4fe6-a76e-74225b88dab7\") " pod="metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.254088 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/63d7776b-9c1b-4fe6-a76e-74225b88dab7-apiservice-cert\") pod \"metallb-operator-controller-manager-574b6cdf6f-tkt77\" (UID: \"63d7776b-9c1b-4fe6-a76e-74225b88dab7\") " pod="metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.355249 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/63d7776b-9c1b-4fe6-a76e-74225b88dab7-apiservice-cert\") pod \"metallb-operator-controller-manager-574b6cdf6f-tkt77\" (UID: \"63d7776b-9c1b-4fe6-a76e-74225b88dab7\") " pod="metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.355349 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/63d7776b-9c1b-4fe6-a76e-74225b88dab7-webhook-cert\") pod \"metallb-operator-controller-manager-574b6cdf6f-tkt77\" (UID: \"63d7776b-9c1b-4fe6-a76e-74225b88dab7\") " pod="metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.355379 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8s6w\" (UniqueName: \"kubernetes.io/projected/63d7776b-9c1b-4fe6-a76e-74225b88dab7-kube-api-access-n8s6w\") pod \"metallb-operator-controller-manager-574b6cdf6f-tkt77\" (UID: \"63d7776b-9c1b-4fe6-a76e-74225b88dab7\") " pod="metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.363013 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/63d7776b-9c1b-4fe6-a76e-74225b88dab7-webhook-cert\") pod \"metallb-operator-controller-manager-574b6cdf6f-tkt77\" (UID: \"63d7776b-9c1b-4fe6-a76e-74225b88dab7\") " pod="metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.367079 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/63d7776b-9c1b-4fe6-a76e-74225b88dab7-apiservice-cert\") pod \"metallb-operator-controller-manager-574b6cdf6f-tkt77\" (UID: \"63d7776b-9c1b-4fe6-a76e-74225b88dab7\") " pod="metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.375158 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8s6w\" (UniqueName: \"kubernetes.io/projected/63d7776b-9c1b-4fe6-a76e-74225b88dab7-kube-api-access-n8s6w\") pod \"metallb-operator-controller-manager-574b6cdf6f-tkt77\" (UID: \"63d7776b-9c1b-4fe6-a76e-74225b88dab7\") " pod="metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.486690 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-767c58648-lwjz8"] Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.488047 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-767c58648-lwjz8" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.490625 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.490847 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.492716 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-cqbfr" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.540046 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.561550 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-767c58648-lwjz8"] Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.663169 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/932f3a22-69f1-427c-8b8c-7d1cb2f32f8a-webhook-cert\") pod \"metallb-operator-webhook-server-767c58648-lwjz8\" (UID: \"932f3a22-69f1-427c-8b8c-7d1cb2f32f8a\") " pod="metallb-system/metallb-operator-webhook-server-767c58648-lwjz8" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.663697 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sq8h\" (UniqueName: \"kubernetes.io/projected/932f3a22-69f1-427c-8b8c-7d1cb2f32f8a-kube-api-access-2sq8h\") pod \"metallb-operator-webhook-server-767c58648-lwjz8\" (UID: \"932f3a22-69f1-427c-8b8c-7d1cb2f32f8a\") " pod="metallb-system/metallb-operator-webhook-server-767c58648-lwjz8" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.663845 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/932f3a22-69f1-427c-8b8c-7d1cb2f32f8a-apiservice-cert\") pod \"metallb-operator-webhook-server-767c58648-lwjz8\" (UID: \"932f3a22-69f1-427c-8b8c-7d1cb2f32f8a\") " pod="metallb-system/metallb-operator-webhook-server-767c58648-lwjz8" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.766955 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sq8h\" (UniqueName: \"kubernetes.io/projected/932f3a22-69f1-427c-8b8c-7d1cb2f32f8a-kube-api-access-2sq8h\") pod \"metallb-operator-webhook-server-767c58648-lwjz8\" (UID: \"932f3a22-69f1-427c-8b8c-7d1cb2f32f8a\") " pod="metallb-system/metallb-operator-webhook-server-767c58648-lwjz8" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.767035 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/932f3a22-69f1-427c-8b8c-7d1cb2f32f8a-apiservice-cert\") pod \"metallb-operator-webhook-server-767c58648-lwjz8\" (UID: \"932f3a22-69f1-427c-8b8c-7d1cb2f32f8a\") " pod="metallb-system/metallb-operator-webhook-server-767c58648-lwjz8" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.767114 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/932f3a22-69f1-427c-8b8c-7d1cb2f32f8a-webhook-cert\") pod \"metallb-operator-webhook-server-767c58648-lwjz8\" (UID: \"932f3a22-69f1-427c-8b8c-7d1cb2f32f8a\") " pod="metallb-system/metallb-operator-webhook-server-767c58648-lwjz8" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.773095 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/932f3a22-69f1-427c-8b8c-7d1cb2f32f8a-webhook-cert\") pod \"metallb-operator-webhook-server-767c58648-lwjz8\" (UID: \"932f3a22-69f1-427c-8b8c-7d1cb2f32f8a\") " pod="metallb-system/metallb-operator-webhook-server-767c58648-lwjz8" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.784099 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/932f3a22-69f1-427c-8b8c-7d1cb2f32f8a-apiservice-cert\") pod \"metallb-operator-webhook-server-767c58648-lwjz8\" (UID: \"932f3a22-69f1-427c-8b8c-7d1cb2f32f8a\") " pod="metallb-system/metallb-operator-webhook-server-767c58648-lwjz8" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.798606 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sq8h\" (UniqueName: \"kubernetes.io/projected/932f3a22-69f1-427c-8b8c-7d1cb2f32f8a-kube-api-access-2sq8h\") pod \"metallb-operator-webhook-server-767c58648-lwjz8\" (UID: \"932f3a22-69f1-427c-8b8c-7d1cb2f32f8a\") " pod="metallb-system/metallb-operator-webhook-server-767c58648-lwjz8" Dec 06 07:12:04 crc kubenswrapper[4954]: I1206 07:12:04.809216 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-767c58648-lwjz8" Dec 06 07:12:05 crc kubenswrapper[4954]: I1206 07:12:05.007221 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77"] Dec 06 07:12:05 crc kubenswrapper[4954]: I1206 07:12:05.166717 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-767c58648-lwjz8"] Dec 06 07:12:05 crc kubenswrapper[4954]: W1206 07:12:05.169510 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod932f3a22_69f1_427c_8b8c_7d1cb2f32f8a.slice/crio-07d1c2c621806ff9d42a16e177181fd777bdd5f3ddfad5557ef0760cc1de056a WatchSource:0}: Error finding container 07d1c2c621806ff9d42a16e177181fd777bdd5f3ddfad5557ef0760cc1de056a: Status 404 returned error can't find the container with id 07d1c2c621806ff9d42a16e177181fd777bdd5f3ddfad5557ef0760cc1de056a Dec 06 07:12:05 crc kubenswrapper[4954]: I1206 07:12:05.516924 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-767c58648-lwjz8" event={"ID":"932f3a22-69f1-427c-8b8c-7d1cb2f32f8a","Type":"ContainerStarted","Data":"07d1c2c621806ff9d42a16e177181fd777bdd5f3ddfad5557ef0760cc1de056a"} Dec 06 07:12:05 crc kubenswrapper[4954]: I1206 07:12:05.522350 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77" event={"ID":"63d7776b-9c1b-4fe6-a76e-74225b88dab7","Type":"ContainerStarted","Data":"d410cbbecb7f1b93b0e22188a3d98b4c48cf49daad30fe0d87c02f6aa4676e01"} Dec 06 07:12:09 crc kubenswrapper[4954]: I1206 07:12:09.574209 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77" event={"ID":"63d7776b-9c1b-4fe6-a76e-74225b88dab7","Type":"ContainerStarted","Data":"6d4549e59c4885128a21bb442f1fb616c70c9cb2e70c313db179664587fd8784"} Dec 06 07:12:09 crc kubenswrapper[4954]: I1206 07:12:09.575027 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77" Dec 06 07:12:09 crc kubenswrapper[4954]: I1206 07:12:09.612109 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77" podStartSLOduration=1.694627039 podStartE2EDuration="5.61209204s" podCreationTimestamp="2025-12-06 07:12:04 +0000 UTC" firstStartedPulling="2025-12-06 07:12:05.010972946 +0000 UTC m=+899.824332335" lastFinishedPulling="2025-12-06 07:12:08.928437947 +0000 UTC m=+903.741797336" observedRunningTime="2025-12-06 07:12:09.610548029 +0000 UTC m=+904.423907418" watchObservedRunningTime="2025-12-06 07:12:09.61209204 +0000 UTC m=+904.425451429" Dec 06 07:12:10 crc kubenswrapper[4954]: I1206 07:12:10.101698 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:12:10 crc kubenswrapper[4954]: I1206 07:12:10.101799 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:12:14 crc kubenswrapper[4954]: I1206 07:12:14.609536 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-767c58648-lwjz8" event={"ID":"932f3a22-69f1-427c-8b8c-7d1cb2f32f8a","Type":"ContainerStarted","Data":"5253508af3ec1ecf55fc134b62b24fa7d9f6003896a9028e1602348832e12f9f"} Dec 06 07:12:14 crc kubenswrapper[4954]: I1206 07:12:14.610223 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-767c58648-lwjz8" Dec 06 07:12:14 crc kubenswrapper[4954]: I1206 07:12:14.631775 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-767c58648-lwjz8" podStartSLOduration=2.144236032 podStartE2EDuration="10.631735218s" podCreationTimestamp="2025-12-06 07:12:04 +0000 UTC" firstStartedPulling="2025-12-06 07:12:05.172229361 +0000 UTC m=+899.985588750" lastFinishedPulling="2025-12-06 07:12:13.659728547 +0000 UTC m=+908.473087936" observedRunningTime="2025-12-06 07:12:14.627661959 +0000 UTC m=+909.441021348" watchObservedRunningTime="2025-12-06 07:12:14.631735218 +0000 UTC m=+909.445094607" Dec 06 07:12:24 crc kubenswrapper[4954]: I1206 07:12:24.819770 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-767c58648-lwjz8" Dec 06 07:12:40 crc kubenswrapper[4954]: I1206 07:12:40.101411 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:12:40 crc kubenswrapper[4954]: I1206 07:12:40.102491 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:12:43 crc kubenswrapper[4954]: I1206 07:12:43.320947 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xsq46"] Dec 06 07:12:43 crc kubenswrapper[4954]: I1206 07:12:43.322459 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xsq46" Dec 06 07:12:43 crc kubenswrapper[4954]: I1206 07:12:43.336030 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xsq46"] Dec 06 07:12:43 crc kubenswrapper[4954]: I1206 07:12:43.429960 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbcck\" (UniqueName: \"kubernetes.io/projected/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91-kube-api-access-hbcck\") pod \"certified-operators-xsq46\" (UID: \"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91\") " pod="openshift-marketplace/certified-operators-xsq46" Dec 06 07:12:43 crc kubenswrapper[4954]: I1206 07:12:43.430404 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91-utilities\") pod \"certified-operators-xsq46\" (UID: \"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91\") " pod="openshift-marketplace/certified-operators-xsq46" Dec 06 07:12:43 crc kubenswrapper[4954]: I1206 07:12:43.430749 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91-catalog-content\") pod \"certified-operators-xsq46\" (UID: \"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91\") " pod="openshift-marketplace/certified-operators-xsq46" Dec 06 07:12:43 crc kubenswrapper[4954]: I1206 07:12:43.532128 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91-catalog-content\") pod \"certified-operators-xsq46\" (UID: \"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91\") " pod="openshift-marketplace/certified-operators-xsq46" Dec 06 07:12:43 crc kubenswrapper[4954]: I1206 07:12:43.532204 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbcck\" (UniqueName: \"kubernetes.io/projected/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91-kube-api-access-hbcck\") pod \"certified-operators-xsq46\" (UID: \"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91\") " pod="openshift-marketplace/certified-operators-xsq46" Dec 06 07:12:43 crc kubenswrapper[4954]: I1206 07:12:43.532234 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91-utilities\") pod \"certified-operators-xsq46\" (UID: \"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91\") " pod="openshift-marketplace/certified-operators-xsq46" Dec 06 07:12:43 crc kubenswrapper[4954]: I1206 07:12:43.532745 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91-catalog-content\") pod \"certified-operators-xsq46\" (UID: \"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91\") " pod="openshift-marketplace/certified-operators-xsq46" Dec 06 07:12:43 crc kubenswrapper[4954]: I1206 07:12:43.533203 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91-utilities\") pod \"certified-operators-xsq46\" (UID: \"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91\") " pod="openshift-marketplace/certified-operators-xsq46" Dec 06 07:12:43 crc kubenswrapper[4954]: I1206 07:12:43.557109 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbcck\" (UniqueName: \"kubernetes.io/projected/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91-kube-api-access-hbcck\") pod \"certified-operators-xsq46\" (UID: \"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91\") " pod="openshift-marketplace/certified-operators-xsq46" Dec 06 07:12:43 crc kubenswrapper[4954]: I1206 07:12:43.646308 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xsq46" Dec 06 07:12:43 crc kubenswrapper[4954]: I1206 07:12:43.995176 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xsq46"] Dec 06 07:12:44 crc kubenswrapper[4954]: I1206 07:12:44.544423 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-574b6cdf6f-tkt77" Dec 06 07:12:44 crc kubenswrapper[4954]: I1206 07:12:44.809500 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xsq46" event={"ID":"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91","Type":"ContainerStarted","Data":"786c835973cff502058bdb5eeac02e62b4c2d4d11a988162bfb288ebda10593c"} Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.335198 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-p95p2"] Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.338498 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.341176 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.347581 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-8f4wp"] Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.348830 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-8f4wp" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.352977 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.353355 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-cwk66" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.353478 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.353604 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-8f4wp"] Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.431054 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-fjt49"] Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.432330 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-fjt49" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.434379 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.434914 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-8tq8n" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.435083 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.435263 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.444335 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-ql4ww"] Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.445357 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-ql4ww" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.448102 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.462881 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bda5decc-cd20-487a-b0b5-01058cac828c-metrics-certs\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.462955 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/bda5decc-cd20-487a-b0b5-01058cac828c-reloader\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.462991 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4509df09-42d5-4323-8d31-88fef1e763c2-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-8f4wp\" (UID: \"4509df09-42d5-4323-8d31-88fef1e763c2\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-8f4wp" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.463052 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tndhh\" (UniqueName: \"kubernetes.io/projected/4509df09-42d5-4323-8d31-88fef1e763c2-kube-api-access-tndhh\") pod \"frr-k8s-webhook-server-7fcb986d4-8f4wp\" (UID: \"4509df09-42d5-4323-8d31-88fef1e763c2\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-8f4wp" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.463113 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/bda5decc-cd20-487a-b0b5-01058cac828c-frr-conf\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.463146 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxc2s\" (UniqueName: \"kubernetes.io/projected/bda5decc-cd20-487a-b0b5-01058cac828c-kube-api-access-hxc2s\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.463195 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/bda5decc-cd20-487a-b0b5-01058cac828c-frr-sockets\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.463231 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/bda5decc-cd20-487a-b0b5-01058cac828c-frr-startup\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.463297 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/bda5decc-cd20-487a-b0b5-01058cac828c-metrics\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.464105 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-ql4ww"] Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.564970 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tndhh\" (UniqueName: \"kubernetes.io/projected/4509df09-42d5-4323-8d31-88fef1e763c2-kube-api-access-tndhh\") pod \"frr-k8s-webhook-server-7fcb986d4-8f4wp\" (UID: \"4509df09-42d5-4323-8d31-88fef1e763c2\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-8f4wp" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.565056 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnbc4\" (UniqueName: \"kubernetes.io/projected/3048d4de-ffa6-4164-afde-4c6fbfe6abb3-kube-api-access-hnbc4\") pod \"speaker-fjt49\" (UID: \"3048d4de-ffa6-4164-afde-4c6fbfe6abb3\") " pod="metallb-system/speaker-fjt49" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.565087 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab7695f2-803f-469d-9302-c1bec12de24e-cert\") pod \"controller-f8648f98b-ql4ww\" (UID: \"ab7695f2-803f-469d-9302-c1bec12de24e\") " pod="metallb-system/controller-f8648f98b-ql4ww" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.565115 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/bda5decc-cd20-487a-b0b5-01058cac828c-frr-conf\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.565161 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52w58\" (UniqueName: \"kubernetes.io/projected/ab7695f2-803f-469d-9302-c1bec12de24e-kube-api-access-52w58\") pod \"controller-f8648f98b-ql4ww\" (UID: \"ab7695f2-803f-469d-9302-c1bec12de24e\") " pod="metallb-system/controller-f8648f98b-ql4ww" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.565193 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxc2s\" (UniqueName: \"kubernetes.io/projected/bda5decc-cd20-487a-b0b5-01058cac828c-kube-api-access-hxc2s\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.565223 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/bda5decc-cd20-487a-b0b5-01058cac828c-frr-sockets\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.565292 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3048d4de-ffa6-4164-afde-4c6fbfe6abb3-metrics-certs\") pod \"speaker-fjt49\" (UID: \"3048d4de-ffa6-4164-afde-4c6fbfe6abb3\") " pod="metallb-system/speaker-fjt49" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.565318 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/bda5decc-cd20-487a-b0b5-01058cac828c-frr-startup\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.565381 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3048d4de-ffa6-4164-afde-4c6fbfe6abb3-memberlist\") pod \"speaker-fjt49\" (UID: \"3048d4de-ffa6-4164-afde-4c6fbfe6abb3\") " pod="metallb-system/speaker-fjt49" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.565418 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/bda5decc-cd20-487a-b0b5-01058cac828c-metrics\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.565451 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bda5decc-cd20-487a-b0b5-01058cac828c-metrics-certs\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.565474 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/bda5decc-cd20-487a-b0b5-01058cac828c-reloader\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.565497 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4509df09-42d5-4323-8d31-88fef1e763c2-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-8f4wp\" (UID: \"4509df09-42d5-4323-8d31-88fef1e763c2\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-8f4wp" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.565525 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab7695f2-803f-469d-9302-c1bec12de24e-metrics-certs\") pod \"controller-f8648f98b-ql4ww\" (UID: \"ab7695f2-803f-469d-9302-c1bec12de24e\") " pod="metallb-system/controller-f8648f98b-ql4ww" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.565737 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/bda5decc-cd20-487a-b0b5-01058cac828c-frr-conf\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.566158 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/bda5decc-cd20-487a-b0b5-01058cac828c-metrics\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.566156 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/bda5decc-cd20-487a-b0b5-01058cac828c-reloader\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.566248 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/bda5decc-cd20-487a-b0b5-01058cac828c-frr-sockets\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.569002 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3048d4de-ffa6-4164-afde-4c6fbfe6abb3-metallb-excludel2\") pod \"speaker-fjt49\" (UID: \"3048d4de-ffa6-4164-afde-4c6fbfe6abb3\") " pod="metallb-system/speaker-fjt49" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.569187 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/bda5decc-cd20-487a-b0b5-01058cac828c-frr-startup\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.586730 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bda5decc-cd20-487a-b0b5-01058cac828c-metrics-certs\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.590731 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4509df09-42d5-4323-8d31-88fef1e763c2-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-8f4wp\" (UID: \"4509df09-42d5-4323-8d31-88fef1e763c2\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-8f4wp" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.591676 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxc2s\" (UniqueName: \"kubernetes.io/projected/bda5decc-cd20-487a-b0b5-01058cac828c-kube-api-access-hxc2s\") pod \"frr-k8s-p95p2\" (UID: \"bda5decc-cd20-487a-b0b5-01058cac828c\") " pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.593466 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tndhh\" (UniqueName: \"kubernetes.io/projected/4509df09-42d5-4323-8d31-88fef1e763c2-kube-api-access-tndhh\") pod \"frr-k8s-webhook-server-7fcb986d4-8f4wp\" (UID: \"4509df09-42d5-4323-8d31-88fef1e763c2\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-8f4wp" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.659436 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-p95p2" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.669470 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-8f4wp" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.671181 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab7695f2-803f-469d-9302-c1bec12de24e-metrics-certs\") pod \"controller-f8648f98b-ql4ww\" (UID: \"ab7695f2-803f-469d-9302-c1bec12de24e\") " pod="metallb-system/controller-f8648f98b-ql4ww" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.671236 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3048d4de-ffa6-4164-afde-4c6fbfe6abb3-metallb-excludel2\") pod \"speaker-fjt49\" (UID: \"3048d4de-ffa6-4164-afde-4c6fbfe6abb3\") " pod="metallb-system/speaker-fjt49" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.671267 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnbc4\" (UniqueName: \"kubernetes.io/projected/3048d4de-ffa6-4164-afde-4c6fbfe6abb3-kube-api-access-hnbc4\") pod \"speaker-fjt49\" (UID: \"3048d4de-ffa6-4164-afde-4c6fbfe6abb3\") " pod="metallb-system/speaker-fjt49" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.671287 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab7695f2-803f-469d-9302-c1bec12de24e-cert\") pod \"controller-f8648f98b-ql4ww\" (UID: \"ab7695f2-803f-469d-9302-c1bec12de24e\") " pod="metallb-system/controller-f8648f98b-ql4ww" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.671312 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52w58\" (UniqueName: \"kubernetes.io/projected/ab7695f2-803f-469d-9302-c1bec12de24e-kube-api-access-52w58\") pod \"controller-f8648f98b-ql4ww\" (UID: \"ab7695f2-803f-469d-9302-c1bec12de24e\") " pod="metallb-system/controller-f8648f98b-ql4ww" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.671342 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3048d4de-ffa6-4164-afde-4c6fbfe6abb3-metrics-certs\") pod \"speaker-fjt49\" (UID: \"3048d4de-ffa6-4164-afde-4c6fbfe6abb3\") " pod="metallb-system/speaker-fjt49" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.671369 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3048d4de-ffa6-4164-afde-4c6fbfe6abb3-memberlist\") pod \"speaker-fjt49\" (UID: \"3048d4de-ffa6-4164-afde-4c6fbfe6abb3\") " pod="metallb-system/speaker-fjt49" Dec 06 07:12:45 crc kubenswrapper[4954]: E1206 07:12:45.671484 4954 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 06 07:12:45 crc kubenswrapper[4954]: E1206 07:12:45.671550 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3048d4de-ffa6-4164-afde-4c6fbfe6abb3-memberlist podName:3048d4de-ffa6-4164-afde-4c6fbfe6abb3 nodeName:}" failed. No retries permitted until 2025-12-06 07:12:46.17152719 +0000 UTC m=+940.984886579 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3048d4de-ffa6-4164-afde-4c6fbfe6abb3-memberlist") pod "speaker-fjt49" (UID: "3048d4de-ffa6-4164-afde-4c6fbfe6abb3") : secret "metallb-memberlist" not found Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.672886 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3048d4de-ffa6-4164-afde-4c6fbfe6abb3-metallb-excludel2\") pod \"speaker-fjt49\" (UID: \"3048d4de-ffa6-4164-afde-4c6fbfe6abb3\") " pod="metallb-system/speaker-fjt49" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.675324 4954 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.676434 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab7695f2-803f-469d-9302-c1bec12de24e-metrics-certs\") pod \"controller-f8648f98b-ql4ww\" (UID: \"ab7695f2-803f-469d-9302-c1bec12de24e\") " pod="metallb-system/controller-f8648f98b-ql4ww" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.676462 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3048d4de-ffa6-4164-afde-4c6fbfe6abb3-metrics-certs\") pod \"speaker-fjt49\" (UID: \"3048d4de-ffa6-4164-afde-4c6fbfe6abb3\") " pod="metallb-system/speaker-fjt49" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.688437 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab7695f2-803f-469d-9302-c1bec12de24e-cert\") pod \"controller-f8648f98b-ql4ww\" (UID: \"ab7695f2-803f-469d-9302-c1bec12de24e\") " pod="metallb-system/controller-f8648f98b-ql4ww" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.696424 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52w58\" (UniqueName: \"kubernetes.io/projected/ab7695f2-803f-469d-9302-c1bec12de24e-kube-api-access-52w58\") pod \"controller-f8648f98b-ql4ww\" (UID: \"ab7695f2-803f-469d-9302-c1bec12de24e\") " pod="metallb-system/controller-f8648f98b-ql4ww" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.706943 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnbc4\" (UniqueName: \"kubernetes.io/projected/3048d4de-ffa6-4164-afde-4c6fbfe6abb3-kube-api-access-hnbc4\") pod \"speaker-fjt49\" (UID: \"3048d4de-ffa6-4164-afde-4c6fbfe6abb3\") " pod="metallb-system/speaker-fjt49" Dec 06 07:12:45 crc kubenswrapper[4954]: I1206 07:12:45.761019 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-ql4ww" Dec 06 07:12:46 crc kubenswrapper[4954]: I1206 07:12:46.009697 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-ql4ww"] Dec 06 07:12:46 crc kubenswrapper[4954]: W1206 07:12:46.015335 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab7695f2_803f_469d_9302_c1bec12de24e.slice/crio-a4f2014172f2c48b172e75c8336af25de21a154ddcab6c6b4ecaf7d107ab821f WatchSource:0}: Error finding container a4f2014172f2c48b172e75c8336af25de21a154ddcab6c6b4ecaf7d107ab821f: Status 404 returned error can't find the container with id a4f2014172f2c48b172e75c8336af25de21a154ddcab6c6b4ecaf7d107ab821f Dec 06 07:12:46 crc kubenswrapper[4954]: I1206 07:12:46.116990 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-8f4wp"] Dec 06 07:12:46 crc kubenswrapper[4954]: W1206 07:12:46.122671 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4509df09_42d5_4323_8d31_88fef1e763c2.slice/crio-79d8e832ac8de2e7ccb4d6d70a3d64f32472296c37f33426337268abf90dad1e WatchSource:0}: Error finding container 79d8e832ac8de2e7ccb4d6d70a3d64f32472296c37f33426337268abf90dad1e: Status 404 returned error can't find the container with id 79d8e832ac8de2e7ccb4d6d70a3d64f32472296c37f33426337268abf90dad1e Dec 06 07:12:46 crc kubenswrapper[4954]: I1206 07:12:46.185735 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3048d4de-ffa6-4164-afde-4c6fbfe6abb3-memberlist\") pod \"speaker-fjt49\" (UID: \"3048d4de-ffa6-4164-afde-4c6fbfe6abb3\") " pod="metallb-system/speaker-fjt49" Dec 06 07:12:46 crc kubenswrapper[4954]: E1206 07:12:46.185933 4954 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 06 07:12:46 crc kubenswrapper[4954]: E1206 07:12:46.186022 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3048d4de-ffa6-4164-afde-4c6fbfe6abb3-memberlist podName:3048d4de-ffa6-4164-afde-4c6fbfe6abb3 nodeName:}" failed. No retries permitted until 2025-12-06 07:12:47.185994505 +0000 UTC m=+941.999353904 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3048d4de-ffa6-4164-afde-4c6fbfe6abb3-memberlist") pod "speaker-fjt49" (UID: "3048d4de-ffa6-4164-afde-4c6fbfe6abb3") : secret "metallb-memberlist" not found Dec 06 07:12:46 crc kubenswrapper[4954]: I1206 07:12:46.826281 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-8f4wp" event={"ID":"4509df09-42d5-4323-8d31-88fef1e763c2","Type":"ContainerStarted","Data":"79d8e832ac8de2e7ccb4d6d70a3d64f32472296c37f33426337268abf90dad1e"} Dec 06 07:12:46 crc kubenswrapper[4954]: I1206 07:12:46.828728 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-ql4ww" event={"ID":"ab7695f2-803f-469d-9302-c1bec12de24e","Type":"ContainerStarted","Data":"b159527084b14f8cd1d32ae9c1cafd211df99a6d10f3c246c543f7e514497c73"} Dec 06 07:12:46 crc kubenswrapper[4954]: I1206 07:12:46.828772 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-ql4ww" event={"ID":"ab7695f2-803f-469d-9302-c1bec12de24e","Type":"ContainerStarted","Data":"112612d7cfe74f5124bf6acd56aa3cc7398879aa4bc55a525d1fe83dfce39987"} Dec 06 07:12:46 crc kubenswrapper[4954]: I1206 07:12:46.828788 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-ql4ww" event={"ID":"ab7695f2-803f-469d-9302-c1bec12de24e","Type":"ContainerStarted","Data":"a4f2014172f2c48b172e75c8336af25de21a154ddcab6c6b4ecaf7d107ab821f"} Dec 06 07:12:46 crc kubenswrapper[4954]: I1206 07:12:46.828863 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-ql4ww" Dec 06 07:12:46 crc kubenswrapper[4954]: I1206 07:12:46.830610 4954 generic.go:334] "Generic (PLEG): container finished" podID="7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91" containerID="821c9582cb1cf60ed42eaad50fc1227e0be7132796f03b489fbf0db48bbd1090" exitCode=0 Dec 06 07:12:46 crc kubenswrapper[4954]: I1206 07:12:46.830683 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xsq46" event={"ID":"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91","Type":"ContainerDied","Data":"821c9582cb1cf60ed42eaad50fc1227e0be7132796f03b489fbf0db48bbd1090"} Dec 06 07:12:46 crc kubenswrapper[4954]: I1206 07:12:46.833659 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p95p2" event={"ID":"bda5decc-cd20-487a-b0b5-01058cac828c","Type":"ContainerStarted","Data":"f9e64047de7cf02c528d2a7991f70f37eff1eb921f906f14e0dc29c6154c0e94"} Dec 06 07:12:46 crc kubenswrapper[4954]: I1206 07:12:46.855188 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-ql4ww" podStartSLOduration=1.855156051 podStartE2EDuration="1.855156051s" podCreationTimestamp="2025-12-06 07:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:12:46.85061283 +0000 UTC m=+941.663972219" watchObservedRunningTime="2025-12-06 07:12:46.855156051 +0000 UTC m=+941.668515440" Dec 06 07:12:47 crc kubenswrapper[4954]: I1206 07:12:47.201802 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3048d4de-ffa6-4164-afde-4c6fbfe6abb3-memberlist\") pod \"speaker-fjt49\" (UID: \"3048d4de-ffa6-4164-afde-4c6fbfe6abb3\") " pod="metallb-system/speaker-fjt49" Dec 06 07:12:47 crc kubenswrapper[4954]: I1206 07:12:47.214839 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3048d4de-ffa6-4164-afde-4c6fbfe6abb3-memberlist\") pod \"speaker-fjt49\" (UID: \"3048d4de-ffa6-4164-afde-4c6fbfe6abb3\") " pod="metallb-system/speaker-fjt49" Dec 06 07:12:47 crc kubenswrapper[4954]: I1206 07:12:47.253478 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-fjt49" Dec 06 07:12:47 crc kubenswrapper[4954]: I1206 07:12:47.863366 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-fjt49" event={"ID":"3048d4de-ffa6-4164-afde-4c6fbfe6abb3","Type":"ContainerStarted","Data":"343edbfea0870a6c39820cc4f135a708cd16ac625594ff607bf93d4c61609c7b"} Dec 06 07:12:47 crc kubenswrapper[4954]: I1206 07:12:47.863923 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-fjt49" event={"ID":"3048d4de-ffa6-4164-afde-4c6fbfe6abb3","Type":"ContainerStarted","Data":"a5eff78383d1a08b02fa4dab91ca171845c3a8f9a464ab0e90bc849e4227d25d"} Dec 06 07:12:47 crc kubenswrapper[4954]: I1206 07:12:47.868977 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xsq46" event={"ID":"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91","Type":"ContainerStarted","Data":"6fb3a0fbb2cc9fcfc4c10003a69110f49ec7a5302167d5bd6d445548a621260a"} Dec 06 07:12:48 crc kubenswrapper[4954]: I1206 07:12:48.886607 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-fjt49" event={"ID":"3048d4de-ffa6-4164-afde-4c6fbfe6abb3","Type":"ContainerStarted","Data":"bf8cb8e4b62b5d988102915c307c7b99bfd575d0d97d7b596cc50f99105c1019"} Dec 06 07:12:48 crc kubenswrapper[4954]: I1206 07:12:48.886739 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-fjt49" Dec 06 07:12:48 crc kubenswrapper[4954]: I1206 07:12:48.890518 4954 generic.go:334] "Generic (PLEG): container finished" podID="7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91" containerID="6fb3a0fbb2cc9fcfc4c10003a69110f49ec7a5302167d5bd6d445548a621260a" exitCode=0 Dec 06 07:12:48 crc kubenswrapper[4954]: I1206 07:12:48.890595 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xsq46" event={"ID":"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91","Type":"ContainerDied","Data":"6fb3a0fbb2cc9fcfc4c10003a69110f49ec7a5302167d5bd6d445548a621260a"} Dec 06 07:12:48 crc kubenswrapper[4954]: I1206 07:12:48.949000 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-fjt49" podStartSLOduration=3.948973033 podStartE2EDuration="3.948973033s" podCreationTimestamp="2025-12-06 07:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:12:48.9098984 +0000 UTC m=+943.723257779" watchObservedRunningTime="2025-12-06 07:12:48.948973033 +0000 UTC m=+943.762332422" Dec 06 07:12:49 crc kubenswrapper[4954]: I1206 07:12:49.913813 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xsq46" event={"ID":"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91","Type":"ContainerStarted","Data":"8caca299b4cc645e13248f495c3419d9b1a64185c944108109d93a042be8d816"} Dec 06 07:12:49 crc kubenswrapper[4954]: I1206 07:12:49.942127 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xsq46" podStartSLOduration=4.346580942 podStartE2EDuration="6.942105099s" podCreationTimestamp="2025-12-06 07:12:43 +0000 UTC" firstStartedPulling="2025-12-06 07:12:46.833702799 +0000 UTC m=+941.647062188" lastFinishedPulling="2025-12-06 07:12:49.429226956 +0000 UTC m=+944.242586345" observedRunningTime="2025-12-06 07:12:49.938690898 +0000 UTC m=+944.752050287" watchObservedRunningTime="2025-12-06 07:12:49.942105099 +0000 UTC m=+944.755464488" Dec 06 07:12:51 crc kubenswrapper[4954]: I1206 07:12:51.695518 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4rdgp"] Dec 06 07:12:51 crc kubenswrapper[4954]: I1206 07:12:51.697119 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rdgp" Dec 06 07:12:51 crc kubenswrapper[4954]: I1206 07:12:51.707617 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4rdgp"] Dec 06 07:12:51 crc kubenswrapper[4954]: I1206 07:12:51.861820 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38e3ea1c-bed8-47b9-b229-c30fbd6046f4-utilities\") pod \"community-operators-4rdgp\" (UID: \"38e3ea1c-bed8-47b9-b229-c30fbd6046f4\") " pod="openshift-marketplace/community-operators-4rdgp" Dec 06 07:12:51 crc kubenswrapper[4954]: I1206 07:12:51.861890 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38e3ea1c-bed8-47b9-b229-c30fbd6046f4-catalog-content\") pod \"community-operators-4rdgp\" (UID: \"38e3ea1c-bed8-47b9-b229-c30fbd6046f4\") " pod="openshift-marketplace/community-operators-4rdgp" Dec 06 07:12:51 crc kubenswrapper[4954]: I1206 07:12:51.861956 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vgxt\" (UniqueName: \"kubernetes.io/projected/38e3ea1c-bed8-47b9-b229-c30fbd6046f4-kube-api-access-7vgxt\") pod \"community-operators-4rdgp\" (UID: \"38e3ea1c-bed8-47b9-b229-c30fbd6046f4\") " pod="openshift-marketplace/community-operators-4rdgp" Dec 06 07:12:51 crc kubenswrapper[4954]: I1206 07:12:51.963382 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38e3ea1c-bed8-47b9-b229-c30fbd6046f4-utilities\") pod \"community-operators-4rdgp\" (UID: \"38e3ea1c-bed8-47b9-b229-c30fbd6046f4\") " pod="openshift-marketplace/community-operators-4rdgp" Dec 06 07:12:51 crc kubenswrapper[4954]: I1206 07:12:51.963474 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38e3ea1c-bed8-47b9-b229-c30fbd6046f4-catalog-content\") pod \"community-operators-4rdgp\" (UID: \"38e3ea1c-bed8-47b9-b229-c30fbd6046f4\") " pod="openshift-marketplace/community-operators-4rdgp" Dec 06 07:12:51 crc kubenswrapper[4954]: I1206 07:12:51.963545 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vgxt\" (UniqueName: \"kubernetes.io/projected/38e3ea1c-bed8-47b9-b229-c30fbd6046f4-kube-api-access-7vgxt\") pod \"community-operators-4rdgp\" (UID: \"38e3ea1c-bed8-47b9-b229-c30fbd6046f4\") " pod="openshift-marketplace/community-operators-4rdgp" Dec 06 07:12:51 crc kubenswrapper[4954]: I1206 07:12:51.964175 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38e3ea1c-bed8-47b9-b229-c30fbd6046f4-utilities\") pod \"community-operators-4rdgp\" (UID: \"38e3ea1c-bed8-47b9-b229-c30fbd6046f4\") " pod="openshift-marketplace/community-operators-4rdgp" Dec 06 07:12:51 crc kubenswrapper[4954]: I1206 07:12:51.964265 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38e3ea1c-bed8-47b9-b229-c30fbd6046f4-catalog-content\") pod \"community-operators-4rdgp\" (UID: \"38e3ea1c-bed8-47b9-b229-c30fbd6046f4\") " pod="openshift-marketplace/community-operators-4rdgp" Dec 06 07:12:52 crc kubenswrapper[4954]: I1206 07:12:52.092109 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vgxt\" (UniqueName: \"kubernetes.io/projected/38e3ea1c-bed8-47b9-b229-c30fbd6046f4-kube-api-access-7vgxt\") pod \"community-operators-4rdgp\" (UID: \"38e3ea1c-bed8-47b9-b229-c30fbd6046f4\") " pod="openshift-marketplace/community-operators-4rdgp" Dec 06 07:12:52 crc kubenswrapper[4954]: I1206 07:12:52.315682 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rdgp" Dec 06 07:12:53 crc kubenswrapper[4954]: I1206 07:12:53.650744 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xsq46" Dec 06 07:12:53 crc kubenswrapper[4954]: I1206 07:12:53.652038 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xsq46" Dec 06 07:12:53 crc kubenswrapper[4954]: I1206 07:12:53.712332 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xsq46" Dec 06 07:12:54 crc kubenswrapper[4954]: I1206 07:12:54.995860 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xsq46" Dec 06 07:12:55 crc kubenswrapper[4954]: I1206 07:12:55.893310 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xsq46"] Dec 06 07:12:56 crc kubenswrapper[4954]: I1206 07:12:56.276660 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4rdgp"] Dec 06 07:12:56 crc kubenswrapper[4954]: I1206 07:12:56.971991 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-8f4wp" event={"ID":"4509df09-42d5-4323-8d31-88fef1e763c2","Type":"ContainerStarted","Data":"428e2fa6036b964ee197d2d80b776a6438b6f13974814e1d56f6c7ace5f19059"} Dec 06 07:12:56 crc kubenswrapper[4954]: I1206 07:12:56.972449 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-8f4wp" Dec 06 07:12:56 crc kubenswrapper[4954]: I1206 07:12:56.974624 4954 generic.go:334] "Generic (PLEG): container finished" podID="38e3ea1c-bed8-47b9-b229-c30fbd6046f4" containerID="29a97ba238f1e5be2de572f92fbce1a555bee84b2c876506644e57b8bb05e819" exitCode=0 Dec 06 07:12:56 crc kubenswrapper[4954]: I1206 07:12:56.974791 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rdgp" event={"ID":"38e3ea1c-bed8-47b9-b229-c30fbd6046f4","Type":"ContainerDied","Data":"29a97ba238f1e5be2de572f92fbce1a555bee84b2c876506644e57b8bb05e819"} Dec 06 07:12:56 crc kubenswrapper[4954]: I1206 07:12:56.974883 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rdgp" event={"ID":"38e3ea1c-bed8-47b9-b229-c30fbd6046f4","Type":"ContainerStarted","Data":"1950352de1859f6615d9a2ca5405eddabe0401bda345840bc1b0b0d20a4ece20"} Dec 06 07:12:56 crc kubenswrapper[4954]: I1206 07:12:56.977230 4954 generic.go:334] "Generic (PLEG): container finished" podID="bda5decc-cd20-487a-b0b5-01058cac828c" containerID="5c4f5aed65749064e8517074f460e4c7724d7636bb6bf0b75a2d79317bc32d12" exitCode=0 Dec 06 07:12:56 crc kubenswrapper[4954]: I1206 07:12:56.977432 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p95p2" event={"ID":"bda5decc-cd20-487a-b0b5-01058cac828c","Type":"ContainerDied","Data":"5c4f5aed65749064e8517074f460e4c7724d7636bb6bf0b75a2d79317bc32d12"} Dec 06 07:12:56 crc kubenswrapper[4954]: I1206 07:12:56.977511 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xsq46" podUID="7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91" containerName="registry-server" containerID="cri-o://8caca299b4cc645e13248f495c3419d9b1a64185c944108109d93a042be8d816" gracePeriod=2 Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.002926 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-8f4wp" podStartSLOduration=2.203663357 podStartE2EDuration="12.002893373s" podCreationTimestamp="2025-12-06 07:12:45 +0000 UTC" firstStartedPulling="2025-12-06 07:12:46.126798315 +0000 UTC m=+940.940157704" lastFinishedPulling="2025-12-06 07:12:55.926028331 +0000 UTC m=+950.739387720" observedRunningTime="2025-12-06 07:12:56.990215664 +0000 UTC m=+951.803575063" watchObservedRunningTime="2025-12-06 07:12:57.002893373 +0000 UTC m=+951.816252772" Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.258493 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-fjt49" Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.390811 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xsq46" Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.408055 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91-catalog-content\") pod \"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91\" (UID: \"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91\") " Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.408121 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbcck\" (UniqueName: \"kubernetes.io/projected/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91-kube-api-access-hbcck\") pod \"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91\" (UID: \"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91\") " Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.408276 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91-utilities\") pod \"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91\" (UID: \"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91\") " Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.409593 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91-utilities" (OuterVolumeSpecName: "utilities") pod "7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91" (UID: "7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.421859 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91-kube-api-access-hbcck" (OuterVolumeSpecName: "kube-api-access-hbcck") pod "7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91" (UID: "7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91"). InnerVolumeSpecName "kube-api-access-hbcck". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.462992 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91" (UID: "7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.510278 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.510884 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.510911 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbcck\" (UniqueName: \"kubernetes.io/projected/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91-kube-api-access-hbcck\") on node \"crc\" DevicePath \"\"" Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.988047 4954 generic.go:334] "Generic (PLEG): container finished" podID="38e3ea1c-bed8-47b9-b229-c30fbd6046f4" containerID="28caa9bee0c4ff8e2d76806241af69ea721e8cf4e7c4ad0a674e8e304dd98332" exitCode=0 Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.988145 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rdgp" event={"ID":"38e3ea1c-bed8-47b9-b229-c30fbd6046f4","Type":"ContainerDied","Data":"28caa9bee0c4ff8e2d76806241af69ea721e8cf4e7c4ad0a674e8e304dd98332"} Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.990965 4954 generic.go:334] "Generic (PLEG): container finished" podID="7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91" containerID="8caca299b4cc645e13248f495c3419d9b1a64185c944108109d93a042be8d816" exitCode=0 Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.991079 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xsq46" event={"ID":"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91","Type":"ContainerDied","Data":"8caca299b4cc645e13248f495c3419d9b1a64185c944108109d93a042be8d816"} Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.991109 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xsq46" Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.991162 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xsq46" event={"ID":"7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91","Type":"ContainerDied","Data":"786c835973cff502058bdb5eeac02e62b4c2d4d11a988162bfb288ebda10593c"} Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.991201 4954 scope.go:117] "RemoveContainer" containerID="8caca299b4cc645e13248f495c3419d9b1a64185c944108109d93a042be8d816" Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.997395 4954 generic.go:334] "Generic (PLEG): container finished" podID="bda5decc-cd20-487a-b0b5-01058cac828c" containerID="883b4e3262083f867767c4db5d3c910330cd5d0c07afb883df3ca101d4bc3ba6" exitCode=0 Dec 06 07:12:57 crc kubenswrapper[4954]: I1206 07:12:57.998843 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p95p2" event={"ID":"bda5decc-cd20-487a-b0b5-01058cac828c","Type":"ContainerDied","Data":"883b4e3262083f867767c4db5d3c910330cd5d0c07afb883df3ca101d4bc3ba6"} Dec 06 07:12:58 crc kubenswrapper[4954]: I1206 07:12:58.018514 4954 scope.go:117] "RemoveContainer" containerID="6fb3a0fbb2cc9fcfc4c10003a69110f49ec7a5302167d5bd6d445548a621260a" Dec 06 07:12:58 crc kubenswrapper[4954]: I1206 07:12:58.042196 4954 scope.go:117] "RemoveContainer" containerID="821c9582cb1cf60ed42eaad50fc1227e0be7132796f03b489fbf0db48bbd1090" Dec 06 07:12:58 crc kubenswrapper[4954]: I1206 07:12:58.067783 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xsq46"] Dec 06 07:12:58 crc kubenswrapper[4954]: I1206 07:12:58.073556 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xsq46"] Dec 06 07:12:58 crc kubenswrapper[4954]: I1206 07:12:58.093774 4954 scope.go:117] "RemoveContainer" containerID="8caca299b4cc645e13248f495c3419d9b1a64185c944108109d93a042be8d816" Dec 06 07:12:58 crc kubenswrapper[4954]: E1206 07:12:58.095203 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8caca299b4cc645e13248f495c3419d9b1a64185c944108109d93a042be8d816\": container with ID starting with 8caca299b4cc645e13248f495c3419d9b1a64185c944108109d93a042be8d816 not found: ID does not exist" containerID="8caca299b4cc645e13248f495c3419d9b1a64185c944108109d93a042be8d816" Dec 06 07:12:58 crc kubenswrapper[4954]: I1206 07:12:58.095287 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8caca299b4cc645e13248f495c3419d9b1a64185c944108109d93a042be8d816"} err="failed to get container status \"8caca299b4cc645e13248f495c3419d9b1a64185c944108109d93a042be8d816\": rpc error: code = NotFound desc = could not find container \"8caca299b4cc645e13248f495c3419d9b1a64185c944108109d93a042be8d816\": container with ID starting with 8caca299b4cc645e13248f495c3419d9b1a64185c944108109d93a042be8d816 not found: ID does not exist" Dec 06 07:12:58 crc kubenswrapper[4954]: I1206 07:12:58.095335 4954 scope.go:117] "RemoveContainer" containerID="6fb3a0fbb2cc9fcfc4c10003a69110f49ec7a5302167d5bd6d445548a621260a" Dec 06 07:12:58 crc kubenswrapper[4954]: E1206 07:12:58.095740 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fb3a0fbb2cc9fcfc4c10003a69110f49ec7a5302167d5bd6d445548a621260a\": container with ID starting with 6fb3a0fbb2cc9fcfc4c10003a69110f49ec7a5302167d5bd6d445548a621260a not found: ID does not exist" containerID="6fb3a0fbb2cc9fcfc4c10003a69110f49ec7a5302167d5bd6d445548a621260a" Dec 06 07:12:58 crc kubenswrapper[4954]: I1206 07:12:58.095777 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fb3a0fbb2cc9fcfc4c10003a69110f49ec7a5302167d5bd6d445548a621260a"} err="failed to get container status \"6fb3a0fbb2cc9fcfc4c10003a69110f49ec7a5302167d5bd6d445548a621260a\": rpc error: code = NotFound desc = could not find container \"6fb3a0fbb2cc9fcfc4c10003a69110f49ec7a5302167d5bd6d445548a621260a\": container with ID starting with 6fb3a0fbb2cc9fcfc4c10003a69110f49ec7a5302167d5bd6d445548a621260a not found: ID does not exist" Dec 06 07:12:58 crc kubenswrapper[4954]: I1206 07:12:58.095801 4954 scope.go:117] "RemoveContainer" containerID="821c9582cb1cf60ed42eaad50fc1227e0be7132796f03b489fbf0db48bbd1090" Dec 06 07:12:58 crc kubenswrapper[4954]: E1206 07:12:58.096174 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"821c9582cb1cf60ed42eaad50fc1227e0be7132796f03b489fbf0db48bbd1090\": container with ID starting with 821c9582cb1cf60ed42eaad50fc1227e0be7132796f03b489fbf0db48bbd1090 not found: ID does not exist" containerID="821c9582cb1cf60ed42eaad50fc1227e0be7132796f03b489fbf0db48bbd1090" Dec 06 07:12:58 crc kubenswrapper[4954]: I1206 07:12:58.096231 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821c9582cb1cf60ed42eaad50fc1227e0be7132796f03b489fbf0db48bbd1090"} err="failed to get container status \"821c9582cb1cf60ed42eaad50fc1227e0be7132796f03b489fbf0db48bbd1090\": rpc error: code = NotFound desc = could not find container \"821c9582cb1cf60ed42eaad50fc1227e0be7132796f03b489fbf0db48bbd1090\": container with ID starting with 821c9582cb1cf60ed42eaad50fc1227e0be7132796f03b489fbf0db48bbd1090 not found: ID does not exist" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.024528 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rdgp" event={"ID":"38e3ea1c-bed8-47b9-b229-c30fbd6046f4","Type":"ContainerStarted","Data":"0ae205303f13b5856854b2fd6cc9c7c00f5e6a629fb8e27e2ed40274909c87cd"} Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.028782 4954 generic.go:334] "Generic (PLEG): container finished" podID="bda5decc-cd20-487a-b0b5-01058cac828c" containerID="dc700393c3012d2e28e1647799ecebf9aebfb57c8de870e10ab2ffb82b09aea4" exitCode=0 Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.028876 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p95p2" event={"ID":"bda5decc-cd20-487a-b0b5-01058cac828c","Type":"ContainerDied","Data":"dc700393c3012d2e28e1647799ecebf9aebfb57c8de870e10ab2ffb82b09aea4"} Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.056610 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4rdgp" podStartSLOduration=6.640640369 podStartE2EDuration="8.056577143s" podCreationTimestamp="2025-12-06 07:12:51 +0000 UTC" firstStartedPulling="2025-12-06 07:12:56.977302349 +0000 UTC m=+951.790661738" lastFinishedPulling="2025-12-06 07:12:58.393239123 +0000 UTC m=+953.206598512" observedRunningTime="2025-12-06 07:12:59.048143678 +0000 UTC m=+953.861503067" watchObservedRunningTime="2025-12-06 07:12:59.056577143 +0000 UTC m=+953.869936532" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.351727 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx"] Dec 06 07:12:59 crc kubenswrapper[4954]: E1206 07:12:59.352471 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91" containerName="registry-server" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.352491 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91" containerName="registry-server" Dec 06 07:12:59 crc kubenswrapper[4954]: E1206 07:12:59.352500 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91" containerName="extract-content" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.352507 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91" containerName="extract-content" Dec 06 07:12:59 crc kubenswrapper[4954]: E1206 07:12:59.352521 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91" containerName="extract-utilities" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.352528 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91" containerName="extract-utilities" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.352674 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91" containerName="registry-server" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.353708 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.356104 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.364456 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx"] Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.453418 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91" path="/var/lib/kubelet/pods/7ffb9a25-a6b5-4d89-b930-9bc5a7d88d91/volumes" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.539400 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/456e3ce9-0eb6-466a-92f1-7eaa684b6b7a-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx\" (UID: \"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.539679 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/456e3ce9-0eb6-466a-92f1-7eaa684b6b7a-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx\" (UID: \"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.539789 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6xbp\" (UniqueName: \"kubernetes.io/projected/456e3ce9-0eb6-466a-92f1-7eaa684b6b7a-kube-api-access-b6xbp\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx\" (UID: \"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.641595 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/456e3ce9-0eb6-466a-92f1-7eaa684b6b7a-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx\" (UID: \"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.641685 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6xbp\" (UniqueName: \"kubernetes.io/projected/456e3ce9-0eb6-466a-92f1-7eaa684b6b7a-kube-api-access-b6xbp\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx\" (UID: \"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.641743 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/456e3ce9-0eb6-466a-92f1-7eaa684b6b7a-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx\" (UID: \"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.642149 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/456e3ce9-0eb6-466a-92f1-7eaa684b6b7a-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx\" (UID: \"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.642234 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/456e3ce9-0eb6-466a-92f1-7eaa684b6b7a-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx\" (UID: \"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.663859 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6xbp\" (UniqueName: \"kubernetes.io/projected/456e3ce9-0eb6-466a-92f1-7eaa684b6b7a-kube-api-access-b6xbp\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx\" (UID: \"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.671433 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx" Dec 06 07:12:59 crc kubenswrapper[4954]: I1206 07:12:59.919998 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx"] Dec 06 07:13:00 crc kubenswrapper[4954]: I1206 07:13:00.037057 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx" event={"ID":"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a","Type":"ContainerStarted","Data":"2447726ee97e86869419c6f6c983220eb1f9b0a3ff9b41fab8bc5e69761ac64e"} Dec 06 07:13:00 crc kubenswrapper[4954]: I1206 07:13:00.039776 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p95p2" event={"ID":"bda5decc-cd20-487a-b0b5-01058cac828c","Type":"ContainerStarted","Data":"67e315156c640124a0c4176df637e1156a84d0b31ce9a0b23b5d24edd083cb4c"} Dec 06 07:13:01 crc kubenswrapper[4954]: I1206 07:13:01.051673 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p95p2" event={"ID":"bda5decc-cd20-487a-b0b5-01058cac828c","Type":"ContainerStarted","Data":"2537668afa0875ebf7b15671c95b04afe341b2b826a2e614045f3140218ad95d"} Dec 06 07:13:01 crc kubenswrapper[4954]: I1206 07:13:01.052084 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p95p2" event={"ID":"bda5decc-cd20-487a-b0b5-01058cac828c","Type":"ContainerStarted","Data":"776038112f9cd9302d526e4824e2ebd4c3ee00c11ca8851c5714cd7367ad8df2"} Dec 06 07:13:01 crc kubenswrapper[4954]: I1206 07:13:01.052096 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p95p2" event={"ID":"bda5decc-cd20-487a-b0b5-01058cac828c","Type":"ContainerStarted","Data":"32ca443624b813e3a2ed4411dbdde489a9d56cf28f0a59285d17bc1df9c029d1"} Dec 06 07:13:01 crc kubenswrapper[4954]: I1206 07:13:01.052107 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p95p2" event={"ID":"bda5decc-cd20-487a-b0b5-01058cac828c","Type":"ContainerStarted","Data":"39a1a88d0e6729d10aae1e066c471b3bbe7a4ce3067c78c8ae5506b38313ce56"} Dec 06 07:13:01 crc kubenswrapper[4954]: I1206 07:13:01.054790 4954 generic.go:334] "Generic (PLEG): container finished" podID="456e3ce9-0eb6-466a-92f1-7eaa684b6b7a" containerID="653c742c5d0eabb8a91ef293d10c0971371c750a27432b1e92fdfcdf250dedad" exitCode=0 Dec 06 07:13:01 crc kubenswrapper[4954]: I1206 07:13:01.054838 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx" event={"ID":"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a","Type":"ContainerDied","Data":"653c742c5d0eabb8a91ef293d10c0971371c750a27432b1e92fdfcdf250dedad"} Dec 06 07:13:02 crc kubenswrapper[4954]: I1206 07:13:02.068664 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p95p2" event={"ID":"bda5decc-cd20-487a-b0b5-01058cac828c","Type":"ContainerStarted","Data":"13a85265e0ec2afd919671d420796bc60d18c8f5f2a19c91b8207de85c76265a"} Dec 06 07:13:02 crc kubenswrapper[4954]: I1206 07:13:02.069220 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-p95p2" Dec 06 07:13:02 crc kubenswrapper[4954]: I1206 07:13:02.111702 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-p95p2" podStartSLOduration=7.094232267 podStartE2EDuration="17.11167162s" podCreationTimestamp="2025-12-06 07:12:45 +0000 UTC" firstStartedPulling="2025-12-06 07:12:45.922374457 +0000 UTC m=+940.735733866" lastFinishedPulling="2025-12-06 07:12:55.93981383 +0000 UTC m=+950.753173219" observedRunningTime="2025-12-06 07:13:02.104754575 +0000 UTC m=+956.918113984" watchObservedRunningTime="2025-12-06 07:13:02.11167162 +0000 UTC m=+956.925031009" Dec 06 07:13:02 crc kubenswrapper[4954]: I1206 07:13:02.316946 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4rdgp" Dec 06 07:13:02 crc kubenswrapper[4954]: I1206 07:13:02.317047 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4rdgp" Dec 06 07:13:02 crc kubenswrapper[4954]: I1206 07:13:02.382181 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4rdgp" Dec 06 07:13:03 crc kubenswrapper[4954]: I1206 07:13:03.130102 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4rdgp" Dec 06 07:13:05 crc kubenswrapper[4954]: I1206 07:13:05.660711 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-p95p2" Dec 06 07:13:05 crc kubenswrapper[4954]: I1206 07:13:05.833947 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-ql4ww" Dec 06 07:13:05 crc kubenswrapper[4954]: I1206 07:13:05.924851 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4rdgp"] Dec 06 07:13:05 crc kubenswrapper[4954]: I1206 07:13:05.925292 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4rdgp" podUID="38e3ea1c-bed8-47b9-b229-c30fbd6046f4" containerName="registry-server" containerID="cri-o://0ae205303f13b5856854b2fd6cc9c7c00f5e6a629fb8e27e2ed40274909c87cd" gracePeriod=2 Dec 06 07:13:06 crc kubenswrapper[4954]: I1206 07:13:06.291273 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-p95p2" Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.046744 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rdgp" Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.108848 4954 generic.go:334] "Generic (PLEG): container finished" podID="38e3ea1c-bed8-47b9-b229-c30fbd6046f4" containerID="0ae205303f13b5856854b2fd6cc9c7c00f5e6a629fb8e27e2ed40274909c87cd" exitCode=0 Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.108921 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rdgp" Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.108928 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rdgp" event={"ID":"38e3ea1c-bed8-47b9-b229-c30fbd6046f4","Type":"ContainerDied","Data":"0ae205303f13b5856854b2fd6cc9c7c00f5e6a629fb8e27e2ed40274909c87cd"} Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.108965 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rdgp" event={"ID":"38e3ea1c-bed8-47b9-b229-c30fbd6046f4","Type":"ContainerDied","Data":"1950352de1859f6615d9a2ca5405eddabe0401bda345840bc1b0b0d20a4ece20"} Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.108994 4954 scope.go:117] "RemoveContainer" containerID="0ae205303f13b5856854b2fd6cc9c7c00f5e6a629fb8e27e2ed40274909c87cd" Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.132851 4954 scope.go:117] "RemoveContainer" containerID="28caa9bee0c4ff8e2d76806241af69ea721e8cf4e7c4ad0a674e8e304dd98332" Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.155507 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38e3ea1c-bed8-47b9-b229-c30fbd6046f4-utilities\") pod \"38e3ea1c-bed8-47b9-b229-c30fbd6046f4\" (UID: \"38e3ea1c-bed8-47b9-b229-c30fbd6046f4\") " Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.155726 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38e3ea1c-bed8-47b9-b229-c30fbd6046f4-catalog-content\") pod \"38e3ea1c-bed8-47b9-b229-c30fbd6046f4\" (UID: \"38e3ea1c-bed8-47b9-b229-c30fbd6046f4\") " Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.155829 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vgxt\" (UniqueName: \"kubernetes.io/projected/38e3ea1c-bed8-47b9-b229-c30fbd6046f4-kube-api-access-7vgxt\") pod \"38e3ea1c-bed8-47b9-b229-c30fbd6046f4\" (UID: \"38e3ea1c-bed8-47b9-b229-c30fbd6046f4\") " Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.156458 4954 scope.go:117] "RemoveContainer" containerID="29a97ba238f1e5be2de572f92fbce1a555bee84b2c876506644e57b8bb05e819" Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.158412 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38e3ea1c-bed8-47b9-b229-c30fbd6046f4-utilities" (OuterVolumeSpecName: "utilities") pod "38e3ea1c-bed8-47b9-b229-c30fbd6046f4" (UID: "38e3ea1c-bed8-47b9-b229-c30fbd6046f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.165816 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38e3ea1c-bed8-47b9-b229-c30fbd6046f4-kube-api-access-7vgxt" (OuterVolumeSpecName: "kube-api-access-7vgxt") pod "38e3ea1c-bed8-47b9-b229-c30fbd6046f4" (UID: "38e3ea1c-bed8-47b9-b229-c30fbd6046f4"). InnerVolumeSpecName "kube-api-access-7vgxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.205587 4954 scope.go:117] "RemoveContainer" containerID="0ae205303f13b5856854b2fd6cc9c7c00f5e6a629fb8e27e2ed40274909c87cd" Dec 06 07:13:07 crc kubenswrapper[4954]: E1206 07:13:07.209765 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ae205303f13b5856854b2fd6cc9c7c00f5e6a629fb8e27e2ed40274909c87cd\": container with ID starting with 0ae205303f13b5856854b2fd6cc9c7c00f5e6a629fb8e27e2ed40274909c87cd not found: ID does not exist" containerID="0ae205303f13b5856854b2fd6cc9c7c00f5e6a629fb8e27e2ed40274909c87cd" Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.209850 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae205303f13b5856854b2fd6cc9c7c00f5e6a629fb8e27e2ed40274909c87cd"} err="failed to get container status \"0ae205303f13b5856854b2fd6cc9c7c00f5e6a629fb8e27e2ed40274909c87cd\": rpc error: code = NotFound desc = could not find container \"0ae205303f13b5856854b2fd6cc9c7c00f5e6a629fb8e27e2ed40274909c87cd\": container with ID starting with 0ae205303f13b5856854b2fd6cc9c7c00f5e6a629fb8e27e2ed40274909c87cd not found: ID does not exist" Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.209893 4954 scope.go:117] "RemoveContainer" containerID="28caa9bee0c4ff8e2d76806241af69ea721e8cf4e7c4ad0a674e8e304dd98332" Dec 06 07:13:07 crc kubenswrapper[4954]: E1206 07:13:07.210960 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28caa9bee0c4ff8e2d76806241af69ea721e8cf4e7c4ad0a674e8e304dd98332\": container with ID starting with 28caa9bee0c4ff8e2d76806241af69ea721e8cf4e7c4ad0a674e8e304dd98332 not found: ID does not exist" containerID="28caa9bee0c4ff8e2d76806241af69ea721e8cf4e7c4ad0a674e8e304dd98332" Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.211040 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28caa9bee0c4ff8e2d76806241af69ea721e8cf4e7c4ad0a674e8e304dd98332"} err="failed to get container status \"28caa9bee0c4ff8e2d76806241af69ea721e8cf4e7c4ad0a674e8e304dd98332\": rpc error: code = NotFound desc = could not find container \"28caa9bee0c4ff8e2d76806241af69ea721e8cf4e7c4ad0a674e8e304dd98332\": container with ID starting with 28caa9bee0c4ff8e2d76806241af69ea721e8cf4e7c4ad0a674e8e304dd98332 not found: ID does not exist" Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.211085 4954 scope.go:117] "RemoveContainer" containerID="29a97ba238f1e5be2de572f92fbce1a555bee84b2c876506644e57b8bb05e819" Dec 06 07:13:07 crc kubenswrapper[4954]: E1206 07:13:07.211930 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29a97ba238f1e5be2de572f92fbce1a555bee84b2c876506644e57b8bb05e819\": container with ID starting with 29a97ba238f1e5be2de572f92fbce1a555bee84b2c876506644e57b8bb05e819 not found: ID does not exist" containerID="29a97ba238f1e5be2de572f92fbce1a555bee84b2c876506644e57b8bb05e819" Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.211999 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29a97ba238f1e5be2de572f92fbce1a555bee84b2c876506644e57b8bb05e819"} err="failed to get container status \"29a97ba238f1e5be2de572f92fbce1a555bee84b2c876506644e57b8bb05e819\": rpc error: code = NotFound desc = could not find container \"29a97ba238f1e5be2de572f92fbce1a555bee84b2c876506644e57b8bb05e819\": container with ID starting with 29a97ba238f1e5be2de572f92fbce1a555bee84b2c876506644e57b8bb05e819 not found: ID does not exist" Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.228282 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38e3ea1c-bed8-47b9-b229-c30fbd6046f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38e3ea1c-bed8-47b9-b229-c30fbd6046f4" (UID: "38e3ea1c-bed8-47b9-b229-c30fbd6046f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.258531 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38e3ea1c-bed8-47b9-b229-c30fbd6046f4-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.258593 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38e3ea1c-bed8-47b9-b229-c30fbd6046f4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.258620 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vgxt\" (UniqueName: \"kubernetes.io/projected/38e3ea1c-bed8-47b9-b229-c30fbd6046f4-kube-api-access-7vgxt\") on node \"crc\" DevicePath \"\"" Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.455903 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4rdgp"] Dec 06 07:13:07 crc kubenswrapper[4954]: I1206 07:13:07.456314 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4rdgp"] Dec 06 07:13:08 crc kubenswrapper[4954]: I1206 07:13:08.118073 4954 generic.go:334] "Generic (PLEG): container finished" podID="456e3ce9-0eb6-466a-92f1-7eaa684b6b7a" containerID="89412b5de07d708e54a08692302ac40d0e6eba9fef14b87c552bc5e7bd39071e" exitCode=0 Dec 06 07:13:08 crc kubenswrapper[4954]: I1206 07:13:08.118132 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx" event={"ID":"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a","Type":"ContainerDied","Data":"89412b5de07d708e54a08692302ac40d0e6eba9fef14b87c552bc5e7bd39071e"} Dec 06 07:13:09 crc kubenswrapper[4954]: I1206 07:13:09.142470 4954 generic.go:334] "Generic (PLEG): container finished" podID="456e3ce9-0eb6-466a-92f1-7eaa684b6b7a" containerID="e894ce806676c303e727af901aab7c464510bd083810638bce5bd8ab71039255" exitCode=0 Dec 06 07:13:09 crc kubenswrapper[4954]: I1206 07:13:09.142593 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx" event={"ID":"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a","Type":"ContainerDied","Data":"e894ce806676c303e727af901aab7c464510bd083810638bce5bd8ab71039255"} Dec 06 07:13:09 crc kubenswrapper[4954]: I1206 07:13:09.453337 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38e3ea1c-bed8-47b9-b229-c30fbd6046f4" path="/var/lib/kubelet/pods/38e3ea1c-bed8-47b9-b229-c30fbd6046f4/volumes" Dec 06 07:13:10 crc kubenswrapper[4954]: I1206 07:13:10.101862 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:13:10 crc kubenswrapper[4954]: I1206 07:13:10.101957 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:13:10 crc kubenswrapper[4954]: I1206 07:13:10.102014 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 07:13:10 crc kubenswrapper[4954]: I1206 07:13:10.102544 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16342cc1e9e4e44c49d62d3e217ed95b4b7faa3f28bb16ba730ed0b900485233"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 07:13:10 crc kubenswrapper[4954]: I1206 07:13:10.102633 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://16342cc1e9e4e44c49d62d3e217ed95b4b7faa3f28bb16ba730ed0b900485233" gracePeriod=600 Dec 06 07:13:10 crc kubenswrapper[4954]: I1206 07:13:10.419759 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx" Dec 06 07:13:10 crc kubenswrapper[4954]: I1206 07:13:10.506205 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/456e3ce9-0eb6-466a-92f1-7eaa684b6b7a-util\") pod \"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a\" (UID: \"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a\") " Dec 06 07:13:10 crc kubenswrapper[4954]: I1206 07:13:10.506353 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6xbp\" (UniqueName: \"kubernetes.io/projected/456e3ce9-0eb6-466a-92f1-7eaa684b6b7a-kube-api-access-b6xbp\") pod \"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a\" (UID: \"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a\") " Dec 06 07:13:10 crc kubenswrapper[4954]: I1206 07:13:10.506591 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/456e3ce9-0eb6-466a-92f1-7eaa684b6b7a-bundle\") pod \"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a\" (UID: \"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a\") " Dec 06 07:13:10 crc kubenswrapper[4954]: I1206 07:13:10.508258 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/456e3ce9-0eb6-466a-92f1-7eaa684b6b7a-bundle" (OuterVolumeSpecName: "bundle") pod "456e3ce9-0eb6-466a-92f1-7eaa684b6b7a" (UID: "456e3ce9-0eb6-466a-92f1-7eaa684b6b7a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:13:10 crc kubenswrapper[4954]: I1206 07:13:10.515020 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/456e3ce9-0eb6-466a-92f1-7eaa684b6b7a-kube-api-access-b6xbp" (OuterVolumeSpecName: "kube-api-access-b6xbp") pod "456e3ce9-0eb6-466a-92f1-7eaa684b6b7a" (UID: "456e3ce9-0eb6-466a-92f1-7eaa684b6b7a"). InnerVolumeSpecName "kube-api-access-b6xbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:13:10 crc kubenswrapper[4954]: I1206 07:13:10.517059 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/456e3ce9-0eb6-466a-92f1-7eaa684b6b7a-util" (OuterVolumeSpecName: "util") pod "456e3ce9-0eb6-466a-92f1-7eaa684b6b7a" (UID: "456e3ce9-0eb6-466a-92f1-7eaa684b6b7a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:13:10 crc kubenswrapper[4954]: I1206 07:13:10.607954 4954 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/456e3ce9-0eb6-466a-92f1-7eaa684b6b7a-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:13:10 crc kubenswrapper[4954]: I1206 07:13:10.607996 4954 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/456e3ce9-0eb6-466a-92f1-7eaa684b6b7a-util\") on node \"crc\" DevicePath \"\"" Dec 06 07:13:10 crc kubenswrapper[4954]: I1206 07:13:10.608014 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6xbp\" (UniqueName: \"kubernetes.io/projected/456e3ce9-0eb6-466a-92f1-7eaa684b6b7a-kube-api-access-b6xbp\") on node \"crc\" DevicePath \"\"" Dec 06 07:13:11 crc kubenswrapper[4954]: I1206 07:13:11.161077 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="16342cc1e9e4e44c49d62d3e217ed95b4b7faa3f28bb16ba730ed0b900485233" exitCode=0 Dec 06 07:13:11 crc kubenswrapper[4954]: I1206 07:13:11.161117 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"16342cc1e9e4e44c49d62d3e217ed95b4b7faa3f28bb16ba730ed0b900485233"} Dec 06 07:13:11 crc kubenswrapper[4954]: I1206 07:13:11.161797 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"b2a8e38b61392d54e95e434affe8fa8be8bd703de4a146acbf88d2066f517403"} Dec 06 07:13:11 crc kubenswrapper[4954]: I1206 07:13:11.161839 4954 scope.go:117] "RemoveContainer" containerID="cfcc55f9e114bfc684b44c6b3eb548c36a24218366f0dfbb06f158d610a881a9" Dec 06 07:13:11 crc kubenswrapper[4954]: I1206 07:13:11.166718 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx" event={"ID":"456e3ce9-0eb6-466a-92f1-7eaa684b6b7a","Type":"ContainerDied","Data":"2447726ee97e86869419c6f6c983220eb1f9b0a3ff9b41fab8bc5e69761ac64e"} Dec 06 07:13:11 crc kubenswrapper[4954]: I1206 07:13:11.166771 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2447726ee97e86869419c6f6c983220eb1f9b0a3ff9b41fab8bc5e69761ac64e" Dec 06 07:13:11 crc kubenswrapper[4954]: I1206 07:13:11.166844 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx" Dec 06 07:13:14 crc kubenswrapper[4954]: I1206 07:13:14.872820 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-dj6ht"] Dec 06 07:13:14 crc kubenswrapper[4954]: E1206 07:13:14.873728 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="456e3ce9-0eb6-466a-92f1-7eaa684b6b7a" containerName="pull" Dec 06 07:13:14 crc kubenswrapper[4954]: I1206 07:13:14.873742 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="456e3ce9-0eb6-466a-92f1-7eaa684b6b7a" containerName="pull" Dec 06 07:13:14 crc kubenswrapper[4954]: E1206 07:13:14.873754 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38e3ea1c-bed8-47b9-b229-c30fbd6046f4" containerName="extract-content" Dec 06 07:13:14 crc kubenswrapper[4954]: I1206 07:13:14.873760 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e3ea1c-bed8-47b9-b229-c30fbd6046f4" containerName="extract-content" Dec 06 07:13:14 crc kubenswrapper[4954]: E1206 07:13:14.873771 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38e3ea1c-bed8-47b9-b229-c30fbd6046f4" containerName="extract-utilities" Dec 06 07:13:14 crc kubenswrapper[4954]: I1206 07:13:14.873778 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e3ea1c-bed8-47b9-b229-c30fbd6046f4" containerName="extract-utilities" Dec 06 07:13:14 crc kubenswrapper[4954]: E1206 07:13:14.873792 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38e3ea1c-bed8-47b9-b229-c30fbd6046f4" containerName="registry-server" Dec 06 07:13:14 crc kubenswrapper[4954]: I1206 07:13:14.873797 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e3ea1c-bed8-47b9-b229-c30fbd6046f4" containerName="registry-server" Dec 06 07:13:14 crc kubenswrapper[4954]: E1206 07:13:14.873805 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="456e3ce9-0eb6-466a-92f1-7eaa684b6b7a" containerName="util" Dec 06 07:13:14 crc kubenswrapper[4954]: I1206 07:13:14.873811 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="456e3ce9-0eb6-466a-92f1-7eaa684b6b7a" containerName="util" Dec 06 07:13:14 crc kubenswrapper[4954]: E1206 07:13:14.873824 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="456e3ce9-0eb6-466a-92f1-7eaa684b6b7a" containerName="extract" Dec 06 07:13:14 crc kubenswrapper[4954]: I1206 07:13:14.873829 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="456e3ce9-0eb6-466a-92f1-7eaa684b6b7a" containerName="extract" Dec 06 07:13:14 crc kubenswrapper[4954]: I1206 07:13:14.873951 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="38e3ea1c-bed8-47b9-b229-c30fbd6046f4" containerName="registry-server" Dec 06 07:13:14 crc kubenswrapper[4954]: I1206 07:13:14.873970 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="456e3ce9-0eb6-466a-92f1-7eaa684b6b7a" containerName="extract" Dec 06 07:13:14 crc kubenswrapper[4954]: I1206 07:13:14.874494 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-dj6ht" Dec 06 07:13:14 crc kubenswrapper[4954]: I1206 07:13:14.877477 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 06 07:13:14 crc kubenswrapper[4954]: I1206 07:13:14.877529 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 06 07:13:14 crc kubenswrapper[4954]: I1206 07:13:14.879175 4954 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-p89sz" Dec 06 07:13:14 crc kubenswrapper[4954]: I1206 07:13:14.912690 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-dj6ht"] Dec 06 07:13:14 crc kubenswrapper[4954]: I1206 07:13:14.979996 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/87fc9ae6-67d4-4501-a14e-119336c63c62-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-dj6ht\" (UID: \"87fc9ae6-67d4-4501-a14e-119336c63c62\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-dj6ht" Dec 06 07:13:14 crc kubenswrapper[4954]: I1206 07:13:14.980074 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx4np\" (UniqueName: \"kubernetes.io/projected/87fc9ae6-67d4-4501-a14e-119336c63c62-kube-api-access-hx4np\") pod \"cert-manager-operator-controller-manager-64cf6dff88-dj6ht\" (UID: \"87fc9ae6-67d4-4501-a14e-119336c63c62\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-dj6ht" Dec 06 07:13:15 crc kubenswrapper[4954]: I1206 07:13:15.082210 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/87fc9ae6-67d4-4501-a14e-119336c63c62-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-dj6ht\" (UID: \"87fc9ae6-67d4-4501-a14e-119336c63c62\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-dj6ht" Dec 06 07:13:15 crc kubenswrapper[4954]: I1206 07:13:15.082314 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx4np\" (UniqueName: \"kubernetes.io/projected/87fc9ae6-67d4-4501-a14e-119336c63c62-kube-api-access-hx4np\") pod \"cert-manager-operator-controller-manager-64cf6dff88-dj6ht\" (UID: \"87fc9ae6-67d4-4501-a14e-119336c63c62\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-dj6ht" Dec 06 07:13:15 crc kubenswrapper[4954]: I1206 07:13:15.083129 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/87fc9ae6-67d4-4501-a14e-119336c63c62-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-dj6ht\" (UID: \"87fc9ae6-67d4-4501-a14e-119336c63c62\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-dj6ht" Dec 06 07:13:15 crc kubenswrapper[4954]: I1206 07:13:15.107286 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx4np\" (UniqueName: \"kubernetes.io/projected/87fc9ae6-67d4-4501-a14e-119336c63c62-kube-api-access-hx4np\") pod \"cert-manager-operator-controller-manager-64cf6dff88-dj6ht\" (UID: \"87fc9ae6-67d4-4501-a14e-119336c63c62\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-dj6ht" Dec 06 07:13:15 crc kubenswrapper[4954]: I1206 07:13:15.196100 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-dj6ht" Dec 06 07:13:15 crc kubenswrapper[4954]: I1206 07:13:15.562056 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-dj6ht"] Dec 06 07:13:15 crc kubenswrapper[4954]: W1206 07:13:15.609189 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87fc9ae6_67d4_4501_a14e_119336c63c62.slice/crio-edbff36f2b0b75c506d5484368094fa7fcca93c7a7038dc16afaff86e4ed9cb5 WatchSource:0}: Error finding container edbff36f2b0b75c506d5484368094fa7fcca93c7a7038dc16afaff86e4ed9cb5: Status 404 returned error can't find the container with id edbff36f2b0b75c506d5484368094fa7fcca93c7a7038dc16afaff86e4ed9cb5 Dec 06 07:13:15 crc kubenswrapper[4954]: I1206 07:13:15.662121 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-p95p2" Dec 06 07:13:15 crc kubenswrapper[4954]: I1206 07:13:15.679025 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-8f4wp" Dec 06 07:13:16 crc kubenswrapper[4954]: I1206 07:13:16.396324 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-dj6ht" event={"ID":"87fc9ae6-67d4-4501-a14e-119336c63c62","Type":"ContainerStarted","Data":"edbff36f2b0b75c506d5484368094fa7fcca93c7a7038dc16afaff86e4ed9cb5"} Dec 06 07:13:22 crc kubenswrapper[4954]: I1206 07:13:22.498943 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8s6mp"] Dec 06 07:13:22 crc kubenswrapper[4954]: I1206 07:13:22.501460 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8s6mp" Dec 06 07:13:22 crc kubenswrapper[4954]: I1206 07:13:22.546541 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8s6mp"] Dec 06 07:13:22 crc kubenswrapper[4954]: I1206 07:13:22.631210 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e2a597-a341-44b7-b139-80143e54b20f-catalog-content\") pod \"redhat-marketplace-8s6mp\" (UID: \"e3e2a597-a341-44b7-b139-80143e54b20f\") " pod="openshift-marketplace/redhat-marketplace-8s6mp" Dec 06 07:13:22 crc kubenswrapper[4954]: I1206 07:13:22.631352 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5zlc\" (UniqueName: \"kubernetes.io/projected/e3e2a597-a341-44b7-b139-80143e54b20f-kube-api-access-j5zlc\") pod \"redhat-marketplace-8s6mp\" (UID: \"e3e2a597-a341-44b7-b139-80143e54b20f\") " pod="openshift-marketplace/redhat-marketplace-8s6mp" Dec 06 07:13:22 crc kubenswrapper[4954]: I1206 07:13:22.631415 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e2a597-a341-44b7-b139-80143e54b20f-utilities\") pod \"redhat-marketplace-8s6mp\" (UID: \"e3e2a597-a341-44b7-b139-80143e54b20f\") " pod="openshift-marketplace/redhat-marketplace-8s6mp" Dec 06 07:13:22 crc kubenswrapper[4954]: I1206 07:13:22.733165 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5zlc\" (UniqueName: \"kubernetes.io/projected/e3e2a597-a341-44b7-b139-80143e54b20f-kube-api-access-j5zlc\") pod \"redhat-marketplace-8s6mp\" (UID: \"e3e2a597-a341-44b7-b139-80143e54b20f\") " pod="openshift-marketplace/redhat-marketplace-8s6mp" Dec 06 07:13:22 crc kubenswrapper[4954]: I1206 07:13:22.733285 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e2a597-a341-44b7-b139-80143e54b20f-utilities\") pod \"redhat-marketplace-8s6mp\" (UID: \"e3e2a597-a341-44b7-b139-80143e54b20f\") " pod="openshift-marketplace/redhat-marketplace-8s6mp" Dec 06 07:13:22 crc kubenswrapper[4954]: I1206 07:13:22.733341 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e2a597-a341-44b7-b139-80143e54b20f-catalog-content\") pod \"redhat-marketplace-8s6mp\" (UID: \"e3e2a597-a341-44b7-b139-80143e54b20f\") " pod="openshift-marketplace/redhat-marketplace-8s6mp" Dec 06 07:13:22 crc kubenswrapper[4954]: I1206 07:13:22.733999 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e2a597-a341-44b7-b139-80143e54b20f-catalog-content\") pod \"redhat-marketplace-8s6mp\" (UID: \"e3e2a597-a341-44b7-b139-80143e54b20f\") " pod="openshift-marketplace/redhat-marketplace-8s6mp" Dec 06 07:13:22 crc kubenswrapper[4954]: I1206 07:13:22.734134 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e2a597-a341-44b7-b139-80143e54b20f-utilities\") pod \"redhat-marketplace-8s6mp\" (UID: \"e3e2a597-a341-44b7-b139-80143e54b20f\") " pod="openshift-marketplace/redhat-marketplace-8s6mp" Dec 06 07:13:22 crc kubenswrapper[4954]: I1206 07:13:22.755897 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5zlc\" (UniqueName: \"kubernetes.io/projected/e3e2a597-a341-44b7-b139-80143e54b20f-kube-api-access-j5zlc\") pod \"redhat-marketplace-8s6mp\" (UID: \"e3e2a597-a341-44b7-b139-80143e54b20f\") " pod="openshift-marketplace/redhat-marketplace-8s6mp" Dec 06 07:13:22 crc kubenswrapper[4954]: I1206 07:13:22.830269 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8s6mp" Dec 06 07:13:24 crc kubenswrapper[4954]: I1206 07:13:24.758653 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8s6mp"] Dec 06 07:13:24 crc kubenswrapper[4954]: W1206 07:13:24.767530 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3e2a597_a341_44b7_b139_80143e54b20f.slice/crio-1c9cece24bff7c6df27f2455f8c1297e5b4b1faad142f5341365dbea9eedef00 WatchSource:0}: Error finding container 1c9cece24bff7c6df27f2455f8c1297e5b4b1faad142f5341365dbea9eedef00: Status 404 returned error can't find the container with id 1c9cece24bff7c6df27f2455f8c1297e5b4b1faad142f5341365dbea9eedef00 Dec 06 07:13:25 crc kubenswrapper[4954]: I1206 07:13:25.607069 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8s6mp" event={"ID":"e3e2a597-a341-44b7-b139-80143e54b20f","Type":"ContainerStarted","Data":"128f3c0691d7b0f39a838870ac434f1deb937ffd1daa2533394084698c10ae01"} Dec 06 07:13:25 crc kubenswrapper[4954]: I1206 07:13:25.607672 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8s6mp" event={"ID":"e3e2a597-a341-44b7-b139-80143e54b20f","Type":"ContainerStarted","Data":"1c9cece24bff7c6df27f2455f8c1297e5b4b1faad142f5341365dbea9eedef00"} Dec 06 07:13:25 crc kubenswrapper[4954]: I1206 07:13:25.609416 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-dj6ht" event={"ID":"87fc9ae6-67d4-4501-a14e-119336c63c62","Type":"ContainerStarted","Data":"43783bdbeef9ef2970bb1862d619b1a1bd481f7ebd6ebc153bbcf6e64e2de94e"} Dec 06 07:13:25 crc kubenswrapper[4954]: I1206 07:13:25.632978 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-dj6ht" podStartSLOduration=2.604310056 podStartE2EDuration="11.632950298s" podCreationTimestamp="2025-12-06 07:13:14 +0000 UTC" firstStartedPulling="2025-12-06 07:13:15.61679493 +0000 UTC m=+970.430154319" lastFinishedPulling="2025-12-06 07:13:24.645435172 +0000 UTC m=+979.458794561" observedRunningTime="2025-12-06 07:13:25.630207785 +0000 UTC m=+980.443567184" watchObservedRunningTime="2025-12-06 07:13:25.632950298 +0000 UTC m=+980.446309707" Dec 06 07:13:26 crc kubenswrapper[4954]: I1206 07:13:26.617575 4954 generic.go:334] "Generic (PLEG): container finished" podID="e3e2a597-a341-44b7-b139-80143e54b20f" containerID="128f3c0691d7b0f39a838870ac434f1deb937ffd1daa2533394084698c10ae01" exitCode=0 Dec 06 07:13:26 crc kubenswrapper[4954]: I1206 07:13:26.618193 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8s6mp" event={"ID":"e3e2a597-a341-44b7-b139-80143e54b20f","Type":"ContainerDied","Data":"128f3c0691d7b0f39a838870ac434f1deb937ffd1daa2533394084698c10ae01"} Dec 06 07:13:28 crc kubenswrapper[4954]: I1206 07:13:28.226752 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-djdk2"] Dec 06 07:13:28 crc kubenswrapper[4954]: I1206 07:13:28.228181 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-djdk2" Dec 06 07:13:28 crc kubenswrapper[4954]: I1206 07:13:28.231832 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 06 07:13:28 crc kubenswrapper[4954]: I1206 07:13:28.231923 4954 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-tnjd2" Dec 06 07:13:28 crc kubenswrapper[4954]: I1206 07:13:28.232165 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 06 07:13:28 crc kubenswrapper[4954]: I1206 07:13:28.240680 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-djdk2"] Dec 06 07:13:28 crc kubenswrapper[4954]: I1206 07:13:28.337260 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr5tb\" (UniqueName: \"kubernetes.io/projected/21c46708-0103-458e-b609-635ce217adb1-kube-api-access-zr5tb\") pod \"cert-manager-webhook-f4fb5df64-djdk2\" (UID: \"21c46708-0103-458e-b609-635ce217adb1\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-djdk2" Dec 06 07:13:28 crc kubenswrapper[4954]: I1206 07:13:28.337536 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21c46708-0103-458e-b609-635ce217adb1-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-djdk2\" (UID: \"21c46708-0103-458e-b609-635ce217adb1\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-djdk2" Dec 06 07:13:28 crc kubenswrapper[4954]: I1206 07:13:28.439994 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21c46708-0103-458e-b609-635ce217adb1-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-djdk2\" (UID: \"21c46708-0103-458e-b609-635ce217adb1\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-djdk2" Dec 06 07:13:28 crc kubenswrapper[4954]: I1206 07:13:28.440165 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr5tb\" (UniqueName: \"kubernetes.io/projected/21c46708-0103-458e-b609-635ce217adb1-kube-api-access-zr5tb\") pod \"cert-manager-webhook-f4fb5df64-djdk2\" (UID: \"21c46708-0103-458e-b609-635ce217adb1\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-djdk2" Dec 06 07:13:28 crc kubenswrapper[4954]: I1206 07:13:28.461463 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21c46708-0103-458e-b609-635ce217adb1-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-djdk2\" (UID: \"21c46708-0103-458e-b609-635ce217adb1\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-djdk2" Dec 06 07:13:28 crc kubenswrapper[4954]: I1206 07:13:28.461494 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr5tb\" (UniqueName: \"kubernetes.io/projected/21c46708-0103-458e-b609-635ce217adb1-kube-api-access-zr5tb\") pod \"cert-manager-webhook-f4fb5df64-djdk2\" (UID: \"21c46708-0103-458e-b609-635ce217adb1\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-djdk2" Dec 06 07:13:28 crc kubenswrapper[4954]: I1206 07:13:28.547480 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-djdk2" Dec 06 07:13:28 crc kubenswrapper[4954]: I1206 07:13:28.645386 4954 generic.go:334] "Generic (PLEG): container finished" podID="e3e2a597-a341-44b7-b139-80143e54b20f" containerID="dd7a88c677f0c84d17890ce107015bda08b52c75fe3f85885ce06500b49d44d8" exitCode=0 Dec 06 07:13:28 crc kubenswrapper[4954]: I1206 07:13:28.645885 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8s6mp" event={"ID":"e3e2a597-a341-44b7-b139-80143e54b20f","Type":"ContainerDied","Data":"dd7a88c677f0c84d17890ce107015bda08b52c75fe3f85885ce06500b49d44d8"} Dec 06 07:13:28 crc kubenswrapper[4954]: I1206 07:13:28.826791 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-djdk2"] Dec 06 07:13:28 crc kubenswrapper[4954]: W1206 07:13:28.832502 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21c46708_0103_458e_b609_635ce217adb1.slice/crio-8809de5b5231f089d2de2b03997a0ca0daee49e7197af96ff52327ec14ed2aae WatchSource:0}: Error finding container 8809de5b5231f089d2de2b03997a0ca0daee49e7197af96ff52327ec14ed2aae: Status 404 returned error can't find the container with id 8809de5b5231f089d2de2b03997a0ca0daee49e7197af96ff52327ec14ed2aae Dec 06 07:13:29 crc kubenswrapper[4954]: I1206 07:13:29.654285 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-djdk2" event={"ID":"21c46708-0103-458e-b609-635ce217adb1","Type":"ContainerStarted","Data":"8809de5b5231f089d2de2b03997a0ca0daee49e7197af96ff52327ec14ed2aae"} Dec 06 07:13:29 crc kubenswrapper[4954]: I1206 07:13:29.659716 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8s6mp" event={"ID":"e3e2a597-a341-44b7-b139-80143e54b20f","Type":"ContainerStarted","Data":"39879f595da9330642eda6ae49422e51d3520bd036604fb7e74bdc24b8842f5b"} Dec 06 07:13:29 crc kubenswrapper[4954]: I1206 07:13:29.678536 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8s6mp" podStartSLOduration=5.271019213 podStartE2EDuration="7.678501109s" podCreationTimestamp="2025-12-06 07:13:22 +0000 UTC" firstStartedPulling="2025-12-06 07:13:26.6203299 +0000 UTC m=+981.433689289" lastFinishedPulling="2025-12-06 07:13:29.027811796 +0000 UTC m=+983.841171185" observedRunningTime="2025-12-06 07:13:29.675618452 +0000 UTC m=+984.488977851" watchObservedRunningTime="2025-12-06 07:13:29.678501109 +0000 UTC m=+984.491860508" Dec 06 07:13:31 crc kubenswrapper[4954]: I1206 07:13:31.853494 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-z4hph"] Dec 06 07:13:31 crc kubenswrapper[4954]: I1206 07:13:31.858800 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-z4hph" Dec 06 07:13:31 crc kubenswrapper[4954]: I1206 07:13:31.862913 4954 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-rsq4q" Dec 06 07:13:31 crc kubenswrapper[4954]: I1206 07:13:31.877103 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-z4hph"] Dec 06 07:13:31 crc kubenswrapper[4954]: I1206 07:13:31.908908 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e45848cf-9e42-436e-9719-8229bac3fb66-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-z4hph\" (UID: \"e45848cf-9e42-436e-9719-8229bac3fb66\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-z4hph" Dec 06 07:13:31 crc kubenswrapper[4954]: I1206 07:13:31.909057 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq78v\" (UniqueName: \"kubernetes.io/projected/e45848cf-9e42-436e-9719-8229bac3fb66-kube-api-access-sq78v\") pod \"cert-manager-cainjector-855d9ccff4-z4hph\" (UID: \"e45848cf-9e42-436e-9719-8229bac3fb66\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-z4hph" Dec 06 07:13:32 crc kubenswrapper[4954]: I1206 07:13:32.010619 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e45848cf-9e42-436e-9719-8229bac3fb66-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-z4hph\" (UID: \"e45848cf-9e42-436e-9719-8229bac3fb66\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-z4hph" Dec 06 07:13:32 crc kubenswrapper[4954]: I1206 07:13:32.010774 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq78v\" (UniqueName: \"kubernetes.io/projected/e45848cf-9e42-436e-9719-8229bac3fb66-kube-api-access-sq78v\") pod \"cert-manager-cainjector-855d9ccff4-z4hph\" (UID: \"e45848cf-9e42-436e-9719-8229bac3fb66\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-z4hph" Dec 06 07:13:32 crc kubenswrapper[4954]: I1206 07:13:32.034552 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e45848cf-9e42-436e-9719-8229bac3fb66-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-z4hph\" (UID: \"e45848cf-9e42-436e-9719-8229bac3fb66\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-z4hph" Dec 06 07:13:32 crc kubenswrapper[4954]: I1206 07:13:32.063627 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq78v\" (UniqueName: \"kubernetes.io/projected/e45848cf-9e42-436e-9719-8229bac3fb66-kube-api-access-sq78v\") pod \"cert-manager-cainjector-855d9ccff4-z4hph\" (UID: \"e45848cf-9e42-436e-9719-8229bac3fb66\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-z4hph" Dec 06 07:13:32 crc kubenswrapper[4954]: I1206 07:13:32.177882 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-z4hph" Dec 06 07:13:32 crc kubenswrapper[4954]: I1206 07:13:32.738923 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-z4hph"] Dec 06 07:13:32 crc kubenswrapper[4954]: I1206 07:13:32.831413 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8s6mp" Dec 06 07:13:32 crc kubenswrapper[4954]: I1206 07:13:32.831961 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8s6mp" Dec 06 07:13:32 crc kubenswrapper[4954]: I1206 07:13:32.899443 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8s6mp" Dec 06 07:13:38 crc kubenswrapper[4954]: I1206 07:13:38.129180 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-qvrwn"] Dec 06 07:13:38 crc kubenswrapper[4954]: I1206 07:13:38.130671 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-qvrwn" Dec 06 07:13:38 crc kubenswrapper[4954]: I1206 07:13:38.133924 4954 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-8ncd6" Dec 06 07:13:38 crc kubenswrapper[4954]: I1206 07:13:38.142101 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-qvrwn"] Dec 06 07:13:38 crc kubenswrapper[4954]: I1206 07:13:38.290185 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3edd2749-556f-4ce2-a0d9-e13cb2455243-bound-sa-token\") pod \"cert-manager-86cb77c54b-qvrwn\" (UID: \"3edd2749-556f-4ce2-a0d9-e13cb2455243\") " pod="cert-manager/cert-manager-86cb77c54b-qvrwn" Dec 06 07:13:38 crc kubenswrapper[4954]: I1206 07:13:38.290269 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkl2t\" (UniqueName: \"kubernetes.io/projected/3edd2749-556f-4ce2-a0d9-e13cb2455243-kube-api-access-tkl2t\") pod \"cert-manager-86cb77c54b-qvrwn\" (UID: \"3edd2749-556f-4ce2-a0d9-e13cb2455243\") " pod="cert-manager/cert-manager-86cb77c54b-qvrwn" Dec 06 07:13:38 crc kubenswrapper[4954]: I1206 07:13:38.391591 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3edd2749-556f-4ce2-a0d9-e13cb2455243-bound-sa-token\") pod \"cert-manager-86cb77c54b-qvrwn\" (UID: \"3edd2749-556f-4ce2-a0d9-e13cb2455243\") " pod="cert-manager/cert-manager-86cb77c54b-qvrwn" Dec 06 07:13:38 crc kubenswrapper[4954]: I1206 07:13:38.391643 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkl2t\" (UniqueName: \"kubernetes.io/projected/3edd2749-556f-4ce2-a0d9-e13cb2455243-kube-api-access-tkl2t\") pod \"cert-manager-86cb77c54b-qvrwn\" (UID: \"3edd2749-556f-4ce2-a0d9-e13cb2455243\") " pod="cert-manager/cert-manager-86cb77c54b-qvrwn" Dec 06 07:13:38 crc kubenswrapper[4954]: I1206 07:13:38.414922 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkl2t\" (UniqueName: \"kubernetes.io/projected/3edd2749-556f-4ce2-a0d9-e13cb2455243-kube-api-access-tkl2t\") pod \"cert-manager-86cb77c54b-qvrwn\" (UID: \"3edd2749-556f-4ce2-a0d9-e13cb2455243\") " pod="cert-manager/cert-manager-86cb77c54b-qvrwn" Dec 06 07:13:38 crc kubenswrapper[4954]: I1206 07:13:38.415992 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3edd2749-556f-4ce2-a0d9-e13cb2455243-bound-sa-token\") pod \"cert-manager-86cb77c54b-qvrwn\" (UID: \"3edd2749-556f-4ce2-a0d9-e13cb2455243\") " pod="cert-manager/cert-manager-86cb77c54b-qvrwn" Dec 06 07:13:38 crc kubenswrapper[4954]: I1206 07:13:38.457810 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-qvrwn" Dec 06 07:13:39 crc kubenswrapper[4954]: I1206 07:13:39.766188 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-djdk2" event={"ID":"21c46708-0103-458e-b609-635ce217adb1","Type":"ContainerStarted","Data":"4d30d5bf80929e7f7d8b56558af5d50613602dbbaf8663336899e7c754be1d1b"} Dec 06 07:13:39 crc kubenswrapper[4954]: I1206 07:13:39.766834 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-djdk2" Dec 06 07:13:39 crc kubenswrapper[4954]: I1206 07:13:39.767892 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-z4hph" event={"ID":"e45848cf-9e42-436e-9719-8229bac3fb66","Type":"ContainerStarted","Data":"c0dbd3d1768d602aac32572a00d498e5909bcb87f17e9ad734bb8c69e7ae9406"} Dec 06 07:13:39 crc kubenswrapper[4954]: I1206 07:13:39.784830 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-qvrwn"] Dec 06 07:13:39 crc kubenswrapper[4954]: I1206 07:13:39.788470 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-djdk2" podStartSLOduration=1.151520849 podStartE2EDuration="11.788450411s" podCreationTimestamp="2025-12-06 07:13:28 +0000 UTC" firstStartedPulling="2025-12-06 07:13:28.836493088 +0000 UTC m=+983.649852477" lastFinishedPulling="2025-12-06 07:13:39.47342265 +0000 UTC m=+994.286782039" observedRunningTime="2025-12-06 07:13:39.78017625 +0000 UTC m=+994.593535639" watchObservedRunningTime="2025-12-06 07:13:39.788450411 +0000 UTC m=+994.601809800" Dec 06 07:13:39 crc kubenswrapper[4954]: W1206 07:13:39.790010 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3edd2749_556f_4ce2_a0d9_e13cb2455243.slice/crio-12b01d92f1d61c83e88b9522fad7d60af5c49d716a8cb35ebacc3857f1681165 WatchSource:0}: Error finding container 12b01d92f1d61c83e88b9522fad7d60af5c49d716a8cb35ebacc3857f1681165: Status 404 returned error can't find the container with id 12b01d92f1d61c83e88b9522fad7d60af5c49d716a8cb35ebacc3857f1681165 Dec 06 07:13:40 crc kubenswrapper[4954]: I1206 07:13:40.785772 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-z4hph" event={"ID":"e45848cf-9e42-436e-9719-8229bac3fb66","Type":"ContainerStarted","Data":"fef49cded62efec69df33e4929f158690d3328d13652e1441355f3fbf4cd21b4"} Dec 06 07:13:40 crc kubenswrapper[4954]: I1206 07:13:40.788823 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-qvrwn" event={"ID":"3edd2749-556f-4ce2-a0d9-e13cb2455243","Type":"ContainerStarted","Data":"34d3cde47c745431793fdcaed2d9b6e3b144af2027381d26ba3466d1af94ab7c"} Dec 06 07:13:40 crc kubenswrapper[4954]: I1206 07:13:40.788892 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-qvrwn" event={"ID":"3edd2749-556f-4ce2-a0d9-e13cb2455243","Type":"ContainerStarted","Data":"12b01d92f1d61c83e88b9522fad7d60af5c49d716a8cb35ebacc3857f1681165"} Dec 06 07:13:40 crc kubenswrapper[4954]: I1206 07:13:40.802749 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-z4hph" podStartSLOduration=8.86931359 podStartE2EDuration="9.802729031s" podCreationTimestamp="2025-12-06 07:13:31 +0000 UTC" firstStartedPulling="2025-12-06 07:13:39.340698347 +0000 UTC m=+994.154057736" lastFinishedPulling="2025-12-06 07:13:40.274113798 +0000 UTC m=+995.087473177" observedRunningTime="2025-12-06 07:13:40.800553933 +0000 UTC m=+995.613913332" watchObservedRunningTime="2025-12-06 07:13:40.802729031 +0000 UTC m=+995.616088420" Dec 06 07:13:40 crc kubenswrapper[4954]: I1206 07:13:40.824435 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-qvrwn" podStartSLOduration=2.82441093 podStartE2EDuration="2.82441093s" podCreationTimestamp="2025-12-06 07:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:13:40.821478012 +0000 UTC m=+995.634837411" watchObservedRunningTime="2025-12-06 07:13:40.82441093 +0000 UTC m=+995.637770319" Dec 06 07:13:42 crc kubenswrapper[4954]: I1206 07:13:42.888541 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8s6mp" Dec 06 07:13:42 crc kubenswrapper[4954]: I1206 07:13:42.963941 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8s6mp"] Dec 06 07:13:43 crc kubenswrapper[4954]: I1206 07:13:43.806620 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8s6mp" podUID="e3e2a597-a341-44b7-b139-80143e54b20f" containerName="registry-server" containerID="cri-o://39879f595da9330642eda6ae49422e51d3520bd036604fb7e74bdc24b8842f5b" gracePeriod=2 Dec 06 07:13:45 crc kubenswrapper[4954]: I1206 07:13:45.826718 4954 generic.go:334] "Generic (PLEG): container finished" podID="e3e2a597-a341-44b7-b139-80143e54b20f" containerID="39879f595da9330642eda6ae49422e51d3520bd036604fb7e74bdc24b8842f5b" exitCode=0 Dec 06 07:13:45 crc kubenswrapper[4954]: I1206 07:13:45.826841 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8s6mp" event={"ID":"e3e2a597-a341-44b7-b139-80143e54b20f","Type":"ContainerDied","Data":"39879f595da9330642eda6ae49422e51d3520bd036604fb7e74bdc24b8842f5b"} Dec 06 07:13:46 crc kubenswrapper[4954]: I1206 07:13:46.156694 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8s6mp" Dec 06 07:13:46 crc kubenswrapper[4954]: I1206 07:13:46.311038 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5zlc\" (UniqueName: \"kubernetes.io/projected/e3e2a597-a341-44b7-b139-80143e54b20f-kube-api-access-j5zlc\") pod \"e3e2a597-a341-44b7-b139-80143e54b20f\" (UID: \"e3e2a597-a341-44b7-b139-80143e54b20f\") " Dec 06 07:13:46 crc kubenswrapper[4954]: I1206 07:13:46.311881 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e2a597-a341-44b7-b139-80143e54b20f-utilities\") pod \"e3e2a597-a341-44b7-b139-80143e54b20f\" (UID: \"e3e2a597-a341-44b7-b139-80143e54b20f\") " Dec 06 07:13:46 crc kubenswrapper[4954]: I1206 07:13:46.312859 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3e2a597-a341-44b7-b139-80143e54b20f-utilities" (OuterVolumeSpecName: "utilities") pod "e3e2a597-a341-44b7-b139-80143e54b20f" (UID: "e3e2a597-a341-44b7-b139-80143e54b20f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:13:46 crc kubenswrapper[4954]: I1206 07:13:46.312558 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e2a597-a341-44b7-b139-80143e54b20f-catalog-content\") pod \"e3e2a597-a341-44b7-b139-80143e54b20f\" (UID: \"e3e2a597-a341-44b7-b139-80143e54b20f\") " Dec 06 07:13:46 crc kubenswrapper[4954]: I1206 07:13:46.313908 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e2a597-a341-44b7-b139-80143e54b20f-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:13:46 crc kubenswrapper[4954]: I1206 07:13:46.318307 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3e2a597-a341-44b7-b139-80143e54b20f-kube-api-access-j5zlc" (OuterVolumeSpecName: "kube-api-access-j5zlc") pod "e3e2a597-a341-44b7-b139-80143e54b20f" (UID: "e3e2a597-a341-44b7-b139-80143e54b20f"). InnerVolumeSpecName "kube-api-access-j5zlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:13:46 crc kubenswrapper[4954]: I1206 07:13:46.336685 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3e2a597-a341-44b7-b139-80143e54b20f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3e2a597-a341-44b7-b139-80143e54b20f" (UID: "e3e2a597-a341-44b7-b139-80143e54b20f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:13:46 crc kubenswrapper[4954]: I1206 07:13:46.416029 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e2a597-a341-44b7-b139-80143e54b20f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:13:46 crc kubenswrapper[4954]: I1206 07:13:46.416083 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5zlc\" (UniqueName: \"kubernetes.io/projected/e3e2a597-a341-44b7-b139-80143e54b20f-kube-api-access-j5zlc\") on node \"crc\" DevicePath \"\"" Dec 06 07:13:46 crc kubenswrapper[4954]: I1206 07:13:46.837525 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8s6mp" event={"ID":"e3e2a597-a341-44b7-b139-80143e54b20f","Type":"ContainerDied","Data":"1c9cece24bff7c6df27f2455f8c1297e5b4b1faad142f5341365dbea9eedef00"} Dec 06 07:13:46 crc kubenswrapper[4954]: I1206 07:13:46.837645 4954 scope.go:117] "RemoveContainer" containerID="39879f595da9330642eda6ae49422e51d3520bd036604fb7e74bdc24b8842f5b" Dec 06 07:13:46 crc kubenswrapper[4954]: I1206 07:13:46.837641 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8s6mp" Dec 06 07:13:46 crc kubenswrapper[4954]: I1206 07:13:46.874767 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8s6mp"] Dec 06 07:13:46 crc kubenswrapper[4954]: I1206 07:13:46.880726 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8s6mp"] Dec 06 07:13:47 crc kubenswrapper[4954]: I1206 07:13:47.085101 4954 scope.go:117] "RemoveContainer" containerID="dd7a88c677f0c84d17890ce107015bda08b52c75fe3f85885ce06500b49d44d8" Dec 06 07:13:47 crc kubenswrapper[4954]: I1206 07:13:47.104113 4954 scope.go:117] "RemoveContainer" containerID="128f3c0691d7b0f39a838870ac434f1deb937ffd1daa2533394084698c10ae01" Dec 06 07:13:47 crc kubenswrapper[4954]: I1206 07:13:47.452285 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3e2a597-a341-44b7-b139-80143e54b20f" path="/var/lib/kubelet/pods/e3e2a597-a341-44b7-b139-80143e54b20f/volumes" Dec 06 07:13:48 crc kubenswrapper[4954]: I1206 07:13:48.551485 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-djdk2" Dec 06 07:13:52 crc kubenswrapper[4954]: I1206 07:13:52.716164 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-v57vx"] Dec 06 07:13:52 crc kubenswrapper[4954]: E1206 07:13:52.717022 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e2a597-a341-44b7-b139-80143e54b20f" containerName="registry-server" Dec 06 07:13:52 crc kubenswrapper[4954]: I1206 07:13:52.717124 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e2a597-a341-44b7-b139-80143e54b20f" containerName="registry-server" Dec 06 07:13:52 crc kubenswrapper[4954]: E1206 07:13:52.717193 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e2a597-a341-44b7-b139-80143e54b20f" containerName="extract-utilities" Dec 06 07:13:52 crc kubenswrapper[4954]: I1206 07:13:52.717255 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e2a597-a341-44b7-b139-80143e54b20f" containerName="extract-utilities" Dec 06 07:13:52 crc kubenswrapper[4954]: E1206 07:13:52.717317 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e2a597-a341-44b7-b139-80143e54b20f" containerName="extract-content" Dec 06 07:13:52 crc kubenswrapper[4954]: I1206 07:13:52.717375 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e2a597-a341-44b7-b139-80143e54b20f" containerName="extract-content" Dec 06 07:13:52 crc kubenswrapper[4954]: I1206 07:13:52.717551 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3e2a597-a341-44b7-b139-80143e54b20f" containerName="registry-server" Dec 06 07:13:52 crc kubenswrapper[4954]: I1206 07:13:52.718148 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v57vx" Dec 06 07:13:52 crc kubenswrapper[4954]: I1206 07:13:52.720291 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-n2g9r" Dec 06 07:13:52 crc kubenswrapper[4954]: I1206 07:13:52.720441 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 06 07:13:52 crc kubenswrapper[4954]: I1206 07:13:52.721271 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 06 07:13:52 crc kubenswrapper[4954]: I1206 07:13:52.733497 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-v57vx"] Dec 06 07:13:52 crc kubenswrapper[4954]: I1206 07:13:52.818023 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnm9l\" (UniqueName: \"kubernetes.io/projected/4a59828a-a1dd-4a48-8d82-6923eda41293-kube-api-access-fnm9l\") pod \"openstack-operator-index-v57vx\" (UID: \"4a59828a-a1dd-4a48-8d82-6923eda41293\") " pod="openstack-operators/openstack-operator-index-v57vx" Dec 06 07:13:52 crc kubenswrapper[4954]: I1206 07:13:52.919420 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnm9l\" (UniqueName: \"kubernetes.io/projected/4a59828a-a1dd-4a48-8d82-6923eda41293-kube-api-access-fnm9l\") pod \"openstack-operator-index-v57vx\" (UID: \"4a59828a-a1dd-4a48-8d82-6923eda41293\") " pod="openstack-operators/openstack-operator-index-v57vx" Dec 06 07:13:52 crc kubenswrapper[4954]: I1206 07:13:52.942025 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnm9l\" (UniqueName: \"kubernetes.io/projected/4a59828a-a1dd-4a48-8d82-6923eda41293-kube-api-access-fnm9l\") pod \"openstack-operator-index-v57vx\" (UID: \"4a59828a-a1dd-4a48-8d82-6923eda41293\") " pod="openstack-operators/openstack-operator-index-v57vx" Dec 06 07:13:53 crc kubenswrapper[4954]: I1206 07:13:53.035709 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v57vx" Dec 06 07:13:53 crc kubenswrapper[4954]: I1206 07:13:53.476171 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-v57vx"] Dec 06 07:13:53 crc kubenswrapper[4954]: I1206 07:13:53.886749 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v57vx" event={"ID":"4a59828a-a1dd-4a48-8d82-6923eda41293","Type":"ContainerStarted","Data":"97a1a44b2d900179f5b7fed17bc3c3274da0d90137a186dd838a5eca99a7f911"} Dec 06 07:13:56 crc kubenswrapper[4954]: I1206 07:13:56.910790 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v57vx" event={"ID":"4a59828a-a1dd-4a48-8d82-6923eda41293","Type":"ContainerStarted","Data":"113b10d7880a76daa5b02a50f4c3e31136dbfa218e83c95276e0b7afff0bb1b6"} Dec 06 07:13:56 crc kubenswrapper[4954]: I1206 07:13:56.928296 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-v57vx" podStartSLOduration=2.4602731 podStartE2EDuration="4.928273242s" podCreationTimestamp="2025-12-06 07:13:52 +0000 UTC" firstStartedPulling="2025-12-06 07:13:53.496775115 +0000 UTC m=+1008.310134504" lastFinishedPulling="2025-12-06 07:13:55.964775257 +0000 UTC m=+1010.778134646" observedRunningTime="2025-12-06 07:13:56.926891465 +0000 UTC m=+1011.740250854" watchObservedRunningTime="2025-12-06 07:13:56.928273242 +0000 UTC m=+1011.741632631" Dec 06 07:13:57 crc kubenswrapper[4954]: I1206 07:13:57.899054 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-v57vx"] Dec 06 07:13:58 crc kubenswrapper[4954]: I1206 07:13:58.305373 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ts4vc"] Dec 06 07:13:58 crc kubenswrapper[4954]: I1206 07:13:58.306210 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ts4vc" Dec 06 07:13:58 crc kubenswrapper[4954]: I1206 07:13:58.316205 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ts4vc"] Dec 06 07:13:58 crc kubenswrapper[4954]: I1206 07:13:58.397362 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pchsr\" (UniqueName: \"kubernetes.io/projected/4ee8b8a2-75dc-4a36-93e4-4c7b6da2432a-kube-api-access-pchsr\") pod \"openstack-operator-index-ts4vc\" (UID: \"4ee8b8a2-75dc-4a36-93e4-4c7b6da2432a\") " pod="openstack-operators/openstack-operator-index-ts4vc" Dec 06 07:13:58 crc kubenswrapper[4954]: I1206 07:13:58.498693 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pchsr\" (UniqueName: \"kubernetes.io/projected/4ee8b8a2-75dc-4a36-93e4-4c7b6da2432a-kube-api-access-pchsr\") pod \"openstack-operator-index-ts4vc\" (UID: \"4ee8b8a2-75dc-4a36-93e4-4c7b6da2432a\") " pod="openstack-operators/openstack-operator-index-ts4vc" Dec 06 07:13:58 crc kubenswrapper[4954]: I1206 07:13:58.517519 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pchsr\" (UniqueName: \"kubernetes.io/projected/4ee8b8a2-75dc-4a36-93e4-4c7b6da2432a-kube-api-access-pchsr\") pod \"openstack-operator-index-ts4vc\" (UID: \"4ee8b8a2-75dc-4a36-93e4-4c7b6da2432a\") " pod="openstack-operators/openstack-operator-index-ts4vc" Dec 06 07:13:58 crc kubenswrapper[4954]: I1206 07:13:58.625477 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ts4vc" Dec 06 07:13:58 crc kubenswrapper[4954]: I1206 07:13:58.924179 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-v57vx" podUID="4a59828a-a1dd-4a48-8d82-6923eda41293" containerName="registry-server" containerID="cri-o://113b10d7880a76daa5b02a50f4c3e31136dbfa218e83c95276e0b7afff0bb1b6" gracePeriod=2 Dec 06 07:13:59 crc kubenswrapper[4954]: I1206 07:13:59.091270 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ts4vc"] Dec 06 07:13:59 crc kubenswrapper[4954]: W1206 07:13:59.106759 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ee8b8a2_75dc_4a36_93e4_4c7b6da2432a.slice/crio-fdef3b2f5c8fabfa721eb617594344f1e47c84337ad78d94fd6a4be34b088367 WatchSource:0}: Error finding container fdef3b2f5c8fabfa721eb617594344f1e47c84337ad78d94fd6a4be34b088367: Status 404 returned error can't find the container with id fdef3b2f5c8fabfa721eb617594344f1e47c84337ad78d94fd6a4be34b088367 Dec 06 07:13:59 crc kubenswrapper[4954]: I1206 07:13:59.252964 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v57vx" Dec 06 07:13:59 crc kubenswrapper[4954]: I1206 07:13:59.409677 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnm9l\" (UniqueName: \"kubernetes.io/projected/4a59828a-a1dd-4a48-8d82-6923eda41293-kube-api-access-fnm9l\") pod \"4a59828a-a1dd-4a48-8d82-6923eda41293\" (UID: \"4a59828a-a1dd-4a48-8d82-6923eda41293\") " Dec 06 07:13:59 crc kubenswrapper[4954]: I1206 07:13:59.416033 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a59828a-a1dd-4a48-8d82-6923eda41293-kube-api-access-fnm9l" (OuterVolumeSpecName: "kube-api-access-fnm9l") pod "4a59828a-a1dd-4a48-8d82-6923eda41293" (UID: "4a59828a-a1dd-4a48-8d82-6923eda41293"). InnerVolumeSpecName "kube-api-access-fnm9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:13:59 crc kubenswrapper[4954]: I1206 07:13:59.511862 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnm9l\" (UniqueName: \"kubernetes.io/projected/4a59828a-a1dd-4a48-8d82-6923eda41293-kube-api-access-fnm9l\") on node \"crc\" DevicePath \"\"" Dec 06 07:13:59 crc kubenswrapper[4954]: I1206 07:13:59.931996 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ts4vc" event={"ID":"4ee8b8a2-75dc-4a36-93e4-4c7b6da2432a","Type":"ContainerStarted","Data":"2b58d124508b9e352d726616105802514043a3e49ef88977cd6ae20b0a235f1e"} Dec 06 07:13:59 crc kubenswrapper[4954]: I1206 07:13:59.932289 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ts4vc" event={"ID":"4ee8b8a2-75dc-4a36-93e4-4c7b6da2432a","Type":"ContainerStarted","Data":"fdef3b2f5c8fabfa721eb617594344f1e47c84337ad78d94fd6a4be34b088367"} Dec 06 07:13:59 crc kubenswrapper[4954]: I1206 07:13:59.934171 4954 generic.go:334] "Generic (PLEG): container finished" podID="4a59828a-a1dd-4a48-8d82-6923eda41293" containerID="113b10d7880a76daa5b02a50f4c3e31136dbfa218e83c95276e0b7afff0bb1b6" exitCode=0 Dec 06 07:13:59 crc kubenswrapper[4954]: I1206 07:13:59.934229 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v57vx" event={"ID":"4a59828a-a1dd-4a48-8d82-6923eda41293","Type":"ContainerDied","Data":"113b10d7880a76daa5b02a50f4c3e31136dbfa218e83c95276e0b7afff0bb1b6"} Dec 06 07:13:59 crc kubenswrapper[4954]: I1206 07:13:59.934292 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v57vx" event={"ID":"4a59828a-a1dd-4a48-8d82-6923eda41293","Type":"ContainerDied","Data":"97a1a44b2d900179f5b7fed17bc3c3274da0d90137a186dd838a5eca99a7f911"} Dec 06 07:13:59 crc kubenswrapper[4954]: I1206 07:13:59.934309 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v57vx" Dec 06 07:13:59 crc kubenswrapper[4954]: I1206 07:13:59.934321 4954 scope.go:117] "RemoveContainer" containerID="113b10d7880a76daa5b02a50f4c3e31136dbfa218e83c95276e0b7afff0bb1b6" Dec 06 07:13:59 crc kubenswrapper[4954]: I1206 07:13:59.954441 4954 scope.go:117] "RemoveContainer" containerID="113b10d7880a76daa5b02a50f4c3e31136dbfa218e83c95276e0b7afff0bb1b6" Dec 06 07:13:59 crc kubenswrapper[4954]: I1206 07:13:59.955016 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ts4vc" podStartSLOduration=1.541023418 podStartE2EDuration="1.954998441s" podCreationTimestamp="2025-12-06 07:13:58 +0000 UTC" firstStartedPulling="2025-12-06 07:13:59.11187073 +0000 UTC m=+1013.925230109" lastFinishedPulling="2025-12-06 07:13:59.525845743 +0000 UTC m=+1014.339205132" observedRunningTime="2025-12-06 07:13:59.95494699 +0000 UTC m=+1014.768306379" watchObservedRunningTime="2025-12-06 07:13:59.954998441 +0000 UTC m=+1014.768357830" Dec 06 07:13:59 crc kubenswrapper[4954]: E1206 07:13:59.955041 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"113b10d7880a76daa5b02a50f4c3e31136dbfa218e83c95276e0b7afff0bb1b6\": container with ID starting with 113b10d7880a76daa5b02a50f4c3e31136dbfa218e83c95276e0b7afff0bb1b6 not found: ID does not exist" containerID="113b10d7880a76daa5b02a50f4c3e31136dbfa218e83c95276e0b7afff0bb1b6" Dec 06 07:13:59 crc kubenswrapper[4954]: I1206 07:13:59.955203 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"113b10d7880a76daa5b02a50f4c3e31136dbfa218e83c95276e0b7afff0bb1b6"} err="failed to get container status \"113b10d7880a76daa5b02a50f4c3e31136dbfa218e83c95276e0b7afff0bb1b6\": rpc error: code = NotFound desc = could not find container \"113b10d7880a76daa5b02a50f4c3e31136dbfa218e83c95276e0b7afff0bb1b6\": container with ID starting with 113b10d7880a76daa5b02a50f4c3e31136dbfa218e83c95276e0b7afff0bb1b6 not found: ID does not exist" Dec 06 07:13:59 crc kubenswrapper[4954]: I1206 07:13:59.968478 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-v57vx"] Dec 06 07:13:59 crc kubenswrapper[4954]: I1206 07:13:59.973083 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-v57vx"] Dec 06 07:14:01 crc kubenswrapper[4954]: I1206 07:14:01.456120 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a59828a-a1dd-4a48-8d82-6923eda41293" path="/var/lib/kubelet/pods/4a59828a-a1dd-4a48-8d82-6923eda41293/volumes" Dec 06 07:14:08 crc kubenswrapper[4954]: I1206 07:14:08.625865 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-ts4vc" Dec 06 07:14:08 crc kubenswrapper[4954]: I1206 07:14:08.626963 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-ts4vc" Dec 06 07:14:08 crc kubenswrapper[4954]: I1206 07:14:08.663602 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-ts4vc" Dec 06 07:14:09 crc kubenswrapper[4954]: I1206 07:14:09.029551 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-ts4vc" Dec 06 07:14:09 crc kubenswrapper[4954]: I1206 07:14:09.952974 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7"] Dec 06 07:14:09 crc kubenswrapper[4954]: E1206 07:14:09.953384 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a59828a-a1dd-4a48-8d82-6923eda41293" containerName="registry-server" Dec 06 07:14:09 crc kubenswrapper[4954]: I1206 07:14:09.953409 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a59828a-a1dd-4a48-8d82-6923eda41293" containerName="registry-server" Dec 06 07:14:09 crc kubenswrapper[4954]: I1206 07:14:09.953656 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a59828a-a1dd-4a48-8d82-6923eda41293" containerName="registry-server" Dec 06 07:14:09 crc kubenswrapper[4954]: I1206 07:14:09.955016 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" Dec 06 07:14:09 crc kubenswrapper[4954]: I1206 07:14:09.957684 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6xnq4" Dec 06 07:14:09 crc kubenswrapper[4954]: I1206 07:14:09.966350 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7"] Dec 06 07:14:10 crc kubenswrapper[4954]: I1206 07:14:10.070881 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e725d552-aee7-4bc6-abff-f04a2b2522b2-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7\" (UID: \"e725d552-aee7-4bc6-abff-f04a2b2522b2\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" Dec 06 07:14:10 crc kubenswrapper[4954]: I1206 07:14:10.070953 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccxbl\" (UniqueName: \"kubernetes.io/projected/e725d552-aee7-4bc6-abff-f04a2b2522b2-kube-api-access-ccxbl\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7\" (UID: \"e725d552-aee7-4bc6-abff-f04a2b2522b2\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" Dec 06 07:14:10 crc kubenswrapper[4954]: I1206 07:14:10.070983 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e725d552-aee7-4bc6-abff-f04a2b2522b2-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7\" (UID: \"e725d552-aee7-4bc6-abff-f04a2b2522b2\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" Dec 06 07:14:10 crc kubenswrapper[4954]: I1206 07:14:10.172830 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e725d552-aee7-4bc6-abff-f04a2b2522b2-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7\" (UID: \"e725d552-aee7-4bc6-abff-f04a2b2522b2\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" Dec 06 07:14:10 crc kubenswrapper[4954]: I1206 07:14:10.172948 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccxbl\" (UniqueName: \"kubernetes.io/projected/e725d552-aee7-4bc6-abff-f04a2b2522b2-kube-api-access-ccxbl\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7\" (UID: \"e725d552-aee7-4bc6-abff-f04a2b2522b2\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" Dec 06 07:14:10 crc kubenswrapper[4954]: I1206 07:14:10.172982 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e725d552-aee7-4bc6-abff-f04a2b2522b2-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7\" (UID: \"e725d552-aee7-4bc6-abff-f04a2b2522b2\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" Dec 06 07:14:10 crc kubenswrapper[4954]: I1206 07:14:10.173497 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e725d552-aee7-4bc6-abff-f04a2b2522b2-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7\" (UID: \"e725d552-aee7-4bc6-abff-f04a2b2522b2\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" Dec 06 07:14:10 crc kubenswrapper[4954]: I1206 07:14:10.173634 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e725d552-aee7-4bc6-abff-f04a2b2522b2-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7\" (UID: \"e725d552-aee7-4bc6-abff-f04a2b2522b2\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" Dec 06 07:14:10 crc kubenswrapper[4954]: I1206 07:14:10.197789 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccxbl\" (UniqueName: \"kubernetes.io/projected/e725d552-aee7-4bc6-abff-f04a2b2522b2-kube-api-access-ccxbl\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7\" (UID: \"e725d552-aee7-4bc6-abff-f04a2b2522b2\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" Dec 06 07:14:10 crc kubenswrapper[4954]: I1206 07:14:10.277190 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" Dec 06 07:14:10 crc kubenswrapper[4954]: I1206 07:14:10.731296 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7"] Dec 06 07:14:11 crc kubenswrapper[4954]: I1206 07:14:11.013053 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" event={"ID":"e725d552-aee7-4bc6-abff-f04a2b2522b2","Type":"ContainerStarted","Data":"231cbe3036f37cdbbb0ee58fb0cd93a6590201d5941014e58f222568d3b5159a"} Dec 06 07:14:11 crc kubenswrapper[4954]: I1206 07:14:11.013119 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" event={"ID":"e725d552-aee7-4bc6-abff-f04a2b2522b2","Type":"ContainerStarted","Data":"d4f5a86ebc58d57d769bf74776155cc3b2394382df4b5589856fa78b3e40ecc1"} Dec 06 07:14:12 crc kubenswrapper[4954]: I1206 07:14:12.023671 4954 generic.go:334] "Generic (PLEG): container finished" podID="e725d552-aee7-4bc6-abff-f04a2b2522b2" containerID="231cbe3036f37cdbbb0ee58fb0cd93a6590201d5941014e58f222568d3b5159a" exitCode=0 Dec 06 07:14:12 crc kubenswrapper[4954]: I1206 07:14:12.023723 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" event={"ID":"e725d552-aee7-4bc6-abff-f04a2b2522b2","Type":"ContainerDied","Data":"231cbe3036f37cdbbb0ee58fb0cd93a6590201d5941014e58f222568d3b5159a"} Dec 06 07:14:13 crc kubenswrapper[4954]: I1206 07:14:13.034621 4954 generic.go:334] "Generic (PLEG): container finished" podID="e725d552-aee7-4bc6-abff-f04a2b2522b2" containerID="c879b2eee6d22b9bbb89bf102984200b8e92f79598572482f539e8e12cd21a9d" exitCode=0 Dec 06 07:14:13 crc kubenswrapper[4954]: I1206 07:14:13.034736 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" event={"ID":"e725d552-aee7-4bc6-abff-f04a2b2522b2","Type":"ContainerDied","Data":"c879b2eee6d22b9bbb89bf102984200b8e92f79598572482f539e8e12cd21a9d"} Dec 06 07:14:14 crc kubenswrapper[4954]: I1206 07:14:14.043960 4954 generic.go:334] "Generic (PLEG): container finished" podID="e725d552-aee7-4bc6-abff-f04a2b2522b2" containerID="9ba9d902c396e6ba8d713a3843a7aa40ac0b3f1a4d8c1e03f59b3cec730e6cf4" exitCode=0 Dec 06 07:14:14 crc kubenswrapper[4954]: I1206 07:14:14.044047 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" event={"ID":"e725d552-aee7-4bc6-abff-f04a2b2522b2","Type":"ContainerDied","Data":"9ba9d902c396e6ba8d713a3843a7aa40ac0b3f1a4d8c1e03f59b3cec730e6cf4"} Dec 06 07:14:15 crc kubenswrapper[4954]: I1206 07:14:15.328683 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" Dec 06 07:14:15 crc kubenswrapper[4954]: I1206 07:14:15.450640 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e725d552-aee7-4bc6-abff-f04a2b2522b2-util\") pod \"e725d552-aee7-4bc6-abff-f04a2b2522b2\" (UID: \"e725d552-aee7-4bc6-abff-f04a2b2522b2\") " Dec 06 07:14:15 crc kubenswrapper[4954]: I1206 07:14:15.450844 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccxbl\" (UniqueName: \"kubernetes.io/projected/e725d552-aee7-4bc6-abff-f04a2b2522b2-kube-api-access-ccxbl\") pod \"e725d552-aee7-4bc6-abff-f04a2b2522b2\" (UID: \"e725d552-aee7-4bc6-abff-f04a2b2522b2\") " Dec 06 07:14:15 crc kubenswrapper[4954]: I1206 07:14:15.450922 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e725d552-aee7-4bc6-abff-f04a2b2522b2-bundle\") pod \"e725d552-aee7-4bc6-abff-f04a2b2522b2\" (UID: \"e725d552-aee7-4bc6-abff-f04a2b2522b2\") " Dec 06 07:14:15 crc kubenswrapper[4954]: I1206 07:14:15.451741 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e725d552-aee7-4bc6-abff-f04a2b2522b2-bundle" (OuterVolumeSpecName: "bundle") pod "e725d552-aee7-4bc6-abff-f04a2b2522b2" (UID: "e725d552-aee7-4bc6-abff-f04a2b2522b2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:14:15 crc kubenswrapper[4954]: I1206 07:14:15.458709 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e725d552-aee7-4bc6-abff-f04a2b2522b2-kube-api-access-ccxbl" (OuterVolumeSpecName: "kube-api-access-ccxbl") pod "e725d552-aee7-4bc6-abff-f04a2b2522b2" (UID: "e725d552-aee7-4bc6-abff-f04a2b2522b2"). InnerVolumeSpecName "kube-api-access-ccxbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:14:15 crc kubenswrapper[4954]: I1206 07:14:15.465697 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e725d552-aee7-4bc6-abff-f04a2b2522b2-util" (OuterVolumeSpecName: "util") pod "e725d552-aee7-4bc6-abff-f04a2b2522b2" (UID: "e725d552-aee7-4bc6-abff-f04a2b2522b2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:14:15 crc kubenswrapper[4954]: I1206 07:14:15.552536 4954 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e725d552-aee7-4bc6-abff-f04a2b2522b2-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:14:15 crc kubenswrapper[4954]: I1206 07:14:15.552599 4954 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e725d552-aee7-4bc6-abff-f04a2b2522b2-util\") on node \"crc\" DevicePath \"\"" Dec 06 07:14:15 crc kubenswrapper[4954]: I1206 07:14:15.552613 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccxbl\" (UniqueName: \"kubernetes.io/projected/e725d552-aee7-4bc6-abff-f04a2b2522b2-kube-api-access-ccxbl\") on node \"crc\" DevicePath \"\"" Dec 06 07:14:16 crc kubenswrapper[4954]: I1206 07:14:16.062021 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" event={"ID":"e725d552-aee7-4bc6-abff-f04a2b2522b2","Type":"ContainerDied","Data":"d4f5a86ebc58d57d769bf74776155cc3b2394382df4b5589856fa78b3e40ecc1"} Dec 06 07:14:16 crc kubenswrapper[4954]: I1206 07:14:16.062074 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4f5a86ebc58d57d769bf74776155cc3b2394382df4b5589856fa78b3e40ecc1" Dec 06 07:14:16 crc kubenswrapper[4954]: I1206 07:14:16.062178 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7" Dec 06 07:14:22 crc kubenswrapper[4954]: I1206 07:14:22.189117 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-h2z85"] Dec 06 07:14:22 crc kubenswrapper[4954]: E1206 07:14:22.189959 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e725d552-aee7-4bc6-abff-f04a2b2522b2" containerName="util" Dec 06 07:14:22 crc kubenswrapper[4954]: I1206 07:14:22.189975 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e725d552-aee7-4bc6-abff-f04a2b2522b2" containerName="util" Dec 06 07:14:22 crc kubenswrapper[4954]: E1206 07:14:22.189991 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e725d552-aee7-4bc6-abff-f04a2b2522b2" containerName="extract" Dec 06 07:14:22 crc kubenswrapper[4954]: I1206 07:14:22.189997 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e725d552-aee7-4bc6-abff-f04a2b2522b2" containerName="extract" Dec 06 07:14:22 crc kubenswrapper[4954]: E1206 07:14:22.190011 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e725d552-aee7-4bc6-abff-f04a2b2522b2" containerName="pull" Dec 06 07:14:22 crc kubenswrapper[4954]: I1206 07:14:22.190018 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e725d552-aee7-4bc6-abff-f04a2b2522b2" containerName="pull" Dec 06 07:14:22 crc kubenswrapper[4954]: I1206 07:14:22.190129 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="e725d552-aee7-4bc6-abff-f04a2b2522b2" containerName="extract" Dec 06 07:14:22 crc kubenswrapper[4954]: I1206 07:14:22.190606 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-h2z85" Dec 06 07:14:22 crc kubenswrapper[4954]: I1206 07:14:22.194932 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-4lml6" Dec 06 07:14:22 crc kubenswrapper[4954]: I1206 07:14:22.230070 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-h2z85"] Dec 06 07:14:22 crc kubenswrapper[4954]: I1206 07:14:22.254720 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkgpv\" (UniqueName: \"kubernetes.io/projected/e3a2a692-b8fa-4f39-b507-9dd36ab9593e-kube-api-access-gkgpv\") pod \"openstack-operator-controller-operator-55b6fb9447-h2z85\" (UID: \"e3a2a692-b8fa-4f39-b507-9dd36ab9593e\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-h2z85" Dec 06 07:14:22 crc kubenswrapper[4954]: I1206 07:14:22.357054 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkgpv\" (UniqueName: \"kubernetes.io/projected/e3a2a692-b8fa-4f39-b507-9dd36ab9593e-kube-api-access-gkgpv\") pod \"openstack-operator-controller-operator-55b6fb9447-h2z85\" (UID: \"e3a2a692-b8fa-4f39-b507-9dd36ab9593e\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-h2z85" Dec 06 07:14:22 crc kubenswrapper[4954]: I1206 07:14:22.382519 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkgpv\" (UniqueName: \"kubernetes.io/projected/e3a2a692-b8fa-4f39-b507-9dd36ab9593e-kube-api-access-gkgpv\") pod \"openstack-operator-controller-operator-55b6fb9447-h2z85\" (UID: \"e3a2a692-b8fa-4f39-b507-9dd36ab9593e\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-h2z85" Dec 06 07:14:22 crc kubenswrapper[4954]: I1206 07:14:22.510802 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-h2z85" Dec 06 07:14:23 crc kubenswrapper[4954]: I1206 07:14:23.001232 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-h2z85"] Dec 06 07:14:23 crc kubenswrapper[4954]: W1206 07:14:23.018439 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3a2a692_b8fa_4f39_b507_9dd36ab9593e.slice/crio-85e0934a5f0d0aeee7c8e663e9f5bbc3864f32fb95cd38013474c15a55a1992f WatchSource:0}: Error finding container 85e0934a5f0d0aeee7c8e663e9f5bbc3864f32fb95cd38013474c15a55a1992f: Status 404 returned error can't find the container with id 85e0934a5f0d0aeee7c8e663e9f5bbc3864f32fb95cd38013474c15a55a1992f Dec 06 07:14:23 crc kubenswrapper[4954]: I1206 07:14:23.118786 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-h2z85" event={"ID":"e3a2a692-b8fa-4f39-b507-9dd36ab9593e","Type":"ContainerStarted","Data":"85e0934a5f0d0aeee7c8e663e9f5bbc3864f32fb95cd38013474c15a55a1992f"} Dec 06 07:14:34 crc kubenswrapper[4954]: I1206 07:14:34.246851 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-h2z85" event={"ID":"e3a2a692-b8fa-4f39-b507-9dd36ab9593e","Type":"ContainerStarted","Data":"c839a8e1a263da53aee71374bc123ceca06403123de38de7decadab8d3fcba50"} Dec 06 07:14:34 crc kubenswrapper[4954]: I1206 07:14:34.247831 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-h2z85" Dec 06 07:14:34 crc kubenswrapper[4954]: I1206 07:14:34.307710 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-h2z85" podStartSLOduration=1.40254978 podStartE2EDuration="12.30768725s" podCreationTimestamp="2025-12-06 07:14:22 +0000 UTC" firstStartedPulling="2025-12-06 07:14:23.027685484 +0000 UTC m=+1037.841044873" lastFinishedPulling="2025-12-06 07:14:33.932822954 +0000 UTC m=+1048.746182343" observedRunningTime="2025-12-06 07:14:34.281200501 +0000 UTC m=+1049.094559910" watchObservedRunningTime="2025-12-06 07:14:34.30768725 +0000 UTC m=+1049.121046639" Dec 06 07:14:42 crc kubenswrapper[4954]: I1206 07:14:42.529470 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-h2z85" Dec 06 07:15:00 crc kubenswrapper[4954]: I1206 07:15:00.161175 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j"] Dec 06 07:15:00 crc kubenswrapper[4954]: I1206 07:15:00.164956 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j" Dec 06 07:15:00 crc kubenswrapper[4954]: I1206 07:15:00.172868 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 07:15:00 crc kubenswrapper[4954]: I1206 07:15:00.175527 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 07:15:00 crc kubenswrapper[4954]: I1206 07:15:00.176351 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j"] Dec 06 07:15:00 crc kubenswrapper[4954]: I1206 07:15:00.308979 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94b53cf7-013c-4df3-82de-c438b35806ac-secret-volume\") pod \"collect-profiles-29416755-nkc4j\" (UID: \"94b53cf7-013c-4df3-82de-c438b35806ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j" Dec 06 07:15:00 crc kubenswrapper[4954]: I1206 07:15:00.309063 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94b53cf7-013c-4df3-82de-c438b35806ac-config-volume\") pod \"collect-profiles-29416755-nkc4j\" (UID: \"94b53cf7-013c-4df3-82de-c438b35806ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j" Dec 06 07:15:00 crc kubenswrapper[4954]: I1206 07:15:00.309174 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nf2t\" (UniqueName: \"kubernetes.io/projected/94b53cf7-013c-4df3-82de-c438b35806ac-kube-api-access-4nf2t\") pod \"collect-profiles-29416755-nkc4j\" (UID: \"94b53cf7-013c-4df3-82de-c438b35806ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j" Dec 06 07:15:00 crc kubenswrapper[4954]: I1206 07:15:00.410895 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94b53cf7-013c-4df3-82de-c438b35806ac-secret-volume\") pod \"collect-profiles-29416755-nkc4j\" (UID: \"94b53cf7-013c-4df3-82de-c438b35806ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j" Dec 06 07:15:00 crc kubenswrapper[4954]: I1206 07:15:00.410977 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94b53cf7-013c-4df3-82de-c438b35806ac-config-volume\") pod \"collect-profiles-29416755-nkc4j\" (UID: \"94b53cf7-013c-4df3-82de-c438b35806ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j" Dec 06 07:15:00 crc kubenswrapper[4954]: I1206 07:15:00.411041 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nf2t\" (UniqueName: \"kubernetes.io/projected/94b53cf7-013c-4df3-82de-c438b35806ac-kube-api-access-4nf2t\") pod \"collect-profiles-29416755-nkc4j\" (UID: \"94b53cf7-013c-4df3-82de-c438b35806ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j" Dec 06 07:15:00 crc kubenswrapper[4954]: I1206 07:15:00.412199 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94b53cf7-013c-4df3-82de-c438b35806ac-config-volume\") pod \"collect-profiles-29416755-nkc4j\" (UID: \"94b53cf7-013c-4df3-82de-c438b35806ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j" Dec 06 07:15:00 crc kubenswrapper[4954]: I1206 07:15:00.419028 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94b53cf7-013c-4df3-82de-c438b35806ac-secret-volume\") pod \"collect-profiles-29416755-nkc4j\" (UID: \"94b53cf7-013c-4df3-82de-c438b35806ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j" Dec 06 07:15:00 crc kubenswrapper[4954]: I1206 07:15:00.432581 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nf2t\" (UniqueName: \"kubernetes.io/projected/94b53cf7-013c-4df3-82de-c438b35806ac-kube-api-access-4nf2t\") pod \"collect-profiles-29416755-nkc4j\" (UID: \"94b53cf7-013c-4df3-82de-c438b35806ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j" Dec 06 07:15:00 crc kubenswrapper[4954]: I1206 07:15:00.486089 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j" Dec 06 07:15:01 crc kubenswrapper[4954]: I1206 07:15:01.079284 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j"] Dec 06 07:15:01 crc kubenswrapper[4954]: I1206 07:15:01.450169 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j" event={"ID":"94b53cf7-013c-4df3-82de-c438b35806ac","Type":"ContainerStarted","Data":"9bf0a4efe2e24ad85348bb01c4e3fe64356fe21da9707c1dbcbb87c496f8dea8"} Dec 06 07:15:01 crc kubenswrapper[4954]: I1206 07:15:01.450627 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j" event={"ID":"94b53cf7-013c-4df3-82de-c438b35806ac","Type":"ContainerStarted","Data":"8a9bfa690967a97700bc966f7d05a7dd038db940918d6e971893c923ec8c0d60"} Dec 06 07:15:01 crc kubenswrapper[4954]: I1206 07:15:01.469878 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j" podStartSLOduration=1.469845798 podStartE2EDuration="1.469845798s" podCreationTimestamp="2025-12-06 07:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:15:01.468516503 +0000 UTC m=+1076.281875912" watchObservedRunningTime="2025-12-06 07:15:01.469845798 +0000 UTC m=+1076.283205187" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.465991 4954 generic.go:334] "Generic (PLEG): container finished" podID="94b53cf7-013c-4df3-82de-c438b35806ac" containerID="9bf0a4efe2e24ad85348bb01c4e3fe64356fe21da9707c1dbcbb87c496f8dea8" exitCode=0 Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.466066 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j" event={"ID":"94b53cf7-013c-4df3-82de-c438b35806ac","Type":"ContainerDied","Data":"9bf0a4efe2e24ad85348bb01c4e3fe64356fe21da9707c1dbcbb87c496f8dea8"} Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.722970 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-vkd9c"] Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.725044 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vkd9c" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.727762 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-dmlxs" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.732632 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-kkcv7"] Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.733977 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kkcv7" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.736109 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-s9r7c" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.749360 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-vkd9c"] Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.759928 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-kkcv7"] Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.766757 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-l8h49"] Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.768117 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9sbt\" (UniqueName: \"kubernetes.io/projected/0e05eb35-0ec5-4760-8c17-fa88a565b38c-kube-api-access-v9sbt\") pod \"cinder-operator-controller-manager-859b6ccc6-kkcv7\" (UID: \"0e05eb35-0ec5-4760-8c17-fa88a565b38c\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kkcv7" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.768219 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l8h49" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.768299 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfjhc\" (UniqueName: \"kubernetes.io/projected/5f12ac11-7904-4ca1-beff-3bc654b8bf24-kube-api-access-wfjhc\") pod \"barbican-operator-controller-manager-7d9dfd778-vkd9c\" (UID: \"5f12ac11-7904-4ca1-beff-3bc654b8bf24\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vkd9c" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.770976 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-gw8px" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.775747 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-hp8pr"] Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.804061 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hp8pr" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.809829 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-p6lfm" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.815048 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-l8h49"] Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.868667 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-hp8pr"] Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.869674 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfjhc\" (UniqueName: \"kubernetes.io/projected/5f12ac11-7904-4ca1-beff-3bc654b8bf24-kube-api-access-wfjhc\") pod \"barbican-operator-controller-manager-7d9dfd778-vkd9c\" (UID: \"5f12ac11-7904-4ca1-beff-3bc654b8bf24\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vkd9c" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.869765 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9sbt\" (UniqueName: \"kubernetes.io/projected/0e05eb35-0ec5-4760-8c17-fa88a565b38c-kube-api-access-v9sbt\") pod \"cinder-operator-controller-manager-859b6ccc6-kkcv7\" (UID: \"0e05eb35-0ec5-4760-8c17-fa88a565b38c\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kkcv7" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.889112 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-pqzdl"] Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.890550 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-pqzdl" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.908245 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-z8655" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.908467 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jsxl2"] Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.910018 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jsxl2" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.916476 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4bp2h" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.920702 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfjhc\" (UniqueName: \"kubernetes.io/projected/5f12ac11-7904-4ca1-beff-3bc654b8bf24-kube-api-access-wfjhc\") pod \"barbican-operator-controller-manager-7d9dfd778-vkd9c\" (UID: \"5f12ac11-7904-4ca1-beff-3bc654b8bf24\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vkd9c" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.921113 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jsxl2"] Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.937298 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-pqzdl"] Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.952818 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9sbt\" (UniqueName: \"kubernetes.io/projected/0e05eb35-0ec5-4760-8c17-fa88a565b38c-kube-api-access-v9sbt\") pod \"cinder-operator-controller-manager-859b6ccc6-kkcv7\" (UID: \"0e05eb35-0ec5-4760-8c17-fa88a565b38c\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kkcv7" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.984860 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mvmx\" (UniqueName: \"kubernetes.io/projected/49a30376-5e1f-4065-a8cd-b728b7413a07-kube-api-access-2mvmx\") pod \"designate-operator-controller-manager-78b4bc895b-l8h49\" (UID: \"49a30376-5e1f-4065-a8cd-b728b7413a07\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l8h49" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.984924 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-545dx\" (UniqueName: \"kubernetes.io/projected/5c0b16d3-8d9a-44c8-882b-8a90fd89379d-kube-api-access-545dx\") pod \"glance-operator-controller-manager-77987cd8cd-hp8pr\" (UID: \"5c0b16d3-8d9a-44c8-882b-8a90fd89379d\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hp8pr" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.984994 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnqc7\" (UniqueName: \"kubernetes.io/projected/62e28422-a646-45cc-9a4b-f3de5e2fc463-kube-api-access-qnqc7\") pod \"horizon-operator-controller-manager-68c6d99b8f-jsxl2\" (UID: \"62e28422-a646-45cc-9a4b-f3de5e2fc463\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jsxl2" Dec 06 07:15:02 crc kubenswrapper[4954]: I1206 07:15:02.985052 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zlss\" (UniqueName: \"kubernetes.io/projected/a47971dc-993b-47f7-b65b-4348dfd56866-kube-api-access-9zlss\") pod \"heat-operator-controller-manager-5f64f6f8bb-pqzdl\" (UID: \"a47971dc-993b-47f7-b65b-4348dfd56866\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-pqzdl" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.003725 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.005552 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.012648 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-cgxql" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.013418 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.018880 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-4kdrf"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.022160 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4kdrf" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.030416 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-24nl8" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.055665 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vkd9c" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.079337 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-t67pd"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.080698 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t67pd" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.083207 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-fg9s6" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.083919 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kkcv7" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.087296 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mvmx\" (UniqueName: \"kubernetes.io/projected/49a30376-5e1f-4065-a8cd-b728b7413a07-kube-api-access-2mvmx\") pod \"designate-operator-controller-manager-78b4bc895b-l8h49\" (UID: \"49a30376-5e1f-4065-a8cd-b728b7413a07\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l8h49" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.087358 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-545dx\" (UniqueName: \"kubernetes.io/projected/5c0b16d3-8d9a-44c8-882b-8a90fd89379d-kube-api-access-545dx\") pod \"glance-operator-controller-manager-77987cd8cd-hp8pr\" (UID: \"5c0b16d3-8d9a-44c8-882b-8a90fd89379d\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hp8pr" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.087422 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnqc7\" (UniqueName: \"kubernetes.io/projected/62e28422-a646-45cc-9a4b-f3de5e2fc463-kube-api-access-qnqc7\") pod \"horizon-operator-controller-manager-68c6d99b8f-jsxl2\" (UID: \"62e28422-a646-45cc-9a4b-f3de5e2fc463\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jsxl2" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.087473 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zlss\" (UniqueName: \"kubernetes.io/projected/a47971dc-993b-47f7-b65b-4348dfd56866-kube-api-access-9zlss\") pod \"heat-operator-controller-manager-5f64f6f8bb-pqzdl\" (UID: \"a47971dc-993b-47f7-b65b-4348dfd56866\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-pqzdl" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.102646 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.126734 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-545dx\" (UniqueName: \"kubernetes.io/projected/5c0b16d3-8d9a-44c8-882b-8a90fd89379d-kube-api-access-545dx\") pod \"glance-operator-controller-manager-77987cd8cd-hp8pr\" (UID: \"5c0b16d3-8d9a-44c8-882b-8a90fd89379d\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hp8pr" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.138549 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mvmx\" (UniqueName: \"kubernetes.io/projected/49a30376-5e1f-4065-a8cd-b728b7413a07-kube-api-access-2mvmx\") pod \"designate-operator-controller-manager-78b4bc895b-l8h49\" (UID: \"49a30376-5e1f-4065-a8cd-b728b7413a07\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l8h49" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.146415 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zlss\" (UniqueName: \"kubernetes.io/projected/a47971dc-993b-47f7-b65b-4348dfd56866-kube-api-access-9zlss\") pod \"heat-operator-controller-manager-5f64f6f8bb-pqzdl\" (UID: \"a47971dc-993b-47f7-b65b-4348dfd56866\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-pqzdl" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.163861 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hp8pr" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.165195 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-4kdrf"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.167413 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnqc7\" (UniqueName: \"kubernetes.io/projected/62e28422-a646-45cc-9a4b-f3de5e2fc463-kube-api-access-qnqc7\") pod \"horizon-operator-controller-manager-68c6d99b8f-jsxl2\" (UID: \"62e28422-a646-45cc-9a4b-f3de5e2fc463\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jsxl2" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.191897 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41e31a3c-2482-492f-a8d6-220d9aaeefe3-cert\") pod \"infra-operator-controller-manager-57548d458d-fn6mj\" (UID: \"41e31a3c-2482-492f-a8d6-220d9aaeefe3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.191984 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vj9q\" (UniqueName: \"kubernetes.io/projected/b6dcd22b-76ae-440f-8f04-2d2ea96a07f5-kube-api-access-8vj9q\") pod \"keystone-operator-controller-manager-7765d96ddf-t67pd\" (UID: \"b6dcd22b-76ae-440f-8f04-2d2ea96a07f5\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t67pd" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.192022 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5mw6\" (UniqueName: \"kubernetes.io/projected/41e31a3c-2482-492f-a8d6-220d9aaeefe3-kube-api-access-g5mw6\") pod \"infra-operator-controller-manager-57548d458d-fn6mj\" (UID: \"41e31a3c-2482-492f-a8d6-220d9aaeefe3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.192055 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59kzt\" (UniqueName: \"kubernetes.io/projected/17e0dd71-107b-454c-97f6-134644f51144-kube-api-access-59kzt\") pod \"ironic-operator-controller-manager-6c548fd776-4kdrf\" (UID: \"17e0dd71-107b-454c-97f6-134644f51144\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4kdrf" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.204919 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-m9bfd"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.206746 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m9bfd" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.214261 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-bvp4r" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.224288 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-t67pd"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.255618 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-m9bfd"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.270226 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-vmctk"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.271710 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-vmctk" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.283141 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-pkc52" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.289602 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-pqzdl" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.294949 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dltbn"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.296886 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dltbn" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.299431 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41e31a3c-2482-492f-a8d6-220d9aaeefe3-cert\") pod \"infra-operator-controller-manager-57548d458d-fn6mj\" (UID: \"41e31a3c-2482-492f-a8d6-220d9aaeefe3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.299499 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vj9q\" (UniqueName: \"kubernetes.io/projected/b6dcd22b-76ae-440f-8f04-2d2ea96a07f5-kube-api-access-8vj9q\") pod \"keystone-operator-controller-manager-7765d96ddf-t67pd\" (UID: \"b6dcd22b-76ae-440f-8f04-2d2ea96a07f5\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t67pd" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.299577 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5mw6\" (UniqueName: \"kubernetes.io/projected/41e31a3c-2482-492f-a8d6-220d9aaeefe3-kube-api-access-g5mw6\") pod \"infra-operator-controller-manager-57548d458d-fn6mj\" (UID: \"41e31a3c-2482-492f-a8d6-220d9aaeefe3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.299620 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59kzt\" (UniqueName: \"kubernetes.io/projected/17e0dd71-107b-454c-97f6-134644f51144-kube-api-access-59kzt\") pod \"ironic-operator-controller-manager-6c548fd776-4kdrf\" (UID: \"17e0dd71-107b-454c-97f6-134644f51144\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4kdrf" Dec 06 07:15:03 crc kubenswrapper[4954]: E1206 07:15:03.305617 4954 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.305658 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-tddmv" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.311053 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-vmctk"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.313796 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jsxl2" Dec 06 07:15:03 crc kubenswrapper[4954]: E1206 07:15:03.324689 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41e31a3c-2482-492f-a8d6-220d9aaeefe3-cert podName:41e31a3c-2482-492f-a8d6-220d9aaeefe3 nodeName:}" failed. No retries permitted until 2025-12-06 07:15:03.824631486 +0000 UTC m=+1078.637990875 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41e31a3c-2482-492f-a8d6-220d9aaeefe3-cert") pod "infra-operator-controller-manager-57548d458d-fn6mj" (UID: "41e31a3c-2482-492f-a8d6-220d9aaeefe3") : secret "infra-operator-webhook-server-cert" not found Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.324751 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dltbn"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.349058 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-g2rfv"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.356362 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vj9q\" (UniqueName: \"kubernetes.io/projected/b6dcd22b-76ae-440f-8f04-2d2ea96a07f5-kube-api-access-8vj9q\") pod \"keystone-operator-controller-manager-7765d96ddf-t67pd\" (UID: \"b6dcd22b-76ae-440f-8f04-2d2ea96a07f5\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t67pd" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.372238 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5mw6\" (UniqueName: \"kubernetes.io/projected/41e31a3c-2482-492f-a8d6-220d9aaeefe3-kube-api-access-g5mw6\") pod \"infra-operator-controller-manager-57548d458d-fn6mj\" (UID: \"41e31a3c-2482-492f-a8d6-220d9aaeefe3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.376058 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-8dxm6"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.377167 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59kzt\" (UniqueName: \"kubernetes.io/projected/17e0dd71-107b-454c-97f6-134644f51144-kube-api-access-59kzt\") pod \"ironic-operator-controller-manager-6c548fd776-4kdrf\" (UID: \"17e0dd71-107b-454c-97f6-134644f51144\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4kdrf" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.379298 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-g2rfv" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.380887 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8dxm6" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.381684 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-g2rfv"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.383459 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-rl2vj" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.404448 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz8zg\" (UniqueName: \"kubernetes.io/projected/f0e990d9-0ccf-44f0-9811-265e79f933c3-kube-api-access-bz8zg\") pod \"manila-operator-controller-manager-7c79b5df47-m9bfd\" (UID: \"f0e990d9-0ccf-44f0-9811-265e79f933c3\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m9bfd" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.404628 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwc4x\" (UniqueName: \"kubernetes.io/projected/8b7cf0fe-276e-41bf-88a8-76f5f56e884e-kube-api-access-vwc4x\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-dltbn\" (UID: \"8b7cf0fe-276e-41bf-88a8-76f5f56e884e\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dltbn" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.404669 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59mhv\" (UniqueName: \"kubernetes.io/projected/02417e24-828d-4cb4-9a33-3908caad8c9c-kube-api-access-59mhv\") pod \"mariadb-operator-controller-manager-56bbcc9d85-vmctk\" (UID: \"02417e24-828d-4cb4-9a33-3908caad8c9c\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-vmctk" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.408654 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-6cpm4" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.415783 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l8h49" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.425413 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-8dxm6"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.452726 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-w7lxk"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.465849 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-w7lxk" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.537788 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-gm9pb" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.603803 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t67pd" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.660444 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7bj6\" (UniqueName: \"kubernetes.io/projected/2d02b32a-0bae-4d0e-9b2a-14d8f2cdfb9b-kube-api-access-p7bj6\") pod \"nova-operator-controller-manager-697bc559fc-8dxm6\" (UID: \"2d02b32a-0bae-4d0e-9b2a-14d8f2cdfb9b\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8dxm6" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.660546 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwc4x\" (UniqueName: \"kubernetes.io/projected/8b7cf0fe-276e-41bf-88a8-76f5f56e884e-kube-api-access-vwc4x\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-dltbn\" (UID: \"8b7cf0fe-276e-41bf-88a8-76f5f56e884e\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dltbn" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.660618 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqqnk\" (UniqueName: \"kubernetes.io/projected/5dac40b5-0993-4696-995b-2476436126fc-kube-api-access-jqqnk\") pod \"octavia-operator-controller-manager-998648c74-g2rfv\" (UID: \"5dac40b5-0993-4696-995b-2476436126fc\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-g2rfv" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.660675 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59mhv\" (UniqueName: \"kubernetes.io/projected/02417e24-828d-4cb4-9a33-3908caad8c9c-kube-api-access-59mhv\") pod \"mariadb-operator-controller-manager-56bbcc9d85-vmctk\" (UID: \"02417e24-828d-4cb4-9a33-3908caad8c9c\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-vmctk" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.660771 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz8zg\" (UniqueName: \"kubernetes.io/projected/f0e990d9-0ccf-44f0-9811-265e79f933c3-kube-api-access-bz8zg\") pod \"manila-operator-controller-manager-7c79b5df47-m9bfd\" (UID: \"f0e990d9-0ccf-44f0-9811-265e79f933c3\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m9bfd" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.672260 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4kdrf" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.699106 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.745693 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.763238 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.763721 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-flvbp" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.764113 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59mhv\" (UniqueName: \"kubernetes.io/projected/02417e24-828d-4cb4-9a33-3908caad8c9c-kube-api-access-59mhv\") pod \"mariadb-operator-controller-manager-56bbcc9d85-vmctk\" (UID: \"02417e24-828d-4cb4-9a33-3908caad8c9c\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-vmctk" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.783144 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s5zz\" (UniqueName: \"kubernetes.io/projected/4aa03482-7c89-4997-b9a4-096ba0f2d47a-kube-api-access-8s5zz\") pod \"placement-operator-controller-manager-78f8948974-w7lxk\" (UID: \"4aa03482-7c89-4997-b9a4-096ba0f2d47a\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-w7lxk" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.783374 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7bj6\" (UniqueName: \"kubernetes.io/projected/2d02b32a-0bae-4d0e-9b2a-14d8f2cdfb9b-kube-api-access-p7bj6\") pod \"nova-operator-controller-manager-697bc559fc-8dxm6\" (UID: \"2d02b32a-0bae-4d0e-9b2a-14d8f2cdfb9b\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8dxm6" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.783454 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqqnk\" (UniqueName: \"kubernetes.io/projected/5dac40b5-0993-4696-995b-2476436126fc-kube-api-access-jqqnk\") pod \"octavia-operator-controller-manager-998648c74-g2rfv\" (UID: \"5dac40b5-0993-4696-995b-2476436126fc\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-g2rfv" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.815095 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwc4x\" (UniqueName: \"kubernetes.io/projected/8b7cf0fe-276e-41bf-88a8-76f5f56e884e-kube-api-access-vwc4x\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-dltbn\" (UID: \"8b7cf0fe-276e-41bf-88a8-76f5f56e884e\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dltbn" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.827122 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz8zg\" (UniqueName: \"kubernetes.io/projected/f0e990d9-0ccf-44f0-9811-265e79f933c3-kube-api-access-bz8zg\") pod \"manila-operator-controller-manager-7c79b5df47-m9bfd\" (UID: \"f0e990d9-0ccf-44f0-9811-265e79f933c3\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m9bfd" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.877443 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7bj6\" (UniqueName: \"kubernetes.io/projected/2d02b32a-0bae-4d0e-9b2a-14d8f2cdfb9b-kube-api-access-p7bj6\") pod \"nova-operator-controller-manager-697bc559fc-8dxm6\" (UID: \"2d02b32a-0bae-4d0e-9b2a-14d8f2cdfb9b\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8dxm6" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.890503 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8dxm6" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.898210 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41e31a3c-2482-492f-a8d6-220d9aaeefe3-cert\") pod \"infra-operator-controller-manager-57548d458d-fn6mj\" (UID: \"41e31a3c-2482-492f-a8d6-220d9aaeefe3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.898304 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s5zz\" (UniqueName: \"kubernetes.io/projected/4aa03482-7c89-4997-b9a4-096ba0f2d47a-kube-api-access-8s5zz\") pod \"placement-operator-controller-manager-78f8948974-w7lxk\" (UID: \"4aa03482-7c89-4997-b9a4-096ba0f2d47a\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-w7lxk" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.898340 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f56gzs7\" (UID: \"f97f4b54-5ea8-4fe9-b0bc-4937979ad468\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.898440 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmlt5\" (UniqueName: \"kubernetes.io/projected/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-kube-api-access-zmlt5\") pod \"openstack-baremetal-operator-controller-manager-55c85496f56gzs7\" (UID: \"f97f4b54-5ea8-4fe9-b0bc-4937979ad468\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" Dec 06 07:15:03 crc kubenswrapper[4954]: E1206 07:15:03.900339 4954 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 07:15:03 crc kubenswrapper[4954]: E1206 07:15:03.903717 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41e31a3c-2482-492f-a8d6-220d9aaeefe3-cert podName:41e31a3c-2482-492f-a8d6-220d9aaeefe3 nodeName:}" failed. No retries permitted until 2025-12-06 07:15:04.903693983 +0000 UTC m=+1079.717053372 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41e31a3c-2482-492f-a8d6-220d9aaeefe3-cert") pod "infra-operator-controller-manager-57548d458d-fn6mj" (UID: "41e31a3c-2482-492f-a8d6-220d9aaeefe3") : secret "infra-operator-webhook-server-cert" not found Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.929763 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-khxwr"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.933665 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m9bfd" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.934636 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-khxwr" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.938419 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-khxwr"] Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.939451 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-vmctk" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.946893 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-gxs58" Dec 06 07:15:03 crc kubenswrapper[4954]: I1206 07:15:03.954271 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7"] Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:03.985930 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8x2"] Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:03.987284 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8x2" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:03.992819 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dltbn" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:03.998648 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-frjm7"] Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.006753 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-frjm7" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.013323 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-psg7k" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.013539 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqqnk\" (UniqueName: \"kubernetes.io/projected/5dac40b5-0993-4696-995b-2476436126fc-kube-api-access-jqqnk\") pod \"octavia-operator-controller-manager-998648c74-g2rfv\" (UID: \"5dac40b5-0993-4696-995b-2476436126fc\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-g2rfv" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.021497 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-frjm7"] Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.025503 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-w7lxk"] Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.029094 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-nkk82" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.032265 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f56gzs7\" (UID: \"f97f4b54-5ea8-4fe9-b0bc-4937979ad468\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.032338 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmlt5\" (UniqueName: \"kubernetes.io/projected/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-kube-api-access-zmlt5\") pod \"openstack-baremetal-operator-controller-manager-55c85496f56gzs7\" (UID: \"f97f4b54-5ea8-4fe9-b0bc-4937979ad468\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" Dec 06 07:15:04 crc kubenswrapper[4954]: E1206 07:15:04.032857 4954 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 07:15:04 crc kubenswrapper[4954]: E1206 07:15:04.032907 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert podName:f97f4b54-5ea8-4fe9-b0bc-4937979ad468 nodeName:}" failed. No retries permitted until 2025-12-06 07:15:04.532892816 +0000 UTC m=+1079.346252205 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f56gzs7" (UID: "f97f4b54-5ea8-4fe9-b0bc-4937979ad468") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.054738 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-g2rfv" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.055315 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8x2"] Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.063545 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-fpbr4"] Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.065191 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fpbr4" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.069879 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-fpbr4"] Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.076292 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-f4fqf" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.080288 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s5zz\" (UniqueName: \"kubernetes.io/projected/4aa03482-7c89-4997-b9a4-096ba0f2d47a-kube-api-access-8s5zz\") pod \"placement-operator-controller-manager-78f8948974-w7lxk\" (UID: \"4aa03482-7c89-4997-b9a4-096ba0f2d47a\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-w7lxk" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.082673 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmlt5\" (UniqueName: \"kubernetes.io/projected/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-kube-api-access-zmlt5\") pod \"openstack-baremetal-operator-controller-manager-55c85496f56gzs7\" (UID: \"f97f4b54-5ea8-4fe9-b0bc-4937979ad468\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.101836 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-zh2kh"] Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.103367 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-zh2kh" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.108972 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-fstzt" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.118351 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-zh2kh"] Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.138613 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dd84\" (UniqueName: \"kubernetes.io/projected/56b42964-963e-469d-bc21-867a3c560e66-kube-api-access-4dd84\") pod \"swift-operator-controller-manager-5f8c65bbfc-frjm7\" (UID: \"56b42964-963e-469d-bc21-867a3c560e66\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-frjm7" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.138683 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq5k9\" (UniqueName: \"kubernetes.io/projected/ae04bc7e-d1cf-4063-8236-d3717d1b8f51-kube-api-access-wq5k9\") pod \"telemetry-operator-controller-manager-76cc84c6bb-2x8x2\" (UID: \"ae04bc7e-d1cf-4063-8236-d3717d1b8f51\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8x2" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.138710 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v7fs\" (UniqueName: \"kubernetes.io/projected/c6aa3674-f8f4-4a0e-bc00-2629d49818e7-kube-api-access-4v7fs\") pod \"ovn-operator-controller-manager-b6456fdb6-khxwr\" (UID: \"c6aa3674-f8f4-4a0e-bc00-2629d49818e7\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-khxwr" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.141881 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz"] Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.144388 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.150748 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz"] Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.151423 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-6t9px" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.151754 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.151924 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.181653 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gt8z5"] Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.183085 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gt8z5" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.201255 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-sbvzx" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.239835 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvd6z\" (UniqueName: \"kubernetes.io/projected/fb5012c2-0242-4c82-bedb-58f94bc2c8d1-kube-api-access-kvd6z\") pod \"watcher-operator-controller-manager-769dc69bc-zh2kh\" (UID: \"fb5012c2-0242-4c82-bedb-58f94bc2c8d1\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-zh2kh" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.239965 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dd84\" (UniqueName: \"kubernetes.io/projected/56b42964-963e-469d-bc21-867a3c560e66-kube-api-access-4dd84\") pod \"swift-operator-controller-manager-5f8c65bbfc-frjm7\" (UID: \"56b42964-963e-469d-bc21-867a3c560e66\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-frjm7" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.239993 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq5k9\" (UniqueName: \"kubernetes.io/projected/ae04bc7e-d1cf-4063-8236-d3717d1b8f51-kube-api-access-wq5k9\") pod \"telemetry-operator-controller-manager-76cc84c6bb-2x8x2\" (UID: \"ae04bc7e-d1cf-4063-8236-d3717d1b8f51\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8x2" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.240015 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v7fs\" (UniqueName: \"kubernetes.io/projected/c6aa3674-f8f4-4a0e-bc00-2629d49818e7-kube-api-access-4v7fs\") pod \"ovn-operator-controller-manager-b6456fdb6-khxwr\" (UID: \"c6aa3674-f8f4-4a0e-bc00-2629d49818e7\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-khxwr" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.240068 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg2sv\" (UniqueName: \"kubernetes.io/projected/d05a311c-788e-41fa-9ddc-52c98fe6dc7c-kube-api-access-rg2sv\") pod \"test-operator-controller-manager-5854674fcc-fpbr4\" (UID: \"d05a311c-788e-41fa-9ddc-52c98fe6dc7c\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-fpbr4" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.258575 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gt8z5"] Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.263963 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq5k9\" (UniqueName: \"kubernetes.io/projected/ae04bc7e-d1cf-4063-8236-d3717d1b8f51-kube-api-access-wq5k9\") pod \"telemetry-operator-controller-manager-76cc84c6bb-2x8x2\" (UID: \"ae04bc7e-d1cf-4063-8236-d3717d1b8f51\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8x2" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.270765 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dd84\" (UniqueName: \"kubernetes.io/projected/56b42964-963e-469d-bc21-867a3c560e66-kube-api-access-4dd84\") pod \"swift-operator-controller-manager-5f8c65bbfc-frjm7\" (UID: \"56b42964-963e-469d-bc21-867a3c560e66\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-frjm7" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.281943 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8x2" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.283794 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v7fs\" (UniqueName: \"kubernetes.io/projected/c6aa3674-f8f4-4a0e-bc00-2629d49818e7-kube-api-access-4v7fs\") pod \"ovn-operator-controller-manager-b6456fdb6-khxwr\" (UID: \"c6aa3674-f8f4-4a0e-bc00-2629d49818e7\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-khxwr" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.296199 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-w7lxk" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.324036 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-vkd9c"] Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.342856 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvd6z\" (UniqueName: \"kubernetes.io/projected/fb5012c2-0242-4c82-bedb-58f94bc2c8d1-kube-api-access-kvd6z\") pod \"watcher-operator-controller-manager-769dc69bc-zh2kh\" (UID: \"fb5012c2-0242-4c82-bedb-58f94bc2c8d1\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-zh2kh" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.342958 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpnr4\" (UniqueName: \"kubernetes.io/projected/a233e22e-ec80-4af3-b6db-488834c983de-kube-api-access-wpnr4\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.343050 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.343099 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.343134 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvdsp\" (UniqueName: \"kubernetes.io/projected/f5cd0bdb-bcec-4ba6-8272-5fe99b6e4008-kube-api-access-zvdsp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gt8z5\" (UID: \"f5cd0bdb-bcec-4ba6-8272-5fe99b6e4008\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gt8z5" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.345142 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg2sv\" (UniqueName: \"kubernetes.io/projected/d05a311c-788e-41fa-9ddc-52c98fe6dc7c-kube-api-access-rg2sv\") pod \"test-operator-controller-manager-5854674fcc-fpbr4\" (UID: \"d05a311c-788e-41fa-9ddc-52c98fe6dc7c\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-fpbr4" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.406057 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvd6z\" (UniqueName: \"kubernetes.io/projected/fb5012c2-0242-4c82-bedb-58f94bc2c8d1-kube-api-access-kvd6z\") pod \"watcher-operator-controller-manager-769dc69bc-zh2kh\" (UID: \"fb5012c2-0242-4c82-bedb-58f94bc2c8d1\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-zh2kh" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.406057 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg2sv\" (UniqueName: \"kubernetes.io/projected/d05a311c-788e-41fa-9ddc-52c98fe6dc7c-kube-api-access-rg2sv\") pod \"test-operator-controller-manager-5854674fcc-fpbr4\" (UID: \"d05a311c-788e-41fa-9ddc-52c98fe6dc7c\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-fpbr4" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.432905 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-frjm7" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.446298 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fpbr4" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.446866 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpnr4\" (UniqueName: \"kubernetes.io/projected/a233e22e-ec80-4af3-b6db-488834c983de-kube-api-access-wpnr4\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.447002 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.447041 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.447096 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvdsp\" (UniqueName: \"kubernetes.io/projected/f5cd0bdb-bcec-4ba6-8272-5fe99b6e4008-kube-api-access-zvdsp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gt8z5\" (UID: \"f5cd0bdb-bcec-4ba6-8272-5fe99b6e4008\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gt8z5" Dec 06 07:15:04 crc kubenswrapper[4954]: E1206 07:15:04.449139 4954 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 07:15:04 crc kubenswrapper[4954]: E1206 07:15:04.449207 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs podName:a233e22e-ec80-4af3-b6db-488834c983de nodeName:}" failed. No retries permitted until 2025-12-06 07:15:04.949181032 +0000 UTC m=+1079.762540421 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-xbgpz" (UID: "a233e22e-ec80-4af3-b6db-488834c983de") : secret "webhook-server-cert" not found Dec 06 07:15:04 crc kubenswrapper[4954]: E1206 07:15:04.449492 4954 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 07:15:04 crc kubenswrapper[4954]: E1206 07:15:04.449516 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs podName:a233e22e-ec80-4af3-b6db-488834c983de nodeName:}" failed. No retries permitted until 2025-12-06 07:15:04.949508811 +0000 UTC m=+1079.762868200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-xbgpz" (UID: "a233e22e-ec80-4af3-b6db-488834c983de") : secret "metrics-server-cert" not found Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.460621 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-zh2kh" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.471948 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpnr4\" (UniqueName: \"kubernetes.io/projected/a233e22e-ec80-4af3-b6db-488834c983de-kube-api-access-wpnr4\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.472943 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvdsp\" (UniqueName: \"kubernetes.io/projected/f5cd0bdb-bcec-4ba6-8272-5fe99b6e4008-kube-api-access-zvdsp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gt8z5\" (UID: \"f5cd0bdb-bcec-4ba6-8272-5fe99b6e4008\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gt8z5" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.509890 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gt8z5" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.545705 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-khxwr" Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.552001 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f56gzs7\" (UID: \"f97f4b54-5ea8-4fe9-b0bc-4937979ad468\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" Dec 06 07:15:04 crc kubenswrapper[4954]: E1206 07:15:04.557581 4954 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 07:15:04 crc kubenswrapper[4954]: E1206 07:15:04.557725 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert podName:f97f4b54-5ea8-4fe9-b0bc-4937979ad468 nodeName:}" failed. No retries permitted until 2025-12-06 07:15:05.55769098 +0000 UTC m=+1080.371050369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f56gzs7" (UID: "f97f4b54-5ea8-4fe9-b0bc-4937979ad468") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.735987 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vkd9c" event={"ID":"5f12ac11-7904-4ca1-beff-3bc654b8bf24","Type":"ContainerStarted","Data":"b7257512c21cd9d83c311946e0bab57fd1a972aa13d8710d811085466788f0de"} Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.860743 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-pqzdl"] Dec 06 07:15:04 crc kubenswrapper[4954]: I1206 07:15:04.896995 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-hp8pr"] Dec 06 07:15:04 crc kubenswrapper[4954]: W1206 07:15:04.931211 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c0b16d3_8d9a_44c8_882b_8a90fd89379d.slice/crio-ea13fea89f578ef969aca9eac77671e99441f583bee8be9af94b610d46894bfa WatchSource:0}: Error finding container ea13fea89f578ef969aca9eac77671e99441f583bee8be9af94b610d46894bfa: Status 404 returned error can't find the container with id ea13fea89f578ef969aca9eac77671e99441f583bee8be9af94b610d46894bfa Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.030403 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41e31a3c-2482-492f-a8d6-220d9aaeefe3-cert\") pod \"infra-operator-controller-manager-57548d458d-fn6mj\" (UID: \"41e31a3c-2482-492f-a8d6-220d9aaeefe3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.030481 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.030517 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:05 crc kubenswrapper[4954]: E1206 07:15:05.030700 4954 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 07:15:05 crc kubenswrapper[4954]: E1206 07:15:05.030765 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs podName:a233e22e-ec80-4af3-b6db-488834c983de nodeName:}" failed. No retries permitted until 2025-12-06 07:15:06.030743948 +0000 UTC m=+1080.844103337 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-xbgpz" (UID: "a233e22e-ec80-4af3-b6db-488834c983de") : secret "webhook-server-cert" not found Dec 06 07:15:05 crc kubenswrapper[4954]: E1206 07:15:05.030814 4954 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 07:15:05 crc kubenswrapper[4954]: E1206 07:15:05.030835 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41e31a3c-2482-492f-a8d6-220d9aaeefe3-cert podName:41e31a3c-2482-492f-a8d6-220d9aaeefe3 nodeName:}" failed. No retries permitted until 2025-12-06 07:15:07.03082826 +0000 UTC m=+1081.844187649 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41e31a3c-2482-492f-a8d6-220d9aaeefe3-cert") pod "infra-operator-controller-manager-57548d458d-fn6mj" (UID: "41e31a3c-2482-492f-a8d6-220d9aaeefe3") : secret "infra-operator-webhook-server-cert" not found Dec 06 07:15:05 crc kubenswrapper[4954]: E1206 07:15:05.030869 4954 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 07:15:05 crc kubenswrapper[4954]: E1206 07:15:05.030888 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs podName:a233e22e-ec80-4af3-b6db-488834c983de nodeName:}" failed. No retries permitted until 2025-12-06 07:15:06.030881262 +0000 UTC m=+1080.844240651 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-xbgpz" (UID: "a233e22e-ec80-4af3-b6db-488834c983de") : secret "metrics-server-cert" not found Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.165045 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j" Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.335906 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94b53cf7-013c-4df3-82de-c438b35806ac-secret-volume\") pod \"94b53cf7-013c-4df3-82de-c438b35806ac\" (UID: \"94b53cf7-013c-4df3-82de-c438b35806ac\") " Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.336421 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nf2t\" (UniqueName: \"kubernetes.io/projected/94b53cf7-013c-4df3-82de-c438b35806ac-kube-api-access-4nf2t\") pod \"94b53cf7-013c-4df3-82de-c438b35806ac\" (UID: \"94b53cf7-013c-4df3-82de-c438b35806ac\") " Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.336623 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94b53cf7-013c-4df3-82de-c438b35806ac-config-volume\") pod \"94b53cf7-013c-4df3-82de-c438b35806ac\" (UID: \"94b53cf7-013c-4df3-82de-c438b35806ac\") " Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.337606 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94b53cf7-013c-4df3-82de-c438b35806ac-config-volume" (OuterVolumeSpecName: "config-volume") pod "94b53cf7-013c-4df3-82de-c438b35806ac" (UID: "94b53cf7-013c-4df3-82de-c438b35806ac"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.345468 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94b53cf7-013c-4df3-82de-c438b35806ac-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "94b53cf7-013c-4df3-82de-c438b35806ac" (UID: "94b53cf7-013c-4df3-82de-c438b35806ac"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.345468 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b53cf7-013c-4df3-82de-c438b35806ac-kube-api-access-4nf2t" (OuterVolumeSpecName: "kube-api-access-4nf2t") pod "94b53cf7-013c-4df3-82de-c438b35806ac" (UID: "94b53cf7-013c-4df3-82de-c438b35806ac"). InnerVolumeSpecName "kube-api-access-4nf2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.438522 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nf2t\" (UniqueName: \"kubernetes.io/projected/94b53cf7-013c-4df3-82de-c438b35806ac-kube-api-access-4nf2t\") on node \"crc\" DevicePath \"\"" Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.438580 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94b53cf7-013c-4df3-82de-c438b35806ac-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.438591 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94b53cf7-013c-4df3-82de-c438b35806ac-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.568795 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-kkcv7"] Dec 06 07:15:05 crc kubenswrapper[4954]: W1206 07:15:05.618042 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17e0dd71_107b_454c_97f6_134644f51144.slice/crio-6de2bf220ffdfdddcb91f4b7a1d0f54c42e6508fe9d7d4ef82ed3c1f33191ac2 WatchSource:0}: Error finding container 6de2bf220ffdfdddcb91f4b7a1d0f54c42e6508fe9d7d4ef82ed3c1f33191ac2: Status 404 returned error can't find the container with id 6de2bf220ffdfdddcb91f4b7a1d0f54c42e6508fe9d7d4ef82ed3c1f33191ac2 Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.620756 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-l8h49"] Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.647903 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f56gzs7\" (UID: \"f97f4b54-5ea8-4fe9-b0bc-4937979ad468\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" Dec 06 07:15:05 crc kubenswrapper[4954]: E1206 07:15:05.648151 4954 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 07:15:05 crc kubenswrapper[4954]: E1206 07:15:05.648212 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert podName:f97f4b54-5ea8-4fe9-b0bc-4937979ad468 nodeName:}" failed. No retries permitted until 2025-12-06 07:15:07.648190785 +0000 UTC m=+1082.461550174 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f56gzs7" (UID: "f97f4b54-5ea8-4fe9-b0bc-4937979ad468") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.652031 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jsxl2"] Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.659250 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-4kdrf"] Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.683489 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-m9bfd"] Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.691545 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-8dxm6"] Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.698898 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-t67pd"] Dec 06 07:15:05 crc kubenswrapper[4954]: W1206 07:15:05.722693 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dac40b5_0993_4696_995b_2476436126fc.slice/crio-f29319ca78dbb394355121a38007a783129e9625451b7f505b2f2d41d18ef898 WatchSource:0}: Error finding container f29319ca78dbb394355121a38007a783129e9625451b7f505b2f2d41d18ef898: Status 404 returned error can't find the container with id f29319ca78dbb394355121a38007a783129e9625451b7f505b2f2d41d18ef898 Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.727352 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-g2rfv"] Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.754813 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jsxl2" event={"ID":"62e28422-a646-45cc-9a4b-f3de5e2fc463","Type":"ContainerStarted","Data":"2f7cc952923f2fd1ec588c927b66cc0f416349dd427c71086b367231b620f08c"} Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.758494 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j" event={"ID":"94b53cf7-013c-4df3-82de-c438b35806ac","Type":"ContainerDied","Data":"8a9bfa690967a97700bc966f7d05a7dd038db940918d6e971893c923ec8c0d60"} Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.758617 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a9bfa690967a97700bc966f7d05a7dd038db940918d6e971893c923ec8c0d60" Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.759049 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j" Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.765892 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4kdrf" event={"ID":"17e0dd71-107b-454c-97f6-134644f51144","Type":"ContainerStarted","Data":"6de2bf220ffdfdddcb91f4b7a1d0f54c42e6508fe9d7d4ef82ed3c1f33191ac2"} Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.768833 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t67pd" event={"ID":"b6dcd22b-76ae-440f-8f04-2d2ea96a07f5","Type":"ContainerStarted","Data":"fe2ef04909dd4cd3d7511e6e8fba65d42d083f92224c800c37c4708b1a05a14c"} Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.774846 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8dxm6" event={"ID":"2d02b32a-0bae-4d0e-9b2a-14d8f2cdfb9b","Type":"ContainerStarted","Data":"4f01814bef315c30f2c8367558abfb2ef89a1f844530dec04f9c8468bedbcc3c"} Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.776779 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-pqzdl" event={"ID":"a47971dc-993b-47f7-b65b-4348dfd56866","Type":"ContainerStarted","Data":"654a7a0308890cea1acc4bcc7cb62905baa5f27dc3f8e335e65ba08d872aa298"} Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.792006 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kkcv7" event={"ID":"0e05eb35-0ec5-4760-8c17-fa88a565b38c","Type":"ContainerStarted","Data":"3c40a8a6c0cc0715f227c887cc6f8e11d1c767dd73f31bd55e89a87dd4e8a367"} Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.801036 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hp8pr" event={"ID":"5c0b16d3-8d9a-44c8-882b-8a90fd89379d","Type":"ContainerStarted","Data":"ea13fea89f578ef969aca9eac77671e99441f583bee8be9af94b610d46894bfa"} Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.803726 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l8h49" event={"ID":"49a30376-5e1f-4065-a8cd-b728b7413a07","Type":"ContainerStarted","Data":"7b668be8ac8b63aedf9515f0a14e0a02f95f170b3461ae8fafedadc0c71975bd"} Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.815738 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-g2rfv" event={"ID":"5dac40b5-0993-4696-995b-2476436126fc","Type":"ContainerStarted","Data":"f29319ca78dbb394355121a38007a783129e9625451b7f505b2f2d41d18ef898"} Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.839791 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dltbn"] Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.839851 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m9bfd" event={"ID":"f0e990d9-0ccf-44f0-9811-265e79f933c3","Type":"ContainerStarted","Data":"91a6bf8574b044837dec2562f41bd232001a41d0a36d5347a2a4f63e0d45a9fc"} Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.859743 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-fpbr4"] Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.868135 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-w7lxk"] Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.878149 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-frjm7"] Dec 06 07:15:05 crc kubenswrapper[4954]: W1206 07:15:05.893580 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56b42964_963e_469d_bc21_867a3c560e66.slice/crio-254d5f00ef68d1825140d7747634cdc8b70392a35553e43a513b849b8901c0ce WatchSource:0}: Error finding container 254d5f00ef68d1825140d7747634cdc8b70392a35553e43a513b849b8901c0ce: Status 404 returned error can't find the container with id 254d5f00ef68d1825140d7747634cdc8b70392a35553e43a513b849b8901c0ce Dec 06 07:15:05 crc kubenswrapper[4954]: E1206 07:15:05.893877 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-59mhv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-vmctk_openstack-operators(02417e24-828d-4cb4-9a33-3908caad8c9c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 07:15:05 crc kubenswrapper[4954]: E1206 07:15:05.896698 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-59mhv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-vmctk_openstack-operators(02417e24-828d-4cb4-9a33-3908caad8c9c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 07:15:05 crc kubenswrapper[4954]: E1206 07:15:05.898628 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-vmctk" podUID="02417e24-828d-4cb4-9a33-3908caad8c9c" Dec 06 07:15:05 crc kubenswrapper[4954]: E1206 07:15:05.900399 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4dd84,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-frjm7_openstack-operators(56b42964-963e-469d-bc21-867a3c560e66): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.906610 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-vmctk"] Dec 06 07:15:05 crc kubenswrapper[4954]: E1206 07:15:05.913683 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4dd84,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-frjm7_openstack-operators(56b42964-963e-469d-bc21-867a3c560e66): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 07:15:05 crc kubenswrapper[4954]: E1206 07:15:05.916362 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-frjm7" podUID="56b42964-963e-469d-bc21-867a3c560e66" Dec 06 07:15:05 crc kubenswrapper[4954]: W1206 07:15:05.921515 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae04bc7e_d1cf_4063_8236_d3717d1b8f51.slice/crio-3e33d144cd1afd0b63a50db6cce6a58e10c7413f2926390dc73bb52ca88cc01d WatchSource:0}: Error finding container 3e33d144cd1afd0b63a50db6cce6a58e10c7413f2926390dc73bb52ca88cc01d: Status 404 returned error can't find the container with id 3e33d144cd1afd0b63a50db6cce6a58e10c7413f2926390dc73bb52ca88cc01d Dec 06 07:15:05 crc kubenswrapper[4954]: E1206 07:15:05.927028 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wq5k9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-2x8x2_openstack-operators(ae04bc7e-d1cf-4063-8236-d3717d1b8f51): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 07:15:05 crc kubenswrapper[4954]: E1206 07:15:05.934968 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wq5k9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-2x8x2_openstack-operators(ae04bc7e-d1cf-4063-8236-d3717d1b8f51): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 07:15:05 crc kubenswrapper[4954]: E1206 07:15:05.936660 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8x2" podUID="ae04bc7e-d1cf-4063-8236-d3717d1b8f51" Dec 06 07:15:05 crc kubenswrapper[4954]: I1206 07:15:05.941665 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8x2"] Dec 06 07:15:06 crc kubenswrapper[4954]: I1206 07:15:06.040756 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-zh2kh"] Dec 06 07:15:06 crc kubenswrapper[4954]: I1206 07:15:06.054407 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-khxwr"] Dec 06 07:15:06 crc kubenswrapper[4954]: E1206 07:15:06.056548 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4v7fs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-khxwr_openstack-operators(c6aa3674-f8f4-4a0e-bc00-2629d49818e7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 07:15:06 crc kubenswrapper[4954]: I1206 07:15:06.057351 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:06 crc kubenswrapper[4954]: I1206 07:15:06.057416 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:06 crc kubenswrapper[4954]: E1206 07:15:06.057627 4954 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 07:15:06 crc kubenswrapper[4954]: E1206 07:15:06.057660 4954 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 07:15:06 crc kubenswrapper[4954]: E1206 07:15:06.057756 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs podName:a233e22e-ec80-4af3-b6db-488834c983de nodeName:}" failed. No retries permitted until 2025-12-06 07:15:08.05772384 +0000 UTC m=+1082.871083389 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-xbgpz" (UID: "a233e22e-ec80-4af3-b6db-488834c983de") : secret "metrics-server-cert" not found Dec 06 07:15:06 crc kubenswrapper[4954]: E1206 07:15:06.057867 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs podName:a233e22e-ec80-4af3-b6db-488834c983de nodeName:}" failed. No retries permitted until 2025-12-06 07:15:08.057784912 +0000 UTC m=+1082.871144451 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-xbgpz" (UID: "a233e22e-ec80-4af3-b6db-488834c983de") : secret "webhook-server-cert" not found Dec 06 07:15:06 crc kubenswrapper[4954]: E1206 07:15:06.059062 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4v7fs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-khxwr_openstack-operators(c6aa3674-f8f4-4a0e-bc00-2629d49818e7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 07:15:06 crc kubenswrapper[4954]: E1206 07:15:06.060831 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-khxwr" podUID="c6aa3674-f8f4-4a0e-bc00-2629d49818e7" Dec 06 07:15:06 crc kubenswrapper[4954]: I1206 07:15:06.079336 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gt8z5"] Dec 06 07:15:06 crc kubenswrapper[4954]: W1206 07:15:06.081760 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5cd0bdb_bcec_4ba6_8272_5fe99b6e4008.slice/crio-49d70e1f3c41f197b3dcd3c82ae099e18b369c9341f5f2d412cd06a20e4c16f1 WatchSource:0}: Error finding container 49d70e1f3c41f197b3dcd3c82ae099e18b369c9341f5f2d412cd06a20e4c16f1: Status 404 returned error can't find the container with id 49d70e1f3c41f197b3dcd3c82ae099e18b369c9341f5f2d412cd06a20e4c16f1 Dec 06 07:15:06 crc kubenswrapper[4954]: E1206 07:15:06.087701 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zvdsp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-gt8z5_openstack-operators(f5cd0bdb-bcec-4ba6-8272-5fe99b6e4008): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 07:15:06 crc kubenswrapper[4954]: E1206 07:15:06.089638 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gt8z5" podUID="f5cd0bdb-bcec-4ba6-8272-5fe99b6e4008" Dec 06 07:15:06 crc kubenswrapper[4954]: I1206 07:15:06.867295 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-frjm7" event={"ID":"56b42964-963e-469d-bc21-867a3c560e66","Type":"ContainerStarted","Data":"254d5f00ef68d1825140d7747634cdc8b70392a35553e43a513b849b8901c0ce"} Dec 06 07:15:06 crc kubenswrapper[4954]: E1206 07:15:06.872142 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-frjm7" podUID="56b42964-963e-469d-bc21-867a3c560e66" Dec 06 07:15:06 crc kubenswrapper[4954]: I1206 07:15:06.877005 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-w7lxk" event={"ID":"4aa03482-7c89-4997-b9a4-096ba0f2d47a","Type":"ContainerStarted","Data":"4dfe077bddbda6bab43692e6345b66ccd9ca61c0677f645f5cf4eb4862a73b7d"} Dec 06 07:15:06 crc kubenswrapper[4954]: I1206 07:15:06.879498 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-zh2kh" event={"ID":"fb5012c2-0242-4c82-bedb-58f94bc2c8d1","Type":"ContainerStarted","Data":"6c2e2b9459d7cf3bcc9468080048cb1426bea54e40cf5d8ea7458f6293a8bb3e"} Dec 06 07:15:06 crc kubenswrapper[4954]: I1206 07:15:06.892542 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-vmctk" event={"ID":"02417e24-828d-4cb4-9a33-3908caad8c9c","Type":"ContainerStarted","Data":"08e94818ad62ee54f46065c7131bb69c08f56d5825c0e03b43ccf421b6d939fc"} Dec 06 07:15:06 crc kubenswrapper[4954]: I1206 07:15:06.897137 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dltbn" event={"ID":"8b7cf0fe-276e-41bf-88a8-76f5f56e884e","Type":"ContainerStarted","Data":"bdd28f31f6d6554f13ceb95bd9b49e1e92235408dba08a3581db5f1f35a8fdb8"} Dec 06 07:15:06 crc kubenswrapper[4954]: I1206 07:15:06.899077 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-khxwr" event={"ID":"c6aa3674-f8f4-4a0e-bc00-2629d49818e7","Type":"ContainerStarted","Data":"52e27e3e566f6db46a771dd3323d5df3ac2a9491166e3f2833947df93ad263e9"} Dec 06 07:15:06 crc kubenswrapper[4954]: E1206 07:15:06.899865 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-vmctk" podUID="02417e24-828d-4cb4-9a33-3908caad8c9c" Dec 06 07:15:06 crc kubenswrapper[4954]: E1206 07:15:06.903974 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-khxwr" podUID="c6aa3674-f8f4-4a0e-bc00-2629d49818e7" Dec 06 07:15:06 crc kubenswrapper[4954]: I1206 07:15:06.905278 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gt8z5" event={"ID":"f5cd0bdb-bcec-4ba6-8272-5fe99b6e4008","Type":"ContainerStarted","Data":"49d70e1f3c41f197b3dcd3c82ae099e18b369c9341f5f2d412cd06a20e4c16f1"} Dec 06 07:15:06 crc kubenswrapper[4954]: E1206 07:15:06.909351 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gt8z5" podUID="f5cd0bdb-bcec-4ba6-8272-5fe99b6e4008" Dec 06 07:15:06 crc kubenswrapper[4954]: I1206 07:15:06.910154 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fpbr4" event={"ID":"d05a311c-788e-41fa-9ddc-52c98fe6dc7c","Type":"ContainerStarted","Data":"cc3bd3602ed4a713642081ddc4a3029c47127f57577199795f6ebf6142e61e6d"} Dec 06 07:15:06 crc kubenswrapper[4954]: I1206 07:15:06.938339 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8x2" event={"ID":"ae04bc7e-d1cf-4063-8236-d3717d1b8f51","Type":"ContainerStarted","Data":"3e33d144cd1afd0b63a50db6cce6a58e10c7413f2926390dc73bb52ca88cc01d"} Dec 06 07:15:06 crc kubenswrapper[4954]: E1206 07:15:06.946935 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8x2" podUID="ae04bc7e-d1cf-4063-8236-d3717d1b8f51" Dec 06 07:15:07 crc kubenswrapper[4954]: I1206 07:15:07.088129 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41e31a3c-2482-492f-a8d6-220d9aaeefe3-cert\") pod \"infra-operator-controller-manager-57548d458d-fn6mj\" (UID: \"41e31a3c-2482-492f-a8d6-220d9aaeefe3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" Dec 06 07:15:07 crc kubenswrapper[4954]: E1206 07:15:07.088399 4954 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 07:15:07 crc kubenswrapper[4954]: E1206 07:15:07.088529 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41e31a3c-2482-492f-a8d6-220d9aaeefe3-cert podName:41e31a3c-2482-492f-a8d6-220d9aaeefe3 nodeName:}" failed. No retries permitted until 2025-12-06 07:15:11.088494104 +0000 UTC m=+1085.901853663 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41e31a3c-2482-492f-a8d6-220d9aaeefe3-cert") pod "infra-operator-controller-manager-57548d458d-fn6mj" (UID: "41e31a3c-2482-492f-a8d6-220d9aaeefe3") : secret "infra-operator-webhook-server-cert" not found Dec 06 07:15:07 crc kubenswrapper[4954]: I1206 07:15:07.699755 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f56gzs7\" (UID: \"f97f4b54-5ea8-4fe9-b0bc-4937979ad468\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" Dec 06 07:15:07 crc kubenswrapper[4954]: E1206 07:15:07.700029 4954 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 07:15:07 crc kubenswrapper[4954]: E1206 07:15:07.700758 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert podName:f97f4b54-5ea8-4fe9-b0bc-4937979ad468 nodeName:}" failed. No retries permitted until 2025-12-06 07:15:11.700618069 +0000 UTC m=+1086.513977458 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f56gzs7" (UID: "f97f4b54-5ea8-4fe9-b0bc-4937979ad468") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 07:15:07 crc kubenswrapper[4954]: E1206 07:15:07.951574 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-khxwr" podUID="c6aa3674-f8f4-4a0e-bc00-2629d49818e7" Dec 06 07:15:07 crc kubenswrapper[4954]: E1206 07:15:07.958192 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gt8z5" podUID="f5cd0bdb-bcec-4ba6-8272-5fe99b6e4008" Dec 06 07:15:07 crc kubenswrapper[4954]: E1206 07:15:07.961947 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8x2" podUID="ae04bc7e-d1cf-4063-8236-d3717d1b8f51" Dec 06 07:15:07 crc kubenswrapper[4954]: E1206 07:15:07.962082 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-vmctk" podUID="02417e24-828d-4cb4-9a33-3908caad8c9c" Dec 06 07:15:07 crc kubenswrapper[4954]: E1206 07:15:07.962211 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-frjm7" podUID="56b42964-963e-469d-bc21-867a3c560e66" Dec 06 07:15:08 crc kubenswrapper[4954]: I1206 07:15:08.110852 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:08 crc kubenswrapper[4954]: I1206 07:15:08.110924 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:08 crc kubenswrapper[4954]: E1206 07:15:08.111095 4954 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 07:15:08 crc kubenswrapper[4954]: E1206 07:15:08.111136 4954 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 07:15:08 crc kubenswrapper[4954]: E1206 07:15:08.111217 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs podName:a233e22e-ec80-4af3-b6db-488834c983de nodeName:}" failed. No retries permitted until 2025-12-06 07:15:12.111190342 +0000 UTC m=+1086.924549731 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-xbgpz" (UID: "a233e22e-ec80-4af3-b6db-488834c983de") : secret "metrics-server-cert" not found Dec 06 07:15:08 crc kubenswrapper[4954]: E1206 07:15:08.111249 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs podName:a233e22e-ec80-4af3-b6db-488834c983de nodeName:}" failed. No retries permitted until 2025-12-06 07:15:12.111239323 +0000 UTC m=+1086.924598712 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-xbgpz" (UID: "a233e22e-ec80-4af3-b6db-488834c983de") : secret "webhook-server-cert" not found Dec 06 07:15:10 crc kubenswrapper[4954]: I1206 07:15:10.102244 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:15:10 crc kubenswrapper[4954]: I1206 07:15:10.103245 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:15:11 crc kubenswrapper[4954]: I1206 07:15:11.122336 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41e31a3c-2482-492f-a8d6-220d9aaeefe3-cert\") pod \"infra-operator-controller-manager-57548d458d-fn6mj\" (UID: \"41e31a3c-2482-492f-a8d6-220d9aaeefe3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" Dec 06 07:15:11 crc kubenswrapper[4954]: E1206 07:15:11.122632 4954 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 07:15:11 crc kubenswrapper[4954]: E1206 07:15:11.123899 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41e31a3c-2482-492f-a8d6-220d9aaeefe3-cert podName:41e31a3c-2482-492f-a8d6-220d9aaeefe3 nodeName:}" failed. No retries permitted until 2025-12-06 07:15:19.123868839 +0000 UTC m=+1093.937228228 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41e31a3c-2482-492f-a8d6-220d9aaeefe3-cert") pod "infra-operator-controller-manager-57548d458d-fn6mj" (UID: "41e31a3c-2482-492f-a8d6-220d9aaeefe3") : secret "infra-operator-webhook-server-cert" not found Dec 06 07:15:11 crc kubenswrapper[4954]: I1206 07:15:11.735977 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f56gzs7\" (UID: \"f97f4b54-5ea8-4fe9-b0bc-4937979ad468\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" Dec 06 07:15:11 crc kubenswrapper[4954]: E1206 07:15:11.736261 4954 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 07:15:11 crc kubenswrapper[4954]: E1206 07:15:11.736371 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert podName:f97f4b54-5ea8-4fe9-b0bc-4937979ad468 nodeName:}" failed. No retries permitted until 2025-12-06 07:15:19.736344133 +0000 UTC m=+1094.549703522 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f56gzs7" (UID: "f97f4b54-5ea8-4fe9-b0bc-4937979ad468") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 07:15:12 crc kubenswrapper[4954]: I1206 07:15:12.143457 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:12 crc kubenswrapper[4954]: I1206 07:15:12.143708 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:12 crc kubenswrapper[4954]: E1206 07:15:12.143926 4954 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 07:15:12 crc kubenswrapper[4954]: E1206 07:15:12.144025 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs podName:a233e22e-ec80-4af3-b6db-488834c983de nodeName:}" failed. No retries permitted until 2025-12-06 07:15:20.143995918 +0000 UTC m=+1094.957355307 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-xbgpz" (UID: "a233e22e-ec80-4af3-b6db-488834c983de") : secret "metrics-server-cert" not found Dec 06 07:15:12 crc kubenswrapper[4954]: E1206 07:15:12.143926 4954 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 07:15:12 crc kubenswrapper[4954]: E1206 07:15:12.144082 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs podName:a233e22e-ec80-4af3-b6db-488834c983de nodeName:}" failed. No retries permitted until 2025-12-06 07:15:20.14406959 +0000 UTC m=+1094.957428979 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-xbgpz" (UID: "a233e22e-ec80-4af3-b6db-488834c983de") : secret "webhook-server-cert" not found Dec 06 07:15:19 crc kubenswrapper[4954]: I1206 07:15:19.210765 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41e31a3c-2482-492f-a8d6-220d9aaeefe3-cert\") pod \"infra-operator-controller-manager-57548d458d-fn6mj\" (UID: \"41e31a3c-2482-492f-a8d6-220d9aaeefe3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" Dec 06 07:15:19 crc kubenswrapper[4954]: I1206 07:15:19.219776 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41e31a3c-2482-492f-a8d6-220d9aaeefe3-cert\") pod \"infra-operator-controller-manager-57548d458d-fn6mj\" (UID: \"41e31a3c-2482-492f-a8d6-220d9aaeefe3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" Dec 06 07:15:19 crc kubenswrapper[4954]: I1206 07:15:19.250512 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-cgxql" Dec 06 07:15:19 crc kubenswrapper[4954]: I1206 07:15:19.258223 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" Dec 06 07:15:19 crc kubenswrapper[4954]: I1206 07:15:19.823153 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f56gzs7\" (UID: \"f97f4b54-5ea8-4fe9-b0bc-4937979ad468\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" Dec 06 07:15:19 crc kubenswrapper[4954]: E1206 07:15:19.823480 4954 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 07:15:19 crc kubenswrapper[4954]: E1206 07:15:19.823807 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert podName:f97f4b54-5ea8-4fe9-b0bc-4937979ad468 nodeName:}" failed. No retries permitted until 2025-12-06 07:15:35.823772292 +0000 UTC m=+1110.637131841 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f56gzs7" (UID: "f97f4b54-5ea8-4fe9-b0bc-4937979ad468") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 07:15:19 crc kubenswrapper[4954]: E1206 07:15:19.915876 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85" Dec 06 07:15:19 crc kubenswrapper[4954]: E1206 07:15:19.916237 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2mvmx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-l8h49_openstack-operators(49a30376-5e1f-4065-a8cd-b728b7413a07): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:15:20 crc kubenswrapper[4954]: I1206 07:15:20.229847 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:20 crc kubenswrapper[4954]: I1206 07:15:20.229930 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:20 crc kubenswrapper[4954]: E1206 07:15:20.230110 4954 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 07:15:20 crc kubenswrapper[4954]: E1206 07:15:20.230189 4954 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 07:15:20 crc kubenswrapper[4954]: E1206 07:15:20.230233 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs podName:a233e22e-ec80-4af3-b6db-488834c983de nodeName:}" failed. No retries permitted until 2025-12-06 07:15:36.230199324 +0000 UTC m=+1111.043558883 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-xbgpz" (UID: "a233e22e-ec80-4af3-b6db-488834c983de") : secret "metrics-server-cert" not found Dec 06 07:15:20 crc kubenswrapper[4954]: E1206 07:15:20.230274 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs podName:a233e22e-ec80-4af3-b6db-488834c983de nodeName:}" failed. No retries permitted until 2025-12-06 07:15:36.230245255 +0000 UTC m=+1111.043604784 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-xbgpz" (UID: "a233e22e-ec80-4af3-b6db-488834c983de") : secret "webhook-server-cert" not found Dec 06 07:15:20 crc kubenswrapper[4954]: E1206 07:15:20.666383 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 06 07:15:20 crc kubenswrapper[4954]: E1206 07:15:20.667444 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rg2sv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-fpbr4_openstack-operators(d05a311c-788e-41fa-9ddc-52c98fe6dc7c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:15:22 crc kubenswrapper[4954]: E1206 07:15:22.600547 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 06 07:15:22 crc kubenswrapper[4954]: E1206 07:15:22.601408 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jqqnk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-g2rfv_openstack-operators(5dac40b5-0993-4696-995b-2476436126fc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:15:24 crc kubenswrapper[4954]: E1206 07:15:24.009658 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801" Dec 06 07:15:24 crc kubenswrapper[4954]: E1206 07:15:24.010058 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v9sbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-kkcv7_openstack-operators(0e05eb35-0ec5-4760-8c17-fa88a565b38c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:15:25 crc kubenswrapper[4954]: E1206 07:15:25.361619 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 06 07:15:25 crc kubenswrapper[4954]: E1206 07:15:25.361898 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vwc4x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-dltbn_openstack-operators(8b7cf0fe-276e-41bf-88a8-76f5f56e884e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:15:28 crc kubenswrapper[4954]: E1206 07:15:28.195895 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 06 07:15:28 crc kubenswrapper[4954]: E1206 07:15:28.196466 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9zlss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-pqzdl_openstack-operators(a47971dc-993b-47f7-b65b-4348dfd56866): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:15:28 crc kubenswrapper[4954]: E1206 07:15:28.879723 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 06 07:15:28 crc kubenswrapper[4954]: E1206 07:15:28.880581 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8vj9q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-t67pd_openstack-operators(b6dcd22b-76ae-440f-8f04-2d2ea96a07f5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:15:29 crc kubenswrapper[4954]: E1206 07:15:29.438582 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 06 07:15:29 crc kubenswrapper[4954]: E1206 07:15:29.438876 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p7bj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-8dxm6_openstack-operators(2d02b32a-0bae-4d0e-9b2a-14d8f2cdfb9b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:15:35 crc kubenswrapper[4954]: I1206 07:15:35.885036 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f56gzs7\" (UID: \"f97f4b54-5ea8-4fe9-b0bc-4937979ad468\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" Dec 06 07:15:35 crc kubenswrapper[4954]: I1206 07:15:35.892081 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f97f4b54-5ea8-4fe9-b0bc-4937979ad468-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f56gzs7\" (UID: \"f97f4b54-5ea8-4fe9-b0bc-4937979ad468\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" Dec 06 07:15:35 crc kubenswrapper[4954]: I1206 07:15:35.973279 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-flvbp" Dec 06 07:15:35 crc kubenswrapper[4954]: I1206 07:15:35.981852 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" Dec 06 07:15:36 crc kubenswrapper[4954]: I1206 07:15:36.320943 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:36 crc kubenswrapper[4954]: I1206 07:15:36.321014 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:36 crc kubenswrapper[4954]: I1206 07:15:36.329295 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:36 crc kubenswrapper[4954]: I1206 07:15:36.334323 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a233e22e-ec80-4af3-b6db-488834c983de-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-xbgpz\" (UID: \"a233e22e-ec80-4af3-b6db-488834c983de\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:36 crc kubenswrapper[4954]: I1206 07:15:36.588816 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-6t9px" Dec 06 07:15:36 crc kubenswrapper[4954]: I1206 07:15:36.598026 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:15:37 crc kubenswrapper[4954]: E1206 07:15:37.251706 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 06 07:15:37 crc kubenswrapper[4954]: E1206 07:15:37.252084 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zvdsp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-gt8z5_openstack-operators(f5cd0bdb-bcec-4ba6-8272-5fe99b6e4008): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:15:37 crc kubenswrapper[4954]: E1206 07:15:37.253330 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gt8z5" podUID="f5cd0bdb-bcec-4ba6-8272-5fe99b6e4008" Dec 06 07:15:37 crc kubenswrapper[4954]: I1206 07:15:37.616685 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj"] Dec 06 07:15:37 crc kubenswrapper[4954]: I1206 07:15:37.828972 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7"] Dec 06 07:15:37 crc kubenswrapper[4954]: W1206 07:15:37.865125 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf97f4b54_5ea8_4fe9_b0bc_4937979ad468.slice/crio-1729ee0949d4f53b6932a2812e5de91eab289688c9e06c8f6f6106ed72764836 WatchSource:0}: Error finding container 1729ee0949d4f53b6932a2812e5de91eab289688c9e06c8f6f6106ed72764836: Status 404 returned error can't find the container with id 1729ee0949d4f53b6932a2812e5de91eab289688c9e06c8f6f6106ed72764836 Dec 06 07:15:38 crc kubenswrapper[4954]: I1206 07:15:38.480207 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" event={"ID":"f97f4b54-5ea8-4fe9-b0bc-4937979ad468","Type":"ContainerStarted","Data":"1729ee0949d4f53b6932a2812e5de91eab289688c9e06c8f6f6106ed72764836"} Dec 06 07:15:38 crc kubenswrapper[4954]: I1206 07:15:38.575023 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" event={"ID":"41e31a3c-2482-492f-a8d6-220d9aaeefe3","Type":"ContainerStarted","Data":"44b22ccc030da23f4f36034164be26b5458637ba5d93e084166fa10a1b042606"} Dec 06 07:15:38 crc kubenswrapper[4954]: I1206 07:15:38.582708 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hp8pr" event={"ID":"5c0b16d3-8d9a-44c8-882b-8a90fd89379d","Type":"ContainerStarted","Data":"49b84df279bbad6f8d9db3182c758cf52374b03eefb982a43a13eb5191d3e72c"} Dec 06 07:15:38 crc kubenswrapper[4954]: I1206 07:15:38.584851 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-zh2kh" event={"ID":"fb5012c2-0242-4c82-bedb-58f94bc2c8d1","Type":"ContainerStarted","Data":"29a8c93ea3b69f1f3e3fa9c7f342e308f3e03575da5154bc975c868c4b779351"} Dec 06 07:15:38 crc kubenswrapper[4954]: I1206 07:15:38.586289 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vkd9c" event={"ID":"5f12ac11-7904-4ca1-beff-3bc654b8bf24","Type":"ContainerStarted","Data":"31b0403582bbb78b236cd42c4af2355d063e51e59869b8612fd53626f0d4df41"} Dec 06 07:15:38 crc kubenswrapper[4954]: I1206 07:15:38.592019 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-w7lxk" event={"ID":"4aa03482-7c89-4997-b9a4-096ba0f2d47a","Type":"ContainerStarted","Data":"e04f5ba301158c7dfb296bca7aee35608aee037fde13ce155d4e5a87f775f425"} Dec 06 07:15:38 crc kubenswrapper[4954]: I1206 07:15:38.861958 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz"] Dec 06 07:15:39 crc kubenswrapper[4954]: I1206 07:15:39.616394 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8x2" event={"ID":"ae04bc7e-d1cf-4063-8236-d3717d1b8f51","Type":"ContainerStarted","Data":"4c1c1491257d83b196f33ac268e00a2f213532d55b7d3c5e9c013bcc2dcbf76f"} Dec 06 07:15:39 crc kubenswrapper[4954]: I1206 07:15:39.618973 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4kdrf" event={"ID":"17e0dd71-107b-454c-97f6-134644f51144","Type":"ContainerStarted","Data":"2674d53b787a929b6d1b52a5bb1fa08710fe263286a7cb91e9fc04eb61781b5a"} Dec 06 07:15:39 crc kubenswrapper[4954]: I1206 07:15:39.634743 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-frjm7" event={"ID":"56b42964-963e-469d-bc21-867a3c560e66","Type":"ContainerStarted","Data":"d3b35203317ec16bc432fcf2239ac1f1e0a224d7ec8f4bb999c7e449b958dca2"} Dec 06 07:15:39 crc kubenswrapper[4954]: I1206 07:15:39.649580 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m9bfd" event={"ID":"f0e990d9-0ccf-44f0-9811-265e79f933c3","Type":"ContainerStarted","Data":"65c63bd47965ee83e9c5880adb1e36e8dd0fd17fb2f9c5c08b919eaed48508a0"} Dec 06 07:15:39 crc kubenswrapper[4954]: I1206 07:15:39.660457 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jsxl2" event={"ID":"62e28422-a646-45cc-9a4b-f3de5e2fc463","Type":"ContainerStarted","Data":"06adb18c025af1ea76a9e4da08eebcbe9f80d3f96461c84d170497bd8b8e2eae"} Dec 06 07:15:40 crc kubenswrapper[4954]: I1206 07:15:40.101766 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:15:40 crc kubenswrapper[4954]: I1206 07:15:40.101848 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:15:50 crc kubenswrapper[4954]: E1206 07:15:50.445624 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gt8z5" podUID="f5cd0bdb-bcec-4ba6-8272-5fe99b6e4008" Dec 06 07:15:53 crc kubenswrapper[4954]: I1206 07:15:53.778527 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-khxwr" event={"ID":"c6aa3674-f8f4-4a0e-bc00-2629d49818e7","Type":"ContainerStarted","Data":"eaae84b7a45afe9e5eea8e3e538081cf77729aa16c83814974a75624579e4e5a"} Dec 06 07:15:53 crc kubenswrapper[4954]: I1206 07:15:53.973251 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 07:15:54 crc kubenswrapper[4954]: E1206 07:15:54.617149 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 06 07:15:54 crc kubenswrapper[4954]: E1206 07:15:54.617359 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p7bj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-8dxm6_openstack-operators(2d02b32a-0bae-4d0e-9b2a-14d8f2cdfb9b): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 06 07:15:54 crc kubenswrapper[4954]: E1206 07:15:54.618633 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8dxm6" podUID="2d02b32a-0bae-4d0e-9b2a-14d8f2cdfb9b" Dec 06 07:15:54 crc kubenswrapper[4954]: I1206 07:15:54.793550 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" event={"ID":"a233e22e-ec80-4af3-b6db-488834c983de","Type":"ContainerStarted","Data":"4edb929f085c9528bf64a3341cc30505e5dc0191d3a7bb3925433e3bc065e5f3"} Dec 06 07:15:54 crc kubenswrapper[4954]: I1206 07:15:54.796303 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-vmctk" event={"ID":"02417e24-828d-4cb4-9a33-3908caad8c9c","Type":"ContainerStarted","Data":"9a92f224e44ce765fca041b4af63b70f458409609b3c85363ba9e5e1334abda4"} Dec 06 07:15:59 crc kubenswrapper[4954]: E1206 07:15:59.100897 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:09a6d0613ee2d3c1c809fc36c22678458ac271e0da87c970aec0a5339f5423f7" Dec 06 07:15:59 crc kubenswrapper[4954]: E1206 07:15:59.101866 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:09a6d0613ee2d3c1c809fc36c22678458ac271e0da87c970aec0a5339f5423f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g5mw6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-57548d458d-fn6mj_openstack-operators(41e31a3c-2482-492f-a8d6-220d9aaeefe3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:16:01 crc kubenswrapper[4954]: E1206 07:16:01.377858 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 06 07:16:01 crc kubenswrapper[4954]: E1206 07:16:01.378153 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9zlss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-pqzdl_openstack-operators(a47971dc-993b-47f7-b65b-4348dfd56866): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 06 07:16:01 crc kubenswrapper[4954]: E1206 07:16:01.379375 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-pqzdl" podUID="a47971dc-993b-47f7-b65b-4348dfd56866" Dec 06 07:16:01 crc kubenswrapper[4954]: E1206 07:16:01.414094 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 06 07:16:01 crc kubenswrapper[4954]: E1206 07:16:01.414332 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8vj9q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-t67pd_openstack-operators(b6dcd22b-76ae-440f-8f04-2d2ea96a07f5): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 06 07:16:01 crc kubenswrapper[4954]: E1206 07:16:01.415533 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t67pd" podUID="b6dcd22b-76ae-440f-8f04-2d2ea96a07f5" Dec 06 07:16:01 crc kubenswrapper[4954]: E1206 07:16:01.726670 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 06 07:16:01 crc kubenswrapper[4954]: E1206 07:16:01.727242 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kvd6z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-zh2kh_openstack-operators(fb5012c2-0242-4c82-bedb-58f94bc2c8d1): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 06 07:16:01 crc kubenswrapper[4954]: E1206 07:16:01.728623 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-zh2kh" podUID="fb5012c2-0242-4c82-bedb-58f94bc2c8d1" Dec 06 07:16:02 crc kubenswrapper[4954]: I1206 07:16:02.110100 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-zh2kh" Dec 06 07:16:02 crc kubenswrapper[4954]: E1206 07:16:02.112162 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-zh2kh" podUID="fb5012c2-0242-4c82-bedb-58f94bc2c8d1" Dec 06 07:16:02 crc kubenswrapper[4954]: I1206 07:16:02.112999 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-zh2kh" Dec 06 07:16:02 crc kubenswrapper[4954]: E1206 07:16:02.504979 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:723607448b0abc536cd883abffcf6942c1c562a48117db73f6fe693d99395ee2: Get \"https://quay.io/v2/openstack-k8s-operators/kube-rbac-proxy/blobs/sha256:723607448b0abc536cd883abffcf6942c1c562a48117db73f6fe693d99395ee2\": context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 06 07:16:02 crc kubenswrapper[4954]: E1206 07:16:02.505242 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2mvmx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-l8h49_openstack-operators(49a30376-5e1f-4065-a8cd-b728b7413a07): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:723607448b0abc536cd883abffcf6942c1c562a48117db73f6fe693d99395ee2: Get \"https://quay.io/v2/openstack-k8s-operators/kube-rbac-proxy/blobs/sha256:723607448b0abc536cd883abffcf6942c1c562a48117db73f6fe693d99395ee2\": context canceled" logger="UnhandledError" Dec 06 07:16:02 crc kubenswrapper[4954]: E1206 07:16:02.506809 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:723607448b0abc536cd883abffcf6942c1c562a48117db73f6fe693d99395ee2: Get \\\"https://quay.io/v2/openstack-k8s-operators/kube-rbac-proxy/blobs/sha256:723607448b0abc536cd883abffcf6942c1c562a48117db73f6fe693d99395ee2\\\": context canceled\"]" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l8h49" podUID="49a30376-5e1f-4065-a8cd-b728b7413a07" Dec 06 07:16:03 crc kubenswrapper[4954]: E1206 07:16:03.130436 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-zh2kh" podUID="fb5012c2-0242-4c82-bedb-58f94bc2c8d1" Dec 06 07:16:04 crc kubenswrapper[4954]: E1206 07:16:04.141136 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-zh2kh" podUID="fb5012c2-0242-4c82-bedb-58f94bc2c8d1" Dec 06 07:16:04 crc kubenswrapper[4954]: E1206 07:16:04.625556 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 06 07:16:04 crc kubenswrapper[4954]: E1206 07:16:04.625795 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v9sbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-kkcv7_openstack-operators(0e05eb35-0ec5-4760-8c17-fa88a565b38c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:16:04 crc kubenswrapper[4954]: E1206 07:16:04.627032 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kkcv7" podUID="0e05eb35-0ec5-4760-8c17-fa88a565b38c" Dec 06 07:16:04 crc kubenswrapper[4954]: E1206 07:16:04.930043 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 06 07:16:04 crc kubenswrapper[4954]: E1206 07:16:04.930284 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jqqnk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-g2rfv_openstack-operators(5dac40b5-0993-4696-995b-2476436126fc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:16:04 crc kubenswrapper[4954]: E1206 07:16:04.932278 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-g2rfv" podUID="5dac40b5-0993-4696-995b-2476436126fc" Dec 06 07:16:05 crc kubenswrapper[4954]: I1206 07:16:05.153575 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" event={"ID":"a233e22e-ec80-4af3-b6db-488834c983de","Type":"ContainerStarted","Data":"b56b9a93e61ef6226431fe66804c92eee847146299ce3eccade427204e6d082c"} Dec 06 07:16:05 crc kubenswrapper[4954]: E1206 07:16:05.952849 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 06 07:16:05 crc kubenswrapper[4954]: E1206 07:16:05.953471 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rg2sv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-fpbr4_openstack-operators(d05a311c-788e-41fa-9ddc-52c98fe6dc7c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:16:05 crc kubenswrapper[4954]: E1206 07:16:05.954894 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fpbr4" podUID="d05a311c-788e-41fa-9ddc-52c98fe6dc7c" Dec 06 07:16:06 crc kubenswrapper[4954]: I1206 07:16:06.179094 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:16:06 crc kubenswrapper[4954]: I1206 07:16:06.221980 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" podStartSLOduration=63.221955854 podStartE2EDuration="1m3.221955854s" podCreationTimestamp="2025-12-06 07:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:16:06.220875545 +0000 UTC m=+1141.034234924" watchObservedRunningTime="2025-12-06 07:16:06.221955854 +0000 UTC m=+1141.035315243" Dec 06 07:16:06 crc kubenswrapper[4954]: E1206 07:16:06.851553 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dltbn" podUID="8b7cf0fe-276e-41bf-88a8-76f5f56e884e" Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.207783 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l8h49" event={"ID":"49a30376-5e1f-4065-a8cd-b728b7413a07","Type":"ContainerStarted","Data":"530e05575aea5161395904bd779c46b4929c1642fe71859b84f3dbf0bbb877fd"} Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.213512 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dltbn" event={"ID":"8b7cf0fe-276e-41bf-88a8-76f5f56e884e","Type":"ContainerStarted","Data":"b31881538d34421e9917f0ec641b291b82d3ffefd59ba6ff200a6c4d300d3f30"} Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.218747 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-frjm7" event={"ID":"56b42964-963e-469d-bc21-867a3c560e66","Type":"ContainerStarted","Data":"8480b464ebe91cbdfe7a350464c64a56b905cad3fafe8ffaa3f78fe09e6c755a"} Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.219019 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-frjm7" Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.224647 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8dxm6" event={"ID":"2d02b32a-0bae-4d0e-9b2a-14d8f2cdfb9b","Type":"ContainerStarted","Data":"4c15084294a2bba78839a9fe1fd01271317e4910f9565b879412352cc8e620d9"} Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.228232 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-w7lxk" event={"ID":"4aa03482-7c89-4997-b9a4-096ba0f2d47a","Type":"ContainerStarted","Data":"79199a0f8f1c9a8c9515e8934e28487ab8975bffe456f808426e459480a1c3b1"} Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.228542 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-w7lxk" Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.228645 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-frjm7" Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.233783 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m9bfd" event={"ID":"f0e990d9-0ccf-44f0-9811-265e79f933c3","Type":"ContainerStarted","Data":"d1acd6f0c0dcad03d7ea31496726ccebf1968522aac62ea8c96f27cdd5333391"} Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.233989 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m9bfd" Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.236228 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m9bfd" Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.236982 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-w7lxk" Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.239453 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8x2" event={"ID":"ae04bc7e-d1cf-4063-8236-d3717d1b8f51","Type":"ContainerStarted","Data":"d9fb77f2ef02e1a478e475aa7ade2a696041ea6f4f61463480048c9321e2ef54"} Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.239680 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8x2" Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.244380 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8x2" Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.244436 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4kdrf" event={"ID":"17e0dd71-107b-454c-97f6-134644f51144","Type":"ContainerStarted","Data":"006f42f405183220514ca1141008b6839e9b489b1022c0458bb9dc057932dca6"} Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.244643 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4kdrf" Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.247601 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4kdrf" Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.251570 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kkcv7" event={"ID":"0e05eb35-0ec5-4760-8c17-fa88a565b38c","Type":"ContainerStarted","Data":"d73539a99d4535b500451a90780ac740906124be80e02554889c0ebe6233962f"} Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.255113 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" event={"ID":"f97f4b54-5ea8-4fe9-b0bc-4937979ad468","Type":"ContainerStarted","Data":"5a45237f272e84eb85a92a4523642aedf8383dfd3d66a63faac76512a7ccc55b"} Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.263604 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jsxl2" event={"ID":"62e28422-a646-45cc-9a4b-f3de5e2fc463","Type":"ContainerStarted","Data":"ba00e7b95085f0c97d80241bf03eee55a237b6d3081ee6957c841c4645165a0f"} Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.263672 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jsxl2" Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.270452 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-frjm7" podStartSLOduration=4.271962031 podStartE2EDuration="1m4.270430952s" podCreationTimestamp="2025-12-06 07:15:03 +0000 UTC" firstStartedPulling="2025-12-06 07:15:05.90024661 +0000 UTC m=+1080.713605999" lastFinishedPulling="2025-12-06 07:16:05.898715531 +0000 UTC m=+1140.712074920" observedRunningTime="2025-12-06 07:16:07.268145481 +0000 UTC m=+1142.081504880" watchObservedRunningTime="2025-12-06 07:16:07.270430952 +0000 UTC m=+1142.083790341" Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.308385 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jsxl2" Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.381816 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m9bfd" podStartSLOduration=5.1486948869999996 podStartE2EDuration="1m5.381775586s" podCreationTimestamp="2025-12-06 07:15:02 +0000 UTC" firstStartedPulling="2025-12-06 07:15:05.691144296 +0000 UTC m=+1080.504503685" lastFinishedPulling="2025-12-06 07:16:05.924225005 +0000 UTC m=+1140.737584384" observedRunningTime="2025-12-06 07:16:07.31027461 +0000 UTC m=+1142.123634009" watchObservedRunningTime="2025-12-06 07:16:07.381775586 +0000 UTC m=+1142.195134985" Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.381987 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-w7lxk" podStartSLOduration=4.312926549 podStartE2EDuration="1m4.381980572s" podCreationTimestamp="2025-12-06 07:15:03 +0000 UTC" firstStartedPulling="2025-12-06 07:15:05.891787723 +0000 UTC m=+1080.705147122" lastFinishedPulling="2025-12-06 07:16:05.960841756 +0000 UTC m=+1140.774201145" observedRunningTime="2025-12-06 07:16:07.356702094 +0000 UTC m=+1142.170061483" watchObservedRunningTime="2025-12-06 07:16:07.381980572 +0000 UTC m=+1142.195339971" Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.500337 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8x2" podStartSLOduration=4.482874264 podStartE2EDuration="1m4.500313243s" podCreationTimestamp="2025-12-06 07:15:03 +0000 UTC" firstStartedPulling="2025-12-06 07:15:05.926627387 +0000 UTC m=+1080.739986776" lastFinishedPulling="2025-12-06 07:16:05.944066366 +0000 UTC m=+1140.757425755" observedRunningTime="2025-12-06 07:16:07.454951907 +0000 UTC m=+1142.268311296" watchObservedRunningTime="2025-12-06 07:16:07.500313243 +0000 UTC m=+1142.313672632" Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.533363 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4kdrf" podStartSLOduration=5.258412939 podStartE2EDuration="1m5.533201945s" podCreationTimestamp="2025-12-06 07:15:02 +0000 UTC" firstStartedPulling="2025-12-06 07:15:05.623423461 +0000 UTC m=+1080.436782850" lastFinishedPulling="2025-12-06 07:16:05.898212467 +0000 UTC m=+1140.711571856" observedRunningTime="2025-12-06 07:16:07.514713389 +0000 UTC m=+1142.328072798" watchObservedRunningTime="2025-12-06 07:16:07.533201945 +0000 UTC m=+1142.346561334" Dec 06 07:16:07 crc kubenswrapper[4954]: I1206 07:16:07.587266 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jsxl2" podStartSLOduration=5.301055862 podStartE2EDuration="1m5.587235543s" podCreationTimestamp="2025-12-06 07:15:02 +0000 UTC" firstStartedPulling="2025-12-06 07:15:05.612030586 +0000 UTC m=+1080.425389975" lastFinishedPulling="2025-12-06 07:16:05.898210267 +0000 UTC m=+1140.711569656" observedRunningTime="2025-12-06 07:16:07.551764962 +0000 UTC m=+1142.365124351" watchObservedRunningTime="2025-12-06 07:16:07.587235543 +0000 UTC m=+1142.400594932" Dec 06 07:16:08 crc kubenswrapper[4954]: I1206 07:16:08.332367 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gt8z5" event={"ID":"f5cd0bdb-bcec-4ba6-8272-5fe99b6e4008","Type":"ContainerStarted","Data":"fd2d3aae94e782e9ee1da6953a2a83824eb81871657b22e0d1b583c6a7d67ef5"} Dec 06 07:16:08 crc kubenswrapper[4954]: I1206 07:16:08.344484 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-vmctk" event={"ID":"02417e24-828d-4cb4-9a33-3908caad8c9c","Type":"ContainerStarted","Data":"e4a3c0f02e007a33c12bf757c94db4d7a1c0fbadf2f737f65b629ca39df397d7"} Dec 06 07:16:08 crc kubenswrapper[4954]: I1206 07:16:08.345739 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-vmctk" Dec 06 07:16:08 crc kubenswrapper[4954]: I1206 07:16:08.358964 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-vmctk" Dec 06 07:16:08 crc kubenswrapper[4954]: I1206 07:16:08.405821 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gt8z5" podStartSLOduration=4.796899848 podStartE2EDuration="1m5.405791558s" podCreationTimestamp="2025-12-06 07:15:03 +0000 UTC" firstStartedPulling="2025-12-06 07:15:06.087410486 +0000 UTC m=+1080.900769865" lastFinishedPulling="2025-12-06 07:16:06.696302186 +0000 UTC m=+1141.509661575" observedRunningTime="2025-12-06 07:16:08.404217136 +0000 UTC m=+1143.217576525" watchObservedRunningTime="2025-12-06 07:16:08.405791558 +0000 UTC m=+1143.219150937" Dec 06 07:16:08 crc kubenswrapper[4954]: I1206 07:16:08.431078 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8dxm6" Dec 06 07:16:08 crc kubenswrapper[4954]: I1206 07:16:08.434178 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" event={"ID":"f97f4b54-5ea8-4fe9-b0bc-4937979ad468","Type":"ContainerStarted","Data":"e739af0c1c61ac35275093ad0bfa540a3f0be0af5771a1b7638552ac81ed08a7"} Dec 06 07:16:08 crc kubenswrapper[4954]: I1206 07:16:08.434225 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" Dec 06 07:16:08 crc kubenswrapper[4954]: I1206 07:16:08.449510 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-vmctk" podStartSLOduration=4.926090111 podStartE2EDuration="1m5.449480699s" podCreationTimestamp="2025-12-06 07:15:03 +0000 UTC" firstStartedPulling="2025-12-06 07:15:05.893670534 +0000 UTC m=+1080.707029913" lastFinishedPulling="2025-12-06 07:16:06.417061112 +0000 UTC m=+1141.230420501" observedRunningTime="2025-12-06 07:16:08.442133342 +0000 UTC m=+1143.255492731" watchObservedRunningTime="2025-12-06 07:16:08.449480699 +0000 UTC m=+1143.262840088" Dec 06 07:16:08 crc kubenswrapper[4954]: I1206 07:16:08.541732 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" podStartSLOduration=37.969999567 podStartE2EDuration="1m5.541701491s" podCreationTimestamp="2025-12-06 07:15:03 +0000 UTC" firstStartedPulling="2025-12-06 07:15:37.901064682 +0000 UTC m=+1112.714424071" lastFinishedPulling="2025-12-06 07:16:05.472766586 +0000 UTC m=+1140.286125995" observedRunningTime="2025-12-06 07:16:08.533176462 +0000 UTC m=+1143.346535851" watchObservedRunningTime="2025-12-06 07:16:08.541701491 +0000 UTC m=+1143.355060880" Dec 06 07:16:08 crc kubenswrapper[4954]: I1206 07:16:08.570463 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8dxm6" podStartSLOduration=6.653775222 podStartE2EDuration="1m5.570437231s" podCreationTimestamp="2025-12-06 07:15:03 +0000 UTC" firstStartedPulling="2025-12-06 07:15:05.690795767 +0000 UTC m=+1080.504155156" lastFinishedPulling="2025-12-06 07:16:04.607457776 +0000 UTC m=+1139.420817165" observedRunningTime="2025-12-06 07:16:08.562419576 +0000 UTC m=+1143.375778965" watchObservedRunningTime="2025-12-06 07:16:08.570437231 +0000 UTC m=+1143.383796620" Dec 06 07:16:09 crc kubenswrapper[4954]: E1206 07:16:09.050307 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" podUID="41e31a3c-2482-492f-a8d6-220d9aaeefe3" Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.456553 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l8h49" event={"ID":"49a30376-5e1f-4065-a8cd-b728b7413a07","Type":"ContainerStarted","Data":"533b3b97e62638edf4ffd4a42af78c26b0b7efbd17b44aa49411ce524f68ac5c"} Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.456647 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l8h49" Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.456666 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vkd9c" Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.456679 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vkd9c" event={"ID":"5f12ac11-7904-4ca1-beff-3bc654b8bf24","Type":"ContainerStarted","Data":"f11ee47259a4c60573119248a08f0a4d9e112aba62f253d94df360eafb9e65d6"} Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.460991 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-pqzdl" event={"ID":"a47971dc-993b-47f7-b65b-4348dfd56866","Type":"ContainerStarted","Data":"8ce7404baeceda73c9a9c6c3ecb6bce6e1c36d8c60b9be955979c48f25be600f"} Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.463536 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vkd9c" Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.470429 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fpbr4" event={"ID":"d05a311c-788e-41fa-9ddc-52c98fe6dc7c","Type":"ContainerStarted","Data":"0bc514d94c685af0f8fbeca516d24d8d8c92f65df9de4a69bdd8d5a9dd25081d"} Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.477440 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l8h49" podStartSLOduration=7.20931508 podStartE2EDuration="1m7.477413257s" podCreationTimestamp="2025-12-06 07:15:02 +0000 UTC" firstStartedPulling="2025-12-06 07:15:05.619867046 +0000 UTC m=+1080.433226435" lastFinishedPulling="2025-12-06 07:16:05.887965223 +0000 UTC m=+1140.701324612" observedRunningTime="2025-12-06 07:16:09.472691891 +0000 UTC m=+1144.286051280" watchObservedRunningTime="2025-12-06 07:16:09.477413257 +0000 UTC m=+1144.290772647" Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.488321 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kkcv7" event={"ID":"0e05eb35-0ec5-4760-8c17-fa88a565b38c","Type":"ContainerStarted","Data":"c3e8a8542d8f4f0d61211736a7e57433d518140c20c1d1d8588ba6dc08f1bf8a"} Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.488530 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kkcv7" Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.499469 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" event={"ID":"41e31a3c-2482-492f-a8d6-220d9aaeefe3","Type":"ContainerStarted","Data":"af465fe6f3c1137359a60e16fa54bd91e94a177e861feeb3fc8e15d74b0e829d"} Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.511148 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vkd9c" podStartSLOduration=5.271582581 podStartE2EDuration="1m7.511125371s" podCreationTimestamp="2025-12-06 07:15:02 +0000 UTC" firstStartedPulling="2025-12-06 07:15:04.355943254 +0000 UTC m=+1079.169302653" lastFinishedPulling="2025-12-06 07:16:06.595486054 +0000 UTC m=+1141.408845443" observedRunningTime="2025-12-06 07:16:09.500809845 +0000 UTC m=+1144.314169244" watchObservedRunningTime="2025-12-06 07:16:09.511125371 +0000 UTC m=+1144.324484760" Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.537940 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hp8pr" event={"ID":"5c0b16d3-8d9a-44c8-882b-8a90fd89379d","Type":"ContainerStarted","Data":"e9eefc8329214bf8401fdf1fe11687c79ba43365d7180aebe4b8da71788e40f9"} Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.539498 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hp8pr" Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.545351 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hp8pr" Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.556217 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t67pd" event={"ID":"b6dcd22b-76ae-440f-8f04-2d2ea96a07f5","Type":"ContainerStarted","Data":"61b877e9072caa7334df59bb00b73ecbf5f02dce74dd9872f55e7b42c372aae6"} Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.558310 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dltbn" event={"ID":"8b7cf0fe-276e-41bf-88a8-76f5f56e884e","Type":"ContainerStarted","Data":"ac63074dba01499a8fc20ce2c47cee80a3307696f22a1b2e3a4294219a945935"} Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.559546 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dltbn" Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.582876 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-g2rfv" event={"ID":"5dac40b5-0993-4696-995b-2476436126fc","Type":"ContainerStarted","Data":"dd2068dd65f79e45208455cb452571e2d200489fa3e3731da6d930fbb7d793cc"} Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.582955 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-g2rfv" event={"ID":"5dac40b5-0993-4696-995b-2476436126fc","Type":"ContainerStarted","Data":"e04e19016ae8327c880844d10f67150939948f5fbf457231836cda810840db20"} Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.583958 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-g2rfv" Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.616386 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kkcv7" podStartSLOduration=6.716892034 podStartE2EDuration="1m7.616359281s" podCreationTimestamp="2025-12-06 07:15:02 +0000 UTC" firstStartedPulling="2025-12-06 07:15:05.551282298 +0000 UTC m=+1080.364641687" lastFinishedPulling="2025-12-06 07:16:06.450749545 +0000 UTC m=+1141.264108934" observedRunningTime="2025-12-06 07:16:09.610613917 +0000 UTC m=+1144.423973316" watchObservedRunningTime="2025-12-06 07:16:09.616359281 +0000 UTC m=+1144.429718670" Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.631160 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8dxm6" event={"ID":"2d02b32a-0bae-4d0e-9b2a-14d8f2cdfb9b","Type":"ContainerStarted","Data":"1b97926568fbac3afab0d2810baab2e1921509fed851327ba295380161d0a478"} Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.659434 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-khxwr" event={"ID":"c6aa3674-f8f4-4a0e-bc00-2629d49818e7","Type":"ContainerStarted","Data":"cc5c1d38db361507c5d0e566eaf971248dae425fb04c76750b40d8a716135c4f"} Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.659505 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-khxwr" Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.669849 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-g2rfv" podStartSLOduration=5.93389512 podStartE2EDuration="1m6.669827364s" podCreationTimestamp="2025-12-06 07:15:03 +0000 UTC" firstStartedPulling="2025-12-06 07:15:05.724746657 +0000 UTC m=+1080.538106046" lastFinishedPulling="2025-12-06 07:16:06.460678901 +0000 UTC m=+1141.274038290" observedRunningTime="2025-12-06 07:16:09.668586511 +0000 UTC m=+1144.481945900" watchObservedRunningTime="2025-12-06 07:16:09.669827364 +0000 UTC m=+1144.483186753" Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.673859 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-khxwr" Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.713670 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dltbn" podStartSLOduration=4.538531335 podStartE2EDuration="1m6.713641098s" podCreationTimestamp="2025-12-06 07:15:03 +0000 UTC" firstStartedPulling="2025-12-06 07:15:05.848114653 +0000 UTC m=+1080.661474042" lastFinishedPulling="2025-12-06 07:16:08.023224416 +0000 UTC m=+1142.836583805" observedRunningTime="2025-12-06 07:16:09.712000254 +0000 UTC m=+1144.525359663" watchObservedRunningTime="2025-12-06 07:16:09.713641098 +0000 UTC m=+1144.527000487" Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.832958 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hp8pr" podStartSLOduration=4.854995497 podStartE2EDuration="1m7.832923445s" podCreationTimestamp="2025-12-06 07:15:02 +0000 UTC" firstStartedPulling="2025-12-06 07:15:04.942477382 +0000 UTC m=+1079.755836771" lastFinishedPulling="2025-12-06 07:16:07.92040533 +0000 UTC m=+1142.733764719" observedRunningTime="2025-12-06 07:16:09.759844657 +0000 UTC m=+1144.573204066" watchObservedRunningTime="2025-12-06 07:16:09.832923445 +0000 UTC m=+1144.646282834" Dec 06 07:16:09 crc kubenswrapper[4954]: I1206 07:16:09.860447 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-khxwr" podStartSLOduration=6.034965599 podStartE2EDuration="1m6.860410492s" podCreationTimestamp="2025-12-06 07:15:03 +0000 UTC" firstStartedPulling="2025-12-06 07:15:06.056178079 +0000 UTC m=+1080.869537488" lastFinishedPulling="2025-12-06 07:16:06.881622992 +0000 UTC m=+1141.694982381" observedRunningTime="2025-12-06 07:16:09.845222535 +0000 UTC m=+1144.658581924" watchObservedRunningTime="2025-12-06 07:16:09.860410492 +0000 UTC m=+1144.673769881" Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.101547 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.102104 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.102167 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.103063 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b2a8e38b61392d54e95e434affe8fa8be8bd703de4a146acbf88d2066f517403"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.103149 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://b2a8e38b61392d54e95e434affe8fa8be8bd703de4a146acbf88d2066f517403" gracePeriod=600 Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.667966 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fpbr4" event={"ID":"d05a311c-788e-41fa-9ddc-52c98fe6dc7c","Type":"ContainerStarted","Data":"8139ab0a536d95824cd48aaf45c3e6434d218832fc423c194b401c864bb521de"} Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.668251 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fpbr4" Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.672724 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="b2a8e38b61392d54e95e434affe8fa8be8bd703de4a146acbf88d2066f517403" exitCode=0 Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.672789 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"b2a8e38b61392d54e95e434affe8fa8be8bd703de4a146acbf88d2066f517403"} Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.672844 4954 scope.go:117] "RemoveContainer" containerID="16342cc1e9e4e44c49d62d3e217ed95b4b7faa3f28bb16ba730ed0b900485233" Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.676350 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t67pd" event={"ID":"b6dcd22b-76ae-440f-8f04-2d2ea96a07f5","Type":"ContainerStarted","Data":"45389e5b841dfdf0ae5bbc9906605edfe2986691aee421ecd7db26d34a3c1f18"} Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.676548 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t67pd" Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.682433 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-pqzdl" event={"ID":"a47971dc-993b-47f7-b65b-4348dfd56866","Type":"ContainerStarted","Data":"2b750a7fe41258d887519d9388d8c5a9bae300ab0051b917125f4f467f6feed6"} Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.682547 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-pqzdl" Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.685050 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" event={"ID":"41e31a3c-2482-492f-a8d6-220d9aaeefe3","Type":"ContainerStarted","Data":"079d945361d213f405203bb4626bdbd4bc9a9e685abe90b0bdee479b498b1fc0"} Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.685994 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.698364 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fpbr4" podStartSLOduration=5.920798789 podStartE2EDuration="1m7.698337768s" podCreationTimestamp="2025-12-06 07:15:03 +0000 UTC" firstStartedPulling="2025-12-06 07:15:05.888267289 +0000 UTC m=+1080.701626678" lastFinishedPulling="2025-12-06 07:16:07.665806268 +0000 UTC m=+1142.479165657" observedRunningTime="2025-12-06 07:16:10.690890438 +0000 UTC m=+1145.504249827" watchObservedRunningTime="2025-12-06 07:16:10.698337768 +0000 UTC m=+1145.511697157" Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.737686 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" podStartSLOduration=36.47565275 podStartE2EDuration="1m8.737664572s" podCreationTimestamp="2025-12-06 07:15:02 +0000 UTC" firstStartedPulling="2025-12-06 07:15:37.774845879 +0000 UTC m=+1112.588205268" lastFinishedPulling="2025-12-06 07:16:10.036857701 +0000 UTC m=+1144.850217090" observedRunningTime="2025-12-06 07:16:10.737165358 +0000 UTC m=+1145.550524777" watchObservedRunningTime="2025-12-06 07:16:10.737664572 +0000 UTC m=+1145.551023971" Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.765396 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-pqzdl" podStartSLOduration=5.672428594 podStartE2EDuration="1m8.765376705s" podCreationTimestamp="2025-12-06 07:15:02 +0000 UTC" firstStartedPulling="2025-12-06 07:15:04.930532972 +0000 UTC m=+1079.743892361" lastFinishedPulling="2025-12-06 07:16:08.023481083 +0000 UTC m=+1142.836840472" observedRunningTime="2025-12-06 07:16:10.762644521 +0000 UTC m=+1145.576003920" watchObservedRunningTime="2025-12-06 07:16:10.765376705 +0000 UTC m=+1145.578736094" Dec 06 07:16:10 crc kubenswrapper[4954]: I1206 07:16:10.802172 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t67pd" podStartSLOduration=6.724988151 podStartE2EDuration="1m8.80214459s" podCreationTimestamp="2025-12-06 07:15:02 +0000 UTC" firstStartedPulling="2025-12-06 07:15:05.701997697 +0000 UTC m=+1080.515357086" lastFinishedPulling="2025-12-06 07:16:07.779154136 +0000 UTC m=+1142.592513525" observedRunningTime="2025-12-06 07:16:10.800678321 +0000 UTC m=+1145.614037710" watchObservedRunningTime="2025-12-06 07:16:10.80214459 +0000 UTC m=+1145.615503979" Dec 06 07:16:11 crc kubenswrapper[4954]: I1206 07:16:11.696639 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"3092fb494fb5f62377d5237c9a62fe265b08e78c0cfb30eb9b606f026fbf3679"} Dec 06 07:16:13 crc kubenswrapper[4954]: I1206 07:16:13.088966 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kkcv7" Dec 06 07:16:13 crc kubenswrapper[4954]: I1206 07:16:13.420464 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l8h49" Dec 06 07:16:13 crc kubenswrapper[4954]: I1206 07:16:13.609474 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t67pd" Dec 06 07:16:13 crc kubenswrapper[4954]: I1206 07:16:13.902979 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8dxm6" Dec 06 07:16:14 crc kubenswrapper[4954]: I1206 07:16:14.058883 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-g2rfv" Dec 06 07:16:14 crc kubenswrapper[4954]: I1206 07:16:14.453869 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fpbr4" Dec 06 07:16:15 crc kubenswrapper[4954]: I1206 07:16:15.990536 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f56gzs7" Dec 06 07:16:16 crc kubenswrapper[4954]: I1206 07:16:16.610252 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-xbgpz" Dec 06 07:16:17 crc kubenswrapper[4954]: I1206 07:16:17.762692 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-zh2kh" event={"ID":"fb5012c2-0242-4c82-bedb-58f94bc2c8d1","Type":"ContainerStarted","Data":"84a475aedf66b62129a297ef40378c6d9c397746f5024402f9f6e20741c86ce0"} Dec 06 07:16:17 crc kubenswrapper[4954]: I1206 07:16:17.786080 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-zh2kh" podStartSLOduration=51.447885304 podStartE2EDuration="1m14.786045432s" podCreationTimestamp="2025-12-06 07:15:03 +0000 UTC" firstStartedPulling="2025-12-06 07:15:06.056027935 +0000 UTC m=+1080.869387324" lastFinishedPulling="2025-12-06 07:15:29.394188063 +0000 UTC m=+1104.207547452" observedRunningTime="2025-12-06 07:16:17.782503017 +0000 UTC m=+1152.595862416" watchObservedRunningTime="2025-12-06 07:16:17.786045432 +0000 UTC m=+1152.599404821" Dec 06 07:16:19 crc kubenswrapper[4954]: I1206 07:16:19.265836 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fn6mj" Dec 06 07:16:23 crc kubenswrapper[4954]: I1206 07:16:23.293942 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-pqzdl" Dec 06 07:16:23 crc kubenswrapper[4954]: I1206 07:16:23.997361 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dltbn" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.702981 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-jbq22"] Dec 06 07:16:38 crc kubenswrapper[4954]: E1206 07:16:38.704263 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b53cf7-013c-4df3-82de-c438b35806ac" containerName="collect-profiles" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.704285 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b53cf7-013c-4df3-82de-c438b35806ac" containerName="collect-profiles" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.704470 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="94b53cf7-013c-4df3-82de-c438b35806ac" containerName="collect-profiles" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.705509 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-jbq22" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.710859 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.710927 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-nwhjc" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.711166 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.711312 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.714622 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-jbq22"] Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.782683 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-567c455747-5p5w9"] Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.789484 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-5p5w9" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.793147 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.797208 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567c455747-5p5w9"] Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.797357 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b77604dd-a806-43fb-a63d-d6b5c276e1fc-config\") pod \"dnsmasq-dns-5cd484bb89-jbq22\" (UID: \"b77604dd-a806-43fb-a63d-d6b5c276e1fc\") " pod="openstack/dnsmasq-dns-5cd484bb89-jbq22" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.797419 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djs88\" (UniqueName: \"kubernetes.io/projected/b77604dd-a806-43fb-a63d-d6b5c276e1fc-kube-api-access-djs88\") pod \"dnsmasq-dns-5cd484bb89-jbq22\" (UID: \"b77604dd-a806-43fb-a63d-d6b5c276e1fc\") " pod="openstack/dnsmasq-dns-5cd484bb89-jbq22" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.898478 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5872947-55e7-4553-9fdc-0c084c2e0d4a-dns-svc\") pod \"dnsmasq-dns-567c455747-5p5w9\" (UID: \"e5872947-55e7-4553-9fdc-0c084c2e0d4a\") " pod="openstack/dnsmasq-dns-567c455747-5p5w9" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.898904 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5872947-55e7-4553-9fdc-0c084c2e0d4a-config\") pod \"dnsmasq-dns-567c455747-5p5w9\" (UID: \"e5872947-55e7-4553-9fdc-0c084c2e0d4a\") " pod="openstack/dnsmasq-dns-567c455747-5p5w9" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.899197 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zmw8\" (UniqueName: \"kubernetes.io/projected/e5872947-55e7-4553-9fdc-0c084c2e0d4a-kube-api-access-2zmw8\") pod \"dnsmasq-dns-567c455747-5p5w9\" (UID: \"e5872947-55e7-4553-9fdc-0c084c2e0d4a\") " pod="openstack/dnsmasq-dns-567c455747-5p5w9" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.899297 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b77604dd-a806-43fb-a63d-d6b5c276e1fc-config\") pod \"dnsmasq-dns-5cd484bb89-jbq22\" (UID: \"b77604dd-a806-43fb-a63d-d6b5c276e1fc\") " pod="openstack/dnsmasq-dns-5cd484bb89-jbq22" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.899354 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djs88\" (UniqueName: \"kubernetes.io/projected/b77604dd-a806-43fb-a63d-d6b5c276e1fc-kube-api-access-djs88\") pod \"dnsmasq-dns-5cd484bb89-jbq22\" (UID: \"b77604dd-a806-43fb-a63d-d6b5c276e1fc\") " pod="openstack/dnsmasq-dns-5cd484bb89-jbq22" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.900517 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b77604dd-a806-43fb-a63d-d6b5c276e1fc-config\") pod \"dnsmasq-dns-5cd484bb89-jbq22\" (UID: \"b77604dd-a806-43fb-a63d-d6b5c276e1fc\") " pod="openstack/dnsmasq-dns-5cd484bb89-jbq22" Dec 06 07:16:38 crc kubenswrapper[4954]: I1206 07:16:38.946057 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djs88\" (UniqueName: \"kubernetes.io/projected/b77604dd-a806-43fb-a63d-d6b5c276e1fc-kube-api-access-djs88\") pod \"dnsmasq-dns-5cd484bb89-jbq22\" (UID: \"b77604dd-a806-43fb-a63d-d6b5c276e1fc\") " pod="openstack/dnsmasq-dns-5cd484bb89-jbq22" Dec 06 07:16:39 crc kubenswrapper[4954]: I1206 07:16:39.001214 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5872947-55e7-4553-9fdc-0c084c2e0d4a-dns-svc\") pod \"dnsmasq-dns-567c455747-5p5w9\" (UID: \"e5872947-55e7-4553-9fdc-0c084c2e0d4a\") " pod="openstack/dnsmasq-dns-567c455747-5p5w9" Dec 06 07:16:39 crc kubenswrapper[4954]: I1206 07:16:39.001342 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5872947-55e7-4553-9fdc-0c084c2e0d4a-config\") pod \"dnsmasq-dns-567c455747-5p5w9\" (UID: \"e5872947-55e7-4553-9fdc-0c084c2e0d4a\") " pod="openstack/dnsmasq-dns-567c455747-5p5w9" Dec 06 07:16:39 crc kubenswrapper[4954]: I1206 07:16:39.001394 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zmw8\" (UniqueName: \"kubernetes.io/projected/e5872947-55e7-4553-9fdc-0c084c2e0d4a-kube-api-access-2zmw8\") pod \"dnsmasq-dns-567c455747-5p5w9\" (UID: \"e5872947-55e7-4553-9fdc-0c084c2e0d4a\") " pod="openstack/dnsmasq-dns-567c455747-5p5w9" Dec 06 07:16:39 crc kubenswrapper[4954]: I1206 07:16:39.003213 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5872947-55e7-4553-9fdc-0c084c2e0d4a-config\") pod \"dnsmasq-dns-567c455747-5p5w9\" (UID: \"e5872947-55e7-4553-9fdc-0c084c2e0d4a\") " pod="openstack/dnsmasq-dns-567c455747-5p5w9" Dec 06 07:16:39 crc kubenswrapper[4954]: I1206 07:16:39.004159 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5872947-55e7-4553-9fdc-0c084c2e0d4a-dns-svc\") pod \"dnsmasq-dns-567c455747-5p5w9\" (UID: \"e5872947-55e7-4553-9fdc-0c084c2e0d4a\") " pod="openstack/dnsmasq-dns-567c455747-5p5w9" Dec 06 07:16:39 crc kubenswrapper[4954]: I1206 07:16:39.020520 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zmw8\" (UniqueName: \"kubernetes.io/projected/e5872947-55e7-4553-9fdc-0c084c2e0d4a-kube-api-access-2zmw8\") pod \"dnsmasq-dns-567c455747-5p5w9\" (UID: \"e5872947-55e7-4553-9fdc-0c084c2e0d4a\") " pod="openstack/dnsmasq-dns-567c455747-5p5w9" Dec 06 07:16:39 crc kubenswrapper[4954]: I1206 07:16:39.027974 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-jbq22" Dec 06 07:16:39 crc kubenswrapper[4954]: I1206 07:16:39.107380 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-5p5w9" Dec 06 07:16:39 crc kubenswrapper[4954]: I1206 07:16:39.432170 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-jbq22"] Dec 06 07:16:39 crc kubenswrapper[4954]: I1206 07:16:39.774544 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567c455747-5p5w9"] Dec 06 07:16:40 crc kubenswrapper[4954]: I1206 07:16:40.005477 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567c455747-5p5w9" event={"ID":"e5872947-55e7-4553-9fdc-0c084c2e0d4a","Type":"ContainerStarted","Data":"cf89f063582658b576bfabcbeb8a044dc29e4b907fd807fa2051cbbf61c94441"} Dec 06 07:16:40 crc kubenswrapper[4954]: I1206 07:16:40.006622 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd484bb89-jbq22" event={"ID":"b77604dd-a806-43fb-a63d-d6b5c276e1fc","Type":"ContainerStarted","Data":"0284a6d98c3324e69abb7571fc745e8941eccabf18680eb9e9052b730acccb77"} Dec 06 07:16:40 crc kubenswrapper[4954]: I1206 07:16:40.346768 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567c455747-5p5w9"] Dec 06 07:16:40 crc kubenswrapper[4954]: I1206 07:16:40.403605 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-n5dql"] Dec 06 07:16:40 crc kubenswrapper[4954]: I1206 07:16:40.405383 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" Dec 06 07:16:40 crc kubenswrapper[4954]: I1206 07:16:40.439423 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-n5dql"] Dec 06 07:16:40 crc kubenswrapper[4954]: I1206 07:16:40.525953 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4927a7-3a81-432e-8652-9b16301f905b-config\") pod \"dnsmasq-dns-bc4b48fc9-n5dql\" (UID: \"6c4927a7-3a81-432e-8652-9b16301f905b\") " pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" Dec 06 07:16:40 crc kubenswrapper[4954]: I1206 07:16:40.526021 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb7mv\" (UniqueName: \"kubernetes.io/projected/6c4927a7-3a81-432e-8652-9b16301f905b-kube-api-access-pb7mv\") pod \"dnsmasq-dns-bc4b48fc9-n5dql\" (UID: \"6c4927a7-3a81-432e-8652-9b16301f905b\") " pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" Dec 06 07:16:40 crc kubenswrapper[4954]: I1206 07:16:40.526122 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4927a7-3a81-432e-8652-9b16301f905b-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-n5dql\" (UID: \"6c4927a7-3a81-432e-8652-9b16301f905b\") " pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" Dec 06 07:16:40 crc kubenswrapper[4954]: I1206 07:16:40.627754 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4927a7-3a81-432e-8652-9b16301f905b-config\") pod \"dnsmasq-dns-bc4b48fc9-n5dql\" (UID: \"6c4927a7-3a81-432e-8652-9b16301f905b\") " pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" Dec 06 07:16:40 crc kubenswrapper[4954]: I1206 07:16:40.627827 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb7mv\" (UniqueName: \"kubernetes.io/projected/6c4927a7-3a81-432e-8652-9b16301f905b-kube-api-access-pb7mv\") pod \"dnsmasq-dns-bc4b48fc9-n5dql\" (UID: \"6c4927a7-3a81-432e-8652-9b16301f905b\") " pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" Dec 06 07:16:40 crc kubenswrapper[4954]: I1206 07:16:40.627895 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4927a7-3a81-432e-8652-9b16301f905b-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-n5dql\" (UID: \"6c4927a7-3a81-432e-8652-9b16301f905b\") " pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" Dec 06 07:16:40 crc kubenswrapper[4954]: I1206 07:16:40.629491 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4927a7-3a81-432e-8652-9b16301f905b-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-n5dql\" (UID: \"6c4927a7-3a81-432e-8652-9b16301f905b\") " pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" Dec 06 07:16:40 crc kubenswrapper[4954]: I1206 07:16:40.629623 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4927a7-3a81-432e-8652-9b16301f905b-config\") pod \"dnsmasq-dns-bc4b48fc9-n5dql\" (UID: \"6c4927a7-3a81-432e-8652-9b16301f905b\") " pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" Dec 06 07:16:40 crc kubenswrapper[4954]: I1206 07:16:40.664338 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb7mv\" (UniqueName: \"kubernetes.io/projected/6c4927a7-3a81-432e-8652-9b16301f905b-kube-api-access-pb7mv\") pod \"dnsmasq-dns-bc4b48fc9-n5dql\" (UID: \"6c4927a7-3a81-432e-8652-9b16301f905b\") " pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" Dec 06 07:16:40 crc kubenswrapper[4954]: I1206 07:16:40.745191 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.165848 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-jbq22"] Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.204991 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb666b895-xvgwv"] Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.206406 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-xvgwv" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.222183 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-xvgwv"] Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.286777 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-n5dql"] Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.357577 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzwmv\" (UniqueName: \"kubernetes.io/projected/8d561be2-2452-466d-acd4-230f18bc29f0-kube-api-access-tzwmv\") pod \"dnsmasq-dns-cb666b895-xvgwv\" (UID: \"8d561be2-2452-466d-acd4-230f18bc29f0\") " pod="openstack/dnsmasq-dns-cb666b895-xvgwv" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.357678 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d561be2-2452-466d-acd4-230f18bc29f0-config\") pod \"dnsmasq-dns-cb666b895-xvgwv\" (UID: \"8d561be2-2452-466d-acd4-230f18bc29f0\") " pod="openstack/dnsmasq-dns-cb666b895-xvgwv" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.357741 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d561be2-2452-466d-acd4-230f18bc29f0-dns-svc\") pod \"dnsmasq-dns-cb666b895-xvgwv\" (UID: \"8d561be2-2452-466d-acd4-230f18bc29f0\") " pod="openstack/dnsmasq-dns-cb666b895-xvgwv" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.459476 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzwmv\" (UniqueName: \"kubernetes.io/projected/8d561be2-2452-466d-acd4-230f18bc29f0-kube-api-access-tzwmv\") pod \"dnsmasq-dns-cb666b895-xvgwv\" (UID: \"8d561be2-2452-466d-acd4-230f18bc29f0\") " pod="openstack/dnsmasq-dns-cb666b895-xvgwv" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.459524 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d561be2-2452-466d-acd4-230f18bc29f0-config\") pod \"dnsmasq-dns-cb666b895-xvgwv\" (UID: \"8d561be2-2452-466d-acd4-230f18bc29f0\") " pod="openstack/dnsmasq-dns-cb666b895-xvgwv" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.459601 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d561be2-2452-466d-acd4-230f18bc29f0-dns-svc\") pod \"dnsmasq-dns-cb666b895-xvgwv\" (UID: \"8d561be2-2452-466d-acd4-230f18bc29f0\") " pod="openstack/dnsmasq-dns-cb666b895-xvgwv" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.460756 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d561be2-2452-466d-acd4-230f18bc29f0-dns-svc\") pod \"dnsmasq-dns-cb666b895-xvgwv\" (UID: \"8d561be2-2452-466d-acd4-230f18bc29f0\") " pod="openstack/dnsmasq-dns-cb666b895-xvgwv" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.461885 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d561be2-2452-466d-acd4-230f18bc29f0-config\") pod \"dnsmasq-dns-cb666b895-xvgwv\" (UID: \"8d561be2-2452-466d-acd4-230f18bc29f0\") " pod="openstack/dnsmasq-dns-cb666b895-xvgwv" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.485220 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzwmv\" (UniqueName: \"kubernetes.io/projected/8d561be2-2452-466d-acd4-230f18bc29f0-kube-api-access-tzwmv\") pod \"dnsmasq-dns-cb666b895-xvgwv\" (UID: \"8d561be2-2452-466d-acd4-230f18bc29f0\") " pod="openstack/dnsmasq-dns-cb666b895-xvgwv" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.536092 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-xvgwv" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.652086 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.655203 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.658453 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.666072 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.666352 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.666659 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qtmg4" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.666898 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.667083 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.667176 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.667462 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.765946 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.766004 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.766040 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.766070 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.766091 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.766112 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhjvx\" (UniqueName: \"kubernetes.io/projected/578bec25-a54c-4f52-95f2-19f20f833437-kube-api-access-qhjvx\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.766147 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/578bec25-a54c-4f52-95f2-19f20f833437-pod-info\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.766388 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/578bec25-a54c-4f52-95f2-19f20f833437-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.766542 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-config-data\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.766625 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.766660 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-server-conf\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.868159 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.868240 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-server-conf\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.868334 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.868360 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.868392 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.868437 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.868467 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.868492 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhjvx\" (UniqueName: \"kubernetes.io/projected/578bec25-a54c-4f52-95f2-19f20f833437-kube-api-access-qhjvx\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.868533 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/578bec25-a54c-4f52-95f2-19f20f833437-pod-info\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.868597 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/578bec25-a54c-4f52-95f2-19f20f833437-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.868643 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-config-data\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.868982 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.869378 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.870154 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-config-data\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.870635 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-server-conf\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.870651 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.871029 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.880390 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/578bec25-a54c-4f52-95f2-19f20f833437-pod-info\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.880718 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.895698 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/578bec25-a54c-4f52-95f2-19f20f833437-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:41 crc kubenswrapper[4954]: I1206 07:16:41.898032 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.013704 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhjvx\" (UniqueName: \"kubernetes.io/projected/578bec25-a54c-4f52-95f2-19f20f833437-kube-api-access-qhjvx\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.017172 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " pod="openstack/rabbitmq-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.046903 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" event={"ID":"6c4927a7-3a81-432e-8652-9b16301f905b","Type":"ContainerStarted","Data":"cb92e4771131cbd2c30926bc896e39ca7ac4c3de2d64d06f319860679ed11524"} Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.141711 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-xvgwv"] Dec 06 07:16:42 crc kubenswrapper[4954]: W1206 07:16:42.153519 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d561be2_2452_466d_acd4_230f18bc29f0.slice/crio-f0c95a97c0a011d3208ea6bff90a7dc5870e9dbd9e14e2e43731e739fbceed97 WatchSource:0}: Error finding container f0c95a97c0a011d3208ea6bff90a7dc5870e9dbd9e14e2e43731e739fbceed97: Status 404 returned error can't find the container with id f0c95a97c0a011d3208ea6bff90a7dc5870e9dbd9e14e2e43731e739fbceed97 Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.285514 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.355893 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.357652 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.367278 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.367579 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.367889 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.368037 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.368155 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.368149 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mqh5h" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.368419 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.410203 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.526905 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.526987 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.527095 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31452db7-e2c4-4e61-8f8c-7017476f0bc0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.527148 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.527176 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bq5n\" (UniqueName: \"kubernetes.io/projected/31452db7-e2c4-4e61-8f8c-7017476f0bc0-kube-api-access-9bq5n\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.527206 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31452db7-e2c4-4e61-8f8c-7017476f0bc0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.527241 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.527276 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.527332 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.527388 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.527421 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.629309 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.629383 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.629410 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.629469 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.629499 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.629530 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.629577 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.629665 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31452db7-e2c4-4e61-8f8c-7017476f0bc0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.629724 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.629752 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bq5n\" (UniqueName: \"kubernetes.io/projected/31452db7-e2c4-4e61-8f8c-7017476f0bc0-kube-api-access-9bq5n\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.629782 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31452db7-e2c4-4e61-8f8c-7017476f0bc0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.631925 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.632023 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.634757 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.635506 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.635667 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.636423 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.654837 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.656776 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31452db7-e2c4-4e61-8f8c-7017476f0bc0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.663009 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bq5n\" (UniqueName: \"kubernetes.io/projected/31452db7-e2c4-4e61-8f8c-7017476f0bc0-kube-api-access-9bq5n\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.683388 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.684387 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31452db7-e2c4-4e61-8f8c-7017476f0bc0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.729534 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:42 crc kubenswrapper[4954]: I1206 07:16:42.976377 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 07:16:43 crc kubenswrapper[4954]: W1206 07:16:43.018964 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod578bec25_a54c_4f52_95f2_19f20f833437.slice/crio-143d686cdba200b11f7ddb4a4fa6fe1c57963da39e319be167ff52a4eb9b563b WatchSource:0}: Error finding container 143d686cdba200b11f7ddb4a4fa6fe1c57963da39e319be167ff52a4eb9b563b: Status 404 returned error can't find the container with id 143d686cdba200b11f7ddb4a4fa6fe1c57963da39e319be167ff52a4eb9b563b Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.019441 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.064192 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-xvgwv" event={"ID":"8d561be2-2452-466d-acd4-230f18bc29f0","Type":"ContainerStarted","Data":"f0c95a97c0a011d3208ea6bff90a7dc5870e9dbd9e14e2e43731e739fbceed97"} Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.065612 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"578bec25-a54c-4f52-95f2-19f20f833437","Type":"ContainerStarted","Data":"143d686cdba200b11f7ddb4a4fa6fe1c57963da39e319be167ff52a4eb9b563b"} Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.542586 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.858944 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.860972 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.866769 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.867286 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-jnx99" Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.871724 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.874374 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.877892 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.878987 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.960867 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841d2cc4-0265-4e02-af59-f0f322208f02-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.960983 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/841d2cc4-0265-4e02-af59-f0f322208f02-operator-scripts\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.961013 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/841d2cc4-0265-4e02-af59-f0f322208f02-config-data-default\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.961042 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4v4p\" (UniqueName: \"kubernetes.io/projected/841d2cc4-0265-4e02-af59-f0f322208f02-kube-api-access-x4v4p\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.961080 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.961116 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/841d2cc4-0265-4e02-af59-f0f322208f02-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.961142 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/841d2cc4-0265-4e02-af59-f0f322208f02-kolla-config\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:43 crc kubenswrapper[4954]: I1206 07:16:43.961167 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/841d2cc4-0265-4e02-af59-f0f322208f02-config-data-generated\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.064395 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841d2cc4-0265-4e02-af59-f0f322208f02-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.065294 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/841d2cc4-0265-4e02-af59-f0f322208f02-operator-scripts\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.065342 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/841d2cc4-0265-4e02-af59-f0f322208f02-config-data-default\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.065374 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4v4p\" (UniqueName: \"kubernetes.io/projected/841d2cc4-0265-4e02-af59-f0f322208f02-kube-api-access-x4v4p\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.065437 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/841d2cc4-0265-4e02-af59-f0f322208f02-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.065464 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/841d2cc4-0265-4e02-af59-f0f322208f02-kolla-config\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.065497 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/841d2cc4-0265-4e02-af59-f0f322208f02-config-data-generated\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.066132 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/841d2cc4-0265-4e02-af59-f0f322208f02-config-data-generated\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.068601 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/841d2cc4-0265-4e02-af59-f0f322208f02-operator-scripts\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.071355 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/841d2cc4-0265-4e02-af59-f0f322208f02-config-data-default\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.072302 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/841d2cc4-0265-4e02-af59-f0f322208f02-kolla-config\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.081889 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"31452db7-e2c4-4e61-8f8c-7017476f0bc0","Type":"ContainerStarted","Data":"0df6ae2d74b1f5fbb0262097ec95719175941863764004072af6dd9d2a84a4e7"} Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.089243 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4v4p\" (UniqueName: \"kubernetes.io/projected/841d2cc4-0265-4e02-af59-f0f322208f02-kube-api-access-x4v4p\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.103472 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/841d2cc4-0265-4e02-af59-f0f322208f02-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.103836 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841d2cc4-0265-4e02-af59-f0f322208f02-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.167817 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.168527 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.250046 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " pod="openstack/openstack-galera-0" Dec 06 07:16:44 crc kubenswrapper[4954]: I1206 07:16:44.490072 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.002024 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.004378 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.008512 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.012030 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.012314 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-kbj7s" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.012476 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.012885 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.272592 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.272698 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.272748 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.272798 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.272826 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.272856 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.272896 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.272919 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzg28\" (UniqueName: \"kubernetes.io/projected/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-kube-api-access-jzg28\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.335705 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.337475 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.340989 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.341226 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-vjzbg" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.358668 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.367420 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.378723 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.378772 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.378799 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.378826 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.378843 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzg28\" (UniqueName: \"kubernetes.io/projected/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-kube-api-access-jzg28\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.378911 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.378942 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.378974 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.380555 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.381764 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.382116 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.383227 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.390638 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.398233 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.404887 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.417060 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzg28\" (UniqueName: \"kubernetes.io/projected/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-kube-api-access-jzg28\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.470053 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.488546 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a82ba0f-bb07-4959-bfdd-8a420c617835-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " pod="openstack/memcached-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.488647 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a82ba0f-bb07-4959-bfdd-8a420c617835-config-data\") pod \"memcached-0\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " pod="openstack/memcached-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.488679 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4a82ba0f-bb07-4959-bfdd-8a420c617835-kolla-config\") pod \"memcached-0\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " pod="openstack/memcached-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.488753 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gct2\" (UniqueName: \"kubernetes.io/projected/4a82ba0f-bb07-4959-bfdd-8a420c617835-kube-api-access-4gct2\") pod \"memcached-0\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " pod="openstack/memcached-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.488851 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a82ba0f-bb07-4959-bfdd-8a420c617835-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " pod="openstack/memcached-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.591516 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a82ba0f-bb07-4959-bfdd-8a420c617835-config-data\") pod \"memcached-0\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " pod="openstack/memcached-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.591730 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4a82ba0f-bb07-4959-bfdd-8a420c617835-kolla-config\") pod \"memcached-0\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " pod="openstack/memcached-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.591799 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gct2\" (UniqueName: \"kubernetes.io/projected/4a82ba0f-bb07-4959-bfdd-8a420c617835-kube-api-access-4gct2\") pod \"memcached-0\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " pod="openstack/memcached-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.591863 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a82ba0f-bb07-4959-bfdd-8a420c617835-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " pod="openstack/memcached-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.591909 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a82ba0f-bb07-4959-bfdd-8a420c617835-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " pod="openstack/memcached-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.593786 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4a82ba0f-bb07-4959-bfdd-8a420c617835-kolla-config\") pod \"memcached-0\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " pod="openstack/memcached-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.594080 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a82ba0f-bb07-4959-bfdd-8a420c617835-config-data\") pod \"memcached-0\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " pod="openstack/memcached-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.599178 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a82ba0f-bb07-4959-bfdd-8a420c617835-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " pod="openstack/memcached-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.606267 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a82ba0f-bb07-4959-bfdd-8a420c617835-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " pod="openstack/memcached-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.664380 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.668528 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gct2\" (UniqueName: \"kubernetes.io/projected/4a82ba0f-bb07-4959-bfdd-8a420c617835-kube-api-access-4gct2\") pod \"memcached-0\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " pod="openstack/memcached-0" Dec 06 07:16:45 crc kubenswrapper[4954]: I1206 07:16:45.764828 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 06 07:16:47 crc kubenswrapper[4954]: I1206 07:16:47.363976 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 07:16:47 crc kubenswrapper[4954]: I1206 07:16:47.366542 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 07:16:47 crc kubenswrapper[4954]: I1206 07:16:47.371241 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-6dh8g" Dec 06 07:16:47 crc kubenswrapper[4954]: I1206 07:16:47.391860 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 07:16:47 crc kubenswrapper[4954]: I1206 07:16:47.587195 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8th6\" (UniqueName: \"kubernetes.io/projected/3d6a09b0-a359-48db-9682-88114d28e3d9-kube-api-access-r8th6\") pod \"kube-state-metrics-0\" (UID: \"3d6a09b0-a359-48db-9682-88114d28e3d9\") " pod="openstack/kube-state-metrics-0" Dec 06 07:16:47 crc kubenswrapper[4954]: I1206 07:16:47.692027 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8th6\" (UniqueName: \"kubernetes.io/projected/3d6a09b0-a359-48db-9682-88114d28e3d9-kube-api-access-r8th6\") pod \"kube-state-metrics-0\" (UID: \"3d6a09b0-a359-48db-9682-88114d28e3d9\") " pod="openstack/kube-state-metrics-0" Dec 06 07:16:47 crc kubenswrapper[4954]: I1206 07:16:47.714339 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8th6\" (UniqueName: \"kubernetes.io/projected/3d6a09b0-a359-48db-9682-88114d28e3d9-kube-api-access-r8th6\") pod \"kube-state-metrics-0\" (UID: \"3d6a09b0-a359-48db-9682-88114d28e3d9\") " pod="openstack/kube-state-metrics-0" Dec 06 07:16:48 crc kubenswrapper[4954]: I1206 07:16:48.001215 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.293007 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lnbn8"] Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.295423 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.308872 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-xskgs"] Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.310683 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.317294 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lnbn8"] Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.350337 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092ab1a2-b565-47cd-9b83-f306883b688e-combined-ca-bundle\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.350404 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/092ab1a2-b565-47cd-9b83-f306883b688e-var-log-ovn\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.350439 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/092ab1a2-b565-47cd-9b83-f306883b688e-ovn-controller-tls-certs\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.350461 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/092ab1a2-b565-47cd-9b83-f306883b688e-scripts\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.350475 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/092ab1a2-b565-47cd-9b83-f306883b688e-var-run-ovn\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.350492 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/092ab1a2-b565-47cd-9b83-f306883b688e-var-run\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.350511 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-var-lib\") pod \"ovn-controller-ovs-xskgs\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.350528 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25rq9\" (UniqueName: \"kubernetes.io/projected/092ab1a2-b565-47cd-9b83-f306883b688e-kube-api-access-25rq9\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.350591 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-etc-ovs\") pod \"ovn-controller-ovs-xskgs\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.350620 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a85b49d0-cc8d-4dce-aade-6c63af659f42-scripts\") pod \"ovn-controller-ovs-xskgs\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.350643 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pxxb\" (UniqueName: \"kubernetes.io/projected/a85b49d0-cc8d-4dce-aade-6c63af659f42-kube-api-access-9pxxb\") pod \"ovn-controller-ovs-xskgs\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.350666 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-var-run\") pod \"ovn-controller-ovs-xskgs\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.350684 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-var-log\") pod \"ovn-controller-ovs-xskgs\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.357670 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.358386 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-vht27" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.358508 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.438801 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-xskgs"] Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.451834 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-var-log\") pod \"ovn-controller-ovs-xskgs\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.451919 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092ab1a2-b565-47cd-9b83-f306883b688e-combined-ca-bundle\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.451962 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/092ab1a2-b565-47cd-9b83-f306883b688e-var-log-ovn\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.451999 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/092ab1a2-b565-47cd-9b83-f306883b688e-ovn-controller-tls-certs\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.452033 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/092ab1a2-b565-47cd-9b83-f306883b688e-var-run-ovn\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.452055 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/092ab1a2-b565-47cd-9b83-f306883b688e-scripts\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.452076 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/092ab1a2-b565-47cd-9b83-f306883b688e-var-run\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.452097 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-var-lib\") pod \"ovn-controller-ovs-xskgs\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.452121 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25rq9\" (UniqueName: \"kubernetes.io/projected/092ab1a2-b565-47cd-9b83-f306883b688e-kube-api-access-25rq9\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.452163 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-etc-ovs\") pod \"ovn-controller-ovs-xskgs\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.452185 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a85b49d0-cc8d-4dce-aade-6c63af659f42-scripts\") pod \"ovn-controller-ovs-xskgs\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.452214 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pxxb\" (UniqueName: \"kubernetes.io/projected/a85b49d0-cc8d-4dce-aade-6c63af659f42-kube-api-access-9pxxb\") pod \"ovn-controller-ovs-xskgs\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.452244 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-var-run\") pod \"ovn-controller-ovs-xskgs\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.452855 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-var-log\") pod \"ovn-controller-ovs-xskgs\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.452920 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-var-run\") pod \"ovn-controller-ovs-xskgs\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.453138 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/092ab1a2-b565-47cd-9b83-f306883b688e-var-run\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.453198 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-var-lib\") pod \"ovn-controller-ovs-xskgs\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.453494 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/092ab1a2-b565-47cd-9b83-f306883b688e-var-run-ovn\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.453491 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/092ab1a2-b565-47cd-9b83-f306883b688e-var-log-ovn\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.453814 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-etc-ovs\") pod \"ovn-controller-ovs-xskgs\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.456819 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a85b49d0-cc8d-4dce-aade-6c63af659f42-scripts\") pod \"ovn-controller-ovs-xskgs\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.458914 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/092ab1a2-b565-47cd-9b83-f306883b688e-scripts\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.464915 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/092ab1a2-b565-47cd-9b83-f306883b688e-ovn-controller-tls-certs\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.474288 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092ab1a2-b565-47cd-9b83-f306883b688e-combined-ca-bundle\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.477438 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25rq9\" (UniqueName: \"kubernetes.io/projected/092ab1a2-b565-47cd-9b83-f306883b688e-kube-api-access-25rq9\") pod \"ovn-controller-lnbn8\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.477837 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.479246 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pxxb\" (UniqueName: \"kubernetes.io/projected/a85b49d0-cc8d-4dce-aade-6c63af659f42-kube-api-access-9pxxb\") pod \"ovn-controller-ovs-xskgs\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.482024 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.492097 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.492723 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.493014 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.493464 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-c87q6" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.494370 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.506405 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.623052 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lnbn8" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.632917 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.656238 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.656987 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.657385 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.657576 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.657711 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.657877 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-config\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.658787 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.658990 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b9km\" (UniqueName: \"kubernetes.io/projected/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-kube-api-access-9b9km\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.760918 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.761037 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b9km\" (UniqueName: \"kubernetes.io/projected/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-kube-api-access-9b9km\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.761125 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.761176 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.761257 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.761285 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.761321 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.761353 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-config\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.762728 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.763344 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-config\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.767535 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.775034 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.781997 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:51 crc kubenswrapper[4954]: I1206 07:16:51.802899 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:52 crc kubenswrapper[4954]: I1206 07:16:52.195723 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b9km\" (UniqueName: \"kubernetes.io/projected/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-kube-api-access-9b9km\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:52 crc kubenswrapper[4954]: I1206 07:16:52.210932 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:52 crc kubenswrapper[4954]: I1206 07:16:52.221198 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:52 crc kubenswrapper[4954]: I1206 07:16:52.434594 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 06 07:16:54 crc kubenswrapper[4954]: I1206 07:16:54.919867 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 07:16:54 crc kubenswrapper[4954]: I1206 07:16:54.924843 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:54 crc kubenswrapper[4954]: I1206 07:16:54.927627 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-5kkks" Dec 06 07:16:54 crc kubenswrapper[4954]: I1206 07:16:54.927627 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 06 07:16:54 crc kubenswrapper[4954]: I1206 07:16:54.928847 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 06 07:16:54 crc kubenswrapper[4954]: I1206 07:16:54.929630 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 06 07:16:54 crc kubenswrapper[4954]: I1206 07:16:54.952998 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.057703 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b282e6-b847-4393-89f2-7844fce43388-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.057778 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b282e6-b847-4393-89f2-7844fce43388-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.057806 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40b282e6-b847-4393-89f2-7844fce43388-config\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.057825 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40b282e6-b847-4393-89f2-7844fce43388-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.057856 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.059712 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b282e6-b847-4393-89f2-7844fce43388-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.059860 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/40b282e6-b847-4393-89f2-7844fce43388-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.060149 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9h9b\" (UniqueName: \"kubernetes.io/projected/40b282e6-b847-4393-89f2-7844fce43388-kube-api-access-v9h9b\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.162606 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9h9b\" (UniqueName: \"kubernetes.io/projected/40b282e6-b847-4393-89f2-7844fce43388-kube-api-access-v9h9b\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.162717 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b282e6-b847-4393-89f2-7844fce43388-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.162766 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b282e6-b847-4393-89f2-7844fce43388-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.162789 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40b282e6-b847-4393-89f2-7844fce43388-config\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.162822 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40b282e6-b847-4393-89f2-7844fce43388-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.162855 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.162877 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b282e6-b847-4393-89f2-7844fce43388-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.162909 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/40b282e6-b847-4393-89f2-7844fce43388-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.163593 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.163679 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/40b282e6-b847-4393-89f2-7844fce43388-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.164346 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40b282e6-b847-4393-89f2-7844fce43388-config\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.164695 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40b282e6-b847-4393-89f2-7844fce43388-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.172253 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b282e6-b847-4393-89f2-7844fce43388-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.175882 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b282e6-b847-4393-89f2-7844fce43388-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.177930 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b282e6-b847-4393-89f2-7844fce43388-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.184317 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9h9b\" (UniqueName: \"kubernetes.io/projected/40b282e6-b847-4393-89f2-7844fce43388-kube-api-access-v9h9b\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.191961 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:55 crc kubenswrapper[4954]: I1206 07:16:55.256634 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 06 07:16:56 crc kubenswrapper[4954]: I1206 07:16:56.693364 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 06 07:17:05 crc kubenswrapper[4954]: E1206 07:17:05.495034 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 06 07:17:05 crc kubenswrapper[4954]: E1206 07:17:05.495811 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tzwmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-cb666b895-xvgwv_openstack(8d561be2-2452-466d-acd4-230f18bc29f0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:17:05 crc kubenswrapper[4954]: I1206 07:17:05.496103 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"841d2cc4-0265-4e02-af59-f0f322208f02","Type":"ContainerStarted","Data":"e5166c69c0f67101f30360e8db04da0fed96c43f3c89d13729a52e456226006f"} Dec 06 07:17:05 crc kubenswrapper[4954]: E1206 07:17:05.497409 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-cb666b895-xvgwv" podUID="8d561be2-2452-466d-acd4-230f18bc29f0" Dec 06 07:17:05 crc kubenswrapper[4954]: E1206 07:17:05.530876 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 06 07:17:05 crc kubenswrapper[4954]: E1206 07:17:05.531142 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-djs88,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5cd484bb89-jbq22_openstack(b77604dd-a806-43fb-a63d-d6b5c276e1fc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:17:05 crc kubenswrapper[4954]: E1206 07:17:05.532501 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5cd484bb89-jbq22" podUID="b77604dd-a806-43fb-a63d-d6b5c276e1fc" Dec 06 07:17:05 crc kubenswrapper[4954]: E1206 07:17:05.541024 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 06 07:17:05 crc kubenswrapper[4954]: E1206 07:17:05.541320 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2zmw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-567c455747-5p5w9_openstack(e5872947-55e7-4553-9fdc-0c084c2e0d4a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:17:05 crc kubenswrapper[4954]: E1206 07:17:05.542514 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-567c455747-5p5w9" podUID="e5872947-55e7-4553-9fdc-0c084c2e0d4a" Dec 06 07:17:05 crc kubenswrapper[4954]: I1206 07:17:05.939846 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 07:17:06 crc kubenswrapper[4954]: E1206 07:17:06.510916 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792\\\"\"" pod="openstack/dnsmasq-dns-cb666b895-xvgwv" podUID="8d561be2-2452-466d-acd4-230f18bc29f0" Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.077962 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-5p5w9" Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.138674 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5872947-55e7-4553-9fdc-0c084c2e0d4a-config\") pod \"e5872947-55e7-4553-9fdc-0c084c2e0d4a\" (UID: \"e5872947-55e7-4553-9fdc-0c084c2e0d4a\") " Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.138783 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zmw8\" (UniqueName: \"kubernetes.io/projected/e5872947-55e7-4553-9fdc-0c084c2e0d4a-kube-api-access-2zmw8\") pod \"e5872947-55e7-4553-9fdc-0c084c2e0d4a\" (UID: \"e5872947-55e7-4553-9fdc-0c084c2e0d4a\") " Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.138940 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5872947-55e7-4553-9fdc-0c084c2e0d4a-dns-svc\") pod \"e5872947-55e7-4553-9fdc-0c084c2e0d4a\" (UID: \"e5872947-55e7-4553-9fdc-0c084c2e0d4a\") " Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.139811 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5872947-55e7-4553-9fdc-0c084c2e0d4a-config" (OuterVolumeSpecName: "config") pod "e5872947-55e7-4553-9fdc-0c084c2e0d4a" (UID: "e5872947-55e7-4553-9fdc-0c084c2e0d4a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.140357 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5872947-55e7-4553-9fdc-0c084c2e0d4a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e5872947-55e7-4553-9fdc-0c084c2e0d4a" (UID: "e5872947-55e7-4553-9fdc-0c084c2e0d4a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.150476 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5872947-55e7-4553-9fdc-0c084c2e0d4a-kube-api-access-2zmw8" (OuterVolumeSpecName: "kube-api-access-2zmw8") pod "e5872947-55e7-4553-9fdc-0c084c2e0d4a" (UID: "e5872947-55e7-4553-9fdc-0c084c2e0d4a"). InnerVolumeSpecName "kube-api-access-2zmw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.241118 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5872947-55e7-4553-9fdc-0c084c2e0d4a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.241184 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5872947-55e7-4553-9fdc-0c084c2e0d4a-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.241196 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zmw8\" (UniqueName: \"kubernetes.io/projected/e5872947-55e7-4553-9fdc-0c084c2e0d4a-kube-api-access-2zmw8\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.463053 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.463678 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.519600 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3d6a09b0-a359-48db-9682-88114d28e3d9","Type":"ContainerStarted","Data":"ef343d324936bc75f5611b7e81b73e2669d36961efef4ebb7a3a60185758d0de"} Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.520737 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567c455747-5p5w9" event={"ID":"e5872947-55e7-4553-9fdc-0c084c2e0d4a","Type":"ContainerDied","Data":"cf89f063582658b576bfabcbeb8a044dc29e4b907fd807fa2051cbbf61c94441"} Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.520825 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-5p5w9" Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.544474 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lnbn8"] Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.589806 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567c455747-5p5w9"] Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.596404 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-567c455747-5p5w9"] Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.678746 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 07:17:07 crc kubenswrapper[4954]: I1206 07:17:07.768335 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-xskgs"] Dec 06 07:17:07 crc kubenswrapper[4954]: W1206 07:17:07.936251 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a82ba0f_bb07_4959_bfdd_8a420c617835.slice/crio-66af8e751472e272dd2fdfbb1944c78878f95915960d3b1e4c03c64210fcc871 WatchSource:0}: Error finding container 66af8e751472e272dd2fdfbb1944c78878f95915960d3b1e4c03c64210fcc871: Status 404 returned error can't find the container with id 66af8e751472e272dd2fdfbb1944c78878f95915960d3b1e4c03c64210fcc871 Dec 06 07:17:07 crc kubenswrapper[4954]: W1206 07:17:07.941267 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40b282e6_b847_4393_89f2_7844fce43388.slice/crio-50dd9e0f4748fda949608a54563ba2a818995654b61092a245197da47944b2fe WatchSource:0}: Error finding container 50dd9e0f4748fda949608a54563ba2a818995654b61092a245197da47944b2fe: Status 404 returned error can't find the container with id 50dd9e0f4748fda949608a54563ba2a818995654b61092a245197da47944b2fe Dec 06 07:17:08 crc kubenswrapper[4954]: I1206 07:17:08.045115 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-jbq22" Dec 06 07:17:08 crc kubenswrapper[4954]: I1206 07:17:08.158735 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b77604dd-a806-43fb-a63d-d6b5c276e1fc-config\") pod \"b77604dd-a806-43fb-a63d-d6b5c276e1fc\" (UID: \"b77604dd-a806-43fb-a63d-d6b5c276e1fc\") " Dec 06 07:17:08 crc kubenswrapper[4954]: I1206 07:17:08.158890 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djs88\" (UniqueName: \"kubernetes.io/projected/b77604dd-a806-43fb-a63d-d6b5c276e1fc-kube-api-access-djs88\") pod \"b77604dd-a806-43fb-a63d-d6b5c276e1fc\" (UID: \"b77604dd-a806-43fb-a63d-d6b5c276e1fc\") " Dec 06 07:17:08 crc kubenswrapper[4954]: I1206 07:17:08.159700 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b77604dd-a806-43fb-a63d-d6b5c276e1fc-config" (OuterVolumeSpecName: "config") pod "b77604dd-a806-43fb-a63d-d6b5c276e1fc" (UID: "b77604dd-a806-43fb-a63d-d6b5c276e1fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:08 crc kubenswrapper[4954]: I1206 07:17:08.173555 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b77604dd-a806-43fb-a63d-d6b5c276e1fc-kube-api-access-djs88" (OuterVolumeSpecName: "kube-api-access-djs88") pod "b77604dd-a806-43fb-a63d-d6b5c276e1fc" (UID: "b77604dd-a806-43fb-a63d-d6b5c276e1fc"). InnerVolumeSpecName "kube-api-access-djs88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:17:08 crc kubenswrapper[4954]: I1206 07:17:08.262147 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b77604dd-a806-43fb-a63d-d6b5c276e1fc-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:08 crc kubenswrapper[4954]: I1206 07:17:08.262221 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djs88\" (UniqueName: \"kubernetes.io/projected/b77604dd-a806-43fb-a63d-d6b5c276e1fc-kube-api-access-djs88\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:08 crc kubenswrapper[4954]: I1206 07:17:08.449742 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 07:17:08 crc kubenswrapper[4954]: I1206 07:17:08.532781 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xskgs" event={"ID":"a85b49d0-cc8d-4dce-aade-6c63af659f42","Type":"ContainerStarted","Data":"b2797e7aef4be7b6a937371a4afb3ffdfa4ce3a020316108cab88a723f7b13b3"} Dec 06 07:17:08 crc kubenswrapper[4954]: I1206 07:17:08.534415 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4a82ba0f-bb07-4959-bfdd-8a420c617835","Type":"ContainerStarted","Data":"66af8e751472e272dd2fdfbb1944c78878f95915960d3b1e4c03c64210fcc871"} Dec 06 07:17:08 crc kubenswrapper[4954]: I1206 07:17:08.535937 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"40b282e6-b847-4393-89f2-7844fce43388","Type":"ContainerStarted","Data":"50dd9e0f4748fda949608a54563ba2a818995654b61092a245197da47944b2fe"} Dec 06 07:17:08 crc kubenswrapper[4954]: I1206 07:17:08.537534 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f9eaecc6-8e85-432e-a906-3fcecee9fc1d","Type":"ContainerStarted","Data":"8cce58a0ff75d9343d7f22f57084ad11c9d91175d3126e4a853a96e0b5431b67"} Dec 06 07:17:08 crc kubenswrapper[4954]: I1206 07:17:08.538928 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lnbn8" event={"ID":"092ab1a2-b565-47cd-9b83-f306883b688e","Type":"ContainerStarted","Data":"efbcb8df143034dd4846aa36db22eee233f1f2282bd1aa5a1d965ff7aba400d2"} Dec 06 07:17:08 crc kubenswrapper[4954]: I1206 07:17:08.540127 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd484bb89-jbq22" event={"ID":"b77604dd-a806-43fb-a63d-d6b5c276e1fc","Type":"ContainerDied","Data":"0284a6d98c3324e69abb7571fc745e8941eccabf18680eb9e9052b730acccb77"} Dec 06 07:17:08 crc kubenswrapper[4954]: I1206 07:17:08.540241 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-jbq22" Dec 06 07:17:08 crc kubenswrapper[4954]: I1206 07:17:08.609810 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-jbq22"] Dec 06 07:17:08 crc kubenswrapper[4954]: I1206 07:17:08.618421 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-jbq22"] Dec 06 07:17:09 crc kubenswrapper[4954]: I1206 07:17:09.455761 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b77604dd-a806-43fb-a63d-d6b5c276e1fc" path="/var/lib/kubelet/pods/b77604dd-a806-43fb-a63d-d6b5c276e1fc/volumes" Dec 06 07:17:09 crc kubenswrapper[4954]: I1206 07:17:09.456750 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5872947-55e7-4553-9fdc-0c084c2e0d4a" path="/var/lib/kubelet/pods/e5872947-55e7-4553-9fdc-0c084c2e0d4a/volumes" Dec 06 07:17:10 crc kubenswrapper[4954]: W1206 07:17:10.245226 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24a72b28_2cf7_47e0_b7c2_5ff92acedfe7.slice/crio-67dfa7eeace0009eed1693c30dbc881ceef8362746b2726ed0ab32cac8b04198 WatchSource:0}: Error finding container 67dfa7eeace0009eed1693c30dbc881ceef8362746b2726ed0ab32cac8b04198: Status 404 returned error can't find the container with id 67dfa7eeace0009eed1693c30dbc881ceef8362746b2726ed0ab32cac8b04198 Dec 06 07:17:10 crc kubenswrapper[4954]: I1206 07:17:10.559917 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7","Type":"ContainerStarted","Data":"67dfa7eeace0009eed1693c30dbc881ceef8362746b2726ed0ab32cac8b04198"} Dec 06 07:17:11 crc kubenswrapper[4954]: I1206 07:17:11.572193 4954 generic.go:334] "Generic (PLEG): container finished" podID="6c4927a7-3a81-432e-8652-9b16301f905b" containerID="a49d63d97b83303d2a217ac2992017385b637df2af116483d603916b4a07f190" exitCode=0 Dec 06 07:17:11 crc kubenswrapper[4954]: I1206 07:17:11.572318 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" event={"ID":"6c4927a7-3a81-432e-8652-9b16301f905b","Type":"ContainerDied","Data":"a49d63d97b83303d2a217ac2992017385b637df2af116483d603916b4a07f190"} Dec 06 07:17:11 crc kubenswrapper[4954]: I1206 07:17:11.578935 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f9eaecc6-8e85-432e-a906-3fcecee9fc1d","Type":"ContainerStarted","Data":"68a8e45e6d42014e5a385ced480ef41ef9ac3c7dbf71d17042ad3054f509fc4c"} Dec 06 07:17:11 crc kubenswrapper[4954]: I1206 07:17:11.581668 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"841d2cc4-0265-4e02-af59-f0f322208f02","Type":"ContainerStarted","Data":"776ef8a2e162102128c3ff29599625dfa34fd8480af7d7c05f9b0eae321798f5"} Dec 06 07:17:12 crc kubenswrapper[4954]: I1206 07:17:12.600347 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"578bec25-a54c-4f52-95f2-19f20f833437","Type":"ContainerStarted","Data":"0a70376c5fc5526ced20f1109a9cab2d6bb28881276a6118e1121122aa40551c"} Dec 06 07:17:12 crc kubenswrapper[4954]: I1206 07:17:12.606613 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"31452db7-e2c4-4e61-8f8c-7017476f0bc0","Type":"ContainerStarted","Data":"b6518d5655feada488f616ac02858e05fce9b02722643230cbaae6694b695cac"} Dec 06 07:17:16 crc kubenswrapper[4954]: E1206 07:17:16.476445 4954 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 06 07:17:16 crc kubenswrapper[4954]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/6c4927a7-3a81-432e-8652-9b16301f905b/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 06 07:17:16 crc kubenswrapper[4954]: > podSandboxID="cb92e4771131cbd2c30926bc896e39ca7ac4c3de2d64d06f319860679ed11524" Dec 06 07:17:16 crc kubenswrapper[4954]: E1206 07:17:16.477455 4954 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 06 07:17:16 crc kubenswrapper[4954]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pb7mv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-bc4b48fc9-n5dql_openstack(6c4927a7-3a81-432e-8652-9b16301f905b): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/6c4927a7-3a81-432e-8652-9b16301f905b/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 06 07:17:16 crc kubenswrapper[4954]: > logger="UnhandledError" Dec 06 07:17:16 crc kubenswrapper[4954]: E1206 07:17:16.478891 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/6c4927a7-3a81-432e-8652-9b16301f905b/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" podUID="6c4927a7-3a81-432e-8652-9b16301f905b" Dec 06 07:17:16 crc kubenswrapper[4954]: I1206 07:17:16.666449 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xskgs" event={"ID":"a85b49d0-cc8d-4dce-aade-6c63af659f42","Type":"ContainerStarted","Data":"f4bf4637ad6f38b3a7a4d4f5071df3a875d12aa499a4cfceb51cfd663705b57f"} Dec 06 07:17:16 crc kubenswrapper[4954]: I1206 07:17:16.673702 4954 generic.go:334] "Generic (PLEG): container finished" podID="841d2cc4-0265-4e02-af59-f0f322208f02" containerID="776ef8a2e162102128c3ff29599625dfa34fd8480af7d7c05f9b0eae321798f5" exitCode=0 Dec 06 07:17:16 crc kubenswrapper[4954]: I1206 07:17:16.673818 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"841d2cc4-0265-4e02-af59-f0f322208f02","Type":"ContainerDied","Data":"776ef8a2e162102128c3ff29599625dfa34fd8480af7d7c05f9b0eae321798f5"} Dec 06 07:17:16 crc kubenswrapper[4954]: I1206 07:17:16.680625 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4a82ba0f-bb07-4959-bfdd-8a420c617835","Type":"ContainerStarted","Data":"d27d5d9f536cc22c14abfb14b6f38b3f0d61e96cf9b23ee50d20f02267f9999f"} Dec 06 07:17:16 crc kubenswrapper[4954]: I1206 07:17:16.680988 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 06 07:17:16 crc kubenswrapper[4954]: I1206 07:17:16.683449 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"40b282e6-b847-4393-89f2-7844fce43388","Type":"ContainerStarted","Data":"03d05c358c3170745be24e17518ed67a6f790841fc18537ed370460246d3cb93"} Dec 06 07:17:16 crc kubenswrapper[4954]: I1206 07:17:16.697173 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7","Type":"ContainerStarted","Data":"98c8778c7e8353cc4dc52283f262371684ed883dec692ca28a39a816cbb59cd0"} Dec 06 07:17:16 crc kubenswrapper[4954]: I1206 07:17:16.701662 4954 generic.go:334] "Generic (PLEG): container finished" podID="f9eaecc6-8e85-432e-a906-3fcecee9fc1d" containerID="68a8e45e6d42014e5a385ced480ef41ef9ac3c7dbf71d17042ad3054f509fc4c" exitCode=0 Dec 06 07:17:16 crc kubenswrapper[4954]: I1206 07:17:16.701787 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f9eaecc6-8e85-432e-a906-3fcecee9fc1d","Type":"ContainerDied","Data":"68a8e45e6d42014e5a385ced480ef41ef9ac3c7dbf71d17042ad3054f509fc4c"} Dec 06 07:17:16 crc kubenswrapper[4954]: I1206 07:17:16.705613 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lnbn8" event={"ID":"092ab1a2-b565-47cd-9b83-f306883b688e","Type":"ContainerStarted","Data":"fa365fab5db7334bced55d4b6aa2cb11e9adda9a3cc808705301e34527d718db"} Dec 06 07:17:16 crc kubenswrapper[4954]: I1206 07:17:16.706066 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-lnbn8" Dec 06 07:17:16 crc kubenswrapper[4954]: I1206 07:17:16.717979 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3d6a09b0-a359-48db-9682-88114d28e3d9","Type":"ContainerStarted","Data":"c79369a269d20fe6e8e33c11b23244d21c6b3f4b11f43329f758a9d72ccb557c"} Dec 06 07:17:16 crc kubenswrapper[4954]: I1206 07:17:16.718136 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 06 07:17:16 crc kubenswrapper[4954]: I1206 07:17:16.748357 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=24.027366193 podStartE2EDuration="31.748322674s" podCreationTimestamp="2025-12-06 07:16:45 +0000 UTC" firstStartedPulling="2025-12-06 07:17:07.940926041 +0000 UTC m=+1202.754285430" lastFinishedPulling="2025-12-06 07:17:15.661882522 +0000 UTC m=+1210.475241911" observedRunningTime="2025-12-06 07:17:16.74112751 +0000 UTC m=+1211.554486909" watchObservedRunningTime="2025-12-06 07:17:16.748322674 +0000 UTC m=+1211.561682073" Dec 06 07:17:16 crc kubenswrapper[4954]: I1206 07:17:16.797765 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=20.451380265 podStartE2EDuration="29.7977401s" podCreationTimestamp="2025-12-06 07:16:47 +0000 UTC" firstStartedPulling="2025-12-06 07:17:06.919495293 +0000 UTC m=+1201.732854682" lastFinishedPulling="2025-12-06 07:17:16.265855128 +0000 UTC m=+1211.079214517" observedRunningTime="2025-12-06 07:17:16.791943964 +0000 UTC m=+1211.605303353" watchObservedRunningTime="2025-12-06 07:17:16.7977401 +0000 UTC m=+1211.611099489" Dec 06 07:17:16 crc kubenswrapper[4954]: I1206 07:17:16.838779 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lnbn8" podStartSLOduration=17.588592539 podStartE2EDuration="25.838721709s" podCreationTimestamp="2025-12-06 07:16:51 +0000 UTC" firstStartedPulling="2025-12-06 07:17:07.94016403 +0000 UTC m=+1202.753523439" lastFinishedPulling="2025-12-06 07:17:16.19029322 +0000 UTC m=+1211.003652609" observedRunningTime="2025-12-06 07:17:16.837133766 +0000 UTC m=+1211.650493165" watchObservedRunningTime="2025-12-06 07:17:16.838721709 +0000 UTC m=+1211.652081098" Dec 06 07:17:17 crc kubenswrapper[4954]: I1206 07:17:17.730318 4954 generic.go:334] "Generic (PLEG): container finished" podID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerID="f4bf4637ad6f38b3a7a4d4f5071df3a875d12aa499a4cfceb51cfd663705b57f" exitCode=0 Dec 06 07:17:17 crc kubenswrapper[4954]: I1206 07:17:17.730435 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xskgs" event={"ID":"a85b49d0-cc8d-4dce-aade-6c63af659f42","Type":"ContainerDied","Data":"f4bf4637ad6f38b3a7a4d4f5071df3a875d12aa499a4cfceb51cfd663705b57f"} Dec 06 07:17:17 crc kubenswrapper[4954]: I1206 07:17:17.735108 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"841d2cc4-0265-4e02-af59-f0f322208f02","Type":"ContainerStarted","Data":"a193b8a0fa409b5321f53a307fc650e4cd2f601366017277f503ce0ae781286a"} Dec 06 07:17:17 crc kubenswrapper[4954]: I1206 07:17:17.740407 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" event={"ID":"6c4927a7-3a81-432e-8652-9b16301f905b","Type":"ContainerStarted","Data":"5ca913012735c5d1657d0e86011c8e22f7ecb63f9b2ac5d0c79b98ceb688224c"} Dec 06 07:17:17 crc kubenswrapper[4954]: I1206 07:17:17.740723 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" Dec 06 07:17:17 crc kubenswrapper[4954]: I1206 07:17:17.744095 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f9eaecc6-8e85-432e-a906-3fcecee9fc1d","Type":"ContainerStarted","Data":"eace79b9d5ea075491fb63d59c67b1bf00e2fa80e5cc5eed0c509ad5570509ae"} Dec 06 07:17:17 crc kubenswrapper[4954]: I1206 07:17:17.778884 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=32.366848196 podStartE2EDuration="34.778860315s" podCreationTimestamp="2025-12-06 07:16:43 +0000 UTC" firstStartedPulling="2025-12-06 07:17:07.931305153 +0000 UTC m=+1202.744664542" lastFinishedPulling="2025-12-06 07:17:10.343317262 +0000 UTC m=+1205.156676661" observedRunningTime="2025-12-06 07:17:17.776008959 +0000 UTC m=+1212.589368358" watchObservedRunningTime="2025-12-06 07:17:17.778860315 +0000 UTC m=+1212.592219704" Dec 06 07:17:17 crc kubenswrapper[4954]: I1206 07:17:17.803605 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=30.027189595 podStartE2EDuration="35.803576879s" podCreationTimestamp="2025-12-06 07:16:42 +0000 UTC" firstStartedPulling="2025-12-06 07:17:04.587624954 +0000 UTC m=+1199.400984343" lastFinishedPulling="2025-12-06 07:17:10.364012238 +0000 UTC m=+1205.177371627" observedRunningTime="2025-12-06 07:17:17.798657407 +0000 UTC m=+1212.612016816" watchObservedRunningTime="2025-12-06 07:17:17.803576879 +0000 UTC m=+1212.616936268" Dec 06 07:17:17 crc kubenswrapper[4954]: I1206 07:17:17.825994 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" podStartSLOduration=11.191329259 podStartE2EDuration="37.8259701s" podCreationTimestamp="2025-12-06 07:16:40 +0000 UTC" firstStartedPulling="2025-12-06 07:16:41.31189562 +0000 UTC m=+1176.125255009" lastFinishedPulling="2025-12-06 07:17:07.946536461 +0000 UTC m=+1202.759895850" observedRunningTime="2025-12-06 07:17:17.821656784 +0000 UTC m=+1212.635016193" watchObservedRunningTime="2025-12-06 07:17:17.8259701 +0000 UTC m=+1212.639329489" Dec 06 07:17:19 crc kubenswrapper[4954]: I1206 07:17:19.765756 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7","Type":"ContainerStarted","Data":"f5f430ad3df9e0cf4cc84ee32b89af25b995b3297d702d7f97feec4a11fa4787"} Dec 06 07:17:19 crc kubenswrapper[4954]: I1206 07:17:19.767528 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xskgs" event={"ID":"a85b49d0-cc8d-4dce-aade-6c63af659f42","Type":"ContainerStarted","Data":"2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc"} Dec 06 07:17:19 crc kubenswrapper[4954]: I1206 07:17:19.769780 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"40b282e6-b847-4393-89f2-7844fce43388","Type":"ContainerStarted","Data":"e5d78eb2813e941bbf1327182aea6826153731bf9030370520ea6cf536e2c69c"} Dec 06 07:17:19 crc kubenswrapper[4954]: I1206 07:17:19.795986 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=20.769063316 podStartE2EDuration="29.795960339s" podCreationTimestamp="2025-12-06 07:16:50 +0000 UTC" firstStartedPulling="2025-12-06 07:17:10.25565276 +0000 UTC m=+1205.069012149" lastFinishedPulling="2025-12-06 07:17:19.282549783 +0000 UTC m=+1214.095909172" observedRunningTime="2025-12-06 07:17:19.792922997 +0000 UTC m=+1214.606282386" watchObservedRunningTime="2025-12-06 07:17:19.795960339 +0000 UTC m=+1214.609319728" Dec 06 07:17:20 crc kubenswrapper[4954]: I1206 07:17:20.257133 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 06 07:17:20 crc kubenswrapper[4954]: I1206 07:17:20.866789 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xskgs" event={"ID":"a85b49d0-cc8d-4dce-aade-6c63af659f42","Type":"ContainerStarted","Data":"f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d"} Dec 06 07:17:20 crc kubenswrapper[4954]: I1206 07:17:20.953172 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=16.647312237 podStartE2EDuration="27.953151069s" podCreationTimestamp="2025-12-06 07:16:53 +0000 UTC" firstStartedPulling="2025-12-06 07:17:07.967837503 +0000 UTC m=+1202.781196892" lastFinishedPulling="2025-12-06 07:17:19.273676335 +0000 UTC m=+1214.087035724" observedRunningTime="2025-12-06 07:17:19.81799483 +0000 UTC m=+1214.631354239" watchObservedRunningTime="2025-12-06 07:17:20.953151069 +0000 UTC m=+1215.766510458" Dec 06 07:17:20 crc kubenswrapper[4954]: I1206 07:17:20.956454 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-xskgs" podStartSLOduration=22.103475873 podStartE2EDuration="29.956441637s" podCreationTimestamp="2025-12-06 07:16:51 +0000 UTC" firstStartedPulling="2025-12-06 07:17:07.99679551 +0000 UTC m=+1202.810154899" lastFinishedPulling="2025-12-06 07:17:15.849761274 +0000 UTC m=+1210.663120663" observedRunningTime="2025-12-06 07:17:20.951253208 +0000 UTC m=+1215.764612617" watchObservedRunningTime="2025-12-06 07:17:20.956441637 +0000 UTC m=+1215.769801026" Dec 06 07:17:21 crc kubenswrapper[4954]: I1206 07:17:21.633881 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:17:21 crc kubenswrapper[4954]: I1206 07:17:21.634380 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:17:22 crc kubenswrapper[4954]: I1206 07:17:22.257215 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 06 07:17:22 crc kubenswrapper[4954]: I1206 07:17:22.300409 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 06 07:17:22 crc kubenswrapper[4954]: I1206 07:17:22.434791 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 06 07:17:22 crc kubenswrapper[4954]: I1206 07:17:22.434888 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 06 07:17:22 crc kubenswrapper[4954]: I1206 07:17:22.483203 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 06 07:17:22 crc kubenswrapper[4954]: I1206 07:17:22.887369 4954 generic.go:334] "Generic (PLEG): container finished" podID="8d561be2-2452-466d-acd4-230f18bc29f0" containerID="9835f1664512d3768f5ae0f88e5273fdf7a5237361968547c30e9fa79257966d" exitCode=0 Dec 06 07:17:22 crc kubenswrapper[4954]: I1206 07:17:22.887459 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-xvgwv" event={"ID":"8d561be2-2452-466d-acd4-230f18bc29f0","Type":"ContainerDied","Data":"9835f1664512d3768f5ae0f88e5273fdf7a5237361968547c30e9fa79257966d"} Dec 06 07:17:22 crc kubenswrapper[4954]: I1206 07:17:22.938773 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 06 07:17:22 crc kubenswrapper[4954]: I1206 07:17:22.960348 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.179688 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-n5dql"] Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.180021 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" podUID="6c4927a7-3a81-432e-8652-9b16301f905b" containerName="dnsmasq-dns" containerID="cri-o://5ca913012735c5d1657d0e86011c8e22f7ecb63f9b2ac5d0c79b98ceb688224c" gracePeriod=10 Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.181299 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.245789 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c67bcdbf5-92x9s"] Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.247985 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.252741 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.255034 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c67bcdbf5-92x9s"] Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.340474 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-hcgg8"] Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.342790 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.350524 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.369386 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hcgg8"] Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.431439 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/75a7f9b6-6582-4498-bccd-954270bc3f8e-ovs-rundir\") pod \"ovn-controller-metrics-hcgg8\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.431532 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/75a7f9b6-6582-4498-bccd-954270bc3f8e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hcgg8\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.431611 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6qj9\" (UniqueName: \"kubernetes.io/projected/75a7f9b6-6582-4498-bccd-954270bc3f8e-kube-api-access-t6qj9\") pod \"ovn-controller-metrics-hcgg8\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.431638 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/75a7f9b6-6582-4498-bccd-954270bc3f8e-ovn-rundir\") pod \"ovn-controller-metrics-hcgg8\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.431678 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46bc8cab-9415-4928-a4cf-715976f4119c-config\") pod \"dnsmasq-dns-6c67bcdbf5-92x9s\" (UID: \"46bc8cab-9415-4928-a4cf-715976f4119c\") " pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.431708 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75a7f9b6-6582-4498-bccd-954270bc3f8e-config\") pod \"ovn-controller-metrics-hcgg8\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.431747 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46bc8cab-9415-4928-a4cf-715976f4119c-dns-svc\") pod \"dnsmasq-dns-6c67bcdbf5-92x9s\" (UID: \"46bc8cab-9415-4928-a4cf-715976f4119c\") " pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.431772 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46bc8cab-9415-4928-a4cf-715976f4119c-ovsdbserver-nb\") pod \"dnsmasq-dns-6c67bcdbf5-92x9s\" (UID: \"46bc8cab-9415-4928-a4cf-715976f4119c\") " pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.431796 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a7f9b6-6582-4498-bccd-954270bc3f8e-combined-ca-bundle\") pod \"ovn-controller-metrics-hcgg8\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.431830 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vhn9\" (UniqueName: \"kubernetes.io/projected/46bc8cab-9415-4928-a4cf-715976f4119c-kube-api-access-5vhn9\") pod \"dnsmasq-dns-6c67bcdbf5-92x9s\" (UID: \"46bc8cab-9415-4928-a4cf-715976f4119c\") " pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.533597 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/75a7f9b6-6582-4498-bccd-954270bc3f8e-ovs-rundir\") pod \"ovn-controller-metrics-hcgg8\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.533757 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/75a7f9b6-6582-4498-bccd-954270bc3f8e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hcgg8\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.533896 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6qj9\" (UniqueName: \"kubernetes.io/projected/75a7f9b6-6582-4498-bccd-954270bc3f8e-kube-api-access-t6qj9\") pod \"ovn-controller-metrics-hcgg8\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.533929 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/75a7f9b6-6582-4498-bccd-954270bc3f8e-ovn-rundir\") pod \"ovn-controller-metrics-hcgg8\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.533957 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46bc8cab-9415-4928-a4cf-715976f4119c-config\") pod \"dnsmasq-dns-6c67bcdbf5-92x9s\" (UID: \"46bc8cab-9415-4928-a4cf-715976f4119c\") " pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.533992 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75a7f9b6-6582-4498-bccd-954270bc3f8e-config\") pod \"ovn-controller-metrics-hcgg8\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.534011 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46bc8cab-9415-4928-a4cf-715976f4119c-dns-svc\") pod \"dnsmasq-dns-6c67bcdbf5-92x9s\" (UID: \"46bc8cab-9415-4928-a4cf-715976f4119c\") " pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.534060 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46bc8cab-9415-4928-a4cf-715976f4119c-ovsdbserver-nb\") pod \"dnsmasq-dns-6c67bcdbf5-92x9s\" (UID: \"46bc8cab-9415-4928-a4cf-715976f4119c\") " pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.534084 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a7f9b6-6582-4498-bccd-954270bc3f8e-combined-ca-bundle\") pod \"ovn-controller-metrics-hcgg8\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.534109 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vhn9\" (UniqueName: \"kubernetes.io/projected/46bc8cab-9415-4928-a4cf-715976f4119c-kube-api-access-5vhn9\") pod \"dnsmasq-dns-6c67bcdbf5-92x9s\" (UID: \"46bc8cab-9415-4928-a4cf-715976f4119c\") " pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.534912 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/75a7f9b6-6582-4498-bccd-954270bc3f8e-ovs-rundir\") pod \"ovn-controller-metrics-hcgg8\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.536616 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75a7f9b6-6582-4498-bccd-954270bc3f8e-config\") pod \"ovn-controller-metrics-hcgg8\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.536936 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/75a7f9b6-6582-4498-bccd-954270bc3f8e-ovn-rundir\") pod \"ovn-controller-metrics-hcgg8\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.537730 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46bc8cab-9415-4928-a4cf-715976f4119c-config\") pod \"dnsmasq-dns-6c67bcdbf5-92x9s\" (UID: \"46bc8cab-9415-4928-a4cf-715976f4119c\") " pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.539448 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46bc8cab-9415-4928-a4cf-715976f4119c-dns-svc\") pod \"dnsmasq-dns-6c67bcdbf5-92x9s\" (UID: \"46bc8cab-9415-4928-a4cf-715976f4119c\") " pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.540840 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46bc8cab-9415-4928-a4cf-715976f4119c-ovsdbserver-nb\") pod \"dnsmasq-dns-6c67bcdbf5-92x9s\" (UID: \"46bc8cab-9415-4928-a4cf-715976f4119c\") " pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.555935 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/75a7f9b6-6582-4498-bccd-954270bc3f8e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hcgg8\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.572533 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a7f9b6-6582-4498-bccd-954270bc3f8e-combined-ca-bundle\") pod \"ovn-controller-metrics-hcgg8\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.582930 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-xvgwv"] Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.590588 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vhn9\" (UniqueName: \"kubernetes.io/projected/46bc8cab-9415-4928-a4cf-715976f4119c-kube-api-access-5vhn9\") pod \"dnsmasq-dns-6c67bcdbf5-92x9s\" (UID: \"46bc8cab-9415-4928-a4cf-715976f4119c\") " pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.597720 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.598684 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6qj9\" (UniqueName: \"kubernetes.io/projected/75a7f9b6-6582-4498-bccd-954270bc3f8e-kube-api-access-t6qj9\") pod \"ovn-controller-metrics-hcgg8\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.610649 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.612311 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.615830 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-5zxpw" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.616196 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.616369 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.621165 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.649379 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.693921 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.701096 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-984c76dd7-vkbsv"] Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.728000 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.734821 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-984c76dd7-vkbsv"] Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.738426 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.740406 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de2aad5-fb15-489c-b0fc-200e18ad3baa-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.740494 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfdlk\" (UniqueName: \"kubernetes.io/projected/6de2aad5-fb15-489c-b0fc-200e18ad3baa-kube-api-access-gfdlk\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.740524 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de2aad5-fb15-489c-b0fc-200e18ad3baa-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.740622 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6de2aad5-fb15-489c-b0fc-200e18ad3baa-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.740654 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6de2aad5-fb15-489c-b0fc-200e18ad3baa-scripts\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.740695 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de2aad5-fb15-489c-b0fc-200e18ad3baa-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.740723 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de2aad5-fb15-489c-b0fc-200e18ad3baa-config\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.843837 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-ovsdbserver-sb\") pod \"dnsmasq-dns-984c76dd7-vkbsv\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.845831 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-ovsdbserver-nb\") pod \"dnsmasq-dns-984c76dd7-vkbsv\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.845910 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6de2aad5-fb15-489c-b0fc-200e18ad3baa-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.845957 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6de2aad5-fb15-489c-b0fc-200e18ad3baa-scripts\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.846046 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de2aad5-fb15-489c-b0fc-200e18ad3baa-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.846095 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de2aad5-fb15-489c-b0fc-200e18ad3baa-config\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.847730 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6de2aad5-fb15-489c-b0fc-200e18ad3baa-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.848518 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6de2aad5-fb15-489c-b0fc-200e18ad3baa-scripts\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.854990 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de2aad5-fb15-489c-b0fc-200e18ad3baa-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.856913 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de2aad5-fb15-489c-b0fc-200e18ad3baa-config\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.856955 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-config\") pod \"dnsmasq-dns-984c76dd7-vkbsv\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.857002 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkhw2\" (UniqueName: \"kubernetes.io/projected/d67c94fc-f109-4536-b490-59598ee00232-kube-api-access-nkhw2\") pod \"dnsmasq-dns-984c76dd7-vkbsv\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.857050 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de2aad5-fb15-489c-b0fc-200e18ad3baa-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.857089 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-dns-svc\") pod \"dnsmasq-dns-984c76dd7-vkbsv\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.857252 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfdlk\" (UniqueName: \"kubernetes.io/projected/6de2aad5-fb15-489c-b0fc-200e18ad3baa-kube-api-access-gfdlk\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.857287 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de2aad5-fb15-489c-b0fc-200e18ad3baa-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.878944 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de2aad5-fb15-489c-b0fc-200e18ad3baa-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.902750 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-xvgwv" event={"ID":"8d561be2-2452-466d-acd4-230f18bc29f0","Type":"ContainerStarted","Data":"0b3202f3b1f1e9fe1dceb1123f94d13cc18c38a97e47b551dfb1e8359e3e1784"} Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.902827 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cb666b895-xvgwv" podUID="8d561be2-2452-466d-acd4-230f18bc29f0" containerName="dnsmasq-dns" containerID="cri-o://0b3202f3b1f1e9fe1dceb1123f94d13cc18c38a97e47b551dfb1e8359e3e1784" gracePeriod=10 Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.903541 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb666b895-xvgwv" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.906864 4954 generic.go:334] "Generic (PLEG): container finished" podID="6c4927a7-3a81-432e-8652-9b16301f905b" containerID="5ca913012735c5d1657d0e86011c8e22f7ecb63f9b2ac5d0c79b98ceb688224c" exitCode=0 Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.907337 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" event={"ID":"6c4927a7-3a81-432e-8652-9b16301f905b","Type":"ContainerDied","Data":"5ca913012735c5d1657d0e86011c8e22f7ecb63f9b2ac5d0c79b98ceb688224c"} Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.909862 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfdlk\" (UniqueName: \"kubernetes.io/projected/6de2aad5-fb15-489c-b0fc-200e18ad3baa-kube-api-access-gfdlk\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.920762 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de2aad5-fb15-489c-b0fc-200e18ad3baa-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.960106 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-config\") pod \"dnsmasq-dns-984c76dd7-vkbsv\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.960174 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkhw2\" (UniqueName: \"kubernetes.io/projected/d67c94fc-f109-4536-b490-59598ee00232-kube-api-access-nkhw2\") pod \"dnsmasq-dns-984c76dd7-vkbsv\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.960202 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-dns-svc\") pod \"dnsmasq-dns-984c76dd7-vkbsv\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.960264 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-ovsdbserver-sb\") pod \"dnsmasq-dns-984c76dd7-vkbsv\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.960483 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.961539 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-ovsdbserver-nb\") pod \"dnsmasq-dns-984c76dd7-vkbsv\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.962315 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-dns-svc\") pod \"dnsmasq-dns-984c76dd7-vkbsv\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.962761 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-ovsdbserver-sb\") pod \"dnsmasq-dns-984c76dd7-vkbsv\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.981361 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb666b895-xvgwv" podStartSLOduration=-9223371993.873453 podStartE2EDuration="42.981322002s" podCreationTimestamp="2025-12-06 07:16:41 +0000 UTC" firstStartedPulling="2025-12-06 07:16:42.156634616 +0000 UTC m=+1176.969994005" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:17:23.953244258 +0000 UTC m=+1218.766603657" watchObservedRunningTime="2025-12-06 07:17:23.981322002 +0000 UTC m=+1218.794681391" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.981511 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-ovsdbserver-nb\") pod \"dnsmasq-dns-984c76dd7-vkbsv\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:23 crc kubenswrapper[4954]: I1206 07:17:23.982633 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-config\") pod \"dnsmasq-dns-984c76dd7-vkbsv\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:24 crc kubenswrapper[4954]: I1206 07:17:24.008904 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkhw2\" (UniqueName: \"kubernetes.io/projected/d67c94fc-f109-4536-b490-59598ee00232-kube-api-access-nkhw2\") pod \"dnsmasq-dns-984c76dd7-vkbsv\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:24 crc kubenswrapper[4954]: I1206 07:17:24.118347 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:24 crc kubenswrapper[4954]: I1206 07:17:24.491336 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 06 07:17:24 crc kubenswrapper[4954]: I1206 07:17:24.492783 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 06 07:17:24 crc kubenswrapper[4954]: I1206 07:17:24.642430 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c67bcdbf5-92x9s"] Dec 06 07:17:24 crc kubenswrapper[4954]: W1206 07:17:24.754621 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46bc8cab_9415_4928_a4cf_715976f4119c.slice/crio-8b0b07b2880457c8afaec6b0a363c1437bf7ef331100c7c1562c00f373dd3770 WatchSource:0}: Error finding container 8b0b07b2880457c8afaec6b0a363c1437bf7ef331100c7c1562c00f373dd3770: Status 404 returned error can't find the container with id 8b0b07b2880457c8afaec6b0a363c1437bf7ef331100c7c1562c00f373dd3770 Dec 06 07:17:24 crc kubenswrapper[4954]: I1206 07:17:24.868634 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 06 07:17:24 crc kubenswrapper[4954]: I1206 07:17:24.921333 4954 generic.go:334] "Generic (PLEG): container finished" podID="8d561be2-2452-466d-acd4-230f18bc29f0" containerID="0b3202f3b1f1e9fe1dceb1123f94d13cc18c38a97e47b551dfb1e8359e3e1784" exitCode=0 Dec 06 07:17:24 crc kubenswrapper[4954]: I1206 07:17:24.921423 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-xvgwv" event={"ID":"8d561be2-2452-466d-acd4-230f18bc29f0","Type":"ContainerDied","Data":"0b3202f3b1f1e9fe1dceb1123f94d13cc18c38a97e47b551dfb1e8359e3e1784"} Dec 06 07:17:24 crc kubenswrapper[4954]: I1206 07:17:24.922542 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" event={"ID":"46bc8cab-9415-4928-a4cf-715976f4119c","Type":"ContainerStarted","Data":"8b0b07b2880457c8afaec6b0a363c1437bf7ef331100c7c1562c00f373dd3770"} Dec 06 07:17:24 crc kubenswrapper[4954]: I1206 07:17:24.922982 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" Dec 06 07:17:24 crc kubenswrapper[4954]: I1206 07:17:24.932824 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" event={"ID":"6c4927a7-3a81-432e-8652-9b16301f905b","Type":"ContainerDied","Data":"cb92e4771131cbd2c30926bc896e39ca7ac4c3de2d64d06f319860679ed11524"} Dec 06 07:17:24 crc kubenswrapper[4954]: I1206 07:17:24.932884 4954 scope.go:117] "RemoveContainer" containerID="5ca913012735c5d1657d0e86011c8e22f7ecb63f9b2ac5d0c79b98ceb688224c" Dec 06 07:17:24 crc kubenswrapper[4954]: I1206 07:17:24.988522 4954 scope.go:117] "RemoveContainer" containerID="a49d63d97b83303d2a217ac2992017385b637df2af116483d603916b4a07f190" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.009086 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-xvgwv" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.077588 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.090283 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzwmv\" (UniqueName: \"kubernetes.io/projected/8d561be2-2452-466d-acd4-230f18bc29f0-kube-api-access-tzwmv\") pod \"8d561be2-2452-466d-acd4-230f18bc29f0\" (UID: \"8d561be2-2452-466d-acd4-230f18bc29f0\") " Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.090408 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4927a7-3a81-432e-8652-9b16301f905b-dns-svc\") pod \"6c4927a7-3a81-432e-8652-9b16301f905b\" (UID: \"6c4927a7-3a81-432e-8652-9b16301f905b\") " Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.090482 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb7mv\" (UniqueName: \"kubernetes.io/projected/6c4927a7-3a81-432e-8652-9b16301f905b-kube-api-access-pb7mv\") pod \"6c4927a7-3a81-432e-8652-9b16301f905b\" (UID: \"6c4927a7-3a81-432e-8652-9b16301f905b\") " Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.090595 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d561be2-2452-466d-acd4-230f18bc29f0-config\") pod \"8d561be2-2452-466d-acd4-230f18bc29f0\" (UID: \"8d561be2-2452-466d-acd4-230f18bc29f0\") " Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.090637 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4927a7-3a81-432e-8652-9b16301f905b-config\") pod \"6c4927a7-3a81-432e-8652-9b16301f905b\" (UID: \"6c4927a7-3a81-432e-8652-9b16301f905b\") " Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.090663 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d561be2-2452-466d-acd4-230f18bc29f0-dns-svc\") pod \"8d561be2-2452-466d-acd4-230f18bc29f0\" (UID: \"8d561be2-2452-466d-acd4-230f18bc29f0\") " Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.098329 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c4927a7-3a81-432e-8652-9b16301f905b-kube-api-access-pb7mv" (OuterVolumeSpecName: "kube-api-access-pb7mv") pod "6c4927a7-3a81-432e-8652-9b16301f905b" (UID: "6c4927a7-3a81-432e-8652-9b16301f905b"). InnerVolumeSpecName "kube-api-access-pb7mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.099592 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d561be2-2452-466d-acd4-230f18bc29f0-kube-api-access-tzwmv" (OuterVolumeSpecName: "kube-api-access-tzwmv") pod "8d561be2-2452-466d-acd4-230f18bc29f0" (UID: "8d561be2-2452-466d-acd4-230f18bc29f0"). InnerVolumeSpecName "kube-api-access-tzwmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.168367 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d561be2-2452-466d-acd4-230f18bc29f0-config" (OuterVolumeSpecName: "config") pod "8d561be2-2452-466d-acd4-230f18bc29f0" (UID: "8d561be2-2452-466d-acd4-230f18bc29f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.172004 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c4927a7-3a81-432e-8652-9b16301f905b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c4927a7-3a81-432e-8652-9b16301f905b" (UID: "6c4927a7-3a81-432e-8652-9b16301f905b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.177903 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hcgg8"] Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.181082 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d561be2-2452-466d-acd4-230f18bc29f0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8d561be2-2452-466d-acd4-230f18bc29f0" (UID: "8d561be2-2452-466d-acd4-230f18bc29f0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.186928 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-984c76dd7-vkbsv"] Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.190232 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c4927a7-3a81-432e-8652-9b16301f905b-config" (OuterVolumeSpecName: "config") pod "6c4927a7-3a81-432e-8652-9b16301f905b" (UID: "6c4927a7-3a81-432e-8652-9b16301f905b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.192363 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzwmv\" (UniqueName: \"kubernetes.io/projected/8d561be2-2452-466d-acd4-230f18bc29f0-kube-api-access-tzwmv\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.192391 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4927a7-3a81-432e-8652-9b16301f905b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.192401 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb7mv\" (UniqueName: \"kubernetes.io/projected/6c4927a7-3a81-432e-8652-9b16301f905b-kube-api-access-pb7mv\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.192411 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d561be2-2452-466d-acd4-230f18bc29f0-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.192419 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4927a7-3a81-432e-8652-9b16301f905b-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.192427 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d561be2-2452-466d-acd4-230f18bc29f0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:25 crc kubenswrapper[4954]: W1206 07:17:25.194294 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75a7f9b6_6582_4498_bccd_954270bc3f8e.slice/crio-671cdc16f347824ef49d91085cf54d6c45dde241f4e4211fa782b345eaf525f3 WatchSource:0}: Error finding container 671cdc16f347824ef49d91085cf54d6c45dde241f4e4211fa782b345eaf525f3: Status 404 returned error can't find the container with id 671cdc16f347824ef49d91085cf54d6c45dde241f4e4211fa782b345eaf525f3 Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.206669 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.417933 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-w9lhq"] Dec 06 07:17:25 crc kubenswrapper[4954]: E1206 07:17:25.418358 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d561be2-2452-466d-acd4-230f18bc29f0" containerName="dnsmasq-dns" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.418374 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d561be2-2452-466d-acd4-230f18bc29f0" containerName="dnsmasq-dns" Dec 06 07:17:25 crc kubenswrapper[4954]: E1206 07:17:25.418391 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d561be2-2452-466d-acd4-230f18bc29f0" containerName="init" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.418399 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d561be2-2452-466d-acd4-230f18bc29f0" containerName="init" Dec 06 07:17:25 crc kubenswrapper[4954]: E1206 07:17:25.418410 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4927a7-3a81-432e-8652-9b16301f905b" containerName="dnsmasq-dns" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.418417 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4927a7-3a81-432e-8652-9b16301f905b" containerName="dnsmasq-dns" Dec 06 07:17:25 crc kubenswrapper[4954]: E1206 07:17:25.418431 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4927a7-3a81-432e-8652-9b16301f905b" containerName="init" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.418440 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4927a7-3a81-432e-8652-9b16301f905b" containerName="init" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.418642 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c4927a7-3a81-432e-8652-9b16301f905b" containerName="dnsmasq-dns" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.418658 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d561be2-2452-466d-acd4-230f18bc29f0" containerName="dnsmasq-dns" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.419317 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-w9lhq" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.434084 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b67a-account-create-update-pbmk9"] Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.435930 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b67a-account-create-update-pbmk9" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.443825 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.461096 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-w9lhq"] Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.466813 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b67a-account-create-update-pbmk9"] Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.496832 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hmsq\" (UniqueName: \"kubernetes.io/projected/0cfba967-3154-41c7-b541-45d9a8a2a122-kube-api-access-8hmsq\") pod \"keystone-db-create-w9lhq\" (UID: \"0cfba967-3154-41c7-b541-45d9a8a2a122\") " pod="openstack/keystone-db-create-w9lhq" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.497370 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cfba967-3154-41c7-b541-45d9a8a2a122-operator-scripts\") pod \"keystone-db-create-w9lhq\" (UID: \"0cfba967-3154-41c7-b541-45d9a8a2a122\") " pod="openstack/keystone-db-create-w9lhq" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.598775 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/054f854f-83bd-4174-bf86-da2b1a6cfedb-operator-scripts\") pod \"keystone-b67a-account-create-update-pbmk9\" (UID: \"054f854f-83bd-4174-bf86-da2b1a6cfedb\") " pod="openstack/keystone-b67a-account-create-update-pbmk9" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.598908 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws7k6\" (UniqueName: \"kubernetes.io/projected/054f854f-83bd-4174-bf86-da2b1a6cfedb-kube-api-access-ws7k6\") pod \"keystone-b67a-account-create-update-pbmk9\" (UID: \"054f854f-83bd-4174-bf86-da2b1a6cfedb\") " pod="openstack/keystone-b67a-account-create-update-pbmk9" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.599052 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hmsq\" (UniqueName: \"kubernetes.io/projected/0cfba967-3154-41c7-b541-45d9a8a2a122-kube-api-access-8hmsq\") pod \"keystone-db-create-w9lhq\" (UID: \"0cfba967-3154-41c7-b541-45d9a8a2a122\") " pod="openstack/keystone-db-create-w9lhq" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.599097 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cfba967-3154-41c7-b541-45d9a8a2a122-operator-scripts\") pod \"keystone-db-create-w9lhq\" (UID: \"0cfba967-3154-41c7-b541-45d9a8a2a122\") " pod="openstack/keystone-db-create-w9lhq" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.600144 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cfba967-3154-41c7-b541-45d9a8a2a122-operator-scripts\") pod \"keystone-db-create-w9lhq\" (UID: \"0cfba967-3154-41c7-b541-45d9a8a2a122\") " pod="openstack/keystone-db-create-w9lhq" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.615911 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-m72rg"] Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.617253 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m72rg" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.624854 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hmsq\" (UniqueName: \"kubernetes.io/projected/0cfba967-3154-41c7-b541-45d9a8a2a122-kube-api-access-8hmsq\") pod \"keystone-db-create-w9lhq\" (UID: \"0cfba967-3154-41c7-b541-45d9a8a2a122\") " pod="openstack/keystone-db-create-w9lhq" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.628860 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m72rg"] Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.665608 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.666931 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.702188 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws7k6\" (UniqueName: \"kubernetes.io/projected/054f854f-83bd-4174-bf86-da2b1a6cfedb-kube-api-access-ws7k6\") pod \"keystone-b67a-account-create-update-pbmk9\" (UID: \"054f854f-83bd-4174-bf86-da2b1a6cfedb\") " pod="openstack/keystone-b67a-account-create-update-pbmk9" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.713674 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3656e91b-4a6f-4ae8-a097-0f5961df5f25-operator-scripts\") pod \"placement-db-create-m72rg\" (UID: \"3656e91b-4a6f-4ae8-a097-0f5961df5f25\") " pod="openstack/placement-db-create-m72rg" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.714192 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/054f854f-83bd-4174-bf86-da2b1a6cfedb-operator-scripts\") pod \"keystone-b67a-account-create-update-pbmk9\" (UID: \"054f854f-83bd-4174-bf86-da2b1a6cfedb\") " pod="openstack/keystone-b67a-account-create-update-pbmk9" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.714323 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llvwj\" (UniqueName: \"kubernetes.io/projected/3656e91b-4a6f-4ae8-a097-0f5961df5f25-kube-api-access-llvwj\") pod \"placement-db-create-m72rg\" (UID: \"3656e91b-4a6f-4ae8-a097-0f5961df5f25\") " pod="openstack/placement-db-create-m72rg" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.716058 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/054f854f-83bd-4174-bf86-da2b1a6cfedb-operator-scripts\") pod \"keystone-b67a-account-create-update-pbmk9\" (UID: \"054f854f-83bd-4174-bf86-da2b1a6cfedb\") " pod="openstack/keystone-b67a-account-create-update-pbmk9" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.733784 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws7k6\" (UniqueName: \"kubernetes.io/projected/054f854f-83bd-4174-bf86-da2b1a6cfedb-kube-api-access-ws7k6\") pod \"keystone-b67a-account-create-update-pbmk9\" (UID: \"054f854f-83bd-4174-bf86-da2b1a6cfedb\") " pod="openstack/keystone-b67a-account-create-update-pbmk9" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.745920 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-88ab-account-create-update-8grdl"] Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.747419 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-88ab-account-create-update-8grdl" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.751255 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.754046 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-w9lhq" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.757845 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-88ab-account-create-update-8grdl"] Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.769057 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b67a-account-create-update-pbmk9" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.770260 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.790149 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.817684 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llvwj\" (UniqueName: \"kubernetes.io/projected/3656e91b-4a6f-4ae8-a097-0f5961df5f25-kube-api-access-llvwj\") pod \"placement-db-create-m72rg\" (UID: \"3656e91b-4a6f-4ae8-a097-0f5961df5f25\") " pod="openstack/placement-db-create-m72rg" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.817826 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3656e91b-4a6f-4ae8-a097-0f5961df5f25-operator-scripts\") pod \"placement-db-create-m72rg\" (UID: \"3656e91b-4a6f-4ae8-a097-0f5961df5f25\") " pod="openstack/placement-db-create-m72rg" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.820033 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3656e91b-4a6f-4ae8-a097-0f5961df5f25-operator-scripts\") pod \"placement-db-create-m72rg\" (UID: \"3656e91b-4a6f-4ae8-a097-0f5961df5f25\") " pod="openstack/placement-db-create-m72rg" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.844617 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llvwj\" (UniqueName: \"kubernetes.io/projected/3656e91b-4a6f-4ae8-a097-0f5961df5f25-kube-api-access-llvwj\") pod \"placement-db-create-m72rg\" (UID: \"3656e91b-4a6f-4ae8-a097-0f5961df5f25\") " pod="openstack/placement-db-create-m72rg" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.920372 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/372dbd7a-e026-4794-9f89-071af1d6e3a8-operator-scripts\") pod \"placement-88ab-account-create-update-8grdl\" (UID: \"372dbd7a-e026-4794-9f89-071af1d6e3a8\") " pod="openstack/placement-88ab-account-create-update-8grdl" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.920491 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tsvx\" (UniqueName: \"kubernetes.io/projected/372dbd7a-e026-4794-9f89-071af1d6e3a8-kube-api-access-7tsvx\") pod \"placement-88ab-account-create-update-8grdl\" (UID: \"372dbd7a-e026-4794-9f89-071af1d6e3a8\") " pod="openstack/placement-88ab-account-create-update-8grdl" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.961278 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hcgg8" event={"ID":"75a7f9b6-6582-4498-bccd-954270bc3f8e","Type":"ContainerStarted","Data":"051545d1d756d25636df272c7f761b7921bf9ef327f55451e1d78ab235564be7"} Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.961355 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hcgg8" event={"ID":"75a7f9b6-6582-4498-bccd-954270bc3f8e","Type":"ContainerStarted","Data":"671cdc16f347824ef49d91085cf54d6c45dde241f4e4211fa782b345eaf525f3"} Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.968553 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-n5dql" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.972847 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-xvgwv" event={"ID":"8d561be2-2452-466d-acd4-230f18bc29f0","Type":"ContainerDied","Data":"f0c95a97c0a011d3208ea6bff90a7dc5870e9dbd9e14e2e43731e739fbceed97"} Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.972922 4954 scope.go:117] "RemoveContainer" containerID="0b3202f3b1f1e9fe1dceb1123f94d13cc18c38a97e47b551dfb1e8359e3e1784" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.973258 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-xvgwv" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.975081 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6de2aad5-fb15-489c-b0fc-200e18ad3baa","Type":"ContainerStarted","Data":"5f9b60d52550887376cc37531f59cf1c2cf9cd5805e21382219d8a746606cefb"} Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.989202 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m72rg" Dec 06 07:17:25 crc kubenswrapper[4954]: I1206 07:17:25.998262 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-hcgg8" podStartSLOduration=2.99822837 podStartE2EDuration="2.99822837s" podCreationTimestamp="2025-12-06 07:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:17:25.982536699 +0000 UTC m=+1220.795896098" watchObservedRunningTime="2025-12-06 07:17:25.99822837 +0000 UTC m=+1220.811587759" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.010245 4954 generic.go:334] "Generic (PLEG): container finished" podID="46bc8cab-9415-4928-a4cf-715976f4119c" containerID="d290b1fac613ad4d1e95398cdc516a96f9966a8a892928d7e64a4ad000e655d3" exitCode=0 Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.010369 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" event={"ID":"46bc8cab-9415-4928-a4cf-715976f4119c","Type":"ContainerDied","Data":"d290b1fac613ad4d1e95398cdc516a96f9966a8a892928d7e64a4ad000e655d3"} Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.022006 4954 generic.go:334] "Generic (PLEG): container finished" podID="d67c94fc-f109-4536-b490-59598ee00232" containerID="7f199df227d670a85f4e663f49a89301006012cae58fd5b6e637804cb99896df" exitCode=0 Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.025733 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" event={"ID":"d67c94fc-f109-4536-b490-59598ee00232","Type":"ContainerDied","Data":"7f199df227d670a85f4e663f49a89301006012cae58fd5b6e637804cb99896df"} Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.025776 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" event={"ID":"d67c94fc-f109-4536-b490-59598ee00232","Type":"ContainerStarted","Data":"aa27b09f792da0fa724357a00850447a1cf434bec94238fdc218ed1290a47f4a"} Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.027497 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/372dbd7a-e026-4794-9f89-071af1d6e3a8-operator-scripts\") pod \"placement-88ab-account-create-update-8grdl\" (UID: \"372dbd7a-e026-4794-9f89-071af1d6e3a8\") " pod="openstack/placement-88ab-account-create-update-8grdl" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.027713 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tsvx\" (UniqueName: \"kubernetes.io/projected/372dbd7a-e026-4794-9f89-071af1d6e3a8-kube-api-access-7tsvx\") pod \"placement-88ab-account-create-update-8grdl\" (UID: \"372dbd7a-e026-4794-9f89-071af1d6e3a8\") " pod="openstack/placement-88ab-account-create-update-8grdl" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.027927 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/372dbd7a-e026-4794-9f89-071af1d6e3a8-operator-scripts\") pod \"placement-88ab-account-create-update-8grdl\" (UID: \"372dbd7a-e026-4794-9f89-071af1d6e3a8\") " pod="openstack/placement-88ab-account-create-update-8grdl" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.027990 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-n5dql"] Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.037705 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-829jd"] Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.039175 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-829jd" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.077295 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-n5dql"] Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.077808 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tsvx\" (UniqueName: \"kubernetes.io/projected/372dbd7a-e026-4794-9f89-071af1d6e3a8-kube-api-access-7tsvx\") pod \"placement-88ab-account-create-update-8grdl\" (UID: \"372dbd7a-e026-4794-9f89-071af1d6e3a8\") " pod="openstack/placement-88ab-account-create-update-8grdl" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.089640 4954 scope.go:117] "RemoveContainer" containerID="9835f1664512d3768f5ae0f88e5273fdf7a5237361968547c30e9fa79257966d" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.129572 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-829jd"] Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.133849 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28dfh\" (UniqueName: \"kubernetes.io/projected/9ca78e5f-9977-49fa-b175-504c48bf1861-kube-api-access-28dfh\") pod \"glance-db-create-829jd\" (UID: \"9ca78e5f-9977-49fa-b175-504c48bf1861\") " pod="openstack/glance-db-create-829jd" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.133948 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca78e5f-9977-49fa-b175-504c48bf1861-operator-scripts\") pod \"glance-db-create-829jd\" (UID: \"9ca78e5f-9977-49fa-b175-504c48bf1861\") " pod="openstack/glance-db-create-829jd" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.178889 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-88ab-account-create-update-8grdl" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.183062 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.207773 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-xvgwv"] Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.227770 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-xvgwv"] Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.235492 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca78e5f-9977-49fa-b175-504c48bf1861-operator-scripts\") pod \"glance-db-create-829jd\" (UID: \"9ca78e5f-9977-49fa-b175-504c48bf1861\") " pod="openstack/glance-db-create-829jd" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.235695 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28dfh\" (UniqueName: \"kubernetes.io/projected/9ca78e5f-9977-49fa-b175-504c48bf1861-kube-api-access-28dfh\") pod \"glance-db-create-829jd\" (UID: \"9ca78e5f-9977-49fa-b175-504c48bf1861\") " pod="openstack/glance-db-create-829jd" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.240885 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca78e5f-9977-49fa-b175-504c48bf1861-operator-scripts\") pod \"glance-db-create-829jd\" (UID: \"9ca78e5f-9977-49fa-b175-504c48bf1861\") " pod="openstack/glance-db-create-829jd" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.264481 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28dfh\" (UniqueName: \"kubernetes.io/projected/9ca78e5f-9977-49fa-b175-504c48bf1861-kube-api-access-28dfh\") pod \"glance-db-create-829jd\" (UID: \"9ca78e5f-9977-49fa-b175-504c48bf1861\") " pod="openstack/glance-db-create-829jd" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.275193 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-d905-account-create-update-tljn7"] Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.279375 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d905-account-create-update-tljn7" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.284764 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.289757 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d905-account-create-update-tljn7"] Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.338473 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94b5d766-a567-4328-9942-c0dc2eede97e-operator-scripts\") pod \"glance-d905-account-create-update-tljn7\" (UID: \"94b5d766-a567-4328-9942-c0dc2eede97e\") " pod="openstack/glance-d905-account-create-update-tljn7" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.338659 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmdsx\" (UniqueName: \"kubernetes.io/projected/94b5d766-a567-4328-9942-c0dc2eede97e-kube-api-access-jmdsx\") pod \"glance-d905-account-create-update-tljn7\" (UID: \"94b5d766-a567-4328-9942-c0dc2eede97e\") " pod="openstack/glance-d905-account-create-update-tljn7" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.377323 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-829jd" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.442241 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94b5d766-a567-4328-9942-c0dc2eede97e-operator-scripts\") pod \"glance-d905-account-create-update-tljn7\" (UID: \"94b5d766-a567-4328-9942-c0dc2eede97e\") " pod="openstack/glance-d905-account-create-update-tljn7" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.442353 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmdsx\" (UniqueName: \"kubernetes.io/projected/94b5d766-a567-4328-9942-c0dc2eede97e-kube-api-access-jmdsx\") pod \"glance-d905-account-create-update-tljn7\" (UID: \"94b5d766-a567-4328-9942-c0dc2eede97e\") " pod="openstack/glance-d905-account-create-update-tljn7" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.444088 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94b5d766-a567-4328-9942-c0dc2eede97e-operator-scripts\") pod \"glance-d905-account-create-update-tljn7\" (UID: \"94b5d766-a567-4328-9942-c0dc2eede97e\") " pod="openstack/glance-d905-account-create-update-tljn7" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.465352 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmdsx\" (UniqueName: \"kubernetes.io/projected/94b5d766-a567-4328-9942-c0dc2eede97e-kube-api-access-jmdsx\") pod \"glance-d905-account-create-update-tljn7\" (UID: \"94b5d766-a567-4328-9942-c0dc2eede97e\") " pod="openstack/glance-d905-account-create-update-tljn7" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.613166 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d905-account-create-update-tljn7" Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.719881 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-w9lhq"] Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.728104 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b67a-account-create-update-pbmk9"] Dec 06 07:17:26 crc kubenswrapper[4954]: I1206 07:17:26.772433 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m72rg"] Dec 06 07:17:26 crc kubenswrapper[4954]: W1206 07:17:26.928292 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod054f854f_83bd_4174_bf86_da2b1a6cfedb.slice/crio-14a02c51521868783b5485ed302cba018d0e8c56eafa1aba5cddf9eb5aed8815 WatchSource:0}: Error finding container 14a02c51521868783b5485ed302cba018d0e8c56eafa1aba5cddf9eb5aed8815: Status 404 returned error can't find the container with id 14a02c51521868783b5485ed302cba018d0e8c56eafa1aba5cddf9eb5aed8815 Dec 06 07:17:26 crc kubenswrapper[4954]: W1206 07:17:26.932812 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3656e91b_4a6f_4ae8_a097_0f5961df5f25.slice/crio-59398a43ca127f19e35502826cab10bccde142fdf04acc68e116f063534d43a0 WatchSource:0}: Error finding container 59398a43ca127f19e35502826cab10bccde142fdf04acc68e116f063534d43a0: Status 404 returned error can't find the container with id 59398a43ca127f19e35502826cab10bccde142fdf04acc68e116f063534d43a0 Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.040678 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m72rg" event={"ID":"3656e91b-4a6f-4ae8-a097-0f5961df5f25","Type":"ContainerStarted","Data":"59398a43ca127f19e35502826cab10bccde142fdf04acc68e116f063534d43a0"} Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.049178 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b67a-account-create-update-pbmk9" event={"ID":"054f854f-83bd-4174-bf86-da2b1a6cfedb","Type":"ContainerStarted","Data":"14a02c51521868783b5485ed302cba018d0e8c56eafa1aba5cddf9eb5aed8815"} Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.049608 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-88ab-account-create-update-8grdl"] Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.055765 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" event={"ID":"46bc8cab-9415-4928-a4cf-715976f4119c","Type":"ContainerStarted","Data":"22dc101badfdcc2e92af5b343a2700b78142682810bf26c49fdfc617c65a1d33"} Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.055867 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.060291 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-w9lhq" event={"ID":"0cfba967-3154-41c7-b541-45d9a8a2a122","Type":"ContainerStarted","Data":"3ef35d22d752a5131fcbd95938b2263d4dcd17010bd8bcac20a17d3207e438ff"} Dec 06 07:17:27 crc kubenswrapper[4954]: W1206 07:17:27.094435 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod372dbd7a_e026_4794_9f89_071af1d6e3a8.slice/crio-893ef05c5ccb03fd244e6ae070c407a10c19e0a0c8fec7d2f9949f65e9ab87f1 WatchSource:0}: Error finding container 893ef05c5ccb03fd244e6ae070c407a10c19e0a0c8fec7d2f9949f65e9ab87f1: Status 404 returned error can't find the container with id 893ef05c5ccb03fd244e6ae070c407a10c19e0a0c8fec7d2f9949f65e9ab87f1 Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.525221 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c4927a7-3a81-432e-8652-9b16301f905b" path="/var/lib/kubelet/pods/6c4927a7-3a81-432e-8652-9b16301f905b/volumes" Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.526363 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d561be2-2452-466d-acd4-230f18bc29f0" path="/var/lib/kubelet/pods/8d561be2-2452-466d-acd4-230f18bc29f0/volumes" Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.598998 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" podStartSLOduration=4.598964831 podStartE2EDuration="4.598964831s" podCreationTimestamp="2025-12-06 07:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:17:27.089749308 +0000 UTC m=+1221.903108707" watchObservedRunningTime="2025-12-06 07:17:27.598964831 +0000 UTC m=+1222.412324220" Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.610660 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-829jd"] Dec 06 07:17:27 crc kubenswrapper[4954]: W1206 07:17:27.617466 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ca78e5f_9977_49fa_b175_504c48bf1861.slice/crio-e1d4cd7e7ca6ecdb60f12dd8006c868e553a9216cd068f0f63c6987d70474c74 WatchSource:0}: Error finding container e1d4cd7e7ca6ecdb60f12dd8006c868e553a9216cd068f0f63c6987d70474c74: Status 404 returned error can't find the container with id e1d4cd7e7ca6ecdb60f12dd8006c868e553a9216cd068f0f63c6987d70474c74 Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.662784 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d905-account-create-update-tljn7"] Dec 06 07:17:27 crc kubenswrapper[4954]: W1206 07:17:27.693982 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94b5d766_a567_4328_9942_c0dc2eede97e.slice/crio-86b128b3795044ba5e7c43ce2c278b0de07f80e6c18970180da9cec458db4865 WatchSource:0}: Error finding container 86b128b3795044ba5e7c43ce2c278b0de07f80e6c18970180da9cec458db4865: Status 404 returned error can't find the container with id 86b128b3795044ba5e7c43ce2c278b0de07f80e6c18970180da9cec458db4865 Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.858274 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c67bcdbf5-92x9s"] Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.909232 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-784d65c867-2mx6q"] Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.918767 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.929465 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-784d65c867-2mx6q"] Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.998853 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-ovsdbserver-nb\") pod \"dnsmasq-dns-784d65c867-2mx6q\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.998935 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-config\") pod \"dnsmasq-dns-784d65c867-2mx6q\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.999022 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-dns-svc\") pod \"dnsmasq-dns-784d65c867-2mx6q\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.999040 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-ovsdbserver-sb\") pod \"dnsmasq-dns-784d65c867-2mx6q\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:27 crc kubenswrapper[4954]: I1206 07:17:27.999103 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc7zg\" (UniqueName: \"kubernetes.io/projected/84bce81b-d290-4c3a-b5ce-e7188df23c4e-kube-api-access-lc7zg\") pod \"dnsmasq-dns-784d65c867-2mx6q\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.012481 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.081904 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d905-account-create-update-tljn7" event={"ID":"94b5d766-a567-4328-9942-c0dc2eede97e","Type":"ContainerStarted","Data":"86b128b3795044ba5e7c43ce2c278b0de07f80e6c18970180da9cec458db4865"} Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.084686 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-829jd" event={"ID":"9ca78e5f-9977-49fa-b175-504c48bf1861","Type":"ContainerStarted","Data":"fa14aef8f5202091e49f53bf62c1c85cbe928b299e3ba4aafb049519f6b59f0a"} Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.084741 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-829jd" event={"ID":"9ca78e5f-9977-49fa-b175-504c48bf1861","Type":"ContainerStarted","Data":"e1d4cd7e7ca6ecdb60f12dd8006c868e553a9216cd068f0f63c6987d70474c74"} Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.091620 4954 generic.go:334] "Generic (PLEG): container finished" podID="3656e91b-4a6f-4ae8-a097-0f5961df5f25" containerID="6bab50824cc7d0430e80d33b5c7beda9916d889f5e5c1a012ab2aebf51826a03" exitCode=0 Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.091743 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m72rg" event={"ID":"3656e91b-4a6f-4ae8-a097-0f5961df5f25","Type":"ContainerDied","Data":"6bab50824cc7d0430e80d33b5c7beda9916d889f5e5c1a012ab2aebf51826a03"} Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.098085 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b67a-account-create-update-pbmk9" event={"ID":"054f854f-83bd-4174-bf86-da2b1a6cfedb","Type":"ContainerStarted","Data":"928615f26172b5cb755987526ac5bb1eb4f54e75a55d4d52248a3bd9c543805c"} Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.101405 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-ovsdbserver-nb\") pod \"dnsmasq-dns-784d65c867-2mx6q\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.102839 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-config\") pod \"dnsmasq-dns-784d65c867-2mx6q\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.102970 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-dns-svc\") pod \"dnsmasq-dns-784d65c867-2mx6q\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.102992 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-ovsdbserver-sb\") pod \"dnsmasq-dns-784d65c867-2mx6q\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.103053 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc7zg\" (UniqueName: \"kubernetes.io/projected/84bce81b-d290-4c3a-b5ce-e7188df23c4e-kube-api-access-lc7zg\") pod \"dnsmasq-dns-784d65c867-2mx6q\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.105604 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-config\") pod \"dnsmasq-dns-784d65c867-2mx6q\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.106083 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-ovsdbserver-nb\") pod \"dnsmasq-dns-784d65c867-2mx6q\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.106597 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-ovsdbserver-sb\") pod \"dnsmasq-dns-784d65c867-2mx6q\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.113156 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-dns-svc\") pod \"dnsmasq-dns-784d65c867-2mx6q\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.116928 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6de2aad5-fb15-489c-b0fc-200e18ad3baa","Type":"ContainerStarted","Data":"b185262581277ddddfd5f329caf8d803c0479c8dea4732929b0bc2936e8a4cb6"} Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.116995 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6de2aad5-fb15-489c-b0fc-200e18ad3baa","Type":"ContainerStarted","Data":"9874b322d9222424c03e259859ca289d93dd364308738ff78e86095855f8b6e6"} Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.117366 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.123881 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-829jd" podStartSLOduration=3.123827605 podStartE2EDuration="3.123827605s" podCreationTimestamp="2025-12-06 07:17:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:17:28.114942026 +0000 UTC m=+1222.928301415" watchObservedRunningTime="2025-12-06 07:17:28.123827605 +0000 UTC m=+1222.937187004" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.134005 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" event={"ID":"d67c94fc-f109-4536-b490-59598ee00232","Type":"ContainerStarted","Data":"56439ebbe0c086c44d87c285cf1e23c4b5e7abdacd33aacd55a69face6cdcb1c"} Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.134237 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.140602 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc7zg\" (UniqueName: \"kubernetes.io/projected/84bce81b-d290-4c3a-b5ce-e7188df23c4e-kube-api-access-lc7zg\") pod \"dnsmasq-dns-784d65c867-2mx6q\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.148125 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-w9lhq" event={"ID":"0cfba967-3154-41c7-b541-45d9a8a2a122","Type":"ContainerStarted","Data":"5da64da480bd207602ab54760066366c1392ecfb9339d3d709fba33f063c4aac"} Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.149614 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-b67a-account-create-update-pbmk9" podStartSLOduration=3.149587866 podStartE2EDuration="3.149587866s" podCreationTimestamp="2025-12-06 07:17:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:17:28.144807898 +0000 UTC m=+1222.958167307" watchObservedRunningTime="2025-12-06 07:17:28.149587866 +0000 UTC m=+1222.962947275" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.155073 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-88ab-account-create-update-8grdl" event={"ID":"372dbd7a-e026-4794-9f89-071af1d6e3a8","Type":"ContainerStarted","Data":"f9ce5a5580680603b5f5978b8290b06d0f73e55dfade2adf905db6a7508d152b"} Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.155141 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-88ab-account-create-update-8grdl" event={"ID":"372dbd7a-e026-4794-9f89-071af1d6e3a8","Type":"ContainerStarted","Data":"893ef05c5ccb03fd244e6ae070c407a10c19e0a0c8fec7d2f9949f65e9ab87f1"} Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.204340 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" podStartSLOduration=5.204315034 podStartE2EDuration="5.204315034s" podCreationTimestamp="2025-12-06 07:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:17:28.192082516 +0000 UTC m=+1223.005441895" watchObservedRunningTime="2025-12-06 07:17:28.204315034 +0000 UTC m=+1223.017674423" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.264279 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.447856796 podStartE2EDuration="5.264242753s" podCreationTimestamp="2025-12-06 07:17:23 +0000 UTC" firstStartedPulling="2025-12-06 07:17:25.195644275 +0000 UTC m=+1220.009003664" lastFinishedPulling="2025-12-06 07:17:27.012030222 +0000 UTC m=+1221.825389621" observedRunningTime="2025-12-06 07:17:28.249880717 +0000 UTC m=+1223.063240126" watchObservedRunningTime="2025-12-06 07:17:28.264242753 +0000 UTC m=+1223.077602142" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.285781 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-88ab-account-create-update-8grdl" podStartSLOduration=3.28574975 podStartE2EDuration="3.28574975s" podCreationTimestamp="2025-12-06 07:17:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:17:28.272052692 +0000 UTC m=+1223.085412081" watchObservedRunningTime="2025-12-06 07:17:28.28574975 +0000 UTC m=+1223.099109139" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.314681 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:28 crc kubenswrapper[4954]: I1206 07:17:28.982348 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-784d65c867-2mx6q"] Dec 06 07:17:28 crc kubenswrapper[4954]: W1206 07:17:28.989213 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84bce81b_d290_4c3a_b5ce_e7188df23c4e.slice/crio-7786504d3ca4bb2102292658ebbe9baa3da0e1e0852eb7410d188b293873cb3a WatchSource:0}: Error finding container 7786504d3ca4bb2102292658ebbe9baa3da0e1e0852eb7410d188b293873cb3a: Status 404 returned error can't find the container with id 7786504d3ca4bb2102292658ebbe9baa3da0e1e0852eb7410d188b293873cb3a Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.047980 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.054079 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.057537 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.057760 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2wjhz" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.058021 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.058182 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.078015 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.169749 4954 generic.go:334] "Generic (PLEG): container finished" podID="0cfba967-3154-41c7-b541-45d9a8a2a122" containerID="5da64da480bd207602ab54760066366c1392ecfb9339d3d709fba33f063c4aac" exitCode=0 Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.169858 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-w9lhq" event={"ID":"0cfba967-3154-41c7-b541-45d9a8a2a122","Type":"ContainerDied","Data":"5da64da480bd207602ab54760066366c1392ecfb9339d3d709fba33f063c4aac"} Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.189129 4954 generic.go:334] "Generic (PLEG): container finished" podID="372dbd7a-e026-4794-9f89-071af1d6e3a8" containerID="f9ce5a5580680603b5f5978b8290b06d0f73e55dfade2adf905db6a7508d152b" exitCode=0 Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.189315 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-88ab-account-create-update-8grdl" event={"ID":"372dbd7a-e026-4794-9f89-071af1d6e3a8","Type":"ContainerDied","Data":"f9ce5a5580680603b5f5978b8290b06d0f73e55dfade2adf905db6a7508d152b"} Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.200017 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d905-account-create-update-tljn7" event={"ID":"94b5d766-a567-4328-9942-c0dc2eede97e","Type":"ContainerStarted","Data":"085463d09eb36984e71b25ab74286f6b9bc39b6985a17aff26c32c1c17ebc34b"} Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.207816 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-829jd" event={"ID":"9ca78e5f-9977-49fa-b175-504c48bf1861","Type":"ContainerDied","Data":"fa14aef8f5202091e49f53bf62c1c85cbe928b299e3ba4aafb049519f6b59f0a"} Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.207175 4954 generic.go:334] "Generic (PLEG): container finished" podID="9ca78e5f-9977-49fa-b175-504c48bf1861" containerID="fa14aef8f5202091e49f53bf62c1c85cbe928b299e3ba4aafb049519f6b59f0a" exitCode=0 Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.229179 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b304148c-ab0e-42ac-966a-024ff59a8cde-cache\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.229304 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v69bl\" (UniqueName: \"kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-kube-api-access-v69bl\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.229500 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.229713 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b304148c-ab0e-42ac-966a-024ff59a8cde-lock\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.229758 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.231993 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784d65c867-2mx6q" event={"ID":"84bce81b-d290-4c3a-b5ce-e7188df23c4e","Type":"ContainerStarted","Data":"7786504d3ca4bb2102292658ebbe9baa3da0e1e0852eb7410d188b293873cb3a"} Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.237749 4954 generic.go:334] "Generic (PLEG): container finished" podID="054f854f-83bd-4174-bf86-da2b1a6cfedb" containerID="928615f26172b5cb755987526ac5bb1eb4f54e75a55d4d52248a3bd9c543805c" exitCode=0 Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.238109 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b67a-account-create-update-pbmk9" event={"ID":"054f854f-83bd-4174-bf86-da2b1a6cfedb","Type":"ContainerDied","Data":"928615f26172b5cb755987526ac5bb1eb4f54e75a55d4d52248a3bd9c543805c"} Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.238801 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" podUID="46bc8cab-9415-4928-a4cf-715976f4119c" containerName="dnsmasq-dns" containerID="cri-o://22dc101badfdcc2e92af5b343a2700b78142682810bf26c49fdfc617c65a1d33" gracePeriod=10 Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.243383 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-d905-account-create-update-tljn7" podStartSLOduration=3.243366265 podStartE2EDuration="3.243366265s" podCreationTimestamp="2025-12-06 07:17:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:17:29.230281544 +0000 UTC m=+1224.043640943" watchObservedRunningTime="2025-12-06 07:17:29.243366265 +0000 UTC m=+1224.056725654" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.331327 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.331904 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b304148c-ab0e-42ac-966a-024ff59a8cde-lock\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.331935 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.332054 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b304148c-ab0e-42ac-966a-024ff59a8cde-cache\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.332109 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v69bl\" (UniqueName: \"kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-kube-api-access-v69bl\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.333708 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.334164 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b304148c-ab0e-42ac-966a-024ff59a8cde-cache\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:29 crc kubenswrapper[4954]: E1206 07:17:29.331633 4954 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 07:17:29 crc kubenswrapper[4954]: E1206 07:17:29.335034 4954 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 07:17:29 crc kubenswrapper[4954]: E1206 07:17:29.335095 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift podName:b304148c-ab0e-42ac-966a-024ff59a8cde nodeName:}" failed. No retries permitted until 2025-12-06 07:17:29.835071106 +0000 UTC m=+1224.648430685 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift") pod "swift-storage-0" (UID: "b304148c-ab0e-42ac-966a-024ff59a8cde") : configmap "swift-ring-files" not found Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.335870 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b304148c-ab0e-42ac-966a-024ff59a8cde-lock\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.369788 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v69bl\" (UniqueName: \"kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-kube-api-access-v69bl\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.379128 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.496808 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-w9lhq" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.640767 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cfba967-3154-41c7-b541-45d9a8a2a122-operator-scripts\") pod \"0cfba967-3154-41c7-b541-45d9a8a2a122\" (UID: \"0cfba967-3154-41c7-b541-45d9a8a2a122\") " Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.641109 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hmsq\" (UniqueName: \"kubernetes.io/projected/0cfba967-3154-41c7-b541-45d9a8a2a122-kube-api-access-8hmsq\") pod \"0cfba967-3154-41c7-b541-45d9a8a2a122\" (UID: \"0cfba967-3154-41c7-b541-45d9a8a2a122\") " Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.641447 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cfba967-3154-41c7-b541-45d9a8a2a122-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0cfba967-3154-41c7-b541-45d9a8a2a122" (UID: "0cfba967-3154-41c7-b541-45d9a8a2a122"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.646014 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cfba967-3154-41c7-b541-45d9a8a2a122-kube-api-access-8hmsq" (OuterVolumeSpecName: "kube-api-access-8hmsq") pod "0cfba967-3154-41c7-b541-45d9a8a2a122" (UID: "0cfba967-3154-41c7-b541-45d9a8a2a122"). InnerVolumeSpecName "kube-api-access-8hmsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.693131 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-xg877"] Dec 06 07:17:29 crc kubenswrapper[4954]: E1206 07:17:29.693611 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cfba967-3154-41c7-b541-45d9a8a2a122" containerName="mariadb-database-create" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.693625 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cfba967-3154-41c7-b541-45d9a8a2a122" containerName="mariadb-database-create" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.693862 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cfba967-3154-41c7-b541-45d9a8a2a122" containerName="mariadb-database-create" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.694497 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.699158 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.699574 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.700457 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.727496 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-xg877"] Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.743732 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hmsq\" (UniqueName: \"kubernetes.io/projected/0cfba967-3154-41c7-b541-45d9a8a2a122-kube-api-access-8hmsq\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.743780 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cfba967-3154-41c7-b541-45d9a8a2a122-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.751857 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m72rg" Dec 06 07:17:29 crc kubenswrapper[4954]: E1206 07:17:29.760663 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-s22sj ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-xg877" podUID="8a7d2715-f057-4114-8e32-f56025d29bad" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.775079 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-cddwk"] Dec 06 07:17:29 crc kubenswrapper[4954]: E1206 07:17:29.775619 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3656e91b-4a6f-4ae8-a097-0f5961df5f25" containerName="mariadb-database-create" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.775641 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3656e91b-4a6f-4ae8-a097-0f5961df5f25" containerName="mariadb-database-create" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.775891 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3656e91b-4a6f-4ae8-a097-0f5961df5f25" containerName="mariadb-database-create" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.776659 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.783859 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-xg877"] Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.796077 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-cddwk"] Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.845608 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llvwj\" (UniqueName: \"kubernetes.io/projected/3656e91b-4a6f-4ae8-a097-0f5961df5f25-kube-api-access-llvwj\") pod \"3656e91b-4a6f-4ae8-a097-0f5961df5f25\" (UID: \"3656e91b-4a6f-4ae8-a097-0f5961df5f25\") " Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.845669 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3656e91b-4a6f-4ae8-a097-0f5961df5f25-operator-scripts\") pod \"3656e91b-4a6f-4ae8-a097-0f5961df5f25\" (UID: \"3656e91b-4a6f-4ae8-a097-0f5961df5f25\") " Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.846234 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a7d2715-f057-4114-8e32-f56025d29bad-combined-ca-bundle\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.846268 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a7d2715-f057-4114-8e32-f56025d29bad-etc-swift\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.846306 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a7d2715-f057-4114-8e32-f56025d29bad-scripts\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.846514 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3656e91b-4a6f-4ae8-a097-0f5961df5f25-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3656e91b-4a6f-4ae8-a097-0f5961df5f25" (UID: "3656e91b-4a6f-4ae8-a097-0f5961df5f25"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.846779 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a7d2715-f057-4114-8e32-f56025d29bad-dispersionconf\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.846952 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a7d2715-f057-4114-8e32-f56025d29bad-swiftconf\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.846994 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a7d2715-f057-4114-8e32-f56025d29bad-ring-data-devices\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.847074 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.847160 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s22sj\" (UniqueName: \"kubernetes.io/projected/8a7d2715-f057-4114-8e32-f56025d29bad-kube-api-access-s22sj\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: E1206 07:17:29.847285 4954 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 07:17:29 crc kubenswrapper[4954]: E1206 07:17:29.847308 4954 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 07:17:29 crc kubenswrapper[4954]: E1206 07:17:29.847440 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift podName:b304148c-ab0e-42ac-966a-024ff59a8cde nodeName:}" failed. No retries permitted until 2025-12-06 07:17:30.847406903 +0000 UTC m=+1225.660766472 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift") pod "swift-storage-0" (UID: "b304148c-ab0e-42ac-966a-024ff59a8cde") : configmap "swift-ring-files" not found Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.847450 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3656e91b-4a6f-4ae8-a097-0f5961df5f25-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.849287 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3656e91b-4a6f-4ae8-a097-0f5961df5f25-kube-api-access-llvwj" (OuterVolumeSpecName: "kube-api-access-llvwj") pod "3656e91b-4a6f-4ae8-a097-0f5961df5f25" (UID: "3656e91b-4a6f-4ae8-a097-0f5961df5f25"). InnerVolumeSpecName "kube-api-access-llvwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.949958 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a7d2715-f057-4114-8e32-f56025d29bad-dispersionconf\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.950818 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c70d4759-1065-4412-9685-898f12a23a38-etc-swift\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.951093 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a7d2715-f057-4114-8e32-f56025d29bad-swiftconf\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.951136 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a7d2715-f057-4114-8e32-f56025d29bad-ring-data-devices\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.951282 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s22sj\" (UniqueName: \"kubernetes.io/projected/8a7d2715-f057-4114-8e32-f56025d29bad-kube-api-access-s22sj\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.951333 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c70d4759-1065-4412-9685-898f12a23a38-dispersionconf\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.951536 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c70d4759-1065-4412-9685-898f12a23a38-scripts\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.951754 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a7d2715-f057-4114-8e32-f56025d29bad-combined-ca-bundle\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.951806 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a7d2715-f057-4114-8e32-f56025d29bad-etc-swift\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.951857 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a7d2715-f057-4114-8e32-f56025d29bad-scripts\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.952127 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70d4759-1065-4412-9685-898f12a23a38-combined-ca-bundle\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.952221 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c70d4759-1065-4412-9685-898f12a23a38-swiftconf\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.952253 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7rrd\" (UniqueName: \"kubernetes.io/projected/c70d4759-1065-4412-9685-898f12a23a38-kube-api-access-x7rrd\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.952260 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a7d2715-f057-4114-8e32-f56025d29bad-etc-swift\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.952427 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c70d4759-1065-4412-9685-898f12a23a38-ring-data-devices\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.952514 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llvwj\" (UniqueName: \"kubernetes.io/projected/3656e91b-4a6f-4ae8-a097-0f5961df5f25-kube-api-access-llvwj\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.952857 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a7d2715-f057-4114-8e32-f56025d29bad-ring-data-devices\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.953314 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a7d2715-f057-4114-8e32-f56025d29bad-scripts\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.956303 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a7d2715-f057-4114-8e32-f56025d29bad-dispersionconf\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.957040 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a7d2715-f057-4114-8e32-f56025d29bad-combined-ca-bundle\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.957526 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a7d2715-f057-4114-8e32-f56025d29bad-swiftconf\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:29 crc kubenswrapper[4954]: I1206 07:17:29.973466 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s22sj\" (UniqueName: \"kubernetes.io/projected/8a7d2715-f057-4114-8e32-f56025d29bad-kube-api-access-s22sj\") pod \"swift-ring-rebalance-xg877\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.054445 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c70d4759-1065-4412-9685-898f12a23a38-scripts\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.055148 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70d4759-1065-4412-9685-898f12a23a38-combined-ca-bundle\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.055198 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c70d4759-1065-4412-9685-898f12a23a38-swiftconf\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.055226 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7rrd\" (UniqueName: \"kubernetes.io/projected/c70d4759-1065-4412-9685-898f12a23a38-kube-api-access-x7rrd\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.055464 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c70d4759-1065-4412-9685-898f12a23a38-ring-data-devices\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.055524 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c70d4759-1065-4412-9685-898f12a23a38-etc-swift\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.055620 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c70d4759-1065-4412-9685-898f12a23a38-dispersionconf\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.055693 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c70d4759-1065-4412-9685-898f12a23a38-scripts\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.056254 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c70d4759-1065-4412-9685-898f12a23a38-ring-data-devices\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.056942 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c70d4759-1065-4412-9685-898f12a23a38-etc-swift\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.059546 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70d4759-1065-4412-9685-898f12a23a38-combined-ca-bundle\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.059924 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c70d4759-1065-4412-9685-898f12a23a38-dispersionconf\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.060119 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c70d4759-1065-4412-9685-898f12a23a38-swiftconf\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.077246 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7rrd\" (UniqueName: \"kubernetes.io/projected/c70d4759-1065-4412-9685-898f12a23a38-kube-api-access-x7rrd\") pod \"swift-ring-rebalance-cddwk\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.098199 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.271551 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m72rg" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.271549 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m72rg" event={"ID":"3656e91b-4a6f-4ae8-a097-0f5961df5f25","Type":"ContainerDied","Data":"59398a43ca127f19e35502826cab10bccde142fdf04acc68e116f063534d43a0"} Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.272324 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59398a43ca127f19e35502826cab10bccde142fdf04acc68e116f063534d43a0" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.278132 4954 generic.go:334] "Generic (PLEG): container finished" podID="84bce81b-d290-4c3a-b5ce-e7188df23c4e" containerID="52fcf5af3db779d717e14fdc3a3c4f3d3aa22bcf9db40eb4664db0e461643a94" exitCode=0 Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.278211 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784d65c867-2mx6q" event={"ID":"84bce81b-d290-4c3a-b5ce-e7188df23c4e","Type":"ContainerDied","Data":"52fcf5af3db779d717e14fdc3a3c4f3d3aa22bcf9db40eb4664db0e461643a94"} Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.289382 4954 generic.go:334] "Generic (PLEG): container finished" podID="46bc8cab-9415-4928-a4cf-715976f4119c" containerID="22dc101badfdcc2e92af5b343a2700b78142682810bf26c49fdfc617c65a1d33" exitCode=0 Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.289490 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" event={"ID":"46bc8cab-9415-4928-a4cf-715976f4119c","Type":"ContainerDied","Data":"22dc101badfdcc2e92af5b343a2700b78142682810bf26c49fdfc617c65a1d33"} Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.295612 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-w9lhq" event={"ID":"0cfba967-3154-41c7-b541-45d9a8a2a122","Type":"ContainerDied","Data":"3ef35d22d752a5131fcbd95938b2263d4dcd17010bd8bcac20a17d3207e438ff"} Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.295679 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ef35d22d752a5131fcbd95938b2263d4dcd17010bd8bcac20a17d3207e438ff" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.295715 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-w9lhq" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.295772 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.429808 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.598867 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a7d2715-f057-4114-8e32-f56025d29bad-combined-ca-bundle\") pod \"8a7d2715-f057-4114-8e32-f56025d29bad\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.599068 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s22sj\" (UniqueName: \"kubernetes.io/projected/8a7d2715-f057-4114-8e32-f56025d29bad-kube-api-access-s22sj\") pod \"8a7d2715-f057-4114-8e32-f56025d29bad\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.599153 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a7d2715-f057-4114-8e32-f56025d29bad-swiftconf\") pod \"8a7d2715-f057-4114-8e32-f56025d29bad\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.599210 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a7d2715-f057-4114-8e32-f56025d29bad-ring-data-devices\") pod \"8a7d2715-f057-4114-8e32-f56025d29bad\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.599287 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a7d2715-f057-4114-8e32-f56025d29bad-scripts\") pod \"8a7d2715-f057-4114-8e32-f56025d29bad\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.599322 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a7d2715-f057-4114-8e32-f56025d29bad-dispersionconf\") pod \"8a7d2715-f057-4114-8e32-f56025d29bad\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.599380 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a7d2715-f057-4114-8e32-f56025d29bad-etc-swift\") pod \"8a7d2715-f057-4114-8e32-f56025d29bad\" (UID: \"8a7d2715-f057-4114-8e32-f56025d29bad\") " Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.601710 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a7d2715-f057-4114-8e32-f56025d29bad-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8a7d2715-f057-4114-8e32-f56025d29bad" (UID: "8a7d2715-f057-4114-8e32-f56025d29bad"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.602232 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a7d2715-f057-4114-8e32-f56025d29bad-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8a7d2715-f057-4114-8e32-f56025d29bad" (UID: "8a7d2715-f057-4114-8e32-f56025d29bad"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.602949 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a7d2715-f057-4114-8e32-f56025d29bad-scripts" (OuterVolumeSpecName: "scripts") pod "8a7d2715-f057-4114-8e32-f56025d29bad" (UID: "8a7d2715-f057-4114-8e32-f56025d29bad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.641771 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a7d2715-f057-4114-8e32-f56025d29bad-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8a7d2715-f057-4114-8e32-f56025d29bad" (UID: "8a7d2715-f057-4114-8e32-f56025d29bad"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.708180 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a7d2715-f057-4114-8e32-f56025d29bad-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.708232 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a7d2715-f057-4114-8e32-f56025d29bad-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.708244 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a7d2715-f057-4114-8e32-f56025d29bad-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.708255 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a7d2715-f057-4114-8e32-f56025d29bad-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.713863 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a7d2715-f057-4114-8e32-f56025d29bad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a7d2715-f057-4114-8e32-f56025d29bad" (UID: "8a7d2715-f057-4114-8e32-f56025d29bad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.713938 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a7d2715-f057-4114-8e32-f56025d29bad-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8a7d2715-f057-4114-8e32-f56025d29bad" (UID: "8a7d2715-f057-4114-8e32-f56025d29bad"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.713985 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a7d2715-f057-4114-8e32-f56025d29bad-kube-api-access-s22sj" (OuterVolumeSpecName: "kube-api-access-s22sj") pod "8a7d2715-f057-4114-8e32-f56025d29bad" (UID: "8a7d2715-f057-4114-8e32-f56025d29bad"). InnerVolumeSpecName "kube-api-access-s22sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.810807 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a7d2715-f057-4114-8e32-f56025d29bad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.810848 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s22sj\" (UniqueName: \"kubernetes.io/projected/8a7d2715-f057-4114-8e32-f56025d29bad-kube-api-access-s22sj\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.810862 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a7d2715-f057-4114-8e32-f56025d29bad-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.816780 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.903116 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-cddwk"] Dec 06 07:17:30 crc kubenswrapper[4954]: I1206 07:17:30.913066 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:30 crc kubenswrapper[4954]: E1206 07:17:30.913386 4954 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 07:17:30 crc kubenswrapper[4954]: E1206 07:17:30.913410 4954 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 07:17:30 crc kubenswrapper[4954]: E1206 07:17:30.913485 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift podName:b304148c-ab0e-42ac-966a-024ff59a8cde nodeName:}" failed. No retries permitted until 2025-12-06 07:17:32.913457207 +0000 UTC m=+1227.726816596 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift") pod "swift-storage-0" (UID: "b304148c-ab0e-42ac-966a-024ff59a8cde") : configmap "swift-ring-files" not found Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.009136 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-88ab-account-create-update-8grdl" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.014638 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vhn9\" (UniqueName: \"kubernetes.io/projected/46bc8cab-9415-4928-a4cf-715976f4119c-kube-api-access-5vhn9\") pod \"46bc8cab-9415-4928-a4cf-715976f4119c\" (UID: \"46bc8cab-9415-4928-a4cf-715976f4119c\") " Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.014699 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46bc8cab-9415-4928-a4cf-715976f4119c-dns-svc\") pod \"46bc8cab-9415-4928-a4cf-715976f4119c\" (UID: \"46bc8cab-9415-4928-a4cf-715976f4119c\") " Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.014808 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46bc8cab-9415-4928-a4cf-715976f4119c-ovsdbserver-nb\") pod \"46bc8cab-9415-4928-a4cf-715976f4119c\" (UID: \"46bc8cab-9415-4928-a4cf-715976f4119c\") " Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.014885 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46bc8cab-9415-4928-a4cf-715976f4119c-config\") pod \"46bc8cab-9415-4928-a4cf-715976f4119c\" (UID: \"46bc8cab-9415-4928-a4cf-715976f4119c\") " Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.058545 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46bc8cab-9415-4928-a4cf-715976f4119c-kube-api-access-5vhn9" (OuterVolumeSpecName: "kube-api-access-5vhn9") pod "46bc8cab-9415-4928-a4cf-715976f4119c" (UID: "46bc8cab-9415-4928-a4cf-715976f4119c"). InnerVolumeSpecName "kube-api-access-5vhn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.103273 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46bc8cab-9415-4928-a4cf-715976f4119c-config" (OuterVolumeSpecName: "config") pod "46bc8cab-9415-4928-a4cf-715976f4119c" (UID: "46bc8cab-9415-4928-a4cf-715976f4119c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.115194 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46bc8cab-9415-4928-a4cf-715976f4119c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "46bc8cab-9415-4928-a4cf-715976f4119c" (UID: "46bc8cab-9415-4928-a4cf-715976f4119c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.117217 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/372dbd7a-e026-4794-9f89-071af1d6e3a8-operator-scripts\") pod \"372dbd7a-e026-4794-9f89-071af1d6e3a8\" (UID: \"372dbd7a-e026-4794-9f89-071af1d6e3a8\") " Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.117350 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tsvx\" (UniqueName: \"kubernetes.io/projected/372dbd7a-e026-4794-9f89-071af1d6e3a8-kube-api-access-7tsvx\") pod \"372dbd7a-e026-4794-9f89-071af1d6e3a8\" (UID: \"372dbd7a-e026-4794-9f89-071af1d6e3a8\") " Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.117970 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/372dbd7a-e026-4794-9f89-071af1d6e3a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "372dbd7a-e026-4794-9f89-071af1d6e3a8" (UID: "372dbd7a-e026-4794-9f89-071af1d6e3a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.118288 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vhn9\" (UniqueName: \"kubernetes.io/projected/46bc8cab-9415-4928-a4cf-715976f4119c-kube-api-access-5vhn9\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.118307 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46bc8cab-9415-4928-a4cf-715976f4119c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.118317 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46bc8cab-9415-4928-a4cf-715976f4119c-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.118326 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/372dbd7a-e026-4794-9f89-071af1d6e3a8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.124046 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/372dbd7a-e026-4794-9f89-071af1d6e3a8-kube-api-access-7tsvx" (OuterVolumeSpecName: "kube-api-access-7tsvx") pod "372dbd7a-e026-4794-9f89-071af1d6e3a8" (UID: "372dbd7a-e026-4794-9f89-071af1d6e3a8"). InnerVolumeSpecName "kube-api-access-7tsvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.139086 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46bc8cab-9415-4928-a4cf-715976f4119c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "46bc8cab-9415-4928-a4cf-715976f4119c" (UID: "46bc8cab-9415-4928-a4cf-715976f4119c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.144047 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b67a-account-create-update-pbmk9" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.153196 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-829jd" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.220011 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tsvx\" (UniqueName: \"kubernetes.io/projected/372dbd7a-e026-4794-9f89-071af1d6e3a8-kube-api-access-7tsvx\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.220069 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46bc8cab-9415-4928-a4cf-715976f4119c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.314242 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cddwk" event={"ID":"c70d4759-1065-4412-9685-898f12a23a38","Type":"ContainerStarted","Data":"1f6cf125245081ff16a7dddba8e00cc53b3ac87a6a568f8a81edead9e70506df"} Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.317156 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-88ab-account-create-update-8grdl" event={"ID":"372dbd7a-e026-4794-9f89-071af1d6e3a8","Type":"ContainerDied","Data":"893ef05c5ccb03fd244e6ae070c407a10c19e0a0c8fec7d2f9949f65e9ab87f1"} Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.317195 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="893ef05c5ccb03fd244e6ae070c407a10c19e0a0c8fec7d2f9949f65e9ab87f1" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.317270 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-88ab-account-create-update-8grdl" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.319633 4954 generic.go:334] "Generic (PLEG): container finished" podID="94b5d766-a567-4328-9942-c0dc2eede97e" containerID="085463d09eb36984e71b25ab74286f6b9bc39b6985a17aff26c32c1c17ebc34b" exitCode=0 Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.319712 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d905-account-create-update-tljn7" event={"ID":"94b5d766-a567-4328-9942-c0dc2eede97e","Type":"ContainerDied","Data":"085463d09eb36984e71b25ab74286f6b9bc39b6985a17aff26c32c1c17ebc34b"} Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.320546 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/054f854f-83bd-4174-bf86-da2b1a6cfedb-operator-scripts\") pod \"054f854f-83bd-4174-bf86-da2b1a6cfedb\" (UID: \"054f854f-83bd-4174-bf86-da2b1a6cfedb\") " Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.320672 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws7k6\" (UniqueName: \"kubernetes.io/projected/054f854f-83bd-4174-bf86-da2b1a6cfedb-kube-api-access-ws7k6\") pod \"054f854f-83bd-4174-bf86-da2b1a6cfedb\" (UID: \"054f854f-83bd-4174-bf86-da2b1a6cfedb\") " Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.320695 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28dfh\" (UniqueName: \"kubernetes.io/projected/9ca78e5f-9977-49fa-b175-504c48bf1861-kube-api-access-28dfh\") pod \"9ca78e5f-9977-49fa-b175-504c48bf1861\" (UID: \"9ca78e5f-9977-49fa-b175-504c48bf1861\") " Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.320845 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca78e5f-9977-49fa-b175-504c48bf1861-operator-scripts\") pod \"9ca78e5f-9977-49fa-b175-504c48bf1861\" (UID: \"9ca78e5f-9977-49fa-b175-504c48bf1861\") " Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.322274 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/054f854f-83bd-4174-bf86-da2b1a6cfedb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "054f854f-83bd-4174-bf86-da2b1a6cfedb" (UID: "054f854f-83bd-4174-bf86-da2b1a6cfedb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.322518 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ca78e5f-9977-49fa-b175-504c48bf1861-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ca78e5f-9977-49fa-b175-504c48bf1861" (UID: "9ca78e5f-9977-49fa-b175-504c48bf1861"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.324396 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-829jd" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.324386 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-829jd" event={"ID":"9ca78e5f-9977-49fa-b175-504c48bf1861","Type":"ContainerDied","Data":"e1d4cd7e7ca6ecdb60f12dd8006c868e553a9216cd068f0f63c6987d70474c74"} Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.324509 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1d4cd7e7ca6ecdb60f12dd8006c868e553a9216cd068f0f63c6987d70474c74" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.326784 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784d65c867-2mx6q" event={"ID":"84bce81b-d290-4c3a-b5ce-e7188df23c4e","Type":"ContainerStarted","Data":"9c8b3230b89e7cd383e6e20c8ead07828efba3da118a04036e69862d1033971b"} Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.327080 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.328182 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ca78e5f-9977-49fa-b175-504c48bf1861-kube-api-access-28dfh" (OuterVolumeSpecName: "kube-api-access-28dfh") pod "9ca78e5f-9977-49fa-b175-504c48bf1861" (UID: "9ca78e5f-9977-49fa-b175-504c48bf1861"). InnerVolumeSpecName "kube-api-access-28dfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.329095 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/054f854f-83bd-4174-bf86-da2b1a6cfedb-kube-api-access-ws7k6" (OuterVolumeSpecName: "kube-api-access-ws7k6") pod "054f854f-83bd-4174-bf86-da2b1a6cfedb" (UID: "054f854f-83bd-4174-bf86-da2b1a6cfedb"). InnerVolumeSpecName "kube-api-access-ws7k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.331453 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b67a-account-create-update-pbmk9" event={"ID":"054f854f-83bd-4174-bf86-da2b1a6cfedb","Type":"ContainerDied","Data":"14a02c51521868783b5485ed302cba018d0e8c56eafa1aba5cddf9eb5aed8815"} Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.331481 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14a02c51521868783b5485ed302cba018d0e8c56eafa1aba5cddf9eb5aed8815" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.331536 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b67a-account-create-update-pbmk9" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.341291 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xg877" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.342146 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" event={"ID":"46bc8cab-9415-4928-a4cf-715976f4119c","Type":"ContainerDied","Data":"8b0b07b2880457c8afaec6b0a363c1437bf7ef331100c7c1562c00f373dd3770"} Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.342311 4954 scope.go:117] "RemoveContainer" containerID="22dc101badfdcc2e92af5b343a2700b78142682810bf26c49fdfc617c65a1d33" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.342239 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c67bcdbf5-92x9s" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.383712 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-784d65c867-2mx6q" podStartSLOduration=4.383688225 podStartE2EDuration="4.383688225s" podCreationTimestamp="2025-12-06 07:17:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:17:31.377464298 +0000 UTC m=+1226.190823707" watchObservedRunningTime="2025-12-06 07:17:31.383688225 +0000 UTC m=+1226.197047624" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.387265 4954 scope.go:117] "RemoveContainer" containerID="d290b1fac613ad4d1e95398cdc516a96f9966a8a892928d7e64a4ad000e655d3" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.411125 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c67bcdbf5-92x9s"] Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.422679 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c67bcdbf5-92x9s"] Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.422987 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca78e5f-9977-49fa-b175-504c48bf1861-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.423264 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/054f854f-83bd-4174-bf86-da2b1a6cfedb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.423393 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28dfh\" (UniqueName: \"kubernetes.io/projected/9ca78e5f-9977-49fa-b175-504c48bf1861-kube-api-access-28dfh\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.423482 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws7k6\" (UniqueName: \"kubernetes.io/projected/054f854f-83bd-4174-bf86-da2b1a6cfedb-kube-api-access-ws7k6\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.462980 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46bc8cab-9415-4928-a4cf-715976f4119c" path="/var/lib/kubelet/pods/46bc8cab-9415-4928-a4cf-715976f4119c/volumes" Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.463904 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-xg877"] Dec 06 07:17:31 crc kubenswrapper[4954]: I1206 07:17:31.468534 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-xg877"] Dec 06 07:17:32 crc kubenswrapper[4954]: I1206 07:17:32.954389 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:32 crc kubenswrapper[4954]: E1206 07:17:32.954694 4954 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 07:17:32 crc kubenswrapper[4954]: E1206 07:17:32.955403 4954 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 07:17:32 crc kubenswrapper[4954]: E1206 07:17:32.955481 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift podName:b304148c-ab0e-42ac-966a-024ff59a8cde nodeName:}" failed. No retries permitted until 2025-12-06 07:17:36.955454339 +0000 UTC m=+1231.768813728 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift") pod "swift-storage-0" (UID: "b304148c-ab0e-42ac-966a-024ff59a8cde") : configmap "swift-ring-files" not found Dec 06 07:17:33 crc kubenswrapper[4954]: I1206 07:17:33.457468 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a7d2715-f057-4114-8e32-f56025d29bad" path="/var/lib/kubelet/pods/8a7d2715-f057-4114-8e32-f56025d29bad/volumes" Dec 06 07:17:33 crc kubenswrapper[4954]: I1206 07:17:33.908443 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d905-account-create-update-tljn7" Dec 06 07:17:34 crc kubenswrapper[4954]: I1206 07:17:33.999808 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94b5d766-a567-4328-9942-c0dc2eede97e-operator-scripts\") pod \"94b5d766-a567-4328-9942-c0dc2eede97e\" (UID: \"94b5d766-a567-4328-9942-c0dc2eede97e\") " Dec 06 07:17:34 crc kubenswrapper[4954]: I1206 07:17:34.000052 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmdsx\" (UniqueName: \"kubernetes.io/projected/94b5d766-a567-4328-9942-c0dc2eede97e-kube-api-access-jmdsx\") pod \"94b5d766-a567-4328-9942-c0dc2eede97e\" (UID: \"94b5d766-a567-4328-9942-c0dc2eede97e\") " Dec 06 07:17:34 crc kubenswrapper[4954]: I1206 07:17:34.001019 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94b5d766-a567-4328-9942-c0dc2eede97e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94b5d766-a567-4328-9942-c0dc2eede97e" (UID: "94b5d766-a567-4328-9942-c0dc2eede97e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:34 crc kubenswrapper[4954]: I1206 07:17:34.004739 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b5d766-a567-4328-9942-c0dc2eede97e-kube-api-access-jmdsx" (OuterVolumeSpecName: "kube-api-access-jmdsx") pod "94b5d766-a567-4328-9942-c0dc2eede97e" (UID: "94b5d766-a567-4328-9942-c0dc2eede97e"). InnerVolumeSpecName "kube-api-access-jmdsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:17:34 crc kubenswrapper[4954]: I1206 07:17:34.102884 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmdsx\" (UniqueName: \"kubernetes.io/projected/94b5d766-a567-4328-9942-c0dc2eede97e-kube-api-access-jmdsx\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:34 crc kubenswrapper[4954]: I1206 07:17:34.102938 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94b5d766-a567-4328-9942-c0dc2eede97e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:34 crc kubenswrapper[4954]: I1206 07:17:34.122079 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:34 crc kubenswrapper[4954]: I1206 07:17:34.375491 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cddwk" event={"ID":"c70d4759-1065-4412-9685-898f12a23a38","Type":"ContainerStarted","Data":"0bf16b7efb1f2617221d73830bc9e49e51a86fe1c47c1f391e18bcba68548179"} Dec 06 07:17:34 crc kubenswrapper[4954]: I1206 07:17:34.379972 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d905-account-create-update-tljn7" event={"ID":"94b5d766-a567-4328-9942-c0dc2eede97e","Type":"ContainerDied","Data":"86b128b3795044ba5e7c43ce2c278b0de07f80e6c18970180da9cec458db4865"} Dec 06 07:17:34 crc kubenswrapper[4954]: I1206 07:17:34.380102 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86b128b3795044ba5e7c43ce2c278b0de07f80e6c18970180da9cec458db4865" Dec 06 07:17:34 crc kubenswrapper[4954]: I1206 07:17:34.380230 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d905-account-create-update-tljn7" Dec 06 07:17:34 crc kubenswrapper[4954]: I1206 07:17:34.408754 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-cddwk" podStartSLOduration=2.4281930689999998 podStartE2EDuration="5.408734814s" podCreationTimestamp="2025-12-06 07:17:29 +0000 UTC" firstStartedPulling="2025-12-06 07:17:30.905220186 +0000 UTC m=+1225.718579565" lastFinishedPulling="2025-12-06 07:17:33.885761921 +0000 UTC m=+1228.699121310" observedRunningTime="2025-12-06 07:17:34.401059078 +0000 UTC m=+1229.214418487" watchObservedRunningTime="2025-12-06 07:17:34.408734814 +0000 UTC m=+1229.222094203" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.359336 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-zttzq"] Dec 06 07:17:36 crc kubenswrapper[4954]: E1206 07:17:36.376370 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="054f854f-83bd-4174-bf86-da2b1a6cfedb" containerName="mariadb-account-create-update" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.376434 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="054f854f-83bd-4174-bf86-da2b1a6cfedb" containerName="mariadb-account-create-update" Dec 06 07:17:36 crc kubenswrapper[4954]: E1206 07:17:36.376447 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46bc8cab-9415-4928-a4cf-715976f4119c" containerName="dnsmasq-dns" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.376458 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="46bc8cab-9415-4928-a4cf-715976f4119c" containerName="dnsmasq-dns" Dec 06 07:17:36 crc kubenswrapper[4954]: E1206 07:17:36.376477 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46bc8cab-9415-4928-a4cf-715976f4119c" containerName="init" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.376485 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="46bc8cab-9415-4928-a4cf-715976f4119c" containerName="init" Dec 06 07:17:36 crc kubenswrapper[4954]: E1206 07:17:36.376507 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="372dbd7a-e026-4794-9f89-071af1d6e3a8" containerName="mariadb-account-create-update" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.376518 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="372dbd7a-e026-4794-9f89-071af1d6e3a8" containerName="mariadb-account-create-update" Dec 06 07:17:36 crc kubenswrapper[4954]: E1206 07:17:36.376554 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b5d766-a567-4328-9942-c0dc2eede97e" containerName="mariadb-account-create-update" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.376584 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b5d766-a567-4328-9942-c0dc2eede97e" containerName="mariadb-account-create-update" Dec 06 07:17:36 crc kubenswrapper[4954]: E1206 07:17:36.376611 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ca78e5f-9977-49fa-b175-504c48bf1861" containerName="mariadb-database-create" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.376620 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ca78e5f-9977-49fa-b175-504c48bf1861" containerName="mariadb-database-create" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.377033 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="054f854f-83bd-4174-bf86-da2b1a6cfedb" containerName="mariadb-account-create-update" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.377077 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ca78e5f-9977-49fa-b175-504c48bf1861" containerName="mariadb-database-create" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.377111 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="94b5d766-a567-4328-9942-c0dc2eede97e" containerName="mariadb-account-create-update" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.377133 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="46bc8cab-9415-4928-a4cf-715976f4119c" containerName="dnsmasq-dns" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.377156 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="372dbd7a-e026-4794-9f89-071af1d6e3a8" containerName="mariadb-account-create-update" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.378032 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zttzq"] Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.378283 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zttzq" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.382137 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.382598 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-j5l2l" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.452363 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7854eba-8c30-4c39-9c96-36e1e3cf7437-combined-ca-bundle\") pod \"glance-db-sync-zttzq\" (UID: \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\") " pod="openstack/glance-db-sync-zttzq" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.452431 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7854eba-8c30-4c39-9c96-36e1e3cf7437-config-data\") pod \"glance-db-sync-zttzq\" (UID: \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\") " pod="openstack/glance-db-sync-zttzq" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.452554 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ffl4\" (UniqueName: \"kubernetes.io/projected/b7854eba-8c30-4c39-9c96-36e1e3cf7437-kube-api-access-9ffl4\") pod \"glance-db-sync-zttzq\" (UID: \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\") " pod="openstack/glance-db-sync-zttzq" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.452620 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7854eba-8c30-4c39-9c96-36e1e3cf7437-db-sync-config-data\") pod \"glance-db-sync-zttzq\" (UID: \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\") " pod="openstack/glance-db-sync-zttzq" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.554627 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7854eba-8c30-4c39-9c96-36e1e3cf7437-combined-ca-bundle\") pod \"glance-db-sync-zttzq\" (UID: \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\") " pod="openstack/glance-db-sync-zttzq" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.554722 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7854eba-8c30-4c39-9c96-36e1e3cf7437-config-data\") pod \"glance-db-sync-zttzq\" (UID: \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\") " pod="openstack/glance-db-sync-zttzq" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.554757 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ffl4\" (UniqueName: \"kubernetes.io/projected/b7854eba-8c30-4c39-9c96-36e1e3cf7437-kube-api-access-9ffl4\") pod \"glance-db-sync-zttzq\" (UID: \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\") " pod="openstack/glance-db-sync-zttzq" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.554809 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7854eba-8c30-4c39-9c96-36e1e3cf7437-db-sync-config-data\") pod \"glance-db-sync-zttzq\" (UID: \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\") " pod="openstack/glance-db-sync-zttzq" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.563578 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7854eba-8c30-4c39-9c96-36e1e3cf7437-db-sync-config-data\") pod \"glance-db-sync-zttzq\" (UID: \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\") " pod="openstack/glance-db-sync-zttzq" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.563596 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7854eba-8c30-4c39-9c96-36e1e3cf7437-combined-ca-bundle\") pod \"glance-db-sync-zttzq\" (UID: \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\") " pod="openstack/glance-db-sync-zttzq" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.565814 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7854eba-8c30-4c39-9c96-36e1e3cf7437-config-data\") pod \"glance-db-sync-zttzq\" (UID: \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\") " pod="openstack/glance-db-sync-zttzq" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.583102 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ffl4\" (UniqueName: \"kubernetes.io/projected/b7854eba-8c30-4c39-9c96-36e1e3cf7437-kube-api-access-9ffl4\") pod \"glance-db-sync-zttzq\" (UID: \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\") " pod="openstack/glance-db-sync-zttzq" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.704210 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zttzq" Dec 06 07:17:36 crc kubenswrapper[4954]: I1206 07:17:36.962814 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:36 crc kubenswrapper[4954]: E1206 07:17:36.963158 4954 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 07:17:36 crc kubenswrapper[4954]: E1206 07:17:36.963175 4954 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 07:17:36 crc kubenswrapper[4954]: E1206 07:17:36.963237 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift podName:b304148c-ab0e-42ac-966a-024ff59a8cde nodeName:}" failed. No retries permitted until 2025-12-06 07:17:44.963216847 +0000 UTC m=+1239.776576226 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift") pod "swift-storage-0" (UID: "b304148c-ab0e-42ac-966a-024ff59a8cde") : configmap "swift-ring-files" not found Dec 06 07:17:37 crc kubenswrapper[4954]: I1206 07:17:37.293846 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zttzq"] Dec 06 07:17:37 crc kubenswrapper[4954]: I1206 07:17:37.411804 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zttzq" event={"ID":"b7854eba-8c30-4c39-9c96-36e1e3cf7437","Type":"ContainerStarted","Data":"9eec3a0567261ca112136f2be93f2caf286e7004c84c92aa76b6fe1e3d515b27"} Dec 06 07:17:38 crc kubenswrapper[4954]: I1206 07:17:38.317797 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:17:38 crc kubenswrapper[4954]: I1206 07:17:38.373335 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-984c76dd7-vkbsv"] Dec 06 07:17:38 crc kubenswrapper[4954]: I1206 07:17:38.373658 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" podUID="d67c94fc-f109-4536-b490-59598ee00232" containerName="dnsmasq-dns" containerID="cri-o://56439ebbe0c086c44d87c285cf1e23c4b5e7abdacd33aacd55a69face6cdcb1c" gracePeriod=10 Dec 06 07:17:39 crc kubenswrapper[4954]: I1206 07:17:39.024864 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 06 07:17:39 crc kubenswrapper[4954]: I1206 07:17:39.120180 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" podUID="d67c94fc-f109-4536-b490-59598ee00232" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Dec 06 07:17:39 crc kubenswrapper[4954]: I1206 07:17:39.435317 4954 generic.go:334] "Generic (PLEG): container finished" podID="d67c94fc-f109-4536-b490-59598ee00232" containerID="56439ebbe0c086c44d87c285cf1e23c4b5e7abdacd33aacd55a69face6cdcb1c" exitCode=0 Dec 06 07:17:39 crc kubenswrapper[4954]: I1206 07:17:39.436923 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" event={"ID":"d67c94fc-f109-4536-b490-59598ee00232","Type":"ContainerDied","Data":"56439ebbe0c086c44d87c285cf1e23c4b5e7abdacd33aacd55a69face6cdcb1c"} Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.139405 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.233869 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-dns-svc\") pod \"d67c94fc-f109-4536-b490-59598ee00232\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.234022 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-ovsdbserver-sb\") pod \"d67c94fc-f109-4536-b490-59598ee00232\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.234249 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-config\") pod \"d67c94fc-f109-4536-b490-59598ee00232\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.234291 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkhw2\" (UniqueName: \"kubernetes.io/projected/d67c94fc-f109-4536-b490-59598ee00232-kube-api-access-nkhw2\") pod \"d67c94fc-f109-4536-b490-59598ee00232\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.234394 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-ovsdbserver-nb\") pod \"d67c94fc-f109-4536-b490-59598ee00232\" (UID: \"d67c94fc-f109-4536-b490-59598ee00232\") " Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.243387 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d67c94fc-f109-4536-b490-59598ee00232-kube-api-access-nkhw2" (OuterVolumeSpecName: "kube-api-access-nkhw2") pod "d67c94fc-f109-4536-b490-59598ee00232" (UID: "d67c94fc-f109-4536-b490-59598ee00232"). InnerVolumeSpecName "kube-api-access-nkhw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.288783 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d67c94fc-f109-4536-b490-59598ee00232" (UID: "d67c94fc-f109-4536-b490-59598ee00232"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.291236 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-config" (OuterVolumeSpecName: "config") pod "d67c94fc-f109-4536-b490-59598ee00232" (UID: "d67c94fc-f109-4536-b490-59598ee00232"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.303269 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d67c94fc-f109-4536-b490-59598ee00232" (UID: "d67c94fc-f109-4536-b490-59598ee00232"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.308849 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d67c94fc-f109-4536-b490-59598ee00232" (UID: "d67c94fc-f109-4536-b490-59598ee00232"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.336755 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.336809 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.336826 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.336841 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkhw2\" (UniqueName: \"kubernetes.io/projected/d67c94fc-f109-4536-b490-59598ee00232-kube-api-access-nkhw2\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.336855 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d67c94fc-f109-4536-b490-59598ee00232-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.449293 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" event={"ID":"d67c94fc-f109-4536-b490-59598ee00232","Type":"ContainerDied","Data":"aa27b09f792da0fa724357a00850447a1cf434bec94238fdc218ed1290a47f4a"} Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.449403 4954 scope.go:117] "RemoveContainer" containerID="56439ebbe0c086c44d87c285cf1e23c4b5e7abdacd33aacd55a69face6cdcb1c" Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.449680 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-984c76dd7-vkbsv" Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.493654 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-984c76dd7-vkbsv"] Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.503848 4954 scope.go:117] "RemoveContainer" containerID="7f199df227d670a85f4e663f49a89301006012cae58fd5b6e637804cb99896df" Dec 06 07:17:40 crc kubenswrapper[4954]: I1206 07:17:40.512391 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-984c76dd7-vkbsv"] Dec 06 07:17:41 crc kubenswrapper[4954]: I1206 07:17:41.462378 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d67c94fc-f109-4536-b490-59598ee00232" path="/var/lib/kubelet/pods/d67c94fc-f109-4536-b490-59598ee00232/volumes" Dec 06 07:17:45 crc kubenswrapper[4954]: I1206 07:17:45.063501 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:17:45 crc kubenswrapper[4954]: E1206 07:17:45.063970 4954 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 07:17:45 crc kubenswrapper[4954]: E1206 07:17:45.063987 4954 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 07:17:45 crc kubenswrapper[4954]: E1206 07:17:45.064037 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift podName:b304148c-ab0e-42ac-966a-024ff59a8cde nodeName:}" failed. No retries permitted until 2025-12-06 07:18:01.06401993 +0000 UTC m=+1255.877379319 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift") pod "swift-storage-0" (UID: "b304148c-ab0e-42ac-966a-024ff59a8cde") : configmap "swift-ring-files" not found Dec 06 07:17:45 crc kubenswrapper[4954]: I1206 07:17:45.517244 4954 generic.go:334] "Generic (PLEG): container finished" podID="31452db7-e2c4-4e61-8f8c-7017476f0bc0" containerID="b6518d5655feada488f616ac02858e05fce9b02722643230cbaae6694b695cac" exitCode=0 Dec 06 07:17:45 crc kubenswrapper[4954]: I1206 07:17:45.517351 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"31452db7-e2c4-4e61-8f8c-7017476f0bc0","Type":"ContainerDied","Data":"b6518d5655feada488f616ac02858e05fce9b02722643230cbaae6694b695cac"} Dec 06 07:17:45 crc kubenswrapper[4954]: I1206 07:17:45.519502 4954 generic.go:334] "Generic (PLEG): container finished" podID="578bec25-a54c-4f52-95f2-19f20f833437" containerID="0a70376c5fc5526ced20f1109a9cab2d6bb28881276a6118e1121122aa40551c" exitCode=0 Dec 06 07:17:45 crc kubenswrapper[4954]: I1206 07:17:45.519543 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"578bec25-a54c-4f52-95f2-19f20f833437","Type":"ContainerDied","Data":"0a70376c5fc5526ced20f1109a9cab2d6bb28881276a6118e1121122aa40551c"} Dec 06 07:17:45 crc kubenswrapper[4954]: I1206 07:17:45.523658 4954 generic.go:334] "Generic (PLEG): container finished" podID="c70d4759-1065-4412-9685-898f12a23a38" containerID="0bf16b7efb1f2617221d73830bc9e49e51a86fe1c47c1f391e18bcba68548179" exitCode=0 Dec 06 07:17:45 crc kubenswrapper[4954]: I1206 07:17:45.523712 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cddwk" event={"ID":"c70d4759-1065-4412-9685-898f12a23a38","Type":"ContainerDied","Data":"0bf16b7efb1f2617221d73830bc9e49e51a86fe1c47c1f391e18bcba68548179"} Dec 06 07:17:46 crc kubenswrapper[4954]: I1206 07:17:46.677877 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lnbn8" podUID="092ab1a2-b565-47cd-9b83-f306883b688e" containerName="ovn-controller" probeResult="failure" output=< Dec 06 07:17:46 crc kubenswrapper[4954]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 06 07:17:46 crc kubenswrapper[4954]: > Dec 06 07:17:51 crc kubenswrapper[4954]: I1206 07:17:51.687505 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lnbn8" podUID="092ab1a2-b565-47cd-9b83-f306883b688e" containerName="ovn-controller" probeResult="failure" output=< Dec 06 07:17:51 crc kubenswrapper[4954]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 06 07:17:51 crc kubenswrapper[4954]: > Dec 06 07:17:51 crc kubenswrapper[4954]: I1206 07:17:51.693872 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:17:51 crc kubenswrapper[4954]: I1206 07:17:51.709148 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.023701 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lnbn8-config-tq87b"] Dec 06 07:17:52 crc kubenswrapper[4954]: E1206 07:17:52.024335 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67c94fc-f109-4536-b490-59598ee00232" containerName="init" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.024367 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67c94fc-f109-4536-b490-59598ee00232" containerName="init" Dec 06 07:17:52 crc kubenswrapper[4954]: E1206 07:17:52.024415 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67c94fc-f109-4536-b490-59598ee00232" containerName="dnsmasq-dns" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.024430 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67c94fc-f109-4536-b490-59598ee00232" containerName="dnsmasq-dns" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.024700 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d67c94fc-f109-4536-b490-59598ee00232" containerName="dnsmasq-dns" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.025654 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.029115 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.034066 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lnbn8-config-tq87b"] Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.185100 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-var-run\") pod \"ovn-controller-lnbn8-config-tq87b\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.185181 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-additional-scripts\") pod \"ovn-controller-lnbn8-config-tq87b\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.185534 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-var-log-ovn\") pod \"ovn-controller-lnbn8-config-tq87b\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.185661 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-scripts\") pod \"ovn-controller-lnbn8-config-tq87b\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.185787 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-var-run-ovn\") pod \"ovn-controller-lnbn8-config-tq87b\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.185992 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q8w2\" (UniqueName: \"kubernetes.io/projected/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-kube-api-access-4q8w2\") pod \"ovn-controller-lnbn8-config-tq87b\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.288037 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-var-run-ovn\") pod \"ovn-controller-lnbn8-config-tq87b\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.288144 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q8w2\" (UniqueName: \"kubernetes.io/projected/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-kube-api-access-4q8w2\") pod \"ovn-controller-lnbn8-config-tq87b\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.288216 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-var-run\") pod \"ovn-controller-lnbn8-config-tq87b\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.288246 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-additional-scripts\") pod \"ovn-controller-lnbn8-config-tq87b\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.288286 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-var-log-ovn\") pod \"ovn-controller-lnbn8-config-tq87b\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.288310 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-scripts\") pod \"ovn-controller-lnbn8-config-tq87b\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.290640 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-scripts\") pod \"ovn-controller-lnbn8-config-tq87b\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.290972 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-var-run-ovn\") pod \"ovn-controller-lnbn8-config-tq87b\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.291420 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-var-run\") pod \"ovn-controller-lnbn8-config-tq87b\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.291901 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-additional-scripts\") pod \"ovn-controller-lnbn8-config-tq87b\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.291968 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-var-log-ovn\") pod \"ovn-controller-lnbn8-config-tq87b\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.423770 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q8w2\" (UniqueName: \"kubernetes.io/projected/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-kube-api-access-4q8w2\") pod \"ovn-controller-lnbn8-config-tq87b\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:52 crc kubenswrapper[4954]: I1206 07:17:52.664390 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:55 crc kubenswrapper[4954]: E1206 07:17:55.185805 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63" Dec 06 07:17:55 crc kubenswrapper[4954]: E1206 07:17:55.186408 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9ffl4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-zttzq_openstack(b7854eba-8c30-4c39-9c96-36e1e3cf7437): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:17:55 crc kubenswrapper[4954]: E1206 07:17:55.187605 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-zttzq" podUID="b7854eba-8c30-4c39-9c96-36e1e3cf7437" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.242456 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.367996 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c70d4759-1065-4412-9685-898f12a23a38-swiftconf\") pod \"c70d4759-1065-4412-9685-898f12a23a38\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.369667 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c70d4759-1065-4412-9685-898f12a23a38-etc-swift\") pod \"c70d4759-1065-4412-9685-898f12a23a38\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.369766 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7rrd\" (UniqueName: \"kubernetes.io/projected/c70d4759-1065-4412-9685-898f12a23a38-kube-api-access-x7rrd\") pod \"c70d4759-1065-4412-9685-898f12a23a38\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.369821 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c70d4759-1065-4412-9685-898f12a23a38-ring-data-devices\") pod \"c70d4759-1065-4412-9685-898f12a23a38\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.369851 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70d4759-1065-4412-9685-898f12a23a38-combined-ca-bundle\") pod \"c70d4759-1065-4412-9685-898f12a23a38\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.369882 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c70d4759-1065-4412-9685-898f12a23a38-dispersionconf\") pod \"c70d4759-1065-4412-9685-898f12a23a38\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.369947 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c70d4759-1065-4412-9685-898f12a23a38-scripts\") pod \"c70d4759-1065-4412-9685-898f12a23a38\" (UID: \"c70d4759-1065-4412-9685-898f12a23a38\") " Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.371444 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c70d4759-1065-4412-9685-898f12a23a38-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c70d4759-1065-4412-9685-898f12a23a38" (UID: "c70d4759-1065-4412-9685-898f12a23a38"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.372692 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c70d4759-1065-4412-9685-898f12a23a38-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c70d4759-1065-4412-9685-898f12a23a38" (UID: "c70d4759-1065-4412-9685-898f12a23a38"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.377443 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70d4759-1065-4412-9685-898f12a23a38-kube-api-access-x7rrd" (OuterVolumeSpecName: "kube-api-access-x7rrd") pod "c70d4759-1065-4412-9685-898f12a23a38" (UID: "c70d4759-1065-4412-9685-898f12a23a38"). InnerVolumeSpecName "kube-api-access-x7rrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.383747 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70d4759-1065-4412-9685-898f12a23a38-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c70d4759-1065-4412-9685-898f12a23a38" (UID: "c70d4759-1065-4412-9685-898f12a23a38"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.401061 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c70d4759-1065-4412-9685-898f12a23a38-scripts" (OuterVolumeSpecName: "scripts") pod "c70d4759-1065-4412-9685-898f12a23a38" (UID: "c70d4759-1065-4412-9685-898f12a23a38"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.401718 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70d4759-1065-4412-9685-898f12a23a38-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c70d4759-1065-4412-9685-898f12a23a38" (UID: "c70d4759-1065-4412-9685-898f12a23a38"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.416238 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70d4759-1065-4412-9685-898f12a23a38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c70d4759-1065-4412-9685-898f12a23a38" (UID: "c70d4759-1065-4412-9685-898f12a23a38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.473183 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c70d4759-1065-4412-9685-898f12a23a38-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.473677 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7rrd\" (UniqueName: \"kubernetes.io/projected/c70d4759-1065-4412-9685-898f12a23a38-kube-api-access-x7rrd\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.473707 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c70d4759-1065-4412-9685-898f12a23a38-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.473744 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70d4759-1065-4412-9685-898f12a23a38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.473761 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c70d4759-1065-4412-9685-898f12a23a38-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.473774 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c70d4759-1065-4412-9685-898f12a23a38-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.473786 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c70d4759-1065-4412-9685-898f12a23a38-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.533543 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lnbn8-config-tq87b"] Dec 06 07:17:55 crc kubenswrapper[4954]: W1206 07:17:55.538868 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79238c2f_7f07_4d70_9fd7_c5f238fe32d7.slice/crio-bc45b2d118cf3d2f95198d90f1836a086dbce593b8d0d27f0e1f17b79e889bba WatchSource:0}: Error finding container bc45b2d118cf3d2f95198d90f1836a086dbce593b8d0d27f0e1f17b79e889bba: Status 404 returned error can't find the container with id bc45b2d118cf3d2f95198d90f1836a086dbce593b8d0d27f0e1f17b79e889bba Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.720403 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cddwk" event={"ID":"c70d4759-1065-4412-9685-898f12a23a38","Type":"ContainerDied","Data":"1f6cf125245081ff16a7dddba8e00cc53b3ac87a6a568f8a81edead9e70506df"} Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.720962 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f6cf125245081ff16a7dddba8e00cc53b3ac87a6a568f8a81edead9e70506df" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.720451 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cddwk" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.725272 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lnbn8-config-tq87b" event={"ID":"79238c2f-7f07-4d70-9fd7-c5f238fe32d7","Type":"ContainerStarted","Data":"bc45b2d118cf3d2f95198d90f1836a086dbce593b8d0d27f0e1f17b79e889bba"} Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.731402 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"31452db7-e2c4-4e61-8f8c-7017476f0bc0","Type":"ContainerStarted","Data":"146293bb351f1e02c74e1849e3be8838c1377922574653baaa3053a66ff3aad5"} Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.731849 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.737108 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"578bec25-a54c-4f52-95f2-19f20f833437","Type":"ContainerStarted","Data":"cf94bc0a088f861f78bc849ba8a3d2602f00e20f5f2438d40190effa61b16e55"} Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.737448 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 06 07:17:55 crc kubenswrapper[4954]: E1206 07:17:55.739501 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63\\\"\"" pod="openstack/glance-db-sync-zttzq" podUID="b7854eba-8c30-4c39-9c96-36e1e3cf7437" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.775791 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=48.010381563 podStartE2EDuration="1m14.775766062s" podCreationTimestamp="2025-12-06 07:16:41 +0000 UTC" firstStartedPulling="2025-12-06 07:16:43.558401209 +0000 UTC m=+1178.371760598" lastFinishedPulling="2025-12-06 07:17:10.323785668 +0000 UTC m=+1205.137145097" observedRunningTime="2025-12-06 07:17:55.76674287 +0000 UTC m=+1250.580102289" watchObservedRunningTime="2025-12-06 07:17:55.775766062 +0000 UTC m=+1250.589125451" Dec 06 07:17:55 crc kubenswrapper[4954]: I1206 07:17:55.836707 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=48.614304076 podStartE2EDuration="1m15.836666207s" podCreationTimestamp="2025-12-06 07:16:40 +0000 UTC" firstStartedPulling="2025-12-06 07:16:43.034034719 +0000 UTC m=+1177.847394108" lastFinishedPulling="2025-12-06 07:17:10.25639684 +0000 UTC m=+1205.069756239" observedRunningTime="2025-12-06 07:17:55.823539324 +0000 UTC m=+1250.636898723" watchObservedRunningTime="2025-12-06 07:17:55.836666207 +0000 UTC m=+1250.650025606" Dec 06 07:17:56 crc kubenswrapper[4954]: I1206 07:17:56.672190 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-lnbn8" Dec 06 07:17:56 crc kubenswrapper[4954]: I1206 07:17:56.748690 4954 generic.go:334] "Generic (PLEG): container finished" podID="79238c2f-7f07-4d70-9fd7-c5f238fe32d7" containerID="16c047b2e6afd9ce52ca0f1e8a2b7459d989ea4b4352f8df7247e410e814cf59" exitCode=0 Dec 06 07:17:56 crc kubenswrapper[4954]: I1206 07:17:56.748755 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lnbn8-config-tq87b" event={"ID":"79238c2f-7f07-4d70-9fd7-c5f238fe32d7","Type":"ContainerDied","Data":"16c047b2e6afd9ce52ca0f1e8a2b7459d989ea4b4352f8df7247e410e814cf59"} Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.146063 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.229287 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-var-log-ovn\") pod \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.229424 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-var-run\") pod \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.229416 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "79238c2f-7f07-4d70-9fd7-c5f238fe32d7" (UID: "79238c2f-7f07-4d70-9fd7-c5f238fe32d7"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.229506 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-scripts\") pod \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.229451 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-var-run" (OuterVolumeSpecName: "var-run") pod "79238c2f-7f07-4d70-9fd7-c5f238fe32d7" (UID: "79238c2f-7f07-4d70-9fd7-c5f238fe32d7"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.229640 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q8w2\" (UniqueName: \"kubernetes.io/projected/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-kube-api-access-4q8w2\") pod \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.229739 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-var-run-ovn\") pod \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.229808 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-additional-scripts\") pod \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\" (UID: \"79238c2f-7f07-4d70-9fd7-c5f238fe32d7\") " Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.229908 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "79238c2f-7f07-4d70-9fd7-c5f238fe32d7" (UID: "79238c2f-7f07-4d70-9fd7-c5f238fe32d7"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.230238 4954 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.230255 4954 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-var-run\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.230265 4954 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.230583 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "79238c2f-7f07-4d70-9fd7-c5f238fe32d7" (UID: "79238c2f-7f07-4d70-9fd7-c5f238fe32d7"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.230968 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-scripts" (OuterVolumeSpecName: "scripts") pod "79238c2f-7f07-4d70-9fd7-c5f238fe32d7" (UID: "79238c2f-7f07-4d70-9fd7-c5f238fe32d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.251515 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-kube-api-access-4q8w2" (OuterVolumeSpecName: "kube-api-access-4q8w2") pod "79238c2f-7f07-4d70-9fd7-c5f238fe32d7" (UID: "79238c2f-7f07-4d70-9fd7-c5f238fe32d7"). InnerVolumeSpecName "kube-api-access-4q8w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.332049 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.332102 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q8w2\" (UniqueName: \"kubernetes.io/projected/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-kube-api-access-4q8w2\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.332117 4954 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/79238c2f-7f07-4d70-9fd7-c5f238fe32d7-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.776578 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lnbn8-config-tq87b" event={"ID":"79238c2f-7f07-4d70-9fd7-c5f238fe32d7","Type":"ContainerDied","Data":"bc45b2d118cf3d2f95198d90f1836a086dbce593b8d0d27f0e1f17b79e889bba"} Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.776622 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc45b2d118cf3d2f95198d90f1836a086dbce593b8d0d27f0e1f17b79e889bba" Dec 06 07:17:58 crc kubenswrapper[4954]: I1206 07:17:58.776817 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lnbn8-config-tq87b" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.286981 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lnbn8-config-tq87b"] Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.298838 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lnbn8-config-tq87b"] Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.456181 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79238c2f-7f07-4d70-9fd7-c5f238fe32d7" path="/var/lib/kubelet/pods/79238c2f-7f07-4d70-9fd7-c5f238fe32d7/volumes" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.505132 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lnbn8-config-4dglw"] Dec 06 07:17:59 crc kubenswrapper[4954]: E1206 07:17:59.505666 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79238c2f-7f07-4d70-9fd7-c5f238fe32d7" containerName="ovn-config" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.505692 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="79238c2f-7f07-4d70-9fd7-c5f238fe32d7" containerName="ovn-config" Dec 06 07:17:59 crc kubenswrapper[4954]: E1206 07:17:59.505712 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70d4759-1065-4412-9685-898f12a23a38" containerName="swift-ring-rebalance" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.505722 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70d4759-1065-4412-9685-898f12a23a38" containerName="swift-ring-rebalance" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.505956 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="79238c2f-7f07-4d70-9fd7-c5f238fe32d7" containerName="ovn-config" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.505983 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70d4759-1065-4412-9685-898f12a23a38" containerName="swift-ring-rebalance" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.506746 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.511175 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.523062 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lnbn8-config-4dglw"] Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.555801 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70d14ea1-37a2-41c0-b561-792db6cd4aac-scripts\") pod \"ovn-controller-lnbn8-config-4dglw\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.555871 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/70d14ea1-37a2-41c0-b561-792db6cd4aac-var-run-ovn\") pod \"ovn-controller-lnbn8-config-4dglw\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.556444 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/70d14ea1-37a2-41c0-b561-792db6cd4aac-var-log-ovn\") pod \"ovn-controller-lnbn8-config-4dglw\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.556644 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4zhw\" (UniqueName: \"kubernetes.io/projected/70d14ea1-37a2-41c0-b561-792db6cd4aac-kube-api-access-k4zhw\") pod \"ovn-controller-lnbn8-config-4dglw\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.556912 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/70d14ea1-37a2-41c0-b561-792db6cd4aac-var-run\") pod \"ovn-controller-lnbn8-config-4dglw\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.557107 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/70d14ea1-37a2-41c0-b561-792db6cd4aac-additional-scripts\") pod \"ovn-controller-lnbn8-config-4dglw\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.659478 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/70d14ea1-37a2-41c0-b561-792db6cd4aac-var-log-ovn\") pod \"ovn-controller-lnbn8-config-4dglw\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.659598 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4zhw\" (UniqueName: \"kubernetes.io/projected/70d14ea1-37a2-41c0-b561-792db6cd4aac-kube-api-access-k4zhw\") pod \"ovn-controller-lnbn8-config-4dglw\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.659644 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/70d14ea1-37a2-41c0-b561-792db6cd4aac-var-run\") pod \"ovn-controller-lnbn8-config-4dglw\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.659672 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/70d14ea1-37a2-41c0-b561-792db6cd4aac-additional-scripts\") pod \"ovn-controller-lnbn8-config-4dglw\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.659735 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70d14ea1-37a2-41c0-b561-792db6cd4aac-scripts\") pod \"ovn-controller-lnbn8-config-4dglw\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.659770 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/70d14ea1-37a2-41c0-b561-792db6cd4aac-var-run-ovn\") pod \"ovn-controller-lnbn8-config-4dglw\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.660026 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/70d14ea1-37a2-41c0-b561-792db6cd4aac-var-log-ovn\") pod \"ovn-controller-lnbn8-config-4dglw\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.660046 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/70d14ea1-37a2-41c0-b561-792db6cd4aac-var-run-ovn\") pod \"ovn-controller-lnbn8-config-4dglw\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.660197 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/70d14ea1-37a2-41c0-b561-792db6cd4aac-var-run\") pod \"ovn-controller-lnbn8-config-4dglw\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.660828 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/70d14ea1-37a2-41c0-b561-792db6cd4aac-additional-scripts\") pod \"ovn-controller-lnbn8-config-4dglw\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.662248 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70d14ea1-37a2-41c0-b561-792db6cd4aac-scripts\") pod \"ovn-controller-lnbn8-config-4dglw\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.684633 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4zhw\" (UniqueName: \"kubernetes.io/projected/70d14ea1-37a2-41c0-b561-792db6cd4aac-kube-api-access-k4zhw\") pod \"ovn-controller-lnbn8-config-4dglw\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:17:59 crc kubenswrapper[4954]: I1206 07:17:59.828155 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:18:00 crc kubenswrapper[4954]: I1206 07:18:00.311214 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lnbn8-config-4dglw"] Dec 06 07:18:00 crc kubenswrapper[4954]: W1206 07:18:00.320146 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70d14ea1_37a2_41c0_b561_792db6cd4aac.slice/crio-81ad01dfcc29fbf04880b396babf0155babaa99cf722324990f93d50dc1798b5 WatchSource:0}: Error finding container 81ad01dfcc29fbf04880b396babf0155babaa99cf722324990f93d50dc1798b5: Status 404 returned error can't find the container with id 81ad01dfcc29fbf04880b396babf0155babaa99cf722324990f93d50dc1798b5 Dec 06 07:18:00 crc kubenswrapper[4954]: I1206 07:18:00.819768 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lnbn8-config-4dglw" event={"ID":"70d14ea1-37a2-41c0-b561-792db6cd4aac","Type":"ContainerStarted","Data":"76b9762d25a0f68e1ca9721e3f881fc79744180682ca9bbe031f0e0466f604df"} Dec 06 07:18:00 crc kubenswrapper[4954]: I1206 07:18:00.820288 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lnbn8-config-4dglw" event={"ID":"70d14ea1-37a2-41c0-b561-792db6cd4aac","Type":"ContainerStarted","Data":"81ad01dfcc29fbf04880b396babf0155babaa99cf722324990f93d50dc1798b5"} Dec 06 07:18:01 crc kubenswrapper[4954]: I1206 07:18:01.090342 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:18:01 crc kubenswrapper[4954]: I1206 07:18:01.100625 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift\") pod \"swift-storage-0\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " pod="openstack/swift-storage-0" Dec 06 07:18:01 crc kubenswrapper[4954]: I1206 07:18:01.345615 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 06 07:18:01 crc kubenswrapper[4954]: I1206 07:18:01.830893 4954 generic.go:334] "Generic (PLEG): container finished" podID="70d14ea1-37a2-41c0-b561-792db6cd4aac" containerID="76b9762d25a0f68e1ca9721e3f881fc79744180682ca9bbe031f0e0466f604df" exitCode=0 Dec 06 07:18:01 crc kubenswrapper[4954]: I1206 07:18:01.831004 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lnbn8-config-4dglw" event={"ID":"70d14ea1-37a2-41c0-b561-792db6cd4aac","Type":"ContainerDied","Data":"76b9762d25a0f68e1ca9721e3f881fc79744180682ca9bbe031f0e0466f604df"} Dec 06 07:18:01 crc kubenswrapper[4954]: I1206 07:18:01.924656 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 06 07:18:01 crc kubenswrapper[4954]: W1206 07:18:01.935675 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb304148c_ab0e_42ac_966a_024ff59a8cde.slice/crio-06a142b099c41eaf5928932556cdf7352557bc53eba3e9a9bf04b502832ff6dc WatchSource:0}: Error finding container 06a142b099c41eaf5928932556cdf7352557bc53eba3e9a9bf04b502832ff6dc: Status 404 returned error can't find the container with id 06a142b099c41eaf5928932556cdf7352557bc53eba3e9a9bf04b502832ff6dc Dec 06 07:18:02 crc kubenswrapper[4954]: I1206 07:18:02.842388 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerStarted","Data":"06a142b099c41eaf5928932556cdf7352557bc53eba3e9a9bf04b502832ff6dc"} Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.635979 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.756879 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/70d14ea1-37a2-41c0-b561-792db6cd4aac-var-run\") pod \"70d14ea1-37a2-41c0-b561-792db6cd4aac\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.756967 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70d14ea1-37a2-41c0-b561-792db6cd4aac-scripts\") pod \"70d14ea1-37a2-41c0-b561-792db6cd4aac\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.757039 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/70d14ea1-37a2-41c0-b561-792db6cd4aac-additional-scripts\") pod \"70d14ea1-37a2-41c0-b561-792db6cd4aac\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.757191 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/70d14ea1-37a2-41c0-b561-792db6cd4aac-var-log-ovn\") pod \"70d14ea1-37a2-41c0-b561-792db6cd4aac\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.757270 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4zhw\" (UniqueName: \"kubernetes.io/projected/70d14ea1-37a2-41c0-b561-792db6cd4aac-kube-api-access-k4zhw\") pod \"70d14ea1-37a2-41c0-b561-792db6cd4aac\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.757299 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/70d14ea1-37a2-41c0-b561-792db6cd4aac-var-run-ovn\") pod \"70d14ea1-37a2-41c0-b561-792db6cd4aac\" (UID: \"70d14ea1-37a2-41c0-b561-792db6cd4aac\") " Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.757691 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70d14ea1-37a2-41c0-b561-792db6cd4aac-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "70d14ea1-37a2-41c0-b561-792db6cd4aac" (UID: "70d14ea1-37a2-41c0-b561-792db6cd4aac"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.757879 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70d14ea1-37a2-41c0-b561-792db6cd4aac-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "70d14ea1-37a2-41c0-b561-792db6cd4aac" (UID: "70d14ea1-37a2-41c0-b561-792db6cd4aac"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.758270 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70d14ea1-37a2-41c0-b561-792db6cd4aac-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "70d14ea1-37a2-41c0-b561-792db6cd4aac" (UID: "70d14ea1-37a2-41c0-b561-792db6cd4aac"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.758343 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70d14ea1-37a2-41c0-b561-792db6cd4aac-var-run" (OuterVolumeSpecName: "var-run") pod "70d14ea1-37a2-41c0-b561-792db6cd4aac" (UID: "70d14ea1-37a2-41c0-b561-792db6cd4aac"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.758604 4954 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/70d14ea1-37a2-41c0-b561-792db6cd4aac-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.758723 4954 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/70d14ea1-37a2-41c0-b561-792db6cd4aac-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.758626 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70d14ea1-37a2-41c0-b561-792db6cd4aac-scripts" (OuterVolumeSpecName: "scripts") pod "70d14ea1-37a2-41c0-b561-792db6cd4aac" (UID: "70d14ea1-37a2-41c0-b561-792db6cd4aac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.765059 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70d14ea1-37a2-41c0-b561-792db6cd4aac-kube-api-access-k4zhw" (OuterVolumeSpecName: "kube-api-access-k4zhw") pod "70d14ea1-37a2-41c0-b561-792db6cd4aac" (UID: "70d14ea1-37a2-41c0-b561-792db6cd4aac"). InnerVolumeSpecName "kube-api-access-k4zhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.854106 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lnbn8-config-4dglw" event={"ID":"70d14ea1-37a2-41c0-b561-792db6cd4aac","Type":"ContainerDied","Data":"81ad01dfcc29fbf04880b396babf0155babaa99cf722324990f93d50dc1798b5"} Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.854172 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81ad01dfcc29fbf04880b396babf0155babaa99cf722324990f93d50dc1798b5" Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.854209 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lnbn8-config-4dglw" Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.857006 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerStarted","Data":"68942f2c61b18621daaf78d82703376edb42951c9bc918b02a792818df0d6cc3"} Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.860698 4954 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/70d14ea1-37a2-41c0-b561-792db6cd4aac-var-run\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.860758 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70d14ea1-37a2-41c0-b561-792db6cd4aac-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.860772 4954 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/70d14ea1-37a2-41c0-b561-792db6cd4aac-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:03 crc kubenswrapper[4954]: I1206 07:18:03.860784 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4zhw\" (UniqueName: \"kubernetes.io/projected/70d14ea1-37a2-41c0-b561-792db6cd4aac-kube-api-access-k4zhw\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:04 crc kubenswrapper[4954]: I1206 07:18:04.745778 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lnbn8-config-4dglw"] Dec 06 07:18:04 crc kubenswrapper[4954]: I1206 07:18:04.756398 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lnbn8-config-4dglw"] Dec 06 07:18:04 crc kubenswrapper[4954]: I1206 07:18:04.871262 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerStarted","Data":"7d0ab20184ee34faada856909d1b2e18f0f407fa1b19e33fe65f5f582f55f86c"} Dec 06 07:18:04 crc kubenswrapper[4954]: I1206 07:18:04.871358 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerStarted","Data":"1bb5213e7cc1c7b665991c2e686f0fcd95258f340b82e006fe07376d1ac56ed8"} Dec 06 07:18:05 crc kubenswrapper[4954]: I1206 07:18:05.456180 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70d14ea1-37a2-41c0-b561-792db6cd4aac" path="/var/lib/kubelet/pods/70d14ea1-37a2-41c0-b561-792db6cd4aac/volumes" Dec 06 07:18:05 crc kubenswrapper[4954]: I1206 07:18:05.886587 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerStarted","Data":"17fe025c33d04ce12223561653e1cb4e9236655eaf9cd02886b52df15dccb969"} Dec 06 07:18:10 crc kubenswrapper[4954]: I1206 07:18:10.101497 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:18:10 crc kubenswrapper[4954]: I1206 07:18:10.102150 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:18:11 crc kubenswrapper[4954]: I1206 07:18:11.965258 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerStarted","Data":"854d5d618d13ec03ec6c2f4744287e27f89042a1dae6f369b12bbc3408f4ff5c"} Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.292859 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.738958 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-xjfwm"] Dec 06 07:18:12 crc kubenswrapper[4954]: E1206 07:18:12.747776 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70d14ea1-37a2-41c0-b561-792db6cd4aac" containerName="ovn-config" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.747837 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d14ea1-37a2-41c0-b561-792db6cd4aac" containerName="ovn-config" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.748304 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="70d14ea1-37a2-41c0-b561-792db6cd4aac" containerName="ovn-config" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.749132 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-df0e-account-create-update-nllgl"] Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.750255 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-df0e-account-create-update-nllgl" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.750549 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xjfwm" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.757833 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.767196 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xjfwm"] Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.809250 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-df0e-account-create-update-nllgl"] Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.887064 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b279d2ed-3be6-4f9e-b27d-8fac529b4d32-operator-scripts\") pod \"cinder-df0e-account-create-update-nllgl\" (UID: \"b279d2ed-3be6-4f9e-b27d-8fac529b4d32\") " pod="openstack/cinder-df0e-account-create-update-nllgl" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.887142 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ece29cd-d385-457d-853d-f37b09007d05-operator-scripts\") pod \"cinder-db-create-xjfwm\" (UID: \"3ece29cd-d385-457d-853d-f37b09007d05\") " pod="openstack/cinder-db-create-xjfwm" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.887235 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbhxd\" (UniqueName: \"kubernetes.io/projected/b279d2ed-3be6-4f9e-b27d-8fac529b4d32-kube-api-access-rbhxd\") pod \"cinder-df0e-account-create-update-nllgl\" (UID: \"b279d2ed-3be6-4f9e-b27d-8fac529b4d32\") " pod="openstack/cinder-df0e-account-create-update-nllgl" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.887323 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm688\" (UniqueName: \"kubernetes.io/projected/3ece29cd-d385-457d-853d-f37b09007d05-kube-api-access-dm688\") pod \"cinder-db-create-xjfwm\" (UID: \"3ece29cd-d385-457d-853d-f37b09007d05\") " pod="openstack/cinder-db-create-xjfwm" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.895196 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-s7jcc"] Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.896911 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-s7jcc" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.930329 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-664f-account-create-update-74gq9"] Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.931715 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-664f-account-create-update-74gq9" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.938586 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.950404 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-s7jcc"] Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.963660 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-664f-account-create-update-74gq9"] Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.988450 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerStarted","Data":"3a5cc68bf70f6cc371a8cb6cffab174f354261b971b711b47d5d116af25c682e"} Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.988511 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerStarted","Data":"ec4a2a2f803658eaa7ba7c7d378bed85b1093f0a8c4791ca3b80cb2214e6e449"} Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.988523 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerStarted","Data":"bb1952930feb993e59b60f78d883b948aa239b6e485225832fb60b3345763360"} Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.988662 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm688\" (UniqueName: \"kubernetes.io/projected/3ece29cd-d385-457d-853d-f37b09007d05-kube-api-access-dm688\") pod \"cinder-db-create-xjfwm\" (UID: \"3ece29cd-d385-457d-853d-f37b09007d05\") " pod="openstack/cinder-db-create-xjfwm" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.988774 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxxlg\" (UniqueName: \"kubernetes.io/projected/c4546452-6f33-4f15-9b6e-30aecc6e81d1-kube-api-access-xxxlg\") pod \"barbican-db-create-s7jcc\" (UID: \"c4546452-6f33-4f15-9b6e-30aecc6e81d1\") " pod="openstack/barbican-db-create-s7jcc" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.988841 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b279d2ed-3be6-4f9e-b27d-8fac529b4d32-operator-scripts\") pod \"cinder-df0e-account-create-update-nllgl\" (UID: \"b279d2ed-3be6-4f9e-b27d-8fac529b4d32\") " pod="openstack/cinder-df0e-account-create-update-nllgl" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.988867 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ece29cd-d385-457d-853d-f37b09007d05-operator-scripts\") pod \"cinder-db-create-xjfwm\" (UID: \"3ece29cd-d385-457d-853d-f37b09007d05\") " pod="openstack/cinder-db-create-xjfwm" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.988928 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4546452-6f33-4f15-9b6e-30aecc6e81d1-operator-scripts\") pod \"barbican-db-create-s7jcc\" (UID: \"c4546452-6f33-4f15-9b6e-30aecc6e81d1\") " pod="openstack/barbican-db-create-s7jcc" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.988978 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbhxd\" (UniqueName: \"kubernetes.io/projected/b279d2ed-3be6-4f9e-b27d-8fac529b4d32-kube-api-access-rbhxd\") pod \"cinder-df0e-account-create-update-nllgl\" (UID: \"b279d2ed-3be6-4f9e-b27d-8fac529b4d32\") " pod="openstack/cinder-df0e-account-create-update-nllgl" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.990812 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b279d2ed-3be6-4f9e-b27d-8fac529b4d32-operator-scripts\") pod \"cinder-df0e-account-create-update-nllgl\" (UID: \"b279d2ed-3be6-4f9e-b27d-8fac529b4d32\") " pod="openstack/cinder-df0e-account-create-update-nllgl" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.991497 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ece29cd-d385-457d-853d-f37b09007d05-operator-scripts\") pod \"cinder-db-create-xjfwm\" (UID: \"3ece29cd-d385-457d-853d-f37b09007d05\") " pod="openstack/cinder-db-create-xjfwm" Dec 06 07:18:12 crc kubenswrapper[4954]: I1206 07:18:12.992254 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zttzq" event={"ID":"b7854eba-8c30-4c39-9c96-36e1e3cf7437","Type":"ContainerStarted","Data":"3018d3d8b3cb5a9e3856801dbdf5ff7b95038c330f1d30888a44744a692d61e1"} Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.015672 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm688\" (UniqueName: \"kubernetes.io/projected/3ece29cd-d385-457d-853d-f37b09007d05-kube-api-access-dm688\") pod \"cinder-db-create-xjfwm\" (UID: \"3ece29cd-d385-457d-853d-f37b09007d05\") " pod="openstack/cinder-db-create-xjfwm" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.020258 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-zttzq" podStartSLOduration=2.631213293 podStartE2EDuration="37.020230042s" podCreationTimestamp="2025-12-06 07:17:36 +0000 UTC" firstStartedPulling="2025-12-06 07:17:37.297169388 +0000 UTC m=+1232.110528777" lastFinishedPulling="2025-12-06 07:18:11.686186137 +0000 UTC m=+1266.499545526" observedRunningTime="2025-12-06 07:18:13.01679795 +0000 UTC m=+1267.830157339" watchObservedRunningTime="2025-12-06 07:18:13.020230042 +0000 UTC m=+1267.833589431" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.027884 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.031545 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbhxd\" (UniqueName: \"kubernetes.io/projected/b279d2ed-3be6-4f9e-b27d-8fac529b4d32-kube-api-access-rbhxd\") pod \"cinder-df0e-account-create-update-nllgl\" (UID: \"b279d2ed-3be6-4f9e-b27d-8fac529b4d32\") " pod="openstack/cinder-df0e-account-create-update-nllgl" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.072685 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-4vvr4"] Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.074497 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4vvr4" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.083164 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-df0e-account-create-update-nllgl" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.084847 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4vvr4"] Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.085087 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.085270 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.085441 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.085540 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r8jst" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.092761 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxxlg\" (UniqueName: \"kubernetes.io/projected/c4546452-6f33-4f15-9b6e-30aecc6e81d1-kube-api-access-xxxlg\") pod \"barbican-db-create-s7jcc\" (UID: \"c4546452-6f33-4f15-9b6e-30aecc6e81d1\") " pod="openstack/barbican-db-create-s7jcc" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.092841 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45760345-6d26-4a9b-931e-1c4d5cc3ca3e-operator-scripts\") pod \"barbican-664f-account-create-update-74gq9\" (UID: \"45760345-6d26-4a9b-931e-1c4d5cc3ca3e\") " pod="openstack/barbican-664f-account-create-update-74gq9" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.092909 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4546452-6f33-4f15-9b6e-30aecc6e81d1-operator-scripts\") pod \"barbican-db-create-s7jcc\" (UID: \"c4546452-6f33-4f15-9b6e-30aecc6e81d1\") " pod="openstack/barbican-db-create-s7jcc" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.092986 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49jkn\" (UniqueName: \"kubernetes.io/projected/45760345-6d26-4a9b-931e-1c4d5cc3ca3e-kube-api-access-49jkn\") pod \"barbican-664f-account-create-update-74gq9\" (UID: \"45760345-6d26-4a9b-931e-1c4d5cc3ca3e\") " pod="openstack/barbican-664f-account-create-update-74gq9" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.097741 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xjfwm" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.104423 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4546452-6f33-4f15-9b6e-30aecc6e81d1-operator-scripts\") pod \"barbican-db-create-s7jcc\" (UID: \"c4546452-6f33-4f15-9b6e-30aecc6e81d1\") " pod="openstack/barbican-db-create-s7jcc" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.151103 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxxlg\" (UniqueName: \"kubernetes.io/projected/c4546452-6f33-4f15-9b6e-30aecc6e81d1-kube-api-access-xxxlg\") pod \"barbican-db-create-s7jcc\" (UID: \"c4546452-6f33-4f15-9b6e-30aecc6e81d1\") " pod="openstack/barbican-db-create-s7jcc" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.195170 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49jkn\" (UniqueName: \"kubernetes.io/projected/45760345-6d26-4a9b-931e-1c4d5cc3ca3e-kube-api-access-49jkn\") pod \"barbican-664f-account-create-update-74gq9\" (UID: \"45760345-6d26-4a9b-931e-1c4d5cc3ca3e\") " pod="openstack/barbican-664f-account-create-update-74gq9" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.195291 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4374b249-a0ad-4b36-9924-fa871e0e63fa-config-data\") pod \"keystone-db-sync-4vvr4\" (UID: \"4374b249-a0ad-4b36-9924-fa871e0e63fa\") " pod="openstack/keystone-db-sync-4vvr4" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.195348 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v427b\" (UniqueName: \"kubernetes.io/projected/4374b249-a0ad-4b36-9924-fa871e0e63fa-kube-api-access-v427b\") pod \"keystone-db-sync-4vvr4\" (UID: \"4374b249-a0ad-4b36-9924-fa871e0e63fa\") " pod="openstack/keystone-db-sync-4vvr4" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.195518 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45760345-6d26-4a9b-931e-1c4d5cc3ca3e-operator-scripts\") pod \"barbican-664f-account-create-update-74gq9\" (UID: \"45760345-6d26-4a9b-931e-1c4d5cc3ca3e\") " pod="openstack/barbican-664f-account-create-update-74gq9" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.195549 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4374b249-a0ad-4b36-9924-fa871e0e63fa-combined-ca-bundle\") pod \"keystone-db-sync-4vvr4\" (UID: \"4374b249-a0ad-4b36-9924-fa871e0e63fa\") " pod="openstack/keystone-db-sync-4vvr4" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.198536 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45760345-6d26-4a9b-931e-1c4d5cc3ca3e-operator-scripts\") pod \"barbican-664f-account-create-update-74gq9\" (UID: \"45760345-6d26-4a9b-931e-1c4d5cc3ca3e\") " pod="openstack/barbican-664f-account-create-update-74gq9" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.219691 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-498rf"] Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.221109 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-498rf" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.236147 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-s7jcc" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.250664 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49jkn\" (UniqueName: \"kubernetes.io/projected/45760345-6d26-4a9b-931e-1c4d5cc3ca3e-kube-api-access-49jkn\") pod \"barbican-664f-account-create-update-74gq9\" (UID: \"45760345-6d26-4a9b-931e-1c4d5cc3ca3e\") " pod="openstack/barbican-664f-account-create-update-74gq9" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.259633 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-251d-account-create-update-d7mbv"] Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.260262 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-664f-account-create-update-74gq9" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.261295 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-251d-account-create-update-d7mbv" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.265137 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.298635 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v427b\" (UniqueName: \"kubernetes.io/projected/4374b249-a0ad-4b36-9924-fa871e0e63fa-kube-api-access-v427b\") pod \"keystone-db-sync-4vvr4\" (UID: \"4374b249-a0ad-4b36-9924-fa871e0e63fa\") " pod="openstack/keystone-db-sync-4vvr4" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.299174 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a33fd05b-5e69-4042-a5b2-b07dd4edd63d-operator-scripts\") pod \"neutron-db-create-498rf\" (UID: \"a33fd05b-5e69-4042-a5b2-b07dd4edd63d\") " pod="openstack/neutron-db-create-498rf" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.302890 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4374b249-a0ad-4b36-9924-fa871e0e63fa-combined-ca-bundle\") pod \"keystone-db-sync-4vvr4\" (UID: \"4374b249-a0ad-4b36-9924-fa871e0e63fa\") " pod="openstack/keystone-db-sync-4vvr4" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.303060 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm4t5\" (UniqueName: \"kubernetes.io/projected/a33fd05b-5e69-4042-a5b2-b07dd4edd63d-kube-api-access-xm4t5\") pod \"neutron-db-create-498rf\" (UID: \"a33fd05b-5e69-4042-a5b2-b07dd4edd63d\") " pod="openstack/neutron-db-create-498rf" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.303301 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4374b249-a0ad-4b36-9924-fa871e0e63fa-config-data\") pod \"keystone-db-sync-4vvr4\" (UID: \"4374b249-a0ad-4b36-9924-fa871e0e63fa\") " pod="openstack/keystone-db-sync-4vvr4" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.310241 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4374b249-a0ad-4b36-9924-fa871e0e63fa-config-data\") pod \"keystone-db-sync-4vvr4\" (UID: \"4374b249-a0ad-4b36-9924-fa871e0e63fa\") " pod="openstack/keystone-db-sync-4vvr4" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.313457 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4374b249-a0ad-4b36-9924-fa871e0e63fa-combined-ca-bundle\") pod \"keystone-db-sync-4vvr4\" (UID: \"4374b249-a0ad-4b36-9924-fa871e0e63fa\") " pod="openstack/keystone-db-sync-4vvr4" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.318467 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-498rf"] Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.369797 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-251d-account-create-update-d7mbv"] Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.370133 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v427b\" (UniqueName: \"kubernetes.io/projected/4374b249-a0ad-4b36-9924-fa871e0e63fa-kube-api-access-v427b\") pod \"keystone-db-sync-4vvr4\" (UID: \"4374b249-a0ad-4b36-9924-fa871e0e63fa\") " pod="openstack/keystone-db-sync-4vvr4" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.411184 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg5lc\" (UniqueName: \"kubernetes.io/projected/5aec320e-8fda-4ac0-aac9-85fa4a436c01-kube-api-access-wg5lc\") pod \"neutron-251d-account-create-update-d7mbv\" (UID: \"5aec320e-8fda-4ac0-aac9-85fa4a436c01\") " pod="openstack/neutron-251d-account-create-update-d7mbv" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.411313 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5aec320e-8fda-4ac0-aac9-85fa4a436c01-operator-scripts\") pod \"neutron-251d-account-create-update-d7mbv\" (UID: \"5aec320e-8fda-4ac0-aac9-85fa4a436c01\") " pod="openstack/neutron-251d-account-create-update-d7mbv" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.411387 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a33fd05b-5e69-4042-a5b2-b07dd4edd63d-operator-scripts\") pod \"neutron-db-create-498rf\" (UID: \"a33fd05b-5e69-4042-a5b2-b07dd4edd63d\") " pod="openstack/neutron-db-create-498rf" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.411477 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm4t5\" (UniqueName: \"kubernetes.io/projected/a33fd05b-5e69-4042-a5b2-b07dd4edd63d-kube-api-access-xm4t5\") pod \"neutron-db-create-498rf\" (UID: \"a33fd05b-5e69-4042-a5b2-b07dd4edd63d\") " pod="openstack/neutron-db-create-498rf" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.425621 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a33fd05b-5e69-4042-a5b2-b07dd4edd63d-operator-scripts\") pod \"neutron-db-create-498rf\" (UID: \"a33fd05b-5e69-4042-a5b2-b07dd4edd63d\") " pod="openstack/neutron-db-create-498rf" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.446124 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm4t5\" (UniqueName: \"kubernetes.io/projected/a33fd05b-5e69-4042-a5b2-b07dd4edd63d-kube-api-access-xm4t5\") pod \"neutron-db-create-498rf\" (UID: \"a33fd05b-5e69-4042-a5b2-b07dd4edd63d\") " pod="openstack/neutron-db-create-498rf" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.514023 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg5lc\" (UniqueName: \"kubernetes.io/projected/5aec320e-8fda-4ac0-aac9-85fa4a436c01-kube-api-access-wg5lc\") pod \"neutron-251d-account-create-update-d7mbv\" (UID: \"5aec320e-8fda-4ac0-aac9-85fa4a436c01\") " pod="openstack/neutron-251d-account-create-update-d7mbv" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.514188 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5aec320e-8fda-4ac0-aac9-85fa4a436c01-operator-scripts\") pod \"neutron-251d-account-create-update-d7mbv\" (UID: \"5aec320e-8fda-4ac0-aac9-85fa4a436c01\") " pod="openstack/neutron-251d-account-create-update-d7mbv" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.516845 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5aec320e-8fda-4ac0-aac9-85fa4a436c01-operator-scripts\") pod \"neutron-251d-account-create-update-d7mbv\" (UID: \"5aec320e-8fda-4ac0-aac9-85fa4a436c01\") " pod="openstack/neutron-251d-account-create-update-d7mbv" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.540474 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg5lc\" (UniqueName: \"kubernetes.io/projected/5aec320e-8fda-4ac0-aac9-85fa4a436c01-kube-api-access-wg5lc\") pod \"neutron-251d-account-create-update-d7mbv\" (UID: \"5aec320e-8fda-4ac0-aac9-85fa4a436c01\") " pod="openstack/neutron-251d-account-create-update-d7mbv" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.569276 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4vvr4" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.588673 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-498rf" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.612671 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-251d-account-create-update-d7mbv" Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.883698 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-df0e-account-create-update-nllgl"] Dec 06 07:18:13 crc kubenswrapper[4954]: I1206 07:18:13.911501 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xjfwm"] Dec 06 07:18:13 crc kubenswrapper[4954]: W1206 07:18:13.948743 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ece29cd_d385_457d_853d_f37b09007d05.slice/crio-a41d59c9fc9ae2264e4d5335d1a6c6adee40c333390fdf67427c6cb8b5f06e23 WatchSource:0}: Error finding container a41d59c9fc9ae2264e4d5335d1a6c6adee40c333390fdf67427c6cb8b5f06e23: Status 404 returned error can't find the container with id a41d59c9fc9ae2264e4d5335d1a6c6adee40c333390fdf67427c6cb8b5f06e23 Dec 06 07:18:14 crc kubenswrapper[4954]: I1206 07:18:14.017857 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xjfwm" event={"ID":"3ece29cd-d385-457d-853d-f37b09007d05","Type":"ContainerStarted","Data":"a41d59c9fc9ae2264e4d5335d1a6c6adee40c333390fdf67427c6cb8b5f06e23"} Dec 06 07:18:14 crc kubenswrapper[4954]: I1206 07:18:14.034762 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-df0e-account-create-update-nllgl" event={"ID":"b279d2ed-3be6-4f9e-b27d-8fac529b4d32","Type":"ContainerStarted","Data":"a8894ceca430a4751d4aaabba40c75acaae25e68bfb85cfff3b34c2cf11f9429"} Dec 06 07:18:14 crc kubenswrapper[4954]: I1206 07:18:14.069774 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-s7jcc"] Dec 06 07:18:14 crc kubenswrapper[4954]: I1206 07:18:14.086559 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-664f-account-create-update-74gq9"] Dec 06 07:18:14 crc kubenswrapper[4954]: I1206 07:18:14.343210 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-498rf"] Dec 06 07:18:14 crc kubenswrapper[4954]: I1206 07:18:14.368409 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4vvr4"] Dec 06 07:18:14 crc kubenswrapper[4954]: I1206 07:18:14.446802 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-251d-account-create-update-d7mbv"] Dec 06 07:18:15 crc kubenswrapper[4954]: I1206 07:18:15.065999 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-251d-account-create-update-d7mbv" event={"ID":"5aec320e-8fda-4ac0-aac9-85fa4a436c01","Type":"ContainerStarted","Data":"c7de933f42b4564aa075a10179a1d94a91f67970c4d423193b607d2cc86ca8c8"} Dec 06 07:18:15 crc kubenswrapper[4954]: I1206 07:18:15.086044 4954 generic.go:334] "Generic (PLEG): container finished" podID="3ece29cd-d385-457d-853d-f37b09007d05" containerID="70ef882091070bde0bec956facf971437b651cffaa48697c11f169c6fe3617c5" exitCode=0 Dec 06 07:18:15 crc kubenswrapper[4954]: I1206 07:18:15.086226 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xjfwm" event={"ID":"3ece29cd-d385-457d-853d-f37b09007d05","Type":"ContainerDied","Data":"70ef882091070bde0bec956facf971437b651cffaa48697c11f169c6fe3617c5"} Dec 06 07:18:15 crc kubenswrapper[4954]: I1206 07:18:15.098716 4954 generic.go:334] "Generic (PLEG): container finished" podID="b279d2ed-3be6-4f9e-b27d-8fac529b4d32" containerID="c9cb2b326ffd728ac824c05ed72c22a0ac49dd17dd685e63e0cde489bf63efb1" exitCode=0 Dec 06 07:18:15 crc kubenswrapper[4954]: I1206 07:18:15.098833 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-df0e-account-create-update-nllgl" event={"ID":"b279d2ed-3be6-4f9e-b27d-8fac529b4d32","Type":"ContainerDied","Data":"c9cb2b326ffd728ac824c05ed72c22a0ac49dd17dd685e63e0cde489bf63efb1"} Dec 06 07:18:15 crc kubenswrapper[4954]: I1206 07:18:15.101198 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4vvr4" event={"ID":"4374b249-a0ad-4b36-9924-fa871e0e63fa","Type":"ContainerStarted","Data":"525c0bd9311c30608a94e455c3c82e7a4a8011b1980e5a2a00a9b73bafbd10ba"} Dec 06 07:18:15 crc kubenswrapper[4954]: I1206 07:18:15.105482 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-664f-account-create-update-74gq9" event={"ID":"45760345-6d26-4a9b-931e-1c4d5cc3ca3e","Type":"ContainerStarted","Data":"b3076b78bb2b4ab221c240edc55c9c81b87b5d2600470051867e57e44b888433"} Dec 06 07:18:15 crc kubenswrapper[4954]: I1206 07:18:15.110146 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-498rf" event={"ID":"a33fd05b-5e69-4042-a5b2-b07dd4edd63d","Type":"ContainerStarted","Data":"b8ce590d575034daf3c7b8fa95d4606b451244d536886796633d7e74fcde3e39"} Dec 06 07:18:15 crc kubenswrapper[4954]: I1206 07:18:15.121623 4954 generic.go:334] "Generic (PLEG): container finished" podID="c4546452-6f33-4f15-9b6e-30aecc6e81d1" containerID="2be43c7ef80f1e974ba1bc9f1ee60912104a84a0dc25733164e3f37e1c0be14a" exitCode=0 Dec 06 07:18:15 crc kubenswrapper[4954]: I1206 07:18:15.121686 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-s7jcc" event={"ID":"c4546452-6f33-4f15-9b6e-30aecc6e81d1","Type":"ContainerDied","Data":"2be43c7ef80f1e974ba1bc9f1ee60912104a84a0dc25733164e3f37e1c0be14a"} Dec 06 07:18:15 crc kubenswrapper[4954]: I1206 07:18:15.121723 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-s7jcc" event={"ID":"c4546452-6f33-4f15-9b6e-30aecc6e81d1","Type":"ContainerStarted","Data":"dab971cc0f7e7d593f6abc0fd9c0af34736d72eca929b03ef4a1be5da0f99e39"} Dec 06 07:18:16 crc kubenswrapper[4954]: I1206 07:18:16.131974 4954 generic.go:334] "Generic (PLEG): container finished" podID="45760345-6d26-4a9b-931e-1c4d5cc3ca3e" containerID="7871e18bb8292414c78e7334c458b2779277880ca11659a8561bd1664ae80f7d" exitCode=0 Dec 06 07:18:16 crc kubenswrapper[4954]: I1206 07:18:16.132072 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-664f-account-create-update-74gq9" event={"ID":"45760345-6d26-4a9b-931e-1c4d5cc3ca3e","Type":"ContainerDied","Data":"7871e18bb8292414c78e7334c458b2779277880ca11659a8561bd1664ae80f7d"} Dec 06 07:18:16 crc kubenswrapper[4954]: I1206 07:18:16.135794 4954 generic.go:334] "Generic (PLEG): container finished" podID="a33fd05b-5e69-4042-a5b2-b07dd4edd63d" containerID="549dfa3cfa0a8efb669e86e3de71c2ea1577bb4859050d03b3d41b11fa4dae74" exitCode=0 Dec 06 07:18:16 crc kubenswrapper[4954]: I1206 07:18:16.135834 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-498rf" event={"ID":"a33fd05b-5e69-4042-a5b2-b07dd4edd63d","Type":"ContainerDied","Data":"549dfa3cfa0a8efb669e86e3de71c2ea1577bb4859050d03b3d41b11fa4dae74"} Dec 06 07:18:16 crc kubenswrapper[4954]: I1206 07:18:16.139855 4954 generic.go:334] "Generic (PLEG): container finished" podID="5aec320e-8fda-4ac0-aac9-85fa4a436c01" containerID="26f1310e589079d2c29a84c9a9c823b2acb1c438f20660e7aedabf9a59ec76dd" exitCode=0 Dec 06 07:18:16 crc kubenswrapper[4954]: I1206 07:18:16.139966 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-251d-account-create-update-d7mbv" event={"ID":"5aec320e-8fda-4ac0-aac9-85fa4a436c01","Type":"ContainerDied","Data":"26f1310e589079d2c29a84c9a9c823b2acb1c438f20660e7aedabf9a59ec76dd"} Dec 06 07:18:16 crc kubenswrapper[4954]: I1206 07:18:16.146814 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerStarted","Data":"9e6c70c416cc446ced1c2039d0651b51abc7bb3be2892db83aa00306a5d00bb3"} Dec 06 07:18:16 crc kubenswrapper[4954]: I1206 07:18:16.146870 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerStarted","Data":"fd7f3929fff74fd65f6989375596655e7c51806663d983bd2aad57effe232114"} Dec 06 07:18:16 crc kubenswrapper[4954]: I1206 07:18:16.146881 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerStarted","Data":"e533f83b20e9f9300f4fb16b6fe9cdf376566cb8e295ffbbfa5b2a2195bb2002"} Dec 06 07:18:16 crc kubenswrapper[4954]: I1206 07:18:16.938720 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xjfwm" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.025011 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ece29cd-d385-457d-853d-f37b09007d05-operator-scripts\") pod \"3ece29cd-d385-457d-853d-f37b09007d05\" (UID: \"3ece29cd-d385-457d-853d-f37b09007d05\") " Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.025076 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm688\" (UniqueName: \"kubernetes.io/projected/3ece29cd-d385-457d-853d-f37b09007d05-kube-api-access-dm688\") pod \"3ece29cd-d385-457d-853d-f37b09007d05\" (UID: \"3ece29cd-d385-457d-853d-f37b09007d05\") " Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.026031 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ece29cd-d385-457d-853d-f37b09007d05-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3ece29cd-d385-457d-853d-f37b09007d05" (UID: "3ece29cd-d385-457d-853d-f37b09007d05"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.033715 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ece29cd-d385-457d-853d-f37b09007d05-kube-api-access-dm688" (OuterVolumeSpecName: "kube-api-access-dm688") pod "3ece29cd-d385-457d-853d-f37b09007d05" (UID: "3ece29cd-d385-457d-853d-f37b09007d05"). InnerVolumeSpecName "kube-api-access-dm688". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.069526 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-s7jcc" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.092681 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-df0e-account-create-update-nllgl" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.126417 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxxlg\" (UniqueName: \"kubernetes.io/projected/c4546452-6f33-4f15-9b6e-30aecc6e81d1-kube-api-access-xxxlg\") pod \"c4546452-6f33-4f15-9b6e-30aecc6e81d1\" (UID: \"c4546452-6f33-4f15-9b6e-30aecc6e81d1\") " Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.126529 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbhxd\" (UniqueName: \"kubernetes.io/projected/b279d2ed-3be6-4f9e-b27d-8fac529b4d32-kube-api-access-rbhxd\") pod \"b279d2ed-3be6-4f9e-b27d-8fac529b4d32\" (UID: \"b279d2ed-3be6-4f9e-b27d-8fac529b4d32\") " Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.126670 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4546452-6f33-4f15-9b6e-30aecc6e81d1-operator-scripts\") pod \"c4546452-6f33-4f15-9b6e-30aecc6e81d1\" (UID: \"c4546452-6f33-4f15-9b6e-30aecc6e81d1\") " Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.126858 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b279d2ed-3be6-4f9e-b27d-8fac529b4d32-operator-scripts\") pod \"b279d2ed-3be6-4f9e-b27d-8fac529b4d32\" (UID: \"b279d2ed-3be6-4f9e-b27d-8fac529b4d32\") " Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.127445 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ece29cd-d385-457d-853d-f37b09007d05-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.127472 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm688\" (UniqueName: \"kubernetes.io/projected/3ece29cd-d385-457d-853d-f37b09007d05-kube-api-access-dm688\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.129021 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4546452-6f33-4f15-9b6e-30aecc6e81d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4546452-6f33-4f15-9b6e-30aecc6e81d1" (UID: "c4546452-6f33-4f15-9b6e-30aecc6e81d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.129221 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b279d2ed-3be6-4f9e-b27d-8fac529b4d32-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b279d2ed-3be6-4f9e-b27d-8fac529b4d32" (UID: "b279d2ed-3be6-4f9e-b27d-8fac529b4d32"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.131811 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4546452-6f33-4f15-9b6e-30aecc6e81d1-kube-api-access-xxxlg" (OuterVolumeSpecName: "kube-api-access-xxxlg") pod "c4546452-6f33-4f15-9b6e-30aecc6e81d1" (UID: "c4546452-6f33-4f15-9b6e-30aecc6e81d1"). InnerVolumeSpecName "kube-api-access-xxxlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.135421 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b279d2ed-3be6-4f9e-b27d-8fac529b4d32-kube-api-access-rbhxd" (OuterVolumeSpecName: "kube-api-access-rbhxd") pod "b279d2ed-3be6-4f9e-b27d-8fac529b4d32" (UID: "b279d2ed-3be6-4f9e-b27d-8fac529b4d32"). InnerVolumeSpecName "kube-api-access-rbhxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.174595 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-s7jcc" event={"ID":"c4546452-6f33-4f15-9b6e-30aecc6e81d1","Type":"ContainerDied","Data":"dab971cc0f7e7d593f6abc0fd9c0af34736d72eca929b03ef4a1be5da0f99e39"} Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.174664 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dab971cc0f7e7d593f6abc0fd9c0af34736d72eca929b03ef4a1be5da0f99e39" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.174782 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-s7jcc" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.194835 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerStarted","Data":"85237b97891e4838aee5911e5b516a52841339958665c7a83d1d2d7121799c5a"} Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.194906 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerStarted","Data":"3e090d4172cdef300aa94f8d8fba36deddc5d8017b13b15dfe18162d7c6aa8ef"} Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.198806 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xjfwm" event={"ID":"3ece29cd-d385-457d-853d-f37b09007d05","Type":"ContainerDied","Data":"a41d59c9fc9ae2264e4d5335d1a6c6adee40c333390fdf67427c6cb8b5f06e23"} Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.198843 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a41d59c9fc9ae2264e4d5335d1a6c6adee40c333390fdf67427c6cb8b5f06e23" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.198885 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xjfwm" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.221112 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-df0e-account-create-update-nllgl" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.221179 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-df0e-account-create-update-nllgl" event={"ID":"b279d2ed-3be6-4f9e-b27d-8fac529b4d32","Type":"ContainerDied","Data":"a8894ceca430a4751d4aaabba40c75acaae25e68bfb85cfff3b34c2cf11f9429"} Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.221218 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8894ceca430a4751d4aaabba40c75acaae25e68bfb85cfff3b34c2cf11f9429" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.229127 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4546452-6f33-4f15-9b6e-30aecc6e81d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.229169 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b279d2ed-3be6-4f9e-b27d-8fac529b4d32-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.229180 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxxlg\" (UniqueName: \"kubernetes.io/projected/c4546452-6f33-4f15-9b6e-30aecc6e81d1-kube-api-access-xxxlg\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.229194 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbhxd\" (UniqueName: \"kubernetes.io/projected/b279d2ed-3be6-4f9e-b27d-8fac529b4d32-kube-api-access-rbhxd\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.591002 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-498rf" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.637880 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm4t5\" (UniqueName: \"kubernetes.io/projected/a33fd05b-5e69-4042-a5b2-b07dd4edd63d-kube-api-access-xm4t5\") pod \"a33fd05b-5e69-4042-a5b2-b07dd4edd63d\" (UID: \"a33fd05b-5e69-4042-a5b2-b07dd4edd63d\") " Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.637987 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a33fd05b-5e69-4042-a5b2-b07dd4edd63d-operator-scripts\") pod \"a33fd05b-5e69-4042-a5b2-b07dd4edd63d\" (UID: \"a33fd05b-5e69-4042-a5b2-b07dd4edd63d\") " Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.638688 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a33fd05b-5e69-4042-a5b2-b07dd4edd63d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a33fd05b-5e69-4042-a5b2-b07dd4edd63d" (UID: "a33fd05b-5e69-4042-a5b2-b07dd4edd63d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.646001 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a33fd05b-5e69-4042-a5b2-b07dd4edd63d-kube-api-access-xm4t5" (OuterVolumeSpecName: "kube-api-access-xm4t5") pod "a33fd05b-5e69-4042-a5b2-b07dd4edd63d" (UID: "a33fd05b-5e69-4042-a5b2-b07dd4edd63d"). InnerVolumeSpecName "kube-api-access-xm4t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.703201 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-664f-account-create-update-74gq9" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.712547 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-251d-account-create-update-d7mbv" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.738959 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49jkn\" (UniqueName: \"kubernetes.io/projected/45760345-6d26-4a9b-931e-1c4d5cc3ca3e-kube-api-access-49jkn\") pod \"45760345-6d26-4a9b-931e-1c4d5cc3ca3e\" (UID: \"45760345-6d26-4a9b-931e-1c4d5cc3ca3e\") " Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.739077 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg5lc\" (UniqueName: \"kubernetes.io/projected/5aec320e-8fda-4ac0-aac9-85fa4a436c01-kube-api-access-wg5lc\") pod \"5aec320e-8fda-4ac0-aac9-85fa4a436c01\" (UID: \"5aec320e-8fda-4ac0-aac9-85fa4a436c01\") " Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.739234 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5aec320e-8fda-4ac0-aac9-85fa4a436c01-operator-scripts\") pod \"5aec320e-8fda-4ac0-aac9-85fa4a436c01\" (UID: \"5aec320e-8fda-4ac0-aac9-85fa4a436c01\") " Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.739437 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45760345-6d26-4a9b-931e-1c4d5cc3ca3e-operator-scripts\") pod \"45760345-6d26-4a9b-931e-1c4d5cc3ca3e\" (UID: \"45760345-6d26-4a9b-931e-1c4d5cc3ca3e\") " Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.739973 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm4t5\" (UniqueName: \"kubernetes.io/projected/a33fd05b-5e69-4042-a5b2-b07dd4edd63d-kube-api-access-xm4t5\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.740003 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a33fd05b-5e69-4042-a5b2-b07dd4edd63d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.740710 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45760345-6d26-4a9b-931e-1c4d5cc3ca3e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45760345-6d26-4a9b-931e-1c4d5cc3ca3e" (UID: "45760345-6d26-4a9b-931e-1c4d5cc3ca3e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.741325 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aec320e-8fda-4ac0-aac9-85fa4a436c01-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5aec320e-8fda-4ac0-aac9-85fa4a436c01" (UID: "5aec320e-8fda-4ac0-aac9-85fa4a436c01"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.745427 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45760345-6d26-4a9b-931e-1c4d5cc3ca3e-kube-api-access-49jkn" (OuterVolumeSpecName: "kube-api-access-49jkn") pod "45760345-6d26-4a9b-931e-1c4d5cc3ca3e" (UID: "45760345-6d26-4a9b-931e-1c4d5cc3ca3e"). InnerVolumeSpecName "kube-api-access-49jkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.745847 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aec320e-8fda-4ac0-aac9-85fa4a436c01-kube-api-access-wg5lc" (OuterVolumeSpecName: "kube-api-access-wg5lc") pod "5aec320e-8fda-4ac0-aac9-85fa4a436c01" (UID: "5aec320e-8fda-4ac0-aac9-85fa4a436c01"). InnerVolumeSpecName "kube-api-access-wg5lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.842149 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45760345-6d26-4a9b-931e-1c4d5cc3ca3e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.842191 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49jkn\" (UniqueName: \"kubernetes.io/projected/45760345-6d26-4a9b-931e-1c4d5cc3ca3e-kube-api-access-49jkn\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.842202 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg5lc\" (UniqueName: \"kubernetes.io/projected/5aec320e-8fda-4ac0-aac9-85fa4a436c01-kube-api-access-wg5lc\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:17 crc kubenswrapper[4954]: I1206 07:18:17.842211 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5aec320e-8fda-4ac0-aac9-85fa4a436c01-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.235665 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-251d-account-create-update-d7mbv" event={"ID":"5aec320e-8fda-4ac0-aac9-85fa4a436c01","Type":"ContainerDied","Data":"c7de933f42b4564aa075a10179a1d94a91f67970c4d423193b607d2cc86ca8c8"} Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.235714 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7de933f42b4564aa075a10179a1d94a91f67970c4d423193b607d2cc86ca8c8" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.235758 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-251d-account-create-update-d7mbv" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.245366 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerStarted","Data":"aa40c5ec6d752ec2fbe2321f2103def28820a11e0c19b5134c7eb5bd9a35558b"} Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.245466 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerStarted","Data":"60ca35ac3122f717dfae9adfc952b010f5c490d1fbab75b0ecd0e2061e89d87e"} Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.249225 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-664f-account-create-update-74gq9" event={"ID":"45760345-6d26-4a9b-931e-1c4d5cc3ca3e","Type":"ContainerDied","Data":"b3076b78bb2b4ab221c240edc55c9c81b87b5d2600470051867e57e44b888433"} Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.249282 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3076b78bb2b4ab221c240edc55c9c81b87b5d2600470051867e57e44b888433" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.249358 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-664f-account-create-update-74gq9" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.258164 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-498rf" event={"ID":"a33fd05b-5e69-4042-a5b2-b07dd4edd63d","Type":"ContainerDied","Data":"b8ce590d575034daf3c7b8fa95d4606b451244d536886796633d7e74fcde3e39"} Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.258235 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8ce590d575034daf3c7b8fa95d4606b451244d536886796633d7e74fcde3e39" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.258344 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-498rf" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.305737 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.456304015 podStartE2EDuration="50.305703414s" podCreationTimestamp="2025-12-06 07:17:28 +0000 UTC" firstStartedPulling="2025-12-06 07:18:01.939438719 +0000 UTC m=+1256.752798108" lastFinishedPulling="2025-12-06 07:18:14.788838118 +0000 UTC m=+1269.602197507" observedRunningTime="2025-12-06 07:18:18.30221307 +0000 UTC m=+1273.115572469" watchObservedRunningTime="2025-12-06 07:18:18.305703414 +0000 UTC m=+1273.119062803" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.626146 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8567775dfc-vjjhz"] Dec 06 07:18:18 crc kubenswrapper[4954]: E1206 07:18:18.626682 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aec320e-8fda-4ac0-aac9-85fa4a436c01" containerName="mariadb-account-create-update" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.626703 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aec320e-8fda-4ac0-aac9-85fa4a436c01" containerName="mariadb-account-create-update" Dec 06 07:18:18 crc kubenswrapper[4954]: E1206 07:18:18.626714 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b279d2ed-3be6-4f9e-b27d-8fac529b4d32" containerName="mariadb-account-create-update" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.626721 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b279d2ed-3be6-4f9e-b27d-8fac529b4d32" containerName="mariadb-account-create-update" Dec 06 07:18:18 crc kubenswrapper[4954]: E1206 07:18:18.626738 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45760345-6d26-4a9b-931e-1c4d5cc3ca3e" containerName="mariadb-account-create-update" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.626745 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="45760345-6d26-4a9b-931e-1c4d5cc3ca3e" containerName="mariadb-account-create-update" Dec 06 07:18:18 crc kubenswrapper[4954]: E1206 07:18:18.626759 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4546452-6f33-4f15-9b6e-30aecc6e81d1" containerName="mariadb-database-create" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.626765 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4546452-6f33-4f15-9b6e-30aecc6e81d1" containerName="mariadb-database-create" Dec 06 07:18:18 crc kubenswrapper[4954]: E1206 07:18:18.626774 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ece29cd-d385-457d-853d-f37b09007d05" containerName="mariadb-database-create" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.626782 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ece29cd-d385-457d-853d-f37b09007d05" containerName="mariadb-database-create" Dec 06 07:18:18 crc kubenswrapper[4954]: E1206 07:18:18.626797 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a33fd05b-5e69-4042-a5b2-b07dd4edd63d" containerName="mariadb-database-create" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.626802 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33fd05b-5e69-4042-a5b2-b07dd4edd63d" containerName="mariadb-database-create" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.626962 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aec320e-8fda-4ac0-aac9-85fa4a436c01" containerName="mariadb-account-create-update" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.626977 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ece29cd-d385-457d-853d-f37b09007d05" containerName="mariadb-database-create" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.626986 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a33fd05b-5e69-4042-a5b2-b07dd4edd63d" containerName="mariadb-database-create" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.626995 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b279d2ed-3be6-4f9e-b27d-8fac529b4d32" containerName="mariadb-account-create-update" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.627006 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4546452-6f33-4f15-9b6e-30aecc6e81d1" containerName="mariadb-database-create" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.627014 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="45760345-6d26-4a9b-931e-1c4d5cc3ca3e" containerName="mariadb-account-create-update" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.627945 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.632819 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.646316 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8567775dfc-vjjhz"] Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.676150 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-dns-svc\") pod \"dnsmasq-dns-8567775dfc-vjjhz\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.676247 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-dns-swift-storage-0\") pod \"dnsmasq-dns-8567775dfc-vjjhz\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.676285 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-ovsdbserver-nb\") pod \"dnsmasq-dns-8567775dfc-vjjhz\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.676341 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-config\") pod \"dnsmasq-dns-8567775dfc-vjjhz\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.676380 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwws5\" (UniqueName: \"kubernetes.io/projected/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-kube-api-access-hwws5\") pod \"dnsmasq-dns-8567775dfc-vjjhz\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.676408 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-ovsdbserver-sb\") pod \"dnsmasq-dns-8567775dfc-vjjhz\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.778481 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-ovsdbserver-sb\") pod \"dnsmasq-dns-8567775dfc-vjjhz\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.778653 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-dns-svc\") pod \"dnsmasq-dns-8567775dfc-vjjhz\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.778706 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-ovsdbserver-nb\") pod \"dnsmasq-dns-8567775dfc-vjjhz\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.778726 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-dns-swift-storage-0\") pod \"dnsmasq-dns-8567775dfc-vjjhz\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.778779 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-config\") pod \"dnsmasq-dns-8567775dfc-vjjhz\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.778823 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwws5\" (UniqueName: \"kubernetes.io/projected/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-kube-api-access-hwws5\") pod \"dnsmasq-dns-8567775dfc-vjjhz\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.779777 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-ovsdbserver-sb\") pod \"dnsmasq-dns-8567775dfc-vjjhz\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.779960 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-dns-svc\") pod \"dnsmasq-dns-8567775dfc-vjjhz\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.780142 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-dns-swift-storage-0\") pod \"dnsmasq-dns-8567775dfc-vjjhz\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.780255 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-config\") pod \"dnsmasq-dns-8567775dfc-vjjhz\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.780490 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-ovsdbserver-nb\") pod \"dnsmasq-dns-8567775dfc-vjjhz\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.803498 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwws5\" (UniqueName: \"kubernetes.io/projected/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-kube-api-access-hwws5\") pod \"dnsmasq-dns-8567775dfc-vjjhz\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:18 crc kubenswrapper[4954]: I1206 07:18:18.965918 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:21 crc kubenswrapper[4954]: I1206 07:18:21.573991 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8567775dfc-vjjhz"] Dec 06 07:18:22 crc kubenswrapper[4954]: I1206 07:18:22.303706 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" event={"ID":"11c1e0d7-df63-4eb2-b161-2f5e0963eb31","Type":"ContainerStarted","Data":"10e9a17f86b20bcca7b107cc51bd5b5ff63af0b55f54b80ff3c57de91879cdc0"} Dec 06 07:18:23 crc kubenswrapper[4954]: I1206 07:18:23.318344 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4vvr4" event={"ID":"4374b249-a0ad-4b36-9924-fa871e0e63fa","Type":"ContainerStarted","Data":"470a3bb2487480880ccb6620497d7d369d86596f60ec23a908780d07204526e8"} Dec 06 07:18:23 crc kubenswrapper[4954]: I1206 07:18:23.320610 4954 generic.go:334] "Generic (PLEG): container finished" podID="11c1e0d7-df63-4eb2-b161-2f5e0963eb31" containerID="a814b870b8d20d3d0d249c8705d0a67df0f9c455c34785e8ac79ff2697a7edc0" exitCode=0 Dec 06 07:18:23 crc kubenswrapper[4954]: I1206 07:18:23.320671 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" event={"ID":"11c1e0d7-df63-4eb2-b161-2f5e0963eb31","Type":"ContainerDied","Data":"a814b870b8d20d3d0d249c8705d0a67df0f9c455c34785e8ac79ff2697a7edc0"} Dec 06 07:18:23 crc kubenswrapper[4954]: I1206 07:18:23.350018 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-4vvr4" podStartSLOduration=2.763659015 podStartE2EDuration="10.349991394s" podCreationTimestamp="2025-12-06 07:18:13 +0000 UTC" firstStartedPulling="2025-12-06 07:18:14.780949186 +0000 UTC m=+1269.594308575" lastFinishedPulling="2025-12-06 07:18:22.367281575 +0000 UTC m=+1277.180640954" observedRunningTime="2025-12-06 07:18:23.339991666 +0000 UTC m=+1278.153351055" watchObservedRunningTime="2025-12-06 07:18:23.349991394 +0000 UTC m=+1278.163350783" Dec 06 07:18:24 crc kubenswrapper[4954]: I1206 07:18:24.333872 4954 generic.go:334] "Generic (PLEG): container finished" podID="b7854eba-8c30-4c39-9c96-36e1e3cf7437" containerID="3018d3d8b3cb5a9e3856801dbdf5ff7b95038c330f1d30888a44744a692d61e1" exitCode=0 Dec 06 07:18:24 crc kubenswrapper[4954]: I1206 07:18:24.334005 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zttzq" event={"ID":"b7854eba-8c30-4c39-9c96-36e1e3cf7437","Type":"ContainerDied","Data":"3018d3d8b3cb5a9e3856801dbdf5ff7b95038c330f1d30888a44744a692d61e1"} Dec 06 07:18:24 crc kubenswrapper[4954]: I1206 07:18:24.336859 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" event={"ID":"11c1e0d7-df63-4eb2-b161-2f5e0963eb31","Type":"ContainerStarted","Data":"cc44285073be119ae0f2ffc2441931c2a4941b698fafe7cdc1622941ff299f7d"} Dec 06 07:18:24 crc kubenswrapper[4954]: I1206 07:18:24.379875 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" podStartSLOduration=6.379848727 podStartE2EDuration="6.379848727s" podCreationTimestamp="2025-12-06 07:18:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:18:24.378340906 +0000 UTC m=+1279.191700315" watchObservedRunningTime="2025-12-06 07:18:24.379848727 +0000 UTC m=+1279.193208116" Dec 06 07:18:25 crc kubenswrapper[4954]: I1206 07:18:25.348637 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:25 crc kubenswrapper[4954]: I1206 07:18:25.763407 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zttzq" Dec 06 07:18:25 crc kubenswrapper[4954]: I1206 07:18:25.900445 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ffl4\" (UniqueName: \"kubernetes.io/projected/b7854eba-8c30-4c39-9c96-36e1e3cf7437-kube-api-access-9ffl4\") pod \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\" (UID: \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\") " Dec 06 07:18:25 crc kubenswrapper[4954]: I1206 07:18:25.900594 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7854eba-8c30-4c39-9c96-36e1e3cf7437-db-sync-config-data\") pod \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\" (UID: \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\") " Dec 06 07:18:25 crc kubenswrapper[4954]: I1206 07:18:25.900812 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7854eba-8c30-4c39-9c96-36e1e3cf7437-config-data\") pod \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\" (UID: \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\") " Dec 06 07:18:25 crc kubenswrapper[4954]: I1206 07:18:25.900862 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7854eba-8c30-4c39-9c96-36e1e3cf7437-combined-ca-bundle\") pod \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\" (UID: \"b7854eba-8c30-4c39-9c96-36e1e3cf7437\") " Dec 06 07:18:25 crc kubenswrapper[4954]: I1206 07:18:25.909511 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7854eba-8c30-4c39-9c96-36e1e3cf7437-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b7854eba-8c30-4c39-9c96-36e1e3cf7437" (UID: "b7854eba-8c30-4c39-9c96-36e1e3cf7437"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:18:25 crc kubenswrapper[4954]: I1206 07:18:25.909635 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7854eba-8c30-4c39-9c96-36e1e3cf7437-kube-api-access-9ffl4" (OuterVolumeSpecName: "kube-api-access-9ffl4") pod "b7854eba-8c30-4c39-9c96-36e1e3cf7437" (UID: "b7854eba-8c30-4c39-9c96-36e1e3cf7437"). InnerVolumeSpecName "kube-api-access-9ffl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:18:25 crc kubenswrapper[4954]: I1206 07:18:25.939380 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7854eba-8c30-4c39-9c96-36e1e3cf7437-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7854eba-8c30-4c39-9c96-36e1e3cf7437" (UID: "b7854eba-8c30-4c39-9c96-36e1e3cf7437"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:18:25 crc kubenswrapper[4954]: I1206 07:18:25.964041 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7854eba-8c30-4c39-9c96-36e1e3cf7437-config-data" (OuterVolumeSpecName: "config-data") pod "b7854eba-8c30-4c39-9c96-36e1e3cf7437" (UID: "b7854eba-8c30-4c39-9c96-36e1e3cf7437"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:18:26 crc kubenswrapper[4954]: I1206 07:18:26.002962 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7854eba-8c30-4c39-9c96-36e1e3cf7437-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:26 crc kubenswrapper[4954]: I1206 07:18:26.003007 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7854eba-8c30-4c39-9c96-36e1e3cf7437-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:26 crc kubenswrapper[4954]: I1206 07:18:26.003026 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ffl4\" (UniqueName: \"kubernetes.io/projected/b7854eba-8c30-4c39-9c96-36e1e3cf7437-kube-api-access-9ffl4\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:26 crc kubenswrapper[4954]: I1206 07:18:26.003041 4954 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7854eba-8c30-4c39-9c96-36e1e3cf7437-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:26 crc kubenswrapper[4954]: I1206 07:18:26.362640 4954 generic.go:334] "Generic (PLEG): container finished" podID="4374b249-a0ad-4b36-9924-fa871e0e63fa" containerID="470a3bb2487480880ccb6620497d7d369d86596f60ec23a908780d07204526e8" exitCode=0 Dec 06 07:18:26 crc kubenswrapper[4954]: I1206 07:18:26.362749 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4vvr4" event={"ID":"4374b249-a0ad-4b36-9924-fa871e0e63fa","Type":"ContainerDied","Data":"470a3bb2487480880ccb6620497d7d369d86596f60ec23a908780d07204526e8"} Dec 06 07:18:26 crc kubenswrapper[4954]: I1206 07:18:26.365274 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zttzq" event={"ID":"b7854eba-8c30-4c39-9c96-36e1e3cf7437","Type":"ContainerDied","Data":"9eec3a0567261ca112136f2be93f2caf286e7004c84c92aa76b6fe1e3d515b27"} Dec 06 07:18:26 crc kubenswrapper[4954]: I1206 07:18:26.365478 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eec3a0567261ca112136f2be93f2caf286e7004c84c92aa76b6fe1e3d515b27" Dec 06 07:18:26 crc kubenswrapper[4954]: I1206 07:18:26.365303 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zttzq" Dec 06 07:18:26 crc kubenswrapper[4954]: I1206 07:18:26.866970 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8567775dfc-vjjhz"] Dec 06 07:18:26 crc kubenswrapper[4954]: I1206 07:18:26.937022 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fbdccd69c-j7sc4"] Dec 06 07:18:26 crc kubenswrapper[4954]: E1206 07:18:26.937511 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7854eba-8c30-4c39-9c96-36e1e3cf7437" containerName="glance-db-sync" Dec 06 07:18:26 crc kubenswrapper[4954]: I1206 07:18:26.937533 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7854eba-8c30-4c39-9c96-36e1e3cf7437" containerName="glance-db-sync" Dec 06 07:18:26 crc kubenswrapper[4954]: I1206 07:18:26.937752 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7854eba-8c30-4c39-9c96-36e1e3cf7437" containerName="glance-db-sync" Dec 06 07:18:26 crc kubenswrapper[4954]: I1206 07:18:26.938783 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:26 crc kubenswrapper[4954]: I1206 07:18:26.974400 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fbdccd69c-j7sc4"] Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.124131 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-dns-swift-storage-0\") pod \"dnsmasq-dns-7fbdccd69c-j7sc4\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.124236 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-dns-svc\") pod \"dnsmasq-dns-7fbdccd69c-j7sc4\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.125086 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj5zj\" (UniqueName: \"kubernetes.io/projected/895971f5-3210-47df-96b8-38b5061b45eb-kube-api-access-fj5zj\") pod \"dnsmasq-dns-7fbdccd69c-j7sc4\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.125198 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-ovsdbserver-nb\") pod \"dnsmasq-dns-7fbdccd69c-j7sc4\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.125340 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-config\") pod \"dnsmasq-dns-7fbdccd69c-j7sc4\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.125426 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-ovsdbserver-sb\") pod \"dnsmasq-dns-7fbdccd69c-j7sc4\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.226985 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-config\") pod \"dnsmasq-dns-7fbdccd69c-j7sc4\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.227051 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-ovsdbserver-sb\") pod \"dnsmasq-dns-7fbdccd69c-j7sc4\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.227129 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-dns-swift-storage-0\") pod \"dnsmasq-dns-7fbdccd69c-j7sc4\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.227189 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-dns-svc\") pod \"dnsmasq-dns-7fbdccd69c-j7sc4\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.227219 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj5zj\" (UniqueName: \"kubernetes.io/projected/895971f5-3210-47df-96b8-38b5061b45eb-kube-api-access-fj5zj\") pod \"dnsmasq-dns-7fbdccd69c-j7sc4\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.227239 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-ovsdbserver-nb\") pod \"dnsmasq-dns-7fbdccd69c-j7sc4\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.228331 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-dns-swift-storage-0\") pod \"dnsmasq-dns-7fbdccd69c-j7sc4\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.228331 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-ovsdbserver-sb\") pod \"dnsmasq-dns-7fbdccd69c-j7sc4\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.228459 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-ovsdbserver-nb\") pod \"dnsmasq-dns-7fbdccd69c-j7sc4\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.228732 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-dns-svc\") pod \"dnsmasq-dns-7fbdccd69c-j7sc4\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.229068 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-config\") pod \"dnsmasq-dns-7fbdccd69c-j7sc4\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.249689 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj5zj\" (UniqueName: \"kubernetes.io/projected/895971f5-3210-47df-96b8-38b5061b45eb-kube-api-access-fj5zj\") pod \"dnsmasq-dns-7fbdccd69c-j7sc4\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.268352 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:27 crc kubenswrapper[4954]: I1206 07:18:27.400533 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" podUID="11c1e0d7-df63-4eb2-b161-2f5e0963eb31" containerName="dnsmasq-dns" containerID="cri-o://cc44285073be119ae0f2ffc2441931c2a4941b698fafe7cdc1622941ff299f7d" gracePeriod=10 Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.420697 4954 generic.go:334] "Generic (PLEG): container finished" podID="11c1e0d7-df63-4eb2-b161-2f5e0963eb31" containerID="cc44285073be119ae0f2ffc2441931c2a4941b698fafe7cdc1622941ff299f7d" exitCode=0 Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.420846 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" event={"ID":"11c1e0d7-df63-4eb2-b161-2f5e0963eb31","Type":"ContainerDied","Data":"cc44285073be119ae0f2ffc2441931c2a4941b698fafe7cdc1622941ff299f7d"} Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.707364 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.709606 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4vvr4" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.754765 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fbdccd69c-j7sc4"] Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.858741 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v427b\" (UniqueName: \"kubernetes.io/projected/4374b249-a0ad-4b36-9924-fa871e0e63fa-kube-api-access-v427b\") pod \"4374b249-a0ad-4b36-9924-fa871e0e63fa\" (UID: \"4374b249-a0ad-4b36-9924-fa871e0e63fa\") " Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.858893 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4374b249-a0ad-4b36-9924-fa871e0e63fa-combined-ca-bundle\") pod \"4374b249-a0ad-4b36-9924-fa871e0e63fa\" (UID: \"4374b249-a0ad-4b36-9924-fa871e0e63fa\") " Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.858981 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-ovsdbserver-sb\") pod \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.859102 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwws5\" (UniqueName: \"kubernetes.io/projected/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-kube-api-access-hwws5\") pod \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.859162 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4374b249-a0ad-4b36-9924-fa871e0e63fa-config-data\") pod \"4374b249-a0ad-4b36-9924-fa871e0e63fa\" (UID: \"4374b249-a0ad-4b36-9924-fa871e0e63fa\") " Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.859196 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-config\") pod \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.859227 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-ovsdbserver-nb\") pod \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.859267 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-dns-swift-storage-0\") pod \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.859295 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-dns-svc\") pod \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\" (UID: \"11c1e0d7-df63-4eb2-b161-2f5e0963eb31\") " Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.865103 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-kube-api-access-hwws5" (OuterVolumeSpecName: "kube-api-access-hwws5") pod "11c1e0d7-df63-4eb2-b161-2f5e0963eb31" (UID: "11c1e0d7-df63-4eb2-b161-2f5e0963eb31"). InnerVolumeSpecName "kube-api-access-hwws5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.886777 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4374b249-a0ad-4b36-9924-fa871e0e63fa-kube-api-access-v427b" (OuterVolumeSpecName: "kube-api-access-v427b") pod "4374b249-a0ad-4b36-9924-fa871e0e63fa" (UID: "4374b249-a0ad-4b36-9924-fa871e0e63fa"). InnerVolumeSpecName "kube-api-access-v427b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.921763 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4374b249-a0ad-4b36-9924-fa871e0e63fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4374b249-a0ad-4b36-9924-fa871e0e63fa" (UID: "4374b249-a0ad-4b36-9924-fa871e0e63fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.924761 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "11c1e0d7-df63-4eb2-b161-2f5e0963eb31" (UID: "11c1e0d7-df63-4eb2-b161-2f5e0963eb31"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.925247 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4374b249-a0ad-4b36-9924-fa871e0e63fa-config-data" (OuterVolumeSpecName: "config-data") pod "4374b249-a0ad-4b36-9924-fa871e0e63fa" (UID: "4374b249-a0ad-4b36-9924-fa871e0e63fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.930045 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-config" (OuterVolumeSpecName: "config") pod "11c1e0d7-df63-4eb2-b161-2f5e0963eb31" (UID: "11c1e0d7-df63-4eb2-b161-2f5e0963eb31"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.937539 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "11c1e0d7-df63-4eb2-b161-2f5e0963eb31" (UID: "11c1e0d7-df63-4eb2-b161-2f5e0963eb31"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.938373 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "11c1e0d7-df63-4eb2-b161-2f5e0963eb31" (UID: "11c1e0d7-df63-4eb2-b161-2f5e0963eb31"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.939359 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "11c1e0d7-df63-4eb2-b161-2f5e0963eb31" (UID: "11c1e0d7-df63-4eb2-b161-2f5e0963eb31"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.962828 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwws5\" (UniqueName: \"kubernetes.io/projected/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-kube-api-access-hwws5\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.962885 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4374b249-a0ad-4b36-9924-fa871e0e63fa-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.962904 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.962919 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.962931 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.962942 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.962957 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v427b\" (UniqueName: \"kubernetes.io/projected/4374b249-a0ad-4b36-9924-fa871e0e63fa-kube-api-access-v427b\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.962968 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4374b249-a0ad-4b36-9924-fa871e0e63fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:28 crc kubenswrapper[4954]: I1206 07:18:28.962976 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11c1e0d7-df63-4eb2-b161-2f5e0963eb31-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:29 crc kubenswrapper[4954]: I1206 07:18:29.431779 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4vvr4" event={"ID":"4374b249-a0ad-4b36-9924-fa871e0e63fa","Type":"ContainerDied","Data":"525c0bd9311c30608a94e455c3c82e7a4a8011b1980e5a2a00a9b73bafbd10ba"} Dec 06 07:18:29 crc kubenswrapper[4954]: I1206 07:18:29.431841 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="525c0bd9311c30608a94e455c3c82e7a4a8011b1980e5a2a00a9b73bafbd10ba" Dec 06 07:18:29 crc kubenswrapper[4954]: I1206 07:18:29.431799 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4vvr4" Dec 06 07:18:29 crc kubenswrapper[4954]: I1206 07:18:29.434032 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" event={"ID":"11c1e0d7-df63-4eb2-b161-2f5e0963eb31","Type":"ContainerDied","Data":"10e9a17f86b20bcca7b107cc51bd5b5ff63af0b55f54b80ff3c57de91879cdc0"} Dec 06 07:18:29 crc kubenswrapper[4954]: I1206 07:18:29.434119 4954 scope.go:117] "RemoveContainer" containerID="cc44285073be119ae0f2ffc2441931c2a4941b698fafe7cdc1622941ff299f7d" Dec 06 07:18:29 crc kubenswrapper[4954]: I1206 07:18:29.434181 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8567775dfc-vjjhz" Dec 06 07:18:29 crc kubenswrapper[4954]: I1206 07:18:29.436700 4954 generic.go:334] "Generic (PLEG): container finished" podID="895971f5-3210-47df-96b8-38b5061b45eb" containerID="5222d83bbde98003b6ab6775544e901eb43a5ab5116bbab952d73c90b941f688" exitCode=0 Dec 06 07:18:29 crc kubenswrapper[4954]: I1206 07:18:29.436731 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" event={"ID":"895971f5-3210-47df-96b8-38b5061b45eb","Type":"ContainerDied","Data":"5222d83bbde98003b6ab6775544e901eb43a5ab5116bbab952d73c90b941f688"} Dec 06 07:18:29 crc kubenswrapper[4954]: I1206 07:18:29.436749 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" event={"ID":"895971f5-3210-47df-96b8-38b5061b45eb","Type":"ContainerStarted","Data":"4614c7edb988b3995374cbb64873ed794baf40b8294b6a8936c99d86b824383a"} Dec 06 07:18:29 crc kubenswrapper[4954]: I1206 07:18:29.459252 4954 scope.go:117] "RemoveContainer" containerID="a814b870b8d20d3d0d249c8705d0a67df0f9c455c34785e8ac79ff2697a7edc0" Dec 06 07:18:29 crc kubenswrapper[4954]: I1206 07:18:29.504767 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8567775dfc-vjjhz"] Dec 06 07:18:29 crc kubenswrapper[4954]: I1206 07:18:29.518781 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8567775dfc-vjjhz"] Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.050412 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fbdccd69c-j7sc4"] Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.172518 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64ccc486bf-9ld6k"] Dec 06 07:18:30 crc kubenswrapper[4954]: E1206 07:18:30.173139 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c1e0d7-df63-4eb2-b161-2f5e0963eb31" containerName="dnsmasq-dns" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.173168 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c1e0d7-df63-4eb2-b161-2f5e0963eb31" containerName="dnsmasq-dns" Dec 06 07:18:30 crc kubenswrapper[4954]: E1206 07:18:30.173193 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c1e0d7-df63-4eb2-b161-2f5e0963eb31" containerName="init" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.173202 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c1e0d7-df63-4eb2-b161-2f5e0963eb31" containerName="init" Dec 06 07:18:30 crc kubenswrapper[4954]: E1206 07:18:30.173230 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4374b249-a0ad-4b36-9924-fa871e0e63fa" containerName="keystone-db-sync" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.173239 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4374b249-a0ad-4b36-9924-fa871e0e63fa" containerName="keystone-db-sync" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.173456 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="4374b249-a0ad-4b36-9924-fa871e0e63fa" containerName="keystone-db-sync" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.173482 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="11c1e0d7-df63-4eb2-b161-2f5e0963eb31" containerName="dnsmasq-dns" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.174745 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.187114 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-9kg4w"] Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.188910 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.198427 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.198459 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.198597 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.198454 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r8jst" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.198844 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.210852 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64ccc486bf-9ld6k"] Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.239518 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9kg4w"] Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.336546 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-config-data\") pod \"keystone-bootstrap-9kg4w\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.336634 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-combined-ca-bundle\") pod \"keystone-bootstrap-9kg4w\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.336703 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-credential-keys\") pod \"keystone-bootstrap-9kg4w\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.336749 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-dns-svc\") pod \"dnsmasq-dns-64ccc486bf-9ld6k\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.336779 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-config\") pod \"dnsmasq-dns-64ccc486bf-9ld6k\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.336824 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-ovsdbserver-sb\") pod \"dnsmasq-dns-64ccc486bf-9ld6k\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.336862 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-ovsdbserver-nb\") pod \"dnsmasq-dns-64ccc486bf-9ld6k\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.336880 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-scripts\") pod \"keystone-bootstrap-9kg4w\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.336927 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-fernet-keys\") pod \"keystone-bootstrap-9kg4w\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.336982 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-dns-swift-storage-0\") pod \"dnsmasq-dns-64ccc486bf-9ld6k\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.337007 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9tlv\" (UniqueName: \"kubernetes.io/projected/11864c16-c8f0-4933-bf0d-aa97e1c4e650-kube-api-access-w9tlv\") pod \"keystone-bootstrap-9kg4w\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.337065 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkvg4\" (UniqueName: \"kubernetes.io/projected/370e499f-2b32-4084-8a3d-7a800c154052-kube-api-access-mkvg4\") pod \"dnsmasq-dns-64ccc486bf-9ld6k\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.361763 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-fcgjm"] Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.363464 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.365830 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hk7x6" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.367115 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.369008 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.377197 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fcgjm"] Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.443860 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-credential-keys\") pod \"keystone-bootstrap-9kg4w\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.443922 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-dns-svc\") pod \"dnsmasq-dns-64ccc486bf-9ld6k\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.443961 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-config\") pod \"dnsmasq-dns-64ccc486bf-9ld6k\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.443994 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-ovsdbserver-sb\") pod \"dnsmasq-dns-64ccc486bf-9ld6k\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.444032 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-ovsdbserver-nb\") pod \"dnsmasq-dns-64ccc486bf-9ld6k\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.444050 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-scripts\") pod \"keystone-bootstrap-9kg4w\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.444077 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-fernet-keys\") pod \"keystone-bootstrap-9kg4w\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.444126 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-dns-swift-storage-0\") pod \"dnsmasq-dns-64ccc486bf-9ld6k\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.444154 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9tlv\" (UniqueName: \"kubernetes.io/projected/11864c16-c8f0-4933-bf0d-aa97e1c4e650-kube-api-access-w9tlv\") pod \"keystone-bootstrap-9kg4w\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.444180 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkvg4\" (UniqueName: \"kubernetes.io/projected/370e499f-2b32-4084-8a3d-7a800c154052-kube-api-access-mkvg4\") pod \"dnsmasq-dns-64ccc486bf-9ld6k\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.444266 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-config-data\") pod \"keystone-bootstrap-9kg4w\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.444303 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-combined-ca-bundle\") pod \"keystone-bootstrap-9kg4w\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.446295 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-ovsdbserver-nb\") pod \"dnsmasq-dns-64ccc486bf-9ld6k\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.446302 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-dns-svc\") pod \"dnsmasq-dns-64ccc486bf-9ld6k\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.446358 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-dns-swift-storage-0\") pod \"dnsmasq-dns-64ccc486bf-9ld6k\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.450353 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-config\") pod \"dnsmasq-dns-64ccc486bf-9ld6k\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.451443 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-ovsdbserver-sb\") pod \"dnsmasq-dns-64ccc486bf-9ld6k\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.458454 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-credential-keys\") pod \"keystone-bootstrap-9kg4w\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.464594 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-scripts\") pod \"keystone-bootstrap-9kg4w\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.482845 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-combined-ca-bundle\") pod \"keystone-bootstrap-9kg4w\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.483384 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-fernet-keys\") pod \"keystone-bootstrap-9kg4w\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.483496 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-config-data\") pod \"keystone-bootstrap-9kg4w\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.507781 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" event={"ID":"895971f5-3210-47df-96b8-38b5061b45eb","Type":"ContainerStarted","Data":"a8d0df08c9e8c9154ab5102f96cb135325137f164b680589b8113828639fd0e6"} Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.508069 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.513994 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkvg4\" (UniqueName: \"kubernetes.io/projected/370e499f-2b32-4084-8a3d-7a800c154052-kube-api-access-mkvg4\") pod \"dnsmasq-dns-64ccc486bf-9ld6k\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.514789 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-khvqc"] Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.516151 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-khvqc" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.525200 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.528416 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-9sp84" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.528799 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.529103 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.540637 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-khvqc"] Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.552631 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-combined-ca-bundle\") pod \"cinder-db-sync-fcgjm\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.552796 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-db-sync-config-data\") pod \"cinder-db-sync-fcgjm\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.552833 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npqzh\" (UniqueName: \"kubernetes.io/projected/24a53752-0ee5-43db-b968-b4d11414ffdb-kube-api-access-npqzh\") pod \"cinder-db-sync-fcgjm\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.552949 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24a53752-0ee5-43db-b968-b4d11414ffdb-etc-machine-id\") pod \"cinder-db-sync-fcgjm\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.552987 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-config-data\") pod \"cinder-db-sync-fcgjm\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.556105 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-scripts\") pod \"cinder-db-sync-fcgjm\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.558189 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9tlv\" (UniqueName: \"kubernetes.io/projected/11864c16-c8f0-4933-bf0d-aa97e1c4e650-kube-api-access-w9tlv\") pod \"keystone-bootstrap-9kg4w\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.567106 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" podStartSLOduration=4.567042935 podStartE2EDuration="4.567042935s" podCreationTimestamp="2025-12-06 07:18:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:18:30.554821047 +0000 UTC m=+1285.368180436" watchObservedRunningTime="2025-12-06 07:18:30.567042935 +0000 UTC m=+1285.380402334" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.617171 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-9zvlz"] Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.618976 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9zvlz" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.627125 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.627538 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.627682 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6n4f8" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.662093 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-9zvlz"] Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.665236 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-scripts\") pod \"cinder-db-sync-fcgjm\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.665348 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgdcc\" (UniqueName: \"kubernetes.io/projected/8866facf-9bb1-4deb-9244-15805d162346-kube-api-access-jgdcc\") pod \"neutron-db-sync-khvqc\" (UID: \"8866facf-9bb1-4deb-9244-15805d162346\") " pod="openstack/neutron-db-sync-khvqc" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.665506 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-combined-ca-bundle\") pod \"cinder-db-sync-fcgjm\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.665703 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-db-sync-config-data\") pod \"cinder-db-sync-fcgjm\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.665825 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npqzh\" (UniqueName: \"kubernetes.io/projected/24a53752-0ee5-43db-b968-b4d11414ffdb-kube-api-access-npqzh\") pod \"cinder-db-sync-fcgjm\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.665951 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24a53752-0ee5-43db-b968-b4d11414ffdb-etc-machine-id\") pod \"cinder-db-sync-fcgjm\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.666035 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8866facf-9bb1-4deb-9244-15805d162346-config\") pod \"neutron-db-sync-khvqc\" (UID: \"8866facf-9bb1-4deb-9244-15805d162346\") " pod="openstack/neutron-db-sync-khvqc" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.666121 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-config-data\") pod \"cinder-db-sync-fcgjm\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.666199 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8866facf-9bb1-4deb-9244-15805d162346-combined-ca-bundle\") pod \"neutron-db-sync-khvqc\" (UID: \"8866facf-9bb1-4deb-9244-15805d162346\") " pod="openstack/neutron-db-sync-khvqc" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.667227 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24a53752-0ee5-43db-b968-b4d11414ffdb-etc-machine-id\") pod \"cinder-db-sync-fcgjm\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.671218 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-scripts\") pod \"cinder-db-sync-fcgjm\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.767017 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-config-data\") pod \"cinder-db-sync-fcgjm\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.770180 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-scripts\") pod \"placement-db-sync-9zvlz\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " pod="openstack/placement-db-sync-9zvlz" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.770231 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-combined-ca-bundle\") pod \"placement-db-sync-9zvlz\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " pod="openstack/placement-db-sync-9zvlz" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.770270 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgdcc\" (UniqueName: \"kubernetes.io/projected/8866facf-9bb1-4deb-9244-15805d162346-kube-api-access-jgdcc\") pod \"neutron-db-sync-khvqc\" (UID: \"8866facf-9bb1-4deb-9244-15805d162346\") " pod="openstack/neutron-db-sync-khvqc" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.770354 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-288vg\" (UniqueName: \"kubernetes.io/projected/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-kube-api-access-288vg\") pod \"placement-db-sync-9zvlz\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " pod="openstack/placement-db-sync-9zvlz" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.770433 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8866facf-9bb1-4deb-9244-15805d162346-config\") pod \"neutron-db-sync-khvqc\" (UID: \"8866facf-9bb1-4deb-9244-15805d162346\") " pod="openstack/neutron-db-sync-khvqc" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.770460 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8866facf-9bb1-4deb-9244-15805d162346-combined-ca-bundle\") pod \"neutron-db-sync-khvqc\" (UID: \"8866facf-9bb1-4deb-9244-15805d162346\") " pod="openstack/neutron-db-sync-khvqc" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.770480 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-config-data\") pod \"placement-db-sync-9zvlz\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " pod="openstack/placement-db-sync-9zvlz" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.770516 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-logs\") pod \"placement-db-sync-9zvlz\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " pod="openstack/placement-db-sync-9zvlz" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.781386 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-db-sync-config-data\") pod \"cinder-db-sync-fcgjm\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.783657 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npqzh\" (UniqueName: \"kubernetes.io/projected/24a53752-0ee5-43db-b968-b4d11414ffdb-kube-api-access-npqzh\") pod \"cinder-db-sync-fcgjm\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.783853 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8866facf-9bb1-4deb-9244-15805d162346-combined-ca-bundle\") pod \"neutron-db-sync-khvqc\" (UID: \"8866facf-9bb1-4deb-9244-15805d162346\") " pod="openstack/neutron-db-sync-khvqc" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.797051 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8866facf-9bb1-4deb-9244-15805d162346-config\") pod \"neutron-db-sync-khvqc\" (UID: \"8866facf-9bb1-4deb-9244-15805d162346\") " pod="openstack/neutron-db-sync-khvqc" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.808865 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.812771 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgdcc\" (UniqueName: \"kubernetes.io/projected/8866facf-9bb1-4deb-9244-15805d162346-kube-api-access-jgdcc\") pod \"neutron-db-sync-khvqc\" (UID: \"8866facf-9bb1-4deb-9244-15805d162346\") " pod="openstack/neutron-db-sync-khvqc" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.814671 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-combined-ca-bundle\") pod \"cinder-db-sync-fcgjm\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.821516 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.825993 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.827681 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.837140 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.846303 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.867719 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-xc5wf"] Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.871316 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xc5wf" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.878778 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-config-data\") pod \"placement-db-sync-9zvlz\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " pod="openstack/placement-db-sync-9zvlz" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.878828 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-logs\") pod \"placement-db-sync-9zvlz\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " pod="openstack/placement-db-sync-9zvlz" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.878865 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-scripts\") pod \"placement-db-sync-9zvlz\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " pod="openstack/placement-db-sync-9zvlz" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.878883 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-combined-ca-bundle\") pod \"placement-db-sync-9zvlz\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " pod="openstack/placement-db-sync-9zvlz" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.879063 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-288vg\" (UniqueName: \"kubernetes.io/projected/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-kube-api-access-288vg\") pod \"placement-db-sync-9zvlz\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " pod="openstack/placement-db-sync-9zvlz" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.882795 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.884233 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-logs\") pod \"placement-db-sync-9zvlz\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " pod="openstack/placement-db-sync-9zvlz" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.888867 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-scripts\") pod \"placement-db-sync-9zvlz\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " pod="openstack/placement-db-sync-9zvlz" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.895273 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-config-data\") pod \"placement-db-sync-9zvlz\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " pod="openstack/placement-db-sync-9zvlz" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.896793 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-combined-ca-bundle\") pod \"placement-db-sync-9zvlz\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " pod="openstack/placement-db-sync-9zvlz" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.896876 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-868gk" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.904689 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64ccc486bf-9ld6k"] Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.914792 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-288vg\" (UniqueName: \"kubernetes.io/projected/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-kube-api-access-288vg\") pod \"placement-db-sync-9zvlz\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " pod="openstack/placement-db-sync-9zvlz" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.925307 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xc5wf"] Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.934793 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79464d554c-vgvct"] Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.944311 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79464d554c-vgvct"] Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.944448 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.970699 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-khvqc" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.983338 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f74aa2d5-0275-4b4c-a394-468414e6f840-run-httpd\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.983455 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-scripts\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.983820 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.984381 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1b212057-565c-4246-820a-a804fb6da962-db-sync-config-data\") pod \"barbican-db-sync-xc5wf\" (UID: \"1b212057-565c-4246-820a-a804fb6da962\") " pod="openstack/barbican-db-sync-xc5wf" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.984454 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc4jw\" (UniqueName: \"kubernetes.io/projected/1b212057-565c-4246-820a-a804fb6da962-kube-api-access-cc4jw\") pod \"barbican-db-sync-xc5wf\" (UID: \"1b212057-565c-4246-820a-a804fb6da962\") " pod="openstack/barbican-db-sync-xc5wf" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.984480 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f74aa2d5-0275-4b4c-a394-468414e6f840-log-httpd\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.984508 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b212057-565c-4246-820a-a804fb6da962-combined-ca-bundle\") pod \"barbican-db-sync-xc5wf\" (UID: \"1b212057-565c-4246-820a-a804fb6da962\") " pod="openstack/barbican-db-sync-xc5wf" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.984538 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.984657 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-config-data\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.984707 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:30 crc kubenswrapper[4954]: I1206 07:18:30.984751 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd8cr\" (UniqueName: \"kubernetes.io/projected/f74aa2d5-0275-4b4c-a394-468414e6f840-kube-api-access-gd8cr\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.086800 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-config\") pod \"dnsmasq-dns-79464d554c-vgvct\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.086876 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-ovsdbserver-sb\") pod \"dnsmasq-dns-79464d554c-vgvct\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.086921 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-config-data\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.086948 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.087088 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd8cr\" (UniqueName: \"kubernetes.io/projected/f74aa2d5-0275-4b4c-a394-468414e6f840-kube-api-access-gd8cr\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.087150 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f74aa2d5-0275-4b4c-a394-468414e6f840-run-httpd\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.087190 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-dns-swift-storage-0\") pod \"dnsmasq-dns-79464d554c-vgvct\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.087276 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g94p\" (UniqueName: \"kubernetes.io/projected/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-kube-api-access-5g94p\") pod \"dnsmasq-dns-79464d554c-vgvct\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.087335 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-dns-svc\") pod \"dnsmasq-dns-79464d554c-vgvct\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.087431 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-scripts\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.087526 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1b212057-565c-4246-820a-a804fb6da962-db-sync-config-data\") pod \"barbican-db-sync-xc5wf\" (UID: \"1b212057-565c-4246-820a-a804fb6da962\") " pod="openstack/barbican-db-sync-xc5wf" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.087677 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc4jw\" (UniqueName: \"kubernetes.io/projected/1b212057-565c-4246-820a-a804fb6da962-kube-api-access-cc4jw\") pod \"barbican-db-sync-xc5wf\" (UID: \"1b212057-565c-4246-820a-a804fb6da962\") " pod="openstack/barbican-db-sync-xc5wf" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.087706 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f74aa2d5-0275-4b4c-a394-468414e6f840-log-httpd\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.087732 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-ovsdbserver-nb\") pod \"dnsmasq-dns-79464d554c-vgvct\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.087766 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b212057-565c-4246-820a-a804fb6da962-combined-ca-bundle\") pod \"barbican-db-sync-xc5wf\" (UID: \"1b212057-565c-4246-820a-a804fb6da962\") " pod="openstack/barbican-db-sync-xc5wf" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.087808 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.088552 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f74aa2d5-0275-4b4c-a394-468414e6f840-run-httpd\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.092016 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f74aa2d5-0275-4b4c-a394-468414e6f840-log-httpd\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.094595 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1b212057-565c-4246-820a-a804fb6da962-db-sync-config-data\") pod \"barbican-db-sync-xc5wf\" (UID: \"1b212057-565c-4246-820a-a804fb6da962\") " pod="openstack/barbican-db-sync-xc5wf" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.097592 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-scripts\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.103713 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b212057-565c-4246-820a-a804fb6da962-combined-ca-bundle\") pod \"barbican-db-sync-xc5wf\" (UID: \"1b212057-565c-4246-820a-a804fb6da962\") " pod="openstack/barbican-db-sync-xc5wf" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.103876 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-config-data\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.103916 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.111436 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.128742 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc4jw\" (UniqueName: \"kubernetes.io/projected/1b212057-565c-4246-820a-a804fb6da962-kube-api-access-cc4jw\") pod \"barbican-db-sync-xc5wf\" (UID: \"1b212057-565c-4246-820a-a804fb6da962\") " pod="openstack/barbican-db-sync-xc5wf" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.128905 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd8cr\" (UniqueName: \"kubernetes.io/projected/f74aa2d5-0275-4b4c-a394-468414e6f840-kube-api-access-gd8cr\") pod \"ceilometer-0\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " pod="openstack/ceilometer-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.189294 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-dns-svc\") pod \"dnsmasq-dns-79464d554c-vgvct\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.189423 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-ovsdbserver-nb\") pod \"dnsmasq-dns-79464d554c-vgvct\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.189458 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-config\") pod \"dnsmasq-dns-79464d554c-vgvct\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.189492 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-ovsdbserver-sb\") pod \"dnsmasq-dns-79464d554c-vgvct\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.189540 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-dns-swift-storage-0\") pod \"dnsmasq-dns-79464d554c-vgvct\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.189582 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g94p\" (UniqueName: \"kubernetes.io/projected/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-kube-api-access-5g94p\") pod \"dnsmasq-dns-79464d554c-vgvct\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.191045 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-dns-svc\") pod \"dnsmasq-dns-79464d554c-vgvct\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.191804 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-dns-swift-storage-0\") pod \"dnsmasq-dns-79464d554c-vgvct\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.192334 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-config\") pod \"dnsmasq-dns-79464d554c-vgvct\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.192717 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-ovsdbserver-sb\") pod \"dnsmasq-dns-79464d554c-vgvct\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.192853 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-ovsdbserver-nb\") pod \"dnsmasq-dns-79464d554c-vgvct\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.195285 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9zvlz" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.212175 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g94p\" (UniqueName: \"kubernetes.io/projected/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-kube-api-access-5g94p\") pod \"dnsmasq-dns-79464d554c-vgvct\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.218174 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.229740 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xc5wf" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.270581 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.302344 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64ccc486bf-9ld6k"] Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.342481 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.356636 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.373437 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-j5l2l" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.373592 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.373808 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.373938 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.380274 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.433809 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9kg4w"] Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.471097 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11c1e0d7-df63-4eb2-b161-2f5e0963eb31" path="/var/lib/kubelet/pods/11c1e0d7-df63-4eb2-b161-2f5e0963eb31/volumes" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.473249 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.477303 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.477695 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.485254 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.486003 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.513786 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/850e525c-ce83-4a19-90a1-f513acc15541-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.514058 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.514211 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/850e525c-ce83-4a19-90a1-f513acc15541-logs\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.514337 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2str\" (UniqueName: \"kubernetes.io/projected/850e525c-ce83-4a19-90a1-f513acc15541-kube-api-access-c2str\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.514454 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.514600 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-config-data\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.514746 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.514918 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-scripts\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.539524 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" event={"ID":"370e499f-2b32-4084-8a3d-7a800c154052","Type":"ContainerStarted","Data":"6f2ba5c6b8efd836ed5d2ffbb6316b47bfa2b1d1366102a9f4befe103f7c495d"} Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.541786 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" podUID="895971f5-3210-47df-96b8-38b5061b45eb" containerName="dnsmasq-dns" containerID="cri-o://a8d0df08c9e8c9154ab5102f96cb135325137f164b680589b8113828639fd0e6" gracePeriod=10 Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.542243 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9kg4w" event={"ID":"11864c16-c8f0-4933-bf0d-aa97e1c4e650","Type":"ContainerStarted","Data":"c99331f4772c8119b9e36da62e9a7cb923ba71329b35d614b4a1b4f59b3013f5"} Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.593308 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-khvqc"] Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.618885 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-scripts\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.620052 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/850e525c-ce83-4a19-90a1-f513acc15541-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.620578 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/850e525c-ce83-4a19-90a1-f513acc15541-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.620932 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.621113 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0142b96-53a8-4914-963f-e1bad9a870e0-logs\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.621373 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.622277 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.622220 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.623112 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/850e525c-ce83-4a19-90a1-f513acc15541-logs\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.623227 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2str\" (UniqueName: \"kubernetes.io/projected/850e525c-ce83-4a19-90a1-f513acc15541-kube-api-access-c2str\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.623377 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.635299 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0142b96-53a8-4914-963f-e1bad9a870e0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.635624 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.635744 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.635840 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-config-data\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.635971 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.636232 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.636377 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x42lz\" (UniqueName: \"kubernetes.io/projected/b0142b96-53a8-4914-963f-e1bad9a870e0-kube-api-access-x42lz\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.632963 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/850e525c-ce83-4a19-90a1-f513acc15541-logs\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.638288 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-scripts\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.638712 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.643694 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-config-data\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.649109 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.650367 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2str\" (UniqueName: \"kubernetes.io/projected/850e525c-ce83-4a19-90a1-f513acc15541-kube-api-access-c2str\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.689780 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fcgjm"] Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.736874 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.739088 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.739129 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0142b96-53a8-4914-963f-e1bad9a870e0-logs\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.739174 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.739214 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0142b96-53a8-4914-963f-e1bad9a870e0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.739232 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.739248 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.739283 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.739312 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x42lz\" (UniqueName: \"kubernetes.io/projected/b0142b96-53a8-4914-963f-e1bad9a870e0-kube-api-access-x42lz\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.740497 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.740512 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0142b96-53a8-4914-963f-e1bad9a870e0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.740865 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0142b96-53a8-4914-963f-e1bad9a870e0-logs\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.747021 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.747819 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.760088 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.777777 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x42lz\" (UniqueName: \"kubernetes.io/projected/b0142b96-53a8-4914-963f-e1bad9a870e0-kube-api-access-x42lz\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.796055 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: W1206 07:18:31.810761 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24a53752_0ee5_43db_b968_b4d11414ffdb.slice/crio-6b18385b98aa796c4502af81134f0f82d24002f658fd4832d3abc7c06e9b4fac WatchSource:0}: Error finding container 6b18385b98aa796c4502af81134f0f82d24002f658fd4832d3abc7c06e9b4fac: Status 404 returned error can't find the container with id 6b18385b98aa796c4502af81134f0f82d24002f658fd4832d3abc7c06e9b4fac Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.824294 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:31 crc kubenswrapper[4954]: I1206 07:18:31.975924 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xc5wf"] Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.013398 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.019462 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.029817 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-9zvlz"] Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.115298 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79464d554c-vgvct"] Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.118448 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 07:18:32 crc kubenswrapper[4954]: W1206 07:18:32.120280 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcefc5dd8_7bea_4938_8489_7b5edc4e79f8.slice/crio-a6212ffd5e473996c082c7ba70e59d677a426b234e5dcae3c5d3c9c48a02a50a WatchSource:0}: Error finding container a6212ffd5e473996c082c7ba70e59d677a426b234e5dcae3c5d3c9c48a02a50a: Status 404 returned error can't find the container with id a6212ffd5e473996c082c7ba70e59d677a426b234e5dcae3c5d3c9c48a02a50a Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.219174 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.352746 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-dns-svc\") pod \"895971f5-3210-47df-96b8-38b5061b45eb\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.353534 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-ovsdbserver-nb\") pod \"895971f5-3210-47df-96b8-38b5061b45eb\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.353600 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-ovsdbserver-sb\") pod \"895971f5-3210-47df-96b8-38b5061b45eb\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.353663 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-dns-swift-storage-0\") pod \"895971f5-3210-47df-96b8-38b5061b45eb\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.353703 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj5zj\" (UniqueName: \"kubernetes.io/projected/895971f5-3210-47df-96b8-38b5061b45eb-kube-api-access-fj5zj\") pod \"895971f5-3210-47df-96b8-38b5061b45eb\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.353772 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-config\") pod \"895971f5-3210-47df-96b8-38b5061b45eb\" (UID: \"895971f5-3210-47df-96b8-38b5061b45eb\") " Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.384922 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/895971f5-3210-47df-96b8-38b5061b45eb-kube-api-access-fj5zj" (OuterVolumeSpecName: "kube-api-access-fj5zj") pod "895971f5-3210-47df-96b8-38b5061b45eb" (UID: "895971f5-3210-47df-96b8-38b5061b45eb"). InnerVolumeSpecName "kube-api-access-fj5zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.456878 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj5zj\" (UniqueName: \"kubernetes.io/projected/895971f5-3210-47df-96b8-38b5061b45eb-kube-api-access-fj5zj\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.562156 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-config" (OuterVolumeSpecName: "config") pod "895971f5-3210-47df-96b8-38b5061b45eb" (UID: "895971f5-3210-47df-96b8-38b5061b45eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.613726 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "895971f5-3210-47df-96b8-38b5061b45eb" (UID: "895971f5-3210-47df-96b8-38b5061b45eb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.616060 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9kg4w" event={"ID":"11864c16-c8f0-4933-bf0d-aa97e1c4e650","Type":"ContainerStarted","Data":"4dde86b4c7309d7ef903e4cc4005a418902a0316bf43d573aa6b04182fb92324"} Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.617787 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "895971f5-3210-47df-96b8-38b5061b45eb" (UID: "895971f5-3210-47df-96b8-38b5061b45eb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.629113 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fcgjm" event={"ID":"24a53752-0ee5-43db-b968-b4d11414ffdb","Type":"ContainerStarted","Data":"6b18385b98aa796c4502af81134f0f82d24002f658fd4832d3abc7c06e9b4fac"} Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.646034 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "895971f5-3210-47df-96b8-38b5061b45eb" (UID: "895971f5-3210-47df-96b8-38b5061b45eb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.646630 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-9kg4w" podStartSLOduration=2.646606254 podStartE2EDuration="2.646606254s" podCreationTimestamp="2025-12-06 07:18:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:18:32.64647584 +0000 UTC m=+1287.459835239" watchObservedRunningTime="2025-12-06 07:18:32.646606254 +0000 UTC m=+1287.459965663" Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.651764 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f74aa2d5-0275-4b4c-a394-468414e6f840","Type":"ContainerStarted","Data":"ae08e3a0dba5fd3b4b77ad50e0a1477e3f4d41f698c372d054548bfe692081be"} Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.660967 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "895971f5-3210-47df-96b8-38b5061b45eb" (UID: "895971f5-3210-47df-96b8-38b5061b45eb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.666414 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.666461 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.666471 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.666488 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.666498 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/895971f5-3210-47df-96b8-38b5061b45eb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.717376 4954 generic.go:334] "Generic (PLEG): container finished" podID="370e499f-2b32-4084-8a3d-7a800c154052" containerID="050a4f5980a9694153fae6930308715f2018802be71819d326685211458af85b" exitCode=0 Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.717550 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" event={"ID":"370e499f-2b32-4084-8a3d-7a800c154052","Type":"ContainerDied","Data":"050a4f5980a9694153fae6930308715f2018802be71819d326685211458af85b"} Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.773162 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xc5wf" event={"ID":"1b212057-565c-4246-820a-a804fb6da962","Type":"ContainerStarted","Data":"8dcf9db4bd9e3a9667f7c4d856314e1bce8bd44ce7234114db928d38d365f756"} Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.814246 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-khvqc" event={"ID":"8866facf-9bb1-4deb-9244-15805d162346","Type":"ContainerStarted","Data":"71b04eafce145d86555e0beb19909a5726709e8e20af73819ad4cf85b6cd66fc"} Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.814297 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-khvqc" event={"ID":"8866facf-9bb1-4deb-9244-15805d162346","Type":"ContainerStarted","Data":"c53a61a1f1a3aa8365da3fcccfbf47d0de46a3c6c8eaadcd8fa1065601876551"} Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.835388 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79464d554c-vgvct" event={"ID":"cefc5dd8-7bea-4938-8489-7b5edc4e79f8","Type":"ContainerStarted","Data":"1c274f4ec368dd0bbdd0327af80dcdd0dbe4da385fae01741d472204dc012d51"} Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.835434 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79464d554c-vgvct" event={"ID":"cefc5dd8-7bea-4938-8489-7b5edc4e79f8","Type":"ContainerStarted","Data":"a6212ffd5e473996c082c7ba70e59d677a426b234e5dcae3c5d3c9c48a02a50a"} Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.857641 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.868116 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9zvlz" event={"ID":"c5f9fd4e-cc86-43cf-8195-cae0becb9e64","Type":"ContainerStarted","Data":"1135bbf7d222445109c31211f37d1e1c17c046a3f9e8171323fa2b652ac9193a"} Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.946021 4954 generic.go:334] "Generic (PLEG): container finished" podID="895971f5-3210-47df-96b8-38b5061b45eb" containerID="a8d0df08c9e8c9154ab5102f96cb135325137f164b680589b8113828639fd0e6" exitCode=0 Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.946090 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" event={"ID":"895971f5-3210-47df-96b8-38b5061b45eb","Type":"ContainerDied","Data":"a8d0df08c9e8c9154ab5102f96cb135325137f164b680589b8113828639fd0e6"} Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.946135 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" event={"ID":"895971f5-3210-47df-96b8-38b5061b45eb","Type":"ContainerDied","Data":"4614c7edb988b3995374cbb64873ed794baf40b8294b6a8936c99d86b824383a"} Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.946155 4954 scope.go:117] "RemoveContainer" containerID="a8d0df08c9e8c9154ab5102f96cb135325137f164b680589b8113828639fd0e6" Dec 06 07:18:32 crc kubenswrapper[4954]: I1206 07:18:32.946341 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fbdccd69c-j7sc4" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.069513 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-khvqc" podStartSLOduration=3.069473511 podStartE2EDuration="3.069473511s" podCreationTimestamp="2025-12-06 07:18:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:18:33.026318502 +0000 UTC m=+1287.839677901" watchObservedRunningTime="2025-12-06 07:18:33.069473511 +0000 UTC m=+1287.882832900" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.123983 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fbdccd69c-j7sc4"] Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.154278 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fbdccd69c-j7sc4"] Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.171076 4954 scope.go:117] "RemoveContainer" containerID="5222d83bbde98003b6ab6775544e901eb43a5ab5116bbab952d73c90b941f688" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.225514 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.347783 4954 scope.go:117] "RemoveContainer" containerID="a8d0df08c9e8c9154ab5102f96cb135325137f164b680589b8113828639fd0e6" Dec 06 07:18:33 crc kubenswrapper[4954]: E1206 07:18:33.348479 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8d0df08c9e8c9154ab5102f96cb135325137f164b680589b8113828639fd0e6\": container with ID starting with a8d0df08c9e8c9154ab5102f96cb135325137f164b680589b8113828639fd0e6 not found: ID does not exist" containerID="a8d0df08c9e8c9154ab5102f96cb135325137f164b680589b8113828639fd0e6" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.348547 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8d0df08c9e8c9154ab5102f96cb135325137f164b680589b8113828639fd0e6"} err="failed to get container status \"a8d0df08c9e8c9154ab5102f96cb135325137f164b680589b8113828639fd0e6\": rpc error: code = NotFound desc = could not find container \"a8d0df08c9e8c9154ab5102f96cb135325137f164b680589b8113828639fd0e6\": container with ID starting with a8d0df08c9e8c9154ab5102f96cb135325137f164b680589b8113828639fd0e6 not found: ID does not exist" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.348595 4954 scope.go:117] "RemoveContainer" containerID="5222d83bbde98003b6ab6775544e901eb43a5ab5116bbab952d73c90b941f688" Dec 06 07:18:33 crc kubenswrapper[4954]: E1206 07:18:33.349031 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5222d83bbde98003b6ab6775544e901eb43a5ab5116bbab952d73c90b941f688\": container with ID starting with 5222d83bbde98003b6ab6775544e901eb43a5ab5116bbab952d73c90b941f688 not found: ID does not exist" containerID="5222d83bbde98003b6ab6775544e901eb43a5ab5116bbab952d73c90b941f688" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.349056 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5222d83bbde98003b6ab6775544e901eb43a5ab5116bbab952d73c90b941f688"} err="failed to get container status \"5222d83bbde98003b6ab6775544e901eb43a5ab5116bbab952d73c90b941f688\": rpc error: code = NotFound desc = could not find container \"5222d83bbde98003b6ab6775544e901eb43a5ab5116bbab952d73c90b941f688\": container with ID starting with 5222d83bbde98003b6ab6775544e901eb43a5ab5116bbab952d73c90b941f688 not found: ID does not exist" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.379099 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.504296 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-dns-svc\") pod \"370e499f-2b32-4084-8a3d-7a800c154052\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.504393 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkvg4\" (UniqueName: \"kubernetes.io/projected/370e499f-2b32-4084-8a3d-7a800c154052-kube-api-access-mkvg4\") pod \"370e499f-2b32-4084-8a3d-7a800c154052\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.504690 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-config\") pod \"370e499f-2b32-4084-8a3d-7a800c154052\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.504723 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-ovsdbserver-sb\") pod \"370e499f-2b32-4084-8a3d-7a800c154052\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.504840 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-dns-swift-storage-0\") pod \"370e499f-2b32-4084-8a3d-7a800c154052\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.504915 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-ovsdbserver-nb\") pod \"370e499f-2b32-4084-8a3d-7a800c154052\" (UID: \"370e499f-2b32-4084-8a3d-7a800c154052\") " Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.506426 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="895971f5-3210-47df-96b8-38b5061b45eb" path="/var/lib/kubelet/pods/895971f5-3210-47df-96b8-38b5061b45eb/volumes" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.532916 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/370e499f-2b32-4084-8a3d-7a800c154052-kube-api-access-mkvg4" (OuterVolumeSpecName: "kube-api-access-mkvg4") pod "370e499f-2b32-4084-8a3d-7a800c154052" (UID: "370e499f-2b32-4084-8a3d-7a800c154052"). InnerVolumeSpecName "kube-api-access-mkvg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.607506 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkvg4\" (UniqueName: \"kubernetes.io/projected/370e499f-2b32-4084-8a3d-7a800c154052-kube-api-access-mkvg4\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.654257 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-config" (OuterVolumeSpecName: "config") pod "370e499f-2b32-4084-8a3d-7a800c154052" (UID: "370e499f-2b32-4084-8a3d-7a800c154052"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.656966 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "370e499f-2b32-4084-8a3d-7a800c154052" (UID: "370e499f-2b32-4084-8a3d-7a800c154052"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.675463 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.695221 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "370e499f-2b32-4084-8a3d-7a800c154052" (UID: "370e499f-2b32-4084-8a3d-7a800c154052"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.708924 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.708969 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.708979 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.739691 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "370e499f-2b32-4084-8a3d-7a800c154052" (UID: "370e499f-2b32-4084-8a3d-7a800c154052"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.745801 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "370e499f-2b32-4084-8a3d-7a800c154052" (UID: "370e499f-2b32-4084-8a3d-7a800c154052"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.768822 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.816474 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.816520 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/370e499f-2b32-4084-8a3d-7a800c154052-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.827216 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 07:18:33 crc kubenswrapper[4954]: I1206 07:18:33.984355 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b0142b96-53a8-4914-963f-e1bad9a870e0","Type":"ContainerStarted","Data":"69b71cad79c1659dcc104073f6be27ff84c14f42dc6a65a1cafcacddd5ab566a"} Dec 06 07:18:34 crc kubenswrapper[4954]: I1206 07:18:34.032705 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" Dec 06 07:18:34 crc kubenswrapper[4954]: I1206 07:18:34.032677 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64ccc486bf-9ld6k" event={"ID":"370e499f-2b32-4084-8a3d-7a800c154052","Type":"ContainerDied","Data":"6f2ba5c6b8efd836ed5d2ffbb6316b47bfa2b1d1366102a9f4befe103f7c495d"} Dec 06 07:18:34 crc kubenswrapper[4954]: I1206 07:18:34.035455 4954 scope.go:117] "RemoveContainer" containerID="050a4f5980a9694153fae6930308715f2018802be71819d326685211458af85b" Dec 06 07:18:34 crc kubenswrapper[4954]: I1206 07:18:34.049831 4954 generic.go:334] "Generic (PLEG): container finished" podID="cefc5dd8-7bea-4938-8489-7b5edc4e79f8" containerID="1c274f4ec368dd0bbdd0327af80dcdd0dbe4da385fae01741d472204dc012d51" exitCode=0 Dec 06 07:18:34 crc kubenswrapper[4954]: I1206 07:18:34.049953 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79464d554c-vgvct" event={"ID":"cefc5dd8-7bea-4938-8489-7b5edc4e79f8","Type":"ContainerDied","Data":"1c274f4ec368dd0bbdd0327af80dcdd0dbe4da385fae01741d472204dc012d51"} Dec 06 07:18:34 crc kubenswrapper[4954]: I1206 07:18:34.054999 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"850e525c-ce83-4a19-90a1-f513acc15541","Type":"ContainerStarted","Data":"8d570a4452d7bd1d70e6f0b22ffe6089ec1bd016a23174c80d930a6d2ac9796f"} Dec 06 07:18:34 crc kubenswrapper[4954]: I1206 07:18:34.167420 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64ccc486bf-9ld6k"] Dec 06 07:18:34 crc kubenswrapper[4954]: I1206 07:18:34.196944 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64ccc486bf-9ld6k"] Dec 06 07:18:35 crc kubenswrapper[4954]: I1206 07:18:35.112458 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b0142b96-53a8-4914-963f-e1bad9a870e0","Type":"ContainerStarted","Data":"333c190eb878e03213e09e785af05b76a563115fc89aee5caea585937ac11f76"} Dec 06 07:18:35 crc kubenswrapper[4954]: I1206 07:18:35.124004 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79464d554c-vgvct" event={"ID":"cefc5dd8-7bea-4938-8489-7b5edc4e79f8","Type":"ContainerStarted","Data":"3ff1ddc524f22f0e5266e63bc6127148019f544502da9441ee205d663d4cf95d"} Dec 06 07:18:35 crc kubenswrapper[4954]: I1206 07:18:35.124405 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:35 crc kubenswrapper[4954]: I1206 07:18:35.127050 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"850e525c-ce83-4a19-90a1-f513acc15541","Type":"ContainerStarted","Data":"9578910258df4e76f347169a5399c7fb23a02dd465bb006c6de63efd52f48f71"} Dec 06 07:18:35 crc kubenswrapper[4954]: I1206 07:18:35.459420 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="370e499f-2b32-4084-8a3d-7a800c154052" path="/var/lib/kubelet/pods/370e499f-2b32-4084-8a3d-7a800c154052/volumes" Dec 06 07:18:35 crc kubenswrapper[4954]: I1206 07:18:35.483802 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79464d554c-vgvct" podStartSLOduration=5.483770331 podStartE2EDuration="5.483770331s" podCreationTimestamp="2025-12-06 07:18:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:18:35.171613705 +0000 UTC m=+1289.984973104" watchObservedRunningTime="2025-12-06 07:18:35.483770331 +0000 UTC m=+1290.297129720" Dec 06 07:18:36 crc kubenswrapper[4954]: I1206 07:18:36.139887 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b0142b96-53a8-4914-963f-e1bad9a870e0","Type":"ContainerStarted","Data":"3c228de746b150019069d960694b69b97286f938c755c26b169c69119edf1149"} Dec 06 07:18:36 crc kubenswrapper[4954]: I1206 07:18:36.139985 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b0142b96-53a8-4914-963f-e1bad9a870e0" containerName="glance-log" containerID="cri-o://333c190eb878e03213e09e785af05b76a563115fc89aee5caea585937ac11f76" gracePeriod=30 Dec 06 07:18:36 crc kubenswrapper[4954]: I1206 07:18:36.140381 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b0142b96-53a8-4914-963f-e1bad9a870e0" containerName="glance-httpd" containerID="cri-o://3c228de746b150019069d960694b69b97286f938c755c26b169c69119edf1149" gracePeriod=30 Dec 06 07:18:36 crc kubenswrapper[4954]: I1206 07:18:36.144545 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="850e525c-ce83-4a19-90a1-f513acc15541" containerName="glance-log" containerID="cri-o://9578910258df4e76f347169a5399c7fb23a02dd465bb006c6de63efd52f48f71" gracePeriod=30 Dec 06 07:18:36 crc kubenswrapper[4954]: I1206 07:18:36.144614 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"850e525c-ce83-4a19-90a1-f513acc15541","Type":"ContainerStarted","Data":"b48f031a1d531a6911d8cd674232d57e40393a990a278e0223ce75237fa48e04"} Dec 06 07:18:36 crc kubenswrapper[4954]: I1206 07:18:36.144768 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="850e525c-ce83-4a19-90a1-f513acc15541" containerName="glance-httpd" containerID="cri-o://b48f031a1d531a6911d8cd674232d57e40393a990a278e0223ce75237fa48e04" gracePeriod=30 Dec 06 07:18:36 crc kubenswrapper[4954]: I1206 07:18:36.236346 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.236318984 podStartE2EDuration="6.236318984s" podCreationTimestamp="2025-12-06 07:18:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:18:36.231275079 +0000 UTC m=+1291.044634468" watchObservedRunningTime="2025-12-06 07:18:36.236318984 +0000 UTC m=+1291.049678373" Dec 06 07:18:36 crc kubenswrapper[4954]: I1206 07:18:36.240760 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.240734443 podStartE2EDuration="6.240734443s" podCreationTimestamp="2025-12-06 07:18:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:18:36.185577483 +0000 UTC m=+1290.998936892" watchObservedRunningTime="2025-12-06 07:18:36.240734443 +0000 UTC m=+1291.054093832" Dec 06 07:18:37 crc kubenswrapper[4954]: I1206 07:18:37.172348 4954 generic.go:334] "Generic (PLEG): container finished" podID="850e525c-ce83-4a19-90a1-f513acc15541" containerID="b48f031a1d531a6911d8cd674232d57e40393a990a278e0223ce75237fa48e04" exitCode=143 Dec 06 07:18:37 crc kubenswrapper[4954]: I1206 07:18:37.172814 4954 generic.go:334] "Generic (PLEG): container finished" podID="850e525c-ce83-4a19-90a1-f513acc15541" containerID="9578910258df4e76f347169a5399c7fb23a02dd465bb006c6de63efd52f48f71" exitCode=143 Dec 06 07:18:37 crc kubenswrapper[4954]: I1206 07:18:37.172399 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"850e525c-ce83-4a19-90a1-f513acc15541","Type":"ContainerDied","Data":"b48f031a1d531a6911d8cd674232d57e40393a990a278e0223ce75237fa48e04"} Dec 06 07:18:37 crc kubenswrapper[4954]: I1206 07:18:37.172965 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"850e525c-ce83-4a19-90a1-f513acc15541","Type":"ContainerDied","Data":"9578910258df4e76f347169a5399c7fb23a02dd465bb006c6de63efd52f48f71"} Dec 06 07:18:37 crc kubenswrapper[4954]: I1206 07:18:37.178247 4954 generic.go:334] "Generic (PLEG): container finished" podID="b0142b96-53a8-4914-963f-e1bad9a870e0" containerID="3c228de746b150019069d960694b69b97286f938c755c26b169c69119edf1149" exitCode=143 Dec 06 07:18:37 crc kubenswrapper[4954]: I1206 07:18:37.178317 4954 generic.go:334] "Generic (PLEG): container finished" podID="b0142b96-53a8-4914-963f-e1bad9a870e0" containerID="333c190eb878e03213e09e785af05b76a563115fc89aee5caea585937ac11f76" exitCode=143 Dec 06 07:18:37 crc kubenswrapper[4954]: I1206 07:18:37.178325 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b0142b96-53a8-4914-963f-e1bad9a870e0","Type":"ContainerDied","Data":"3c228de746b150019069d960694b69b97286f938c755c26b169c69119edf1149"} Dec 06 07:18:37 crc kubenswrapper[4954]: I1206 07:18:37.178384 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b0142b96-53a8-4914-963f-e1bad9a870e0","Type":"ContainerDied","Data":"333c190eb878e03213e09e785af05b76a563115fc89aee5caea585937ac11f76"} Dec 06 07:18:40 crc kubenswrapper[4954]: I1206 07:18:40.102731 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:18:40 crc kubenswrapper[4954]: I1206 07:18:40.103320 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:18:41 crc kubenswrapper[4954]: I1206 07:18:41.238381 4954 generic.go:334] "Generic (PLEG): container finished" podID="11864c16-c8f0-4933-bf0d-aa97e1c4e650" containerID="4dde86b4c7309d7ef903e4cc4005a418902a0316bf43d573aa6b04182fb92324" exitCode=0 Dec 06 07:18:41 crc kubenswrapper[4954]: I1206 07:18:41.238467 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9kg4w" event={"ID":"11864c16-c8f0-4933-bf0d-aa97e1c4e650","Type":"ContainerDied","Data":"4dde86b4c7309d7ef903e4cc4005a418902a0316bf43d573aa6b04182fb92324"} Dec 06 07:18:41 crc kubenswrapper[4954]: I1206 07:18:41.271742 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:18:41 crc kubenswrapper[4954]: I1206 07:18:41.371505 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784d65c867-2mx6q"] Dec 06 07:18:41 crc kubenswrapper[4954]: I1206 07:18:41.372067 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-784d65c867-2mx6q" podUID="84bce81b-d290-4c3a-b5ce-e7188df23c4e" containerName="dnsmasq-dns" containerID="cri-o://9c8b3230b89e7cd383e6e20c8ead07828efba3da118a04036e69862d1033971b" gracePeriod=10 Dec 06 07:18:42 crc kubenswrapper[4954]: I1206 07:18:42.270309 4954 generic.go:334] "Generic (PLEG): container finished" podID="84bce81b-d290-4c3a-b5ce-e7188df23c4e" containerID="9c8b3230b89e7cd383e6e20c8ead07828efba3da118a04036e69862d1033971b" exitCode=0 Dec 06 07:18:42 crc kubenswrapper[4954]: I1206 07:18:42.270782 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784d65c867-2mx6q" event={"ID":"84bce81b-d290-4c3a-b5ce-e7188df23c4e","Type":"ContainerDied","Data":"9c8b3230b89e7cd383e6e20c8ead07828efba3da118a04036e69862d1033971b"} Dec 06 07:18:43 crc kubenswrapper[4954]: I1206 07:18:43.316728 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-784d65c867-2mx6q" podUID="84bce81b-d290-4c3a-b5ce-e7188df23c4e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: connect: connection refused" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.170356 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.305379 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"850e525c-ce83-4a19-90a1-f513acc15541","Type":"ContainerDied","Data":"8d570a4452d7bd1d70e6f0b22ffe6089ec1bd016a23174c80d930a6d2ac9796f"} Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.305446 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.305456 4954 scope.go:117] "RemoveContainer" containerID="b48f031a1d531a6911d8cd674232d57e40393a990a278e0223ce75237fa48e04" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.310761 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/850e525c-ce83-4a19-90a1-f513acc15541-httpd-run\") pod \"850e525c-ce83-4a19-90a1-f513acc15541\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.310927 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-config-data\") pod \"850e525c-ce83-4a19-90a1-f513acc15541\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.311109 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/850e525c-ce83-4a19-90a1-f513acc15541-logs\") pod \"850e525c-ce83-4a19-90a1-f513acc15541\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.311213 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-combined-ca-bundle\") pod \"850e525c-ce83-4a19-90a1-f513acc15541\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.311298 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2str\" (UniqueName: \"kubernetes.io/projected/850e525c-ce83-4a19-90a1-f513acc15541-kube-api-access-c2str\") pod \"850e525c-ce83-4a19-90a1-f513acc15541\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.311360 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-public-tls-certs\") pod \"850e525c-ce83-4a19-90a1-f513acc15541\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.311444 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"850e525c-ce83-4a19-90a1-f513acc15541\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.311586 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-scripts\") pod \"850e525c-ce83-4a19-90a1-f513acc15541\" (UID: \"850e525c-ce83-4a19-90a1-f513acc15541\") " Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.311730 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/850e525c-ce83-4a19-90a1-f513acc15541-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "850e525c-ce83-4a19-90a1-f513acc15541" (UID: "850e525c-ce83-4a19-90a1-f513acc15541"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.312450 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/850e525c-ce83-4a19-90a1-f513acc15541-logs" (OuterVolumeSpecName: "logs") pod "850e525c-ce83-4a19-90a1-f513acc15541" (UID: "850e525c-ce83-4a19-90a1-f513acc15541"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.312486 4954 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/850e525c-ce83-4a19-90a1-f513acc15541-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.320547 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-scripts" (OuterVolumeSpecName: "scripts") pod "850e525c-ce83-4a19-90a1-f513acc15541" (UID: "850e525c-ce83-4a19-90a1-f513acc15541"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.324390 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/850e525c-ce83-4a19-90a1-f513acc15541-kube-api-access-c2str" (OuterVolumeSpecName: "kube-api-access-c2str") pod "850e525c-ce83-4a19-90a1-f513acc15541" (UID: "850e525c-ce83-4a19-90a1-f513acc15541"). InnerVolumeSpecName "kube-api-access-c2str". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.331225 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "850e525c-ce83-4a19-90a1-f513acc15541" (UID: "850e525c-ce83-4a19-90a1-f513acc15541"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.342843 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "850e525c-ce83-4a19-90a1-f513acc15541" (UID: "850e525c-ce83-4a19-90a1-f513acc15541"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.365624 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "850e525c-ce83-4a19-90a1-f513acc15541" (UID: "850e525c-ce83-4a19-90a1-f513acc15541"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.373191 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-config-data" (OuterVolumeSpecName: "config-data") pod "850e525c-ce83-4a19-90a1-f513acc15541" (UID: "850e525c-ce83-4a19-90a1-f513acc15541"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.414428 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.414470 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.414481 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/850e525c-ce83-4a19-90a1-f513acc15541-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.414489 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.414503 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2str\" (UniqueName: \"kubernetes.io/projected/850e525c-ce83-4a19-90a1-f513acc15541-kube-api-access-c2str\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.414512 4954 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/850e525c-ce83-4a19-90a1-f513acc15541-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.414550 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.432825 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.518154 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.669456 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.686336 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.707418 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 07:18:45 crc kubenswrapper[4954]: E1206 07:18:45.707976 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="850e525c-ce83-4a19-90a1-f513acc15541" containerName="glance-log" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.707991 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="850e525c-ce83-4a19-90a1-f513acc15541" containerName="glance-log" Dec 06 07:18:45 crc kubenswrapper[4954]: E1206 07:18:45.708016 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="850e525c-ce83-4a19-90a1-f513acc15541" containerName="glance-httpd" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.708023 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="850e525c-ce83-4a19-90a1-f513acc15541" containerName="glance-httpd" Dec 06 07:18:45 crc kubenswrapper[4954]: E1206 07:18:45.708039 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="895971f5-3210-47df-96b8-38b5061b45eb" containerName="dnsmasq-dns" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.708046 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="895971f5-3210-47df-96b8-38b5061b45eb" containerName="dnsmasq-dns" Dec 06 07:18:45 crc kubenswrapper[4954]: E1206 07:18:45.708062 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370e499f-2b32-4084-8a3d-7a800c154052" containerName="init" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.708068 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="370e499f-2b32-4084-8a3d-7a800c154052" containerName="init" Dec 06 07:18:45 crc kubenswrapper[4954]: E1206 07:18:45.708087 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="895971f5-3210-47df-96b8-38b5061b45eb" containerName="init" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.708095 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="895971f5-3210-47df-96b8-38b5061b45eb" containerName="init" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.708348 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="370e499f-2b32-4084-8a3d-7a800c154052" containerName="init" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.708383 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="895971f5-3210-47df-96b8-38b5061b45eb" containerName="dnsmasq-dns" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.708402 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="850e525c-ce83-4a19-90a1-f513acc15541" containerName="glance-log" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.708410 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="850e525c-ce83-4a19-90a1-f513acc15541" containerName="glance-httpd" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.709463 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.871869 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.873700 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.887842 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.979143 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw5h4\" (UniqueName: \"kubernetes.io/projected/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-kube-api-access-hw5h4\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.979196 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-scripts\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.979219 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.979237 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-config-data\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.979329 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.979349 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.979390 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:45 crc kubenswrapper[4954]: I1206 07:18:45.979417 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-logs\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:46 crc kubenswrapper[4954]: I1206 07:18:46.082985 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:46 crc kubenswrapper[4954]: I1206 07:18:46.083054 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:46 crc kubenswrapper[4954]: I1206 07:18:46.083837 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:46 crc kubenswrapper[4954]: I1206 07:18:46.083893 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-logs\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:46 crc kubenswrapper[4954]: I1206 07:18:46.083974 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw5h4\" (UniqueName: \"kubernetes.io/projected/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-kube-api-access-hw5h4\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:46 crc kubenswrapper[4954]: I1206 07:18:46.084010 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-config-data\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:46 crc kubenswrapper[4954]: I1206 07:18:46.084032 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-scripts\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:46 crc kubenswrapper[4954]: I1206 07:18:46.084052 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:46 crc kubenswrapper[4954]: I1206 07:18:46.084837 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Dec 06 07:18:46 crc kubenswrapper[4954]: I1206 07:18:46.084967 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:46 crc kubenswrapper[4954]: I1206 07:18:46.085463 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-logs\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:46 crc kubenswrapper[4954]: I1206 07:18:46.092670 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-scripts\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:46 crc kubenswrapper[4954]: I1206 07:18:46.093301 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:46 crc kubenswrapper[4954]: I1206 07:18:46.093832 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:46 crc kubenswrapper[4954]: I1206 07:18:46.094586 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-config-data\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:46 crc kubenswrapper[4954]: I1206 07:18:46.105677 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw5h4\" (UniqueName: \"kubernetes.io/projected/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-kube-api-access-hw5h4\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:46 crc kubenswrapper[4954]: I1206 07:18:46.134411 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " pod="openstack/glance-default-external-api-0" Dec 06 07:18:46 crc kubenswrapper[4954]: I1206 07:18:46.213277 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 07:18:47 crc kubenswrapper[4954]: I1206 07:18:47.457723 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="850e525c-ce83-4a19-90a1-f513acc15541" path="/var/lib/kubelet/pods/850e525c-ce83-4a19-90a1-f513acc15541/volumes" Dec 06 07:18:53 crc kubenswrapper[4954]: I1206 07:18:53.325287 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-784d65c867-2mx6q" podUID="84bce81b-d290-4c3a-b5ce-e7188df23c4e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: i/o timeout" Dec 06 07:18:54 crc kubenswrapper[4954]: E1206 07:18:54.754747 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:82006b9c64d4c5f80483cda262d960ce6be4813665158ef1a53ea7734bbe431f" Dec 06 07:18:54 crc kubenswrapper[4954]: E1206 07:18:54.754958 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:82006b9c64d4c5f80483cda262d960ce6be4813665158ef1a53ea7734bbe431f,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cc4jw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-xc5wf_openstack(1b212057-565c-4246-820a-a804fb6da962): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:18:54 crc kubenswrapper[4954]: E1206 07:18:54.756543 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-xc5wf" podUID="1b212057-565c-4246-820a-a804fb6da962" Dec 06 07:18:54 crc kubenswrapper[4954]: I1206 07:18:54.855254 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 07:18:54 crc kubenswrapper[4954]: I1206 07:18:54.872418 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:54 crc kubenswrapper[4954]: I1206 07:18:54.872953 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.050848 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-config-data\") pod \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.051033 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-ovsdbserver-sb\") pod \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.051211 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-ovsdbserver-nb\") pod \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.051257 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-internal-tls-certs\") pod \"b0142b96-53a8-4914-963f-e1bad9a870e0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.051330 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x42lz\" (UniqueName: \"kubernetes.io/projected/b0142b96-53a8-4914-963f-e1bad9a870e0-kube-api-access-x42lz\") pod \"b0142b96-53a8-4914-963f-e1bad9a870e0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.051358 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-config-data\") pod \"b0142b96-53a8-4914-963f-e1bad9a870e0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.053205 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-combined-ca-bundle\") pod \"b0142b96-53a8-4914-963f-e1bad9a870e0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.053474 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-config\") pod \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.053534 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-scripts\") pod \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.053582 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-dns-svc\") pod \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.053623 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-scripts\") pod \"b0142b96-53a8-4914-963f-e1bad9a870e0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.053697 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0142b96-53a8-4914-963f-e1bad9a870e0-logs\") pod \"b0142b96-53a8-4914-963f-e1bad9a870e0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.053743 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc7zg\" (UniqueName: \"kubernetes.io/projected/84bce81b-d290-4c3a-b5ce-e7188df23c4e-kube-api-access-lc7zg\") pod \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\" (UID: \"84bce81b-d290-4c3a-b5ce-e7188df23c4e\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.053767 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"b0142b96-53a8-4914-963f-e1bad9a870e0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.053792 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9tlv\" (UniqueName: \"kubernetes.io/projected/11864c16-c8f0-4933-bf0d-aa97e1c4e650-kube-api-access-w9tlv\") pod \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.053876 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-credential-keys\") pod \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.054028 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0142b96-53a8-4914-963f-e1bad9a870e0-httpd-run\") pod \"b0142b96-53a8-4914-963f-e1bad9a870e0\" (UID: \"b0142b96-53a8-4914-963f-e1bad9a870e0\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.054090 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-fernet-keys\") pod \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.054123 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-combined-ca-bundle\") pod \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\" (UID: \"11864c16-c8f0-4933-bf0d-aa97e1c4e650\") " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.058457 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0142b96-53a8-4914-963f-e1bad9a870e0-kube-api-access-x42lz" (OuterVolumeSpecName: "kube-api-access-x42lz") pod "b0142b96-53a8-4914-963f-e1bad9a870e0" (UID: "b0142b96-53a8-4914-963f-e1bad9a870e0"). InnerVolumeSpecName "kube-api-access-x42lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.060905 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "b0142b96-53a8-4914-963f-e1bad9a870e0" (UID: "b0142b96-53a8-4914-963f-e1bad9a870e0"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.066432 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0142b96-53a8-4914-963f-e1bad9a870e0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b0142b96-53a8-4914-963f-e1bad9a870e0" (UID: "b0142b96-53a8-4914-963f-e1bad9a870e0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.067375 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "11864c16-c8f0-4933-bf0d-aa97e1c4e650" (UID: "11864c16-c8f0-4933-bf0d-aa97e1c4e650"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.067504 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-scripts" (OuterVolumeSpecName: "scripts") pod "11864c16-c8f0-4933-bf0d-aa97e1c4e650" (UID: "11864c16-c8f0-4933-bf0d-aa97e1c4e650"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.070630 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0142b96-53a8-4914-963f-e1bad9a870e0-logs" (OuterVolumeSpecName: "logs") pod "b0142b96-53a8-4914-963f-e1bad9a870e0" (UID: "b0142b96-53a8-4914-963f-e1bad9a870e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.072348 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84bce81b-d290-4c3a-b5ce-e7188df23c4e-kube-api-access-lc7zg" (OuterVolumeSpecName: "kube-api-access-lc7zg") pod "84bce81b-d290-4c3a-b5ce-e7188df23c4e" (UID: "84bce81b-d290-4c3a-b5ce-e7188df23c4e"). InnerVolumeSpecName "kube-api-access-lc7zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.076348 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-scripts" (OuterVolumeSpecName: "scripts") pod "b0142b96-53a8-4914-963f-e1bad9a870e0" (UID: "b0142b96-53a8-4914-963f-e1bad9a870e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.079663 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11864c16-c8f0-4933-bf0d-aa97e1c4e650-kube-api-access-w9tlv" (OuterVolumeSpecName: "kube-api-access-w9tlv") pod "11864c16-c8f0-4933-bf0d-aa97e1c4e650" (UID: "11864c16-c8f0-4933-bf0d-aa97e1c4e650"). InnerVolumeSpecName "kube-api-access-w9tlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.085962 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "11864c16-c8f0-4933-bf0d-aa97e1c4e650" (UID: "11864c16-c8f0-4933-bf0d-aa97e1c4e650"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.096681 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-config-data" (OuterVolumeSpecName: "config-data") pod "11864c16-c8f0-4933-bf0d-aa97e1c4e650" (UID: "11864c16-c8f0-4933-bf0d-aa97e1c4e650"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.124126 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "84bce81b-d290-4c3a-b5ce-e7188df23c4e" (UID: "84bce81b-d290-4c3a-b5ce-e7188df23c4e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.126246 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0142b96-53a8-4914-963f-e1bad9a870e0" (UID: "b0142b96-53a8-4914-963f-e1bad9a870e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.126651 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "84bce81b-d290-4c3a-b5ce-e7188df23c4e" (UID: "84bce81b-d290-4c3a-b5ce-e7188df23c4e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.128200 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "84bce81b-d290-4c3a-b5ce-e7188df23c4e" (UID: "84bce81b-d290-4c3a-b5ce-e7188df23c4e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.142536 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-config-data" (OuterVolumeSpecName: "config-data") pod "b0142b96-53a8-4914-963f-e1bad9a870e0" (UID: "b0142b96-53a8-4914-963f-e1bad9a870e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.151411 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11864c16-c8f0-4933-bf0d-aa97e1c4e650" (UID: "11864c16-c8f0-4933-bf0d-aa97e1c4e650"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.153851 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b0142b96-53a8-4914-963f-e1bad9a870e0" (UID: "b0142b96-53a8-4914-963f-e1bad9a870e0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.159362 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.159405 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.159421 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.159433 4954 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.159445 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x42lz\" (UniqueName: \"kubernetes.io/projected/b0142b96-53a8-4914-963f-e1bad9a870e0-kube-api-access-x42lz\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.159458 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.159469 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.159479 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.159488 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.159502 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0142b96-53a8-4914-963f-e1bad9a870e0-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.159512 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0142b96-53a8-4914-963f-e1bad9a870e0-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.159522 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc7zg\" (UniqueName: \"kubernetes.io/projected/84bce81b-d290-4c3a-b5ce-e7188df23c4e-kube-api-access-lc7zg\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.159578 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.159593 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9tlv\" (UniqueName: \"kubernetes.io/projected/11864c16-c8f0-4933-bf0d-aa97e1c4e650-kube-api-access-w9tlv\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.159603 4954 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.159613 4954 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0142b96-53a8-4914-963f-e1bad9a870e0-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.159624 4954 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.159637 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11864c16-c8f0-4933-bf0d-aa97e1c4e650-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.171441 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-config" (OuterVolumeSpecName: "config") pod "84bce81b-d290-4c3a-b5ce-e7188df23c4e" (UID: "84bce81b-d290-4c3a-b5ce-e7188df23c4e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.189331 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.261272 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84bce81b-d290-4c3a-b5ce-e7188df23c4e-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.261315 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.481915 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784d65c867-2mx6q" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.481967 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784d65c867-2mx6q" event={"ID":"84bce81b-d290-4c3a-b5ce-e7188df23c4e","Type":"ContainerDied","Data":"7786504d3ca4bb2102292658ebbe9baa3da0e1e0852eb7410d188b293873cb3a"} Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.488054 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9kg4w" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.488339 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9kg4w" event={"ID":"11864c16-c8f0-4933-bf0d-aa97e1c4e650","Type":"ContainerDied","Data":"c99331f4772c8119b9e36da62e9a7cb923ba71329b35d614b4a1b4f59b3013f5"} Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.488382 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c99331f4772c8119b9e36da62e9a7cb923ba71329b35d614b4a1b4f59b3013f5" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.492582 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.492855 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b0142b96-53a8-4914-963f-e1bad9a870e0","Type":"ContainerDied","Data":"69b71cad79c1659dcc104073f6be27ff84c14f42dc6a65a1cafcacddd5ab566a"} Dec 06 07:18:55 crc kubenswrapper[4954]: E1206 07:18:55.502496 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:82006b9c64d4c5f80483cda262d960ce6be4813665158ef1a53ea7734bbe431f\\\"\"" pod="openstack/barbican-db-sync-xc5wf" podUID="1b212057-565c-4246-820a-a804fb6da962" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.559482 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784d65c867-2mx6q"] Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.574171 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-784d65c867-2mx6q"] Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.598645 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.608206 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.621515 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 07:18:55 crc kubenswrapper[4954]: E1206 07:18:55.623221 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0142b96-53a8-4914-963f-e1bad9a870e0" containerName="glance-log" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.623245 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0142b96-53a8-4914-963f-e1bad9a870e0" containerName="glance-log" Dec 06 07:18:55 crc kubenswrapper[4954]: E1206 07:18:55.623258 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84bce81b-d290-4c3a-b5ce-e7188df23c4e" containerName="init" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.623265 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="84bce81b-d290-4c3a-b5ce-e7188df23c4e" containerName="init" Dec 06 07:18:55 crc kubenswrapper[4954]: E1206 07:18:55.623279 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84bce81b-d290-4c3a-b5ce-e7188df23c4e" containerName="dnsmasq-dns" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.623286 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="84bce81b-d290-4c3a-b5ce-e7188df23c4e" containerName="dnsmasq-dns" Dec 06 07:18:55 crc kubenswrapper[4954]: E1206 07:18:55.623297 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11864c16-c8f0-4933-bf0d-aa97e1c4e650" containerName="keystone-bootstrap" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.623306 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="11864c16-c8f0-4933-bf0d-aa97e1c4e650" containerName="keystone-bootstrap" Dec 06 07:18:55 crc kubenswrapper[4954]: E1206 07:18:55.623332 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0142b96-53a8-4914-963f-e1bad9a870e0" containerName="glance-httpd" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.623339 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0142b96-53a8-4914-963f-e1bad9a870e0" containerName="glance-httpd" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.623528 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="84bce81b-d290-4c3a-b5ce-e7188df23c4e" containerName="dnsmasq-dns" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.623551 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0142b96-53a8-4914-963f-e1bad9a870e0" containerName="glance-httpd" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.623588 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0142b96-53a8-4914-963f-e1bad9a870e0" containerName="glance-log" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.623605 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="11864c16-c8f0-4933-bf0d-aa97e1c4e650" containerName="keystone-bootstrap" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.624595 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.629729 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.629823 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.637035 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.786440 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz8pk\" (UniqueName: \"kubernetes.io/projected/99f880ab-3992-479d-b71c-d71152a6199a-kube-api-access-hz8pk\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.786749 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.786808 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f880ab-3992-479d-b71c-d71152a6199a-logs\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.786916 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.787118 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99f880ab-3992-479d-b71c-d71152a6199a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.787812 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.787847 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.787873 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.890235 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.890299 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.890316 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.890340 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz8pk\" (UniqueName: \"kubernetes.io/projected/99f880ab-3992-479d-b71c-d71152a6199a-kube-api-access-hz8pk\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.890419 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.890436 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f880ab-3992-479d-b71c-d71152a6199a-logs\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.890462 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.890493 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99f880ab-3992-479d-b71c-d71152a6199a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.891144 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99f880ab-3992-479d-b71c-d71152a6199a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.893375 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.901647 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.902536 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.915166 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.916282 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.916624 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f880ab-3992-479d-b71c-d71152a6199a-logs\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.936343 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz8pk\" (UniqueName: \"kubernetes.io/projected/99f880ab-3992-479d-b71c-d71152a6199a-kube-api-access-hz8pk\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.964711 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:18:55 crc kubenswrapper[4954]: I1206 07:18:55.996115 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-9kg4w"] Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.006186 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-9kg4w"] Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.095691 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-wcw8k"] Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.097265 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.100344 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.100490 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.100838 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r8jst" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.100984 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.105052 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wcw8k"] Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.253509 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.265553 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.272635 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-config-data\") pod \"keystone-bootstrap-wcw8k\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.272804 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-fernet-keys\") pod \"keystone-bootstrap-wcw8k\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.273023 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-credential-keys\") pod \"keystone-bootstrap-wcw8k\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.273106 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-scripts\") pod \"keystone-bootstrap-wcw8k\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.273207 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4rgx\" (UniqueName: \"kubernetes.io/projected/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-kube-api-access-v4rgx\") pod \"keystone-bootstrap-wcw8k\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.273260 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-combined-ca-bundle\") pod \"keystone-bootstrap-wcw8k\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.373824 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-config-data\") pod \"keystone-bootstrap-wcw8k\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.373895 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-fernet-keys\") pod \"keystone-bootstrap-wcw8k\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.373966 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-credential-keys\") pod \"keystone-bootstrap-wcw8k\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.374006 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-scripts\") pod \"keystone-bootstrap-wcw8k\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.374054 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4rgx\" (UniqueName: \"kubernetes.io/projected/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-kube-api-access-v4rgx\") pod \"keystone-bootstrap-wcw8k\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.374089 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-combined-ca-bundle\") pod \"keystone-bootstrap-wcw8k\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.379957 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-config-data\") pod \"keystone-bootstrap-wcw8k\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.382797 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-scripts\") pod \"keystone-bootstrap-wcw8k\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.382942 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-fernet-keys\") pod \"keystone-bootstrap-wcw8k\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.389256 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-credential-keys\") pod \"keystone-bootstrap-wcw8k\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.391638 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-combined-ca-bundle\") pod \"keystone-bootstrap-wcw8k\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.396704 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4rgx\" (UniqueName: \"kubernetes.io/projected/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-kube-api-access-v4rgx\") pod \"keystone-bootstrap-wcw8k\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.584402 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.731286 4954 scope.go:117] "RemoveContainer" containerID="9578910258df4e76f347169a5399c7fb23a02dd465bb006c6de63efd52f48f71" Dec 06 07:18:56 crc kubenswrapper[4954]: E1206 07:18:56.755803 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2" Dec 06 07:18:56 crc kubenswrapper[4954]: E1206 07:18:56.756617 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-npqzh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-fcgjm_openstack(24a53752-0ee5-43db-b968-b4d11414ffdb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 07:18:56 crc kubenswrapper[4954]: E1206 07:18:56.757924 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-fcgjm" podUID="24a53752-0ee5-43db-b968-b4d11414ffdb" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.803982 4954 scope.go:117] "RemoveContainer" containerID="9c8b3230b89e7cd383e6e20c8ead07828efba3da118a04036e69862d1033971b" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.844417 4954 scope.go:117] "RemoveContainer" containerID="52fcf5af3db779d717e14fdc3a3c4f3d3aa22bcf9db40eb4664db0e461643a94" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.886975 4954 scope.go:117] "RemoveContainer" containerID="3c228de746b150019069d960694b69b97286f938c755c26b169c69119edf1149" Dec 06 07:18:56 crc kubenswrapper[4954]: I1206 07:18:56.985335 4954 scope.go:117] "RemoveContainer" containerID="333c190eb878e03213e09e785af05b76a563115fc89aee5caea585937ac11f76" Dec 06 07:18:57 crc kubenswrapper[4954]: I1206 07:18:57.348450 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wcw8k"] Dec 06 07:18:57 crc kubenswrapper[4954]: I1206 07:18:57.394741 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 07:18:57 crc kubenswrapper[4954]: W1206 07:18:57.407293 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55a1a9c0_08bd_400d_8b38_b4226dadc5d5.slice/crio-2761ee60708c7ea3fa05d389fccee48b2a7505d17e82f6bcb595cf0862d01396 WatchSource:0}: Error finding container 2761ee60708c7ea3fa05d389fccee48b2a7505d17e82f6bcb595cf0862d01396: Status 404 returned error can't find the container with id 2761ee60708c7ea3fa05d389fccee48b2a7505d17e82f6bcb595cf0862d01396 Dec 06 07:18:57 crc kubenswrapper[4954]: I1206 07:18:57.480725 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11864c16-c8f0-4933-bf0d-aa97e1c4e650" path="/var/lib/kubelet/pods/11864c16-c8f0-4933-bf0d-aa97e1c4e650/volumes" Dec 06 07:18:57 crc kubenswrapper[4954]: I1206 07:18:57.481819 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84bce81b-d290-4c3a-b5ce-e7188df23c4e" path="/var/lib/kubelet/pods/84bce81b-d290-4c3a-b5ce-e7188df23c4e/volumes" Dec 06 07:18:57 crc kubenswrapper[4954]: I1206 07:18:57.482765 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0142b96-53a8-4914-963f-e1bad9a870e0" path="/var/lib/kubelet/pods/b0142b96-53a8-4914-963f-e1bad9a870e0/volumes" Dec 06 07:18:57 crc kubenswrapper[4954]: W1206 07:18:57.484450 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99f880ab_3992_479d_b71c_d71152a6199a.slice/crio-13f7dfaff04d47297b2510bf2b77929baba939adcfb9c219f900ce62c96d596b WatchSource:0}: Error finding container 13f7dfaff04d47297b2510bf2b77929baba939adcfb9c219f900ce62c96d596b: Status 404 returned error can't find the container with id 13f7dfaff04d47297b2510bf2b77929baba939adcfb9c219f900ce62c96d596b Dec 06 07:18:57 crc kubenswrapper[4954]: I1206 07:18:57.486062 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 07:18:57 crc kubenswrapper[4954]: I1206 07:18:57.521995 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"99f880ab-3992-479d-b71c-d71152a6199a","Type":"ContainerStarted","Data":"13f7dfaff04d47297b2510bf2b77929baba939adcfb9c219f900ce62c96d596b"} Dec 06 07:18:57 crc kubenswrapper[4954]: I1206 07:18:57.524883 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wcw8k" event={"ID":"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3","Type":"ContainerStarted","Data":"9bd8678dbdf83af1b9985e38b33894b64f93736d13e0d2b31cf24248b679ecc6"} Dec 06 07:18:57 crc kubenswrapper[4954]: I1206 07:18:57.527376 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"55a1a9c0-08bd-400d-8b38-b4226dadc5d5","Type":"ContainerStarted","Data":"2761ee60708c7ea3fa05d389fccee48b2a7505d17e82f6bcb595cf0862d01396"} Dec 06 07:18:57 crc kubenswrapper[4954]: I1206 07:18:57.535663 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f74aa2d5-0275-4b4c-a394-468414e6f840","Type":"ContainerStarted","Data":"3d51af3f527dae523af467b40835190221d458c85f1ca604efa4190bd08d8799"} Dec 06 07:18:57 crc kubenswrapper[4954]: I1206 07:18:57.538978 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9zvlz" event={"ID":"c5f9fd4e-cc86-43cf-8195-cae0becb9e64","Type":"ContainerStarted","Data":"2ec47098771e81a3e2ed711acad31be02d7763e790d88dde621f4b4966df4ef7"} Dec 06 07:18:57 crc kubenswrapper[4954]: E1206 07:18:57.549485 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2\\\"\"" pod="openstack/cinder-db-sync-fcgjm" podUID="24a53752-0ee5-43db-b968-b4d11414ffdb" Dec 06 07:18:57 crc kubenswrapper[4954]: I1206 07:18:57.572490 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-9zvlz" podStartSLOduration=2.934645888 podStartE2EDuration="27.57245164s" podCreationTimestamp="2025-12-06 07:18:30 +0000 UTC" firstStartedPulling="2025-12-06 07:18:32.031597362 +0000 UTC m=+1286.844956751" lastFinishedPulling="2025-12-06 07:18:56.669403114 +0000 UTC m=+1311.482762503" observedRunningTime="2025-12-06 07:18:57.556608435 +0000 UTC m=+1312.369967834" watchObservedRunningTime="2025-12-06 07:18:57.57245164 +0000 UTC m=+1312.385811029" Dec 06 07:18:58 crc kubenswrapper[4954]: I1206 07:18:58.362753 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-784d65c867-2mx6q" podUID="84bce81b-d290-4c3a-b5ce-e7188df23c4e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: i/o timeout" Dec 06 07:18:58 crc kubenswrapper[4954]: I1206 07:18:58.573065 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"99f880ab-3992-479d-b71c-d71152a6199a","Type":"ContainerStarted","Data":"14336c5639296bdf66b099aa7e78137865151ca77f4751466d1020a44df20161"} Dec 06 07:18:58 crc kubenswrapper[4954]: I1206 07:18:58.585259 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wcw8k" event={"ID":"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3","Type":"ContainerStarted","Data":"73b4fc68357e95fab522f5bfaa524801ed8b0c73a9d9a696e2f90c09e2776edb"} Dec 06 07:18:58 crc kubenswrapper[4954]: I1206 07:18:58.598682 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"55a1a9c0-08bd-400d-8b38-b4226dadc5d5","Type":"ContainerStarted","Data":"99c8dea202a8920a4eee42b890be9d604676fddb74f851b9d59b1959e543e7d3"} Dec 06 07:18:58 crc kubenswrapper[4954]: I1206 07:18:58.607977 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f74aa2d5-0275-4b4c-a394-468414e6f840","Type":"ContainerStarted","Data":"e1f4adf7be94100797eae7cc8de157aaf57ae9a3f550ae855dcaccd1e3c043bc"} Dec 06 07:18:58 crc kubenswrapper[4954]: I1206 07:18:58.626409 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-wcw8k" podStartSLOduration=2.626386286 podStartE2EDuration="2.626386286s" podCreationTimestamp="2025-12-06 07:18:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:18:58.607947912 +0000 UTC m=+1313.421307321" watchObservedRunningTime="2025-12-06 07:18:58.626386286 +0000 UTC m=+1313.439745695" Dec 06 07:18:59 crc kubenswrapper[4954]: I1206 07:18:59.631058 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"55a1a9c0-08bd-400d-8b38-b4226dadc5d5","Type":"ContainerStarted","Data":"95f70c01692c02f12be12c27c5292ae7d782b346fe0682d4e7ac933a4509847b"} Dec 06 07:18:59 crc kubenswrapper[4954]: I1206 07:18:59.635244 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"99f880ab-3992-479d-b71c-d71152a6199a","Type":"ContainerStarted","Data":"d7e0c85d4629bccacf706938d220e273e792bc1181cb2b98cadc350830f3ff62"} Dec 06 07:18:59 crc kubenswrapper[4954]: I1206 07:18:59.674344 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=14.674314081 podStartE2EDuration="14.674314081s" podCreationTimestamp="2025-12-06 07:18:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:18:59.657527821 +0000 UTC m=+1314.470887230" watchObservedRunningTime="2025-12-06 07:18:59.674314081 +0000 UTC m=+1314.487673460" Dec 06 07:18:59 crc kubenswrapper[4954]: I1206 07:18:59.697757 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.697718128 podStartE2EDuration="4.697718128s" podCreationTimestamp="2025-12-06 07:18:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:18:59.684133374 +0000 UTC m=+1314.497492773" watchObservedRunningTime="2025-12-06 07:18:59.697718128 +0000 UTC m=+1314.511077517" Dec 06 07:19:00 crc kubenswrapper[4954]: I1206 07:19:00.652668 4954 generic.go:334] "Generic (PLEG): container finished" podID="c5f9fd4e-cc86-43cf-8195-cae0becb9e64" containerID="2ec47098771e81a3e2ed711acad31be02d7763e790d88dde621f4b4966df4ef7" exitCode=0 Dec 06 07:19:00 crc kubenswrapper[4954]: I1206 07:19:00.652738 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9zvlz" event={"ID":"c5f9fd4e-cc86-43cf-8195-cae0becb9e64","Type":"ContainerDied","Data":"2ec47098771e81a3e2ed711acad31be02d7763e790d88dde621f4b4966df4ef7"} Dec 06 07:19:02 crc kubenswrapper[4954]: I1206 07:19:02.690730 4954 generic.go:334] "Generic (PLEG): container finished" podID="fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3" containerID="73b4fc68357e95fab522f5bfaa524801ed8b0c73a9d9a696e2f90c09e2776edb" exitCode=0 Dec 06 07:19:02 crc kubenswrapper[4954]: I1206 07:19:02.690807 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wcw8k" event={"ID":"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3","Type":"ContainerDied","Data":"73b4fc68357e95fab522f5bfaa524801ed8b0c73a9d9a696e2f90c09e2776edb"} Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.266538 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9zvlz" Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.466873 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-combined-ca-bundle\") pod \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.467194 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-logs\") pod \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.467327 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-288vg\" (UniqueName: \"kubernetes.io/projected/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-kube-api-access-288vg\") pod \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.467401 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-scripts\") pod \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.467424 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-config-data\") pod \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\" (UID: \"c5f9fd4e-cc86-43cf-8195-cae0becb9e64\") " Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.467972 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-logs" (OuterVolumeSpecName: "logs") pod "c5f9fd4e-cc86-43cf-8195-cae0becb9e64" (UID: "c5f9fd4e-cc86-43cf-8195-cae0becb9e64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.474839 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-scripts" (OuterVolumeSpecName: "scripts") pod "c5f9fd4e-cc86-43cf-8195-cae0becb9e64" (UID: "c5f9fd4e-cc86-43cf-8195-cae0becb9e64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.474976 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-kube-api-access-288vg" (OuterVolumeSpecName: "kube-api-access-288vg") pod "c5f9fd4e-cc86-43cf-8195-cae0becb9e64" (UID: "c5f9fd4e-cc86-43cf-8195-cae0becb9e64"). InnerVolumeSpecName "kube-api-access-288vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.503553 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-config-data" (OuterVolumeSpecName: "config-data") pod "c5f9fd4e-cc86-43cf-8195-cae0becb9e64" (UID: "c5f9fd4e-cc86-43cf-8195-cae0becb9e64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.506290 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5f9fd4e-cc86-43cf-8195-cae0becb9e64" (UID: "c5f9fd4e-cc86-43cf-8195-cae0becb9e64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.571093 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.573733 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.573779 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-288vg\" (UniqueName: \"kubernetes.io/projected/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-kube-api-access-288vg\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.573793 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.573804 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f9fd4e-cc86-43cf-8195-cae0becb9e64-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.709338 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f74aa2d5-0275-4b4c-a394-468414e6f840","Type":"ContainerStarted","Data":"9ccf00a975a945e1ec654f4d1f89e52233116cdf8b21e852533f3e44c9272d32"} Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.711477 4954 generic.go:334] "Generic (PLEG): container finished" podID="8866facf-9bb1-4deb-9244-15805d162346" containerID="71b04eafce145d86555e0beb19909a5726709e8e20af73819ad4cf85b6cd66fc" exitCode=0 Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.711588 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-khvqc" event={"ID":"8866facf-9bb1-4deb-9244-15805d162346","Type":"ContainerDied","Data":"71b04eafce145d86555e0beb19909a5726709e8e20af73819ad4cf85b6cd66fc"} Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.713886 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9zvlz" Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.713924 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9zvlz" event={"ID":"c5f9fd4e-cc86-43cf-8195-cae0becb9e64","Type":"ContainerDied","Data":"1135bbf7d222445109c31211f37d1e1c17c046a3f9e8171323fa2b652ac9193a"} Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.713993 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1135bbf7d222445109c31211f37d1e1c17c046a3f9e8171323fa2b652ac9193a" Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.972749 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.983599 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-scripts\") pod \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.983682 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-combined-ca-bundle\") pod \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.983729 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4rgx\" (UniqueName: \"kubernetes.io/projected/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-kube-api-access-v4rgx\") pod \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.983831 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-fernet-keys\") pod \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.983865 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-credential-keys\") pod \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.983889 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-config-data\") pod \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\" (UID: \"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3\") " Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.993001 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3" (UID: "fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.995358 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-kube-api-access-v4rgx" (OuterVolumeSpecName: "kube-api-access-v4rgx") pod "fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3" (UID: "fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3"). InnerVolumeSpecName "kube-api-access-v4rgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.995602 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-scripts" (OuterVolumeSpecName: "scripts") pod "fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3" (UID: "fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:03 crc kubenswrapper[4954]: I1206 07:19:03.996625 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3" (UID: "fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.017243 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-config-data" (OuterVolumeSpecName: "config-data") pod "fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3" (UID: "fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.018176 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3" (UID: "fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.086467 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.086514 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.086528 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4rgx\" (UniqueName: \"kubernetes.io/projected/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-kube-api-access-v4rgx\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.086538 4954 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.086547 4954 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.086587 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.381206 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-55bf8ff4-7brlv"] Dec 06 07:19:04 crc kubenswrapper[4954]: E1206 07:19:04.382471 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f9fd4e-cc86-43cf-8195-cae0becb9e64" containerName="placement-db-sync" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.382503 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f9fd4e-cc86-43cf-8195-cae0becb9e64" containerName="placement-db-sync" Dec 06 07:19:04 crc kubenswrapper[4954]: E1206 07:19:04.382539 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3" containerName="keystone-bootstrap" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.382548 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3" containerName="keystone-bootstrap" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.382787 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3" containerName="keystone-bootstrap" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.382818 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f9fd4e-cc86-43cf-8195-cae0becb9e64" containerName="placement-db-sync" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.385680 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.391771 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.392279 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.392511 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.392866 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6n4f8" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.393315 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.394257 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55bf8ff4-7brlv"] Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.394790 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvdnt\" (UniqueName: \"kubernetes.io/projected/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-kube-api-access-fvdnt\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.394938 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-config-data\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.395602 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-internal-tls-certs\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.395738 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-public-tls-certs\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.395845 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-scripts\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.396013 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-logs\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.396151 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-combined-ca-bundle\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.497989 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvdnt\" (UniqueName: \"kubernetes.io/projected/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-kube-api-access-fvdnt\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.498386 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-config-data\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.498579 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-internal-tls-certs\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.498731 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-public-tls-certs\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.498885 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-scripts\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.499026 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-logs\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.499124 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-combined-ca-bundle\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.500950 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-logs\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.505813 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-public-tls-certs\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.509541 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-config-data\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.511474 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-internal-tls-certs\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.513417 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-combined-ca-bundle\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.516595 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-scripts\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.517161 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvdnt\" (UniqueName: \"kubernetes.io/projected/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-kube-api-access-fvdnt\") pod \"placement-55bf8ff4-7brlv\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.715020 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.729050 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wcw8k" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.731641 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wcw8k" event={"ID":"fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3","Type":"ContainerDied","Data":"9bd8678dbdf83af1b9985e38b33894b64f93736d13e0d2b31cf24248b679ecc6"} Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.731689 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bd8678dbdf83af1b9985e38b33894b64f93736d13e0d2b31cf24248b679ecc6" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.828076 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-55864b7b7d-dsx9g"] Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.830002 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.833114 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.833954 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.834283 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.834544 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.834811 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r8jst" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.835013 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.879734 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55864b7b7d-dsx9g"] Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.908002 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-fernet-keys\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.908060 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-public-tls-certs\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.908094 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-config-data\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.908125 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-scripts\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.908166 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-internal-tls-certs\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.908192 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-combined-ca-bundle\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.908242 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-credential-keys\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:04 crc kubenswrapper[4954]: I1206 07:19:04.908342 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6tsh\" (UniqueName: \"kubernetes.io/projected/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-kube-api-access-s6tsh\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.009420 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-credential-keys\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.009570 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6tsh\" (UniqueName: \"kubernetes.io/projected/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-kube-api-access-s6tsh\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.009631 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-fernet-keys\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.009656 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-public-tls-certs\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.009686 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-config-data\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.009706 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-scripts\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.009740 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-internal-tls-certs\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.009762 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-combined-ca-bundle\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.017420 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-credential-keys\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.017926 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-internal-tls-certs\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.018131 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-config-data\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.018750 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-combined-ca-bundle\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.019672 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-fernet-keys\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.022037 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-scripts\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.022342 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-public-tls-certs\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.035597 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6tsh\" (UniqueName: \"kubernetes.io/projected/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-kube-api-access-s6tsh\") pod \"keystone-55864b7b7d-dsx9g\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.141057 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-khvqc" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.169443 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.213779 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgdcc\" (UniqueName: \"kubernetes.io/projected/8866facf-9bb1-4deb-9244-15805d162346-kube-api-access-jgdcc\") pod \"8866facf-9bb1-4deb-9244-15805d162346\" (UID: \"8866facf-9bb1-4deb-9244-15805d162346\") " Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.213876 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8866facf-9bb1-4deb-9244-15805d162346-config\") pod \"8866facf-9bb1-4deb-9244-15805d162346\" (UID: \"8866facf-9bb1-4deb-9244-15805d162346\") " Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.214516 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8866facf-9bb1-4deb-9244-15805d162346-combined-ca-bundle\") pod \"8866facf-9bb1-4deb-9244-15805d162346\" (UID: \"8866facf-9bb1-4deb-9244-15805d162346\") " Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.220191 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8866facf-9bb1-4deb-9244-15805d162346-kube-api-access-jgdcc" (OuterVolumeSpecName: "kube-api-access-jgdcc") pod "8866facf-9bb1-4deb-9244-15805d162346" (UID: "8866facf-9bb1-4deb-9244-15805d162346"). InnerVolumeSpecName "kube-api-access-jgdcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.247250 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8866facf-9bb1-4deb-9244-15805d162346-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8866facf-9bb1-4deb-9244-15805d162346" (UID: "8866facf-9bb1-4deb-9244-15805d162346"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.254633 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8866facf-9bb1-4deb-9244-15805d162346-config" (OuterVolumeSpecName: "config") pod "8866facf-9bb1-4deb-9244-15805d162346" (UID: "8866facf-9bb1-4deb-9244-15805d162346"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.317144 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8866facf-9bb1-4deb-9244-15805d162346-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.317203 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgdcc\" (UniqueName: \"kubernetes.io/projected/8866facf-9bb1-4deb-9244-15805d162346-kube-api-access-jgdcc\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.317218 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8866facf-9bb1-4deb-9244-15805d162346-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.375774 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55bf8ff4-7brlv"] Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.705869 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55864b7b7d-dsx9g"] Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.780083 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55864b7b7d-dsx9g" event={"ID":"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a","Type":"ContainerStarted","Data":"bc6e5869146f90030326cf9214bc35bbc536cef99cafd3be22659e03b413e05f"} Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.799942 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55bf8ff4-7brlv" event={"ID":"f84d04d2-6282-4a9c-89a8-3aa64ef22c74","Type":"ContainerStarted","Data":"27d6334f40e13cafc61de087025fab078b3fe41774336c33565c8ae4035e063f"} Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.800002 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55bf8ff4-7brlv" event={"ID":"f84d04d2-6282-4a9c-89a8-3aa64ef22c74","Type":"ContainerStarted","Data":"c0bf167dc2e4d497caa12502cdabb27efe38c789d6566ae953f0ddb3be6b425e"} Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.801383 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-khvqc" event={"ID":"8866facf-9bb1-4deb-9244-15805d162346","Type":"ContainerDied","Data":"c53a61a1f1a3aa8365da3fcccfbf47d0de46a3c6c8eaadcd8fa1065601876551"} Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.801407 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c53a61a1f1a3aa8365da3fcccfbf47d0de46a3c6c8eaadcd8fa1065601876551" Dec 06 07:19:05 crc kubenswrapper[4954]: I1206 07:19:05.801496 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-khvqc" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.031156 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67dfc45497-kr42c"] Dec 06 07:19:06 crc kubenswrapper[4954]: E1206 07:19:06.031726 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8866facf-9bb1-4deb-9244-15805d162346" containerName="neutron-db-sync" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.031744 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8866facf-9bb1-4deb-9244-15805d162346" containerName="neutron-db-sync" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.031943 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8866facf-9bb1-4deb-9244-15805d162346" containerName="neutron-db-sync" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.035687 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.057994 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4cj8\" (UniqueName: \"kubernetes.io/projected/c36304c6-bb92-409f-9700-d45ac07bca8a-kube-api-access-k4cj8\") pod \"dnsmasq-dns-67dfc45497-kr42c\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.058052 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-dns-svc\") pod \"dnsmasq-dns-67dfc45497-kr42c\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.058083 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-ovsdbserver-sb\") pod \"dnsmasq-dns-67dfc45497-kr42c\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.058124 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-config\") pod \"dnsmasq-dns-67dfc45497-kr42c\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.058145 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-dns-swift-storage-0\") pod \"dnsmasq-dns-67dfc45497-kr42c\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.058335 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-ovsdbserver-nb\") pod \"dnsmasq-dns-67dfc45497-kr42c\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.093860 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67dfc45497-kr42c"] Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.164114 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4cj8\" (UniqueName: \"kubernetes.io/projected/c36304c6-bb92-409f-9700-d45ac07bca8a-kube-api-access-k4cj8\") pod \"dnsmasq-dns-67dfc45497-kr42c\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.164186 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-dns-svc\") pod \"dnsmasq-dns-67dfc45497-kr42c\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.164220 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-ovsdbserver-sb\") pod \"dnsmasq-dns-67dfc45497-kr42c\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.164263 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-config\") pod \"dnsmasq-dns-67dfc45497-kr42c\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.164286 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-dns-swift-storage-0\") pod \"dnsmasq-dns-67dfc45497-kr42c\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.164355 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-ovsdbserver-nb\") pod \"dnsmasq-dns-67dfc45497-kr42c\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.165654 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-ovsdbserver-nb\") pod \"dnsmasq-dns-67dfc45497-kr42c\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.165724 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-ovsdbserver-sb\") pod \"dnsmasq-dns-67dfc45497-kr42c\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.166209 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-config\") pod \"dnsmasq-dns-67dfc45497-kr42c\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.166920 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-dns-svc\") pod \"dnsmasq-dns-67dfc45497-kr42c\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.166986 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-dns-swift-storage-0\") pod \"dnsmasq-dns-67dfc45497-kr42c\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.186505 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ff8d8cf8b-qx8qg"] Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.190051 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.192126 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4cj8\" (UniqueName: \"kubernetes.io/projected/c36304c6-bb92-409f-9700-d45ac07bca8a-kube-api-access-k4cj8\") pod \"dnsmasq-dns-67dfc45497-kr42c\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.196469 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.196807 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.196957 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.198946 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-9sp84" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.211093 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ff8d8cf8b-qx8qg"] Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.214830 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.216578 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.274087 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.274184 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.307804 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.317279 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.337140 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.358621 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.368799 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-ovndb-tls-certs\") pod \"neutron-ff8d8cf8b-qx8qg\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.368964 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-config\") pod \"neutron-ff8d8cf8b-qx8qg\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.369031 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-combined-ca-bundle\") pod \"neutron-ff8d8cf8b-qx8qg\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.369161 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-httpd-config\") pod \"neutron-ff8d8cf8b-qx8qg\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.369204 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9npr2\" (UniqueName: \"kubernetes.io/projected/b41a1242-45fc-4a75-850b-8b0732873987-kube-api-access-9npr2\") pod \"neutron-ff8d8cf8b-qx8qg\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.428043 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.474921 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-combined-ca-bundle\") pod \"neutron-ff8d8cf8b-qx8qg\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.475184 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-httpd-config\") pod \"neutron-ff8d8cf8b-qx8qg\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.475259 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9npr2\" (UniqueName: \"kubernetes.io/projected/b41a1242-45fc-4a75-850b-8b0732873987-kube-api-access-9npr2\") pod \"neutron-ff8d8cf8b-qx8qg\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.475362 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-ovndb-tls-certs\") pod \"neutron-ff8d8cf8b-qx8qg\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.475464 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-config\") pod \"neutron-ff8d8cf8b-qx8qg\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.495935 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-combined-ca-bundle\") pod \"neutron-ff8d8cf8b-qx8qg\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.502907 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-httpd-config\") pod \"neutron-ff8d8cf8b-qx8qg\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.504087 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-config\") pod \"neutron-ff8d8cf8b-qx8qg\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.505384 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-ovndb-tls-certs\") pod \"neutron-ff8d8cf8b-qx8qg\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.507371 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9npr2\" (UniqueName: \"kubernetes.io/projected/b41a1242-45fc-4a75-850b-8b0732873987-kube-api-access-9npr2\") pod \"neutron-ff8d8cf8b-qx8qg\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.625504 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.840432 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55864b7b7d-dsx9g" event={"ID":"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a","Type":"ContainerStarted","Data":"cbea21543fc70a07f7bd97a251a95abf2e5f0008dd200e897c3854aec6d26be7"} Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.841011 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.848767 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55bf8ff4-7brlv" event={"ID":"f84d04d2-6282-4a9c-89a8-3aa64ef22c74","Type":"ContainerStarted","Data":"6ab0f45e984c650aee9fc279116d7bc9e68e5fe5d6b74cb802fbb810f6d5de26"} Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.849382 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.849430 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.849444 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.849456 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.849516 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.904516 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-55864b7b7d-dsx9g" podStartSLOduration=2.904479755 podStartE2EDuration="2.904479755s" podCreationTimestamp="2025-12-06 07:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:19:06.872513377 +0000 UTC m=+1321.685872786" watchObservedRunningTime="2025-12-06 07:19:06.904479755 +0000 UTC m=+1321.717839144" Dec 06 07:19:06 crc kubenswrapper[4954]: I1206 07:19:06.934319 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-55bf8ff4-7brlv" podStartSLOduration=2.934289295 podStartE2EDuration="2.934289295s" podCreationTimestamp="2025-12-06 07:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:19:06.912917271 +0000 UTC m=+1321.726276680" watchObservedRunningTime="2025-12-06 07:19:06.934289295 +0000 UTC m=+1321.747648684" Dec 06 07:19:07 crc kubenswrapper[4954]: I1206 07:19:07.043112 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67dfc45497-kr42c"] Dec 06 07:19:07 crc kubenswrapper[4954]: I1206 07:19:07.494119 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ff8d8cf8b-qx8qg"] Dec 06 07:19:07 crc kubenswrapper[4954]: I1206 07:19:07.880028 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dfc45497-kr42c" event={"ID":"c36304c6-bb92-409f-9700-d45ac07bca8a","Type":"ContainerStarted","Data":"1c209e1225c76a2a5f479efe8ee4e7a25d1729ea17e97957e9da4cc9edba3ada"} Dec 06 07:19:07 crc kubenswrapper[4954]: I1206 07:19:07.883214 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ff8d8cf8b-qx8qg" event={"ID":"b41a1242-45fc-4a75-850b-8b0732873987","Type":"ContainerStarted","Data":"8fb4a70b034d55c04a0f570cc7f1dcb24563adbff10cb69861b18e631a6e65e4"} Dec 06 07:19:07 crc kubenswrapper[4954]: I1206 07:19:07.885158 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:08 crc kubenswrapper[4954]: I1206 07:19:08.895398 4954 generic.go:334] "Generic (PLEG): container finished" podID="c36304c6-bb92-409f-9700-d45ac07bca8a" containerID="7d138999999fc4107a55db7ab0c535ac484c83e49f640de1f842c8caf3e53734" exitCode=0 Dec 06 07:19:08 crc kubenswrapper[4954]: I1206 07:19:08.895596 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dfc45497-kr42c" event={"ID":"c36304c6-bb92-409f-9700-d45ac07bca8a","Type":"ContainerDied","Data":"7d138999999fc4107a55db7ab0c535ac484c83e49f640de1f842c8caf3e53734"} Dec 06 07:19:08 crc kubenswrapper[4954]: I1206 07:19:08.922372 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 07:19:08 crc kubenswrapper[4954]: I1206 07:19:08.923054 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 07:19:08 crc kubenswrapper[4954]: I1206 07:19:08.923113 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 07:19:08 crc kubenswrapper[4954]: I1206 07:19:08.923144 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 07:19:08 crc kubenswrapper[4954]: I1206 07:19:08.924203 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ff8d8cf8b-qx8qg" event={"ID":"b41a1242-45fc-4a75-850b-8b0732873987","Type":"ContainerStarted","Data":"13a279add44a664cd0e5027ab334fe72d8f8e781aab77d9e967476e881a64f22"} Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.126553 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-78654684fc-84hfw"] Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.129309 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.134717 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.135305 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.144185 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78654684fc-84hfw"] Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.175868 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-public-tls-certs\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.176241 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2jtr\" (UniqueName: \"kubernetes.io/projected/096c2131-031b-4573-ade7-b1d0d34abc60-kube-api-access-l2jtr\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.176357 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-internal-tls-certs\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.176463 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-ovndb-tls-certs\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.179744 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-httpd-config\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.179869 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-config\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.180031 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-combined-ca-bundle\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.281470 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-public-tls-certs\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.282490 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2jtr\" (UniqueName: \"kubernetes.io/projected/096c2131-031b-4573-ade7-b1d0d34abc60-kube-api-access-l2jtr\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.282716 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-internal-tls-certs\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.282949 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-ovndb-tls-certs\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.283323 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-httpd-config\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.283479 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-config\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.283639 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-combined-ca-bundle\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.289797 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-httpd-config\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.293389 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-public-tls-certs\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.294475 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-ovndb-tls-certs\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.294495 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-combined-ca-bundle\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.295654 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-internal-tls-certs\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.311043 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-config\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.317431 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2jtr\" (UniqueName: \"kubernetes.io/projected/096c2131-031b-4573-ade7-b1d0d34abc60-kube-api-access-l2jtr\") pod \"neutron-78654684fc-84hfw\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.503037 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.612295 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.733374 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.734024 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.958040 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ff8d8cf8b-qx8qg" event={"ID":"b41a1242-45fc-4a75-850b-8b0732873987","Type":"ContainerStarted","Data":"17734e5db296a1112832ee1fec0a82dc416a216f61ede9129566b9f19b43668e"} Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.958804 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.972902 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.973445 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dfc45497-kr42c" event={"ID":"c36304c6-bb92-409f-9700-d45ac07bca8a","Type":"ContainerStarted","Data":"6f6eabd999407ab0205f96cb4ec712282a45d1316429c1b890f020566f380231"} Dec 06 07:19:09 crc kubenswrapper[4954]: I1206 07:19:09.975971 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:10 crc kubenswrapper[4954]: I1206 07:19:10.007600 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-ff8d8cf8b-qx8qg" podStartSLOduration=4.007524186 podStartE2EDuration="4.007524186s" podCreationTimestamp="2025-12-06 07:19:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:19:09.990343795 +0000 UTC m=+1324.803703324" watchObservedRunningTime="2025-12-06 07:19:10.007524186 +0000 UTC m=+1324.820883575" Dec 06 07:19:10 crc kubenswrapper[4954]: I1206 07:19:10.043480 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67dfc45497-kr42c" podStartSLOduration=5.04343392 podStartE2EDuration="5.04343392s" podCreationTimestamp="2025-12-06 07:19:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:19:10.031719025 +0000 UTC m=+1324.845078424" watchObservedRunningTime="2025-12-06 07:19:10.04343392 +0000 UTC m=+1324.856793309" Dec 06 07:19:10 crc kubenswrapper[4954]: I1206 07:19:10.101315 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:19:10 crc kubenswrapper[4954]: I1206 07:19:10.101407 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:19:10 crc kubenswrapper[4954]: I1206 07:19:10.101469 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 07:19:10 crc kubenswrapper[4954]: I1206 07:19:10.102434 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3092fb494fb5f62377d5237c9a62fe265b08e78c0cfb30eb9b606f026fbf3679"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 07:19:10 crc kubenswrapper[4954]: I1206 07:19:10.102503 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://3092fb494fb5f62377d5237c9a62fe265b08e78c0cfb30eb9b606f026fbf3679" gracePeriod=600 Dec 06 07:19:10 crc kubenswrapper[4954]: W1206 07:19:10.196536 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod096c2131_031b_4573_ade7_b1d0d34abc60.slice/crio-36be5325833648901d84529ec4c478968dce341b4f206524051d37b86a68aec2 WatchSource:0}: Error finding container 36be5325833648901d84529ec4c478968dce341b4f206524051d37b86a68aec2: Status 404 returned error can't find the container with id 36be5325833648901d84529ec4c478968dce341b4f206524051d37b86a68aec2 Dec 06 07:19:10 crc kubenswrapper[4954]: I1206 07:19:10.197982 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78654684fc-84hfw"] Dec 06 07:19:10 crc kubenswrapper[4954]: I1206 07:19:10.446541 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 07:19:10 crc kubenswrapper[4954]: I1206 07:19:10.988762 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="3092fb494fb5f62377d5237c9a62fe265b08e78c0cfb30eb9b606f026fbf3679" exitCode=0 Dec 06 07:19:10 crc kubenswrapper[4954]: I1206 07:19:10.988820 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"3092fb494fb5f62377d5237c9a62fe265b08e78c0cfb30eb9b606f026fbf3679"} Dec 06 07:19:10 crc kubenswrapper[4954]: I1206 07:19:10.989382 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689"} Dec 06 07:19:10 crc kubenswrapper[4954]: I1206 07:19:10.989406 4954 scope.go:117] "RemoveContainer" containerID="b2a8e38b61392d54e95e434affe8fa8be8bd703de4a146acbf88d2066f517403" Dec 06 07:19:10 crc kubenswrapper[4954]: I1206 07:19:10.991350 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78654684fc-84hfw" event={"ID":"096c2131-031b-4573-ade7-b1d0d34abc60","Type":"ContainerStarted","Data":"36be5325833648901d84529ec4c478968dce341b4f206524051d37b86a68aec2"} Dec 06 07:19:12 crc kubenswrapper[4954]: I1206 07:19:12.010395 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78654684fc-84hfw" event={"ID":"096c2131-031b-4573-ade7-b1d0d34abc60","Type":"ContainerStarted","Data":"59705182107d1f5a2685c487316a4e8779bdf03f701931d0f8c6734cc83fb7c6"} Dec 06 07:19:16 crc kubenswrapper[4954]: I1206 07:19:16.431518 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:16 crc kubenswrapper[4954]: I1206 07:19:16.505361 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79464d554c-vgvct"] Dec 06 07:19:16 crc kubenswrapper[4954]: I1206 07:19:16.505843 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79464d554c-vgvct" podUID="cefc5dd8-7bea-4938-8489-7b5edc4e79f8" containerName="dnsmasq-dns" containerID="cri-o://3ff1ddc524f22f0e5266e63bc6127148019f544502da9441ee205d663d4cf95d" gracePeriod=10 Dec 06 07:19:17 crc kubenswrapper[4954]: I1206 07:19:17.067501 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78654684fc-84hfw" event={"ID":"096c2131-031b-4573-ade7-b1d0d34abc60","Type":"ContainerStarted","Data":"feff0af557a19a07e436a4ff2db6401f9895dc0adfba494469ecdb5c414013e9"} Dec 06 07:19:17 crc kubenswrapper[4954]: I1206 07:19:17.067953 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:17 crc kubenswrapper[4954]: I1206 07:19:17.069949 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xc5wf" event={"ID":"1b212057-565c-4246-820a-a804fb6da962","Type":"ContainerStarted","Data":"511c18ca4df0ed8aff0072864f0d0e77eddb735086b7978043e9e0becffa93c2"} Dec 06 07:19:17 crc kubenswrapper[4954]: I1206 07:19:17.074628 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f74aa2d5-0275-4b4c-a394-468414e6f840","Type":"ContainerStarted","Data":"a7e5f1a022086eb3694f5433449850d884c9185bfe347cb13fc34b36f3e20473"} Dec 06 07:19:17 crc kubenswrapper[4954]: I1206 07:19:17.074811 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerName="ceilometer-central-agent" containerID="cri-o://3d51af3f527dae523af467b40835190221d458c85f1ca604efa4190bd08d8799" gracePeriod=30 Dec 06 07:19:17 crc kubenswrapper[4954]: I1206 07:19:17.075030 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 07:19:17 crc kubenswrapper[4954]: I1206 07:19:17.075082 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerName="proxy-httpd" containerID="cri-o://a7e5f1a022086eb3694f5433449850d884c9185bfe347cb13fc34b36f3e20473" gracePeriod=30 Dec 06 07:19:17 crc kubenswrapper[4954]: I1206 07:19:17.075139 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerName="sg-core" containerID="cri-o://9ccf00a975a945e1ec654f4d1f89e52233116cdf8b21e852533f3e44c9272d32" gracePeriod=30 Dec 06 07:19:17 crc kubenswrapper[4954]: I1206 07:19:17.075183 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerName="ceilometer-notification-agent" containerID="cri-o://e1f4adf7be94100797eae7cc8de157aaf57ae9a3f550ae855dcaccd1e3c043bc" gracePeriod=30 Dec 06 07:19:17 crc kubenswrapper[4954]: I1206 07:19:17.095947 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-78654684fc-84hfw" podStartSLOduration=8.095919226 podStartE2EDuration="8.095919226s" podCreationTimestamp="2025-12-06 07:19:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:19:17.094870878 +0000 UTC m=+1331.908230267" watchObservedRunningTime="2025-12-06 07:19:17.095919226 +0000 UTC m=+1331.909278725" Dec 06 07:19:17 crc kubenswrapper[4954]: I1206 07:19:17.127941 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-xc5wf" podStartSLOduration=3.390539184 podStartE2EDuration="47.127918175s" podCreationTimestamp="2025-12-06 07:18:30 +0000 UTC" firstStartedPulling="2025-12-06 07:18:31.993810228 +0000 UTC m=+1286.807169617" lastFinishedPulling="2025-12-06 07:19:15.731189219 +0000 UTC m=+1330.544548608" observedRunningTime="2025-12-06 07:19:17.120876976 +0000 UTC m=+1331.934236385" watchObservedRunningTime="2025-12-06 07:19:17.127918175 +0000 UTC m=+1331.941277564" Dec 06 07:19:17 crc kubenswrapper[4954]: I1206 07:19:17.153136 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.392529377 podStartE2EDuration="47.153109661s" podCreationTimestamp="2025-12-06 07:18:30 +0000 UTC" firstStartedPulling="2025-12-06 07:18:32.031296413 +0000 UTC m=+1286.844655802" lastFinishedPulling="2025-12-06 07:19:15.791876697 +0000 UTC m=+1330.605236086" observedRunningTime="2025-12-06 07:19:17.144173641 +0000 UTC m=+1331.957533040" watchObservedRunningTime="2025-12-06 07:19:17.153109661 +0000 UTC m=+1331.966469050" Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.086345 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fcgjm" event={"ID":"24a53752-0ee5-43db-b968-b4d11414ffdb","Type":"ContainerStarted","Data":"df8e570779d97aab43c23df58a337a053e63cc302fbe36e683dc6788a6773e03"} Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.092919 4954 generic.go:334] "Generic (PLEG): container finished" podID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerID="a7e5f1a022086eb3694f5433449850d884c9185bfe347cb13fc34b36f3e20473" exitCode=0 Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.092951 4954 generic.go:334] "Generic (PLEG): container finished" podID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerID="9ccf00a975a945e1ec654f4d1f89e52233116cdf8b21e852533f3e44c9272d32" exitCode=2 Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.092961 4954 generic.go:334] "Generic (PLEG): container finished" podID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerID="3d51af3f527dae523af467b40835190221d458c85f1ca604efa4190bd08d8799" exitCode=0 Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.092980 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f74aa2d5-0275-4b4c-a394-468414e6f840","Type":"ContainerDied","Data":"a7e5f1a022086eb3694f5433449850d884c9185bfe347cb13fc34b36f3e20473"} Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.093055 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f74aa2d5-0275-4b4c-a394-468414e6f840","Type":"ContainerDied","Data":"9ccf00a975a945e1ec654f4d1f89e52233116cdf8b21e852533f3e44c9272d32"} Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.093066 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f74aa2d5-0275-4b4c-a394-468414e6f840","Type":"ContainerDied","Data":"3d51af3f527dae523af467b40835190221d458c85f1ca604efa4190bd08d8799"} Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.095725 4954 generic.go:334] "Generic (PLEG): container finished" podID="cefc5dd8-7bea-4938-8489-7b5edc4e79f8" containerID="3ff1ddc524f22f0e5266e63bc6127148019f544502da9441ee205d663d4cf95d" exitCode=0 Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.095792 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79464d554c-vgvct" event={"ID":"cefc5dd8-7bea-4938-8489-7b5edc4e79f8","Type":"ContainerDied","Data":"3ff1ddc524f22f0e5266e63bc6127148019f544502da9441ee205d663d4cf95d"} Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.095985 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79464d554c-vgvct" event={"ID":"cefc5dd8-7bea-4938-8489-7b5edc4e79f8","Type":"ContainerDied","Data":"a6212ffd5e473996c082c7ba70e59d677a426b234e5dcae3c5d3c9c48a02a50a"} Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.096009 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6212ffd5e473996c082c7ba70e59d677a426b234e5dcae3c5d3c9c48a02a50a" Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.113397 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-fcgjm" podStartSLOduration=4.207707185 podStartE2EDuration="48.113370052s" podCreationTimestamp="2025-12-06 07:18:30 +0000 UTC" firstStartedPulling="2025-12-06 07:18:31.824793422 +0000 UTC m=+1286.638152811" lastFinishedPulling="2025-12-06 07:19:15.730456289 +0000 UTC m=+1330.543815678" observedRunningTime="2025-12-06 07:19:18.105387768 +0000 UTC m=+1332.918747157" watchObservedRunningTime="2025-12-06 07:19:18.113370052 +0000 UTC m=+1332.926729441" Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.132191 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.236378 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-dns-svc\") pod \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.236632 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-config\") pod \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.236700 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-ovsdbserver-nb\") pod \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.236772 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-ovsdbserver-sb\") pod \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.236834 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-dns-swift-storage-0\") pod \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.236890 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g94p\" (UniqueName: \"kubernetes.io/projected/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-kube-api-access-5g94p\") pod \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\" (UID: \"cefc5dd8-7bea-4938-8489-7b5edc4e79f8\") " Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.244852 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-kube-api-access-5g94p" (OuterVolumeSpecName: "kube-api-access-5g94p") pod "cefc5dd8-7bea-4938-8489-7b5edc4e79f8" (UID: "cefc5dd8-7bea-4938-8489-7b5edc4e79f8"). InnerVolumeSpecName "kube-api-access-5g94p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.292284 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cefc5dd8-7bea-4938-8489-7b5edc4e79f8" (UID: "cefc5dd8-7bea-4938-8489-7b5edc4e79f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.298794 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-config" (OuterVolumeSpecName: "config") pod "cefc5dd8-7bea-4938-8489-7b5edc4e79f8" (UID: "cefc5dd8-7bea-4938-8489-7b5edc4e79f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.307475 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cefc5dd8-7bea-4938-8489-7b5edc4e79f8" (UID: "cefc5dd8-7bea-4938-8489-7b5edc4e79f8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.323097 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cefc5dd8-7bea-4938-8489-7b5edc4e79f8" (UID: "cefc5dd8-7bea-4938-8489-7b5edc4e79f8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.326412 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cefc5dd8-7bea-4938-8489-7b5edc4e79f8" (UID: "cefc5dd8-7bea-4938-8489-7b5edc4e79f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.339429 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.339482 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.339493 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.339507 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.339519 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:18 crc kubenswrapper[4954]: I1206 07:19:18.339531 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5g94p\" (UniqueName: \"kubernetes.io/projected/cefc5dd8-7bea-4938-8489-7b5edc4e79f8-kube-api-access-5g94p\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:19 crc kubenswrapper[4954]: I1206 07:19:19.106786 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79464d554c-vgvct" Dec 06 07:19:19 crc kubenswrapper[4954]: I1206 07:19:19.153094 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79464d554c-vgvct"] Dec 06 07:19:19 crc kubenswrapper[4954]: I1206 07:19:19.161828 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79464d554c-vgvct"] Dec 06 07:19:19 crc kubenswrapper[4954]: I1206 07:19:19.455795 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cefc5dd8-7bea-4938-8489-7b5edc4e79f8" path="/var/lib/kubelet/pods/cefc5dd8-7bea-4938-8489-7b5edc4e79f8/volumes" Dec 06 07:19:20 crc kubenswrapper[4954]: I1206 07:19:20.119026 4954 generic.go:334] "Generic (PLEG): container finished" podID="1b212057-565c-4246-820a-a804fb6da962" containerID="511c18ca4df0ed8aff0072864f0d0e77eddb735086b7978043e9e0becffa93c2" exitCode=0 Dec 06 07:19:20 crc kubenswrapper[4954]: I1206 07:19:20.119087 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xc5wf" event={"ID":"1b212057-565c-4246-820a-a804fb6da962","Type":"ContainerDied","Data":"511c18ca4df0ed8aff0072864f0d0e77eddb735086b7978043e9e0becffa93c2"} Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.141084 4954 generic.go:334] "Generic (PLEG): container finished" podID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerID="e1f4adf7be94100797eae7cc8de157aaf57ae9a3f550ae855dcaccd1e3c043bc" exitCode=0 Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.141878 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f74aa2d5-0275-4b4c-a394-468414e6f840","Type":"ContainerDied","Data":"e1f4adf7be94100797eae7cc8de157aaf57ae9a3f550ae855dcaccd1e3c043bc"} Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.277955 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.407489 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-sg-core-conf-yaml\") pod \"f74aa2d5-0275-4b4c-a394-468414e6f840\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.407678 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-config-data\") pod \"f74aa2d5-0275-4b4c-a394-468414e6f840\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.407737 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f74aa2d5-0275-4b4c-a394-468414e6f840-run-httpd\") pod \"f74aa2d5-0275-4b4c-a394-468414e6f840\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.407812 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-combined-ca-bundle\") pod \"f74aa2d5-0275-4b4c-a394-468414e6f840\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.407855 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd8cr\" (UniqueName: \"kubernetes.io/projected/f74aa2d5-0275-4b4c-a394-468414e6f840-kube-api-access-gd8cr\") pod \"f74aa2d5-0275-4b4c-a394-468414e6f840\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.407888 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-scripts\") pod \"f74aa2d5-0275-4b4c-a394-468414e6f840\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.407923 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f74aa2d5-0275-4b4c-a394-468414e6f840-log-httpd\") pod \"f74aa2d5-0275-4b4c-a394-468414e6f840\" (UID: \"f74aa2d5-0275-4b4c-a394-468414e6f840\") " Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.409164 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f74aa2d5-0275-4b4c-a394-468414e6f840-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f74aa2d5-0275-4b4c-a394-468414e6f840" (UID: "f74aa2d5-0275-4b4c-a394-468414e6f840"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.410806 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f74aa2d5-0275-4b4c-a394-468414e6f840-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f74aa2d5-0275-4b4c-a394-468414e6f840" (UID: "f74aa2d5-0275-4b4c-a394-468414e6f840"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.415863 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74aa2d5-0275-4b4c-a394-468414e6f840-kube-api-access-gd8cr" (OuterVolumeSpecName: "kube-api-access-gd8cr") pod "f74aa2d5-0275-4b4c-a394-468414e6f840" (UID: "f74aa2d5-0275-4b4c-a394-468414e6f840"). InnerVolumeSpecName "kube-api-access-gd8cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.416003 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-scripts" (OuterVolumeSpecName: "scripts") pod "f74aa2d5-0275-4b4c-a394-468414e6f840" (UID: "f74aa2d5-0275-4b4c-a394-468414e6f840"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.450392 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f74aa2d5-0275-4b4c-a394-468414e6f840" (UID: "f74aa2d5-0275-4b4c-a394-468414e6f840"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.510203 4954 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.510470 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f74aa2d5-0275-4b4c-a394-468414e6f840-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.510552 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd8cr\" (UniqueName: \"kubernetes.io/projected/f74aa2d5-0275-4b4c-a394-468414e6f840-kube-api-access-gd8cr\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.510786 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.510865 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f74aa2d5-0275-4b4c-a394-468414e6f840-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.517977 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xc5wf" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.520212 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f74aa2d5-0275-4b4c-a394-468414e6f840" (UID: "f74aa2d5-0275-4b4c-a394-468414e6f840"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.525721 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-config-data" (OuterVolumeSpecName: "config-data") pod "f74aa2d5-0275-4b4c-a394-468414e6f840" (UID: "f74aa2d5-0275-4b4c-a394-468414e6f840"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.612830 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc4jw\" (UniqueName: \"kubernetes.io/projected/1b212057-565c-4246-820a-a804fb6da962-kube-api-access-cc4jw\") pod \"1b212057-565c-4246-820a-a804fb6da962\" (UID: \"1b212057-565c-4246-820a-a804fb6da962\") " Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.612952 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b212057-565c-4246-820a-a804fb6da962-combined-ca-bundle\") pod \"1b212057-565c-4246-820a-a804fb6da962\" (UID: \"1b212057-565c-4246-820a-a804fb6da962\") " Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.613075 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1b212057-565c-4246-820a-a804fb6da962-db-sync-config-data\") pod \"1b212057-565c-4246-820a-a804fb6da962\" (UID: \"1b212057-565c-4246-820a-a804fb6da962\") " Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.614448 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.614472 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74aa2d5-0275-4b4c-a394-468414e6f840-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.615873 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b212057-565c-4246-820a-a804fb6da962-kube-api-access-cc4jw" (OuterVolumeSpecName: "kube-api-access-cc4jw") pod "1b212057-565c-4246-820a-a804fb6da962" (UID: "1b212057-565c-4246-820a-a804fb6da962"). InnerVolumeSpecName "kube-api-access-cc4jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.617481 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b212057-565c-4246-820a-a804fb6da962-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1b212057-565c-4246-820a-a804fb6da962" (UID: "1b212057-565c-4246-820a-a804fb6da962"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.642221 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b212057-565c-4246-820a-a804fb6da962-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b212057-565c-4246-820a-a804fb6da962" (UID: "1b212057-565c-4246-820a-a804fb6da962"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.716911 4954 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1b212057-565c-4246-820a-a804fb6da962-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.717255 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc4jw\" (UniqueName: \"kubernetes.io/projected/1b212057-565c-4246-820a-a804fb6da962-kube-api-access-cc4jw\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:21 crc kubenswrapper[4954]: I1206 07:19:21.717334 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b212057-565c-4246-820a-a804fb6da962-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.160014 4954 generic.go:334] "Generic (PLEG): container finished" podID="24a53752-0ee5-43db-b968-b4d11414ffdb" containerID="df8e570779d97aab43c23df58a337a053e63cc302fbe36e683dc6788a6773e03" exitCode=0 Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.160121 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fcgjm" event={"ID":"24a53752-0ee5-43db-b968-b4d11414ffdb","Type":"ContainerDied","Data":"df8e570779d97aab43c23df58a337a053e63cc302fbe36e683dc6788a6773e03"} Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.164923 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f74aa2d5-0275-4b4c-a394-468414e6f840","Type":"ContainerDied","Data":"ae08e3a0dba5fd3b4b77ad50e0a1477e3f4d41f698c372d054548bfe692081be"} Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.164994 4954 scope.go:117] "RemoveContainer" containerID="a7e5f1a022086eb3694f5433449850d884c9185bfe347cb13fc34b36f3e20473" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.165150 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.172629 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xc5wf" event={"ID":"1b212057-565c-4246-820a-a804fb6da962","Type":"ContainerDied","Data":"8dcf9db4bd9e3a9667f7c4d856314e1bce8bd44ce7234114db928d38d365f756"} Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.172726 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xc5wf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.172739 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dcf9db4bd9e3a9667f7c4d856314e1bce8bd44ce7234114db928d38d365f756" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.212838 4954 scope.go:117] "RemoveContainer" containerID="9ccf00a975a945e1ec654f4d1f89e52233116cdf8b21e852533f3e44c9272d32" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.237040 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.246974 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.259200 4954 scope.go:117] "RemoveContainer" containerID="e1f4adf7be94100797eae7cc8de157aaf57ae9a3f550ae855dcaccd1e3c043bc" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.269794 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:19:22 crc kubenswrapper[4954]: E1206 07:19:22.270430 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerName="ceilometer-notification-agent" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.270457 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerName="ceilometer-notification-agent" Dec 06 07:19:22 crc kubenswrapper[4954]: E1206 07:19:22.270479 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cefc5dd8-7bea-4938-8489-7b5edc4e79f8" containerName="dnsmasq-dns" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.270487 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="cefc5dd8-7bea-4938-8489-7b5edc4e79f8" containerName="dnsmasq-dns" Dec 06 07:19:22 crc kubenswrapper[4954]: E1206 07:19:22.270512 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerName="sg-core" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.270523 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerName="sg-core" Dec 06 07:19:22 crc kubenswrapper[4954]: E1206 07:19:22.270539 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b212057-565c-4246-820a-a804fb6da962" containerName="barbican-db-sync" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.270548 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b212057-565c-4246-820a-a804fb6da962" containerName="barbican-db-sync" Dec 06 07:19:22 crc kubenswrapper[4954]: E1206 07:19:22.270589 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cefc5dd8-7bea-4938-8489-7b5edc4e79f8" containerName="init" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.270599 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="cefc5dd8-7bea-4938-8489-7b5edc4e79f8" containerName="init" Dec 06 07:19:22 crc kubenswrapper[4954]: E1206 07:19:22.270614 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerName="proxy-httpd" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.270621 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerName="proxy-httpd" Dec 06 07:19:22 crc kubenswrapper[4954]: E1206 07:19:22.270637 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerName="ceilometer-central-agent" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.270665 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerName="ceilometer-central-agent" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.270906 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerName="sg-core" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.270930 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerName="ceilometer-notification-agent" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.270938 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b212057-565c-4246-820a-a804fb6da962" containerName="barbican-db-sync" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.270951 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="cefc5dd8-7bea-4938-8489-7b5edc4e79f8" containerName="dnsmasq-dns" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.270960 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerName="proxy-httpd" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.270970 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74aa2d5-0275-4b4c-a394-468414e6f840" containerName="ceilometer-central-agent" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.272842 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.278454 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.278655 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.296822 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.320760 4954 scope.go:117] "RemoveContainer" containerID="3d51af3f527dae523af467b40835190221d458c85f1ca604efa4190bd08d8799" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.379225 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5bf7788f9-vw5rh"] Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.381347 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.388811 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.389044 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.389958 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-868gk" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.402875 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5bf7788f9-vw5rh"] Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.434820 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-config-data\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.434928 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-log-httpd\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.434957 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.434995 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-scripts\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.435049 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-run-httpd\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.435143 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.435186 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64474\" (UniqueName: \"kubernetes.io/projected/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-kube-api-access-64474\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.523479 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk"] Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.531085 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.537164 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.538723 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b907c888-706a-4183-b581-ff7b4742fc74-config-data\") pod \"barbican-worker-5bf7788f9-vw5rh\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.538778 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4jb8\" (UniqueName: \"kubernetes.io/projected/b907c888-706a-4183-b581-ff7b4742fc74-kube-api-access-n4jb8\") pod \"barbican-worker-5bf7788f9-vw5rh\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.538820 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.538857 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64474\" (UniqueName: \"kubernetes.io/projected/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-kube-api-access-64474\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.538879 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-config-data\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.538945 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b907c888-706a-4183-b581-ff7b4742fc74-logs\") pod \"barbican-worker-5bf7788f9-vw5rh\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.538969 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-log-httpd\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.538987 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b907c888-706a-4183-b581-ff7b4742fc74-config-data-custom\") pod \"barbican-worker-5bf7788f9-vw5rh\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.539008 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.539035 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-scripts\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.539056 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b907c888-706a-4183-b581-ff7b4742fc74-combined-ca-bundle\") pod \"barbican-worker-5bf7788f9-vw5rh\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.539095 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-run-httpd\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.539619 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-run-httpd\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.543539 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-log-httpd\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.552149 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.553784 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.557046 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-config-data\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.567682 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk"] Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.567766 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-scripts\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.575315 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64474\" (UniqueName: \"kubernetes.io/projected/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-kube-api-access-64474\") pod \"ceilometer-0\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.598297 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.640628 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5768d59dd9-5lhjf"] Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.642949 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.656725 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13e11c3-b93d-4671-b9a7-961ab83bd23e-combined-ca-bundle\") pod \"barbican-keystone-listener-7d7bb8bff8-tdqdk\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.656813 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b907c888-706a-4183-b581-ff7b4742fc74-combined-ca-bundle\") pod \"barbican-worker-5bf7788f9-vw5rh\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.656902 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f13e11c3-b93d-4671-b9a7-961ab83bd23e-config-data\") pod \"barbican-keystone-listener-7d7bb8bff8-tdqdk\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.657055 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b907c888-706a-4183-b581-ff7b4742fc74-config-data\") pod \"barbican-worker-5bf7788f9-vw5rh\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.657298 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f13e11c3-b93d-4671-b9a7-961ab83bd23e-logs\") pod \"barbican-keystone-listener-7d7bb8bff8-tdqdk\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.657452 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4jb8\" (UniqueName: \"kubernetes.io/projected/b907c888-706a-4183-b581-ff7b4742fc74-kube-api-access-n4jb8\") pod \"barbican-worker-5bf7788f9-vw5rh\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.657730 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f13e11c3-b93d-4671-b9a7-961ab83bd23e-config-data-custom\") pod \"barbican-keystone-listener-7d7bb8bff8-tdqdk\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.658123 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b907c888-706a-4183-b581-ff7b4742fc74-logs\") pod \"barbican-worker-5bf7788f9-vw5rh\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.658244 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b907c888-706a-4183-b581-ff7b4742fc74-config-data-custom\") pod \"barbican-worker-5bf7788f9-vw5rh\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.658351 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmcsq\" (UniqueName: \"kubernetes.io/projected/f13e11c3-b93d-4671-b9a7-961ab83bd23e-kube-api-access-mmcsq\") pod \"barbican-keystone-listener-7d7bb8bff8-tdqdk\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.658792 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b907c888-706a-4183-b581-ff7b4742fc74-logs\") pod \"barbican-worker-5bf7788f9-vw5rh\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.664209 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b907c888-706a-4183-b581-ff7b4742fc74-combined-ca-bundle\") pod \"barbican-worker-5bf7788f9-vw5rh\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.667481 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b907c888-706a-4183-b581-ff7b4742fc74-config-data\") pod \"barbican-worker-5bf7788f9-vw5rh\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.673188 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b907c888-706a-4183-b581-ff7b4742fc74-config-data-custom\") pod \"barbican-worker-5bf7788f9-vw5rh\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.676465 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5768d59dd9-5lhjf"] Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.688645 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4jb8\" (UniqueName: \"kubernetes.io/projected/b907c888-706a-4183-b581-ff7b4742fc74-kube-api-access-n4jb8\") pod \"barbican-worker-5bf7788f9-vw5rh\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.714631 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.743943 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-79d7899494-4fjgd"] Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.752704 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.762158 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.777659 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmcsq\" (UniqueName: \"kubernetes.io/projected/f13e11c3-b93d-4671-b9a7-961ab83bd23e-kube-api-access-mmcsq\") pod \"barbican-keystone-listener-7d7bb8bff8-tdqdk\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.777738 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13e11c3-b93d-4671-b9a7-961ab83bd23e-combined-ca-bundle\") pod \"barbican-keystone-listener-7d7bb8bff8-tdqdk\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.777798 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-config\") pod \"dnsmasq-dns-5768d59dd9-5lhjf\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.777831 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-dns-svc\") pod \"dnsmasq-dns-5768d59dd9-5lhjf\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.777856 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/248b92ff-f8e8-40a1-beec-1d456e88d1c2-logs\") pod \"barbican-api-79d7899494-4fjgd\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.777900 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f13e11c3-b93d-4671-b9a7-961ab83bd23e-config-data\") pod \"barbican-keystone-listener-7d7bb8bff8-tdqdk\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.777917 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/248b92ff-f8e8-40a1-beec-1d456e88d1c2-config-data-custom\") pod \"barbican-api-79d7899494-4fjgd\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.778015 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b24jd\" (UniqueName: \"kubernetes.io/projected/248b92ff-f8e8-40a1-beec-1d456e88d1c2-kube-api-access-b24jd\") pod \"barbican-api-79d7899494-4fjgd\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.778053 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/248b92ff-f8e8-40a1-beec-1d456e88d1c2-config-data\") pod \"barbican-api-79d7899494-4fjgd\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.778078 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f13e11c3-b93d-4671-b9a7-961ab83bd23e-logs\") pod \"barbican-keystone-listener-7d7bb8bff8-tdqdk\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.778103 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-dns-swift-storage-0\") pod \"dnsmasq-dns-5768d59dd9-5lhjf\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.778164 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w72sr\" (UniqueName: \"kubernetes.io/projected/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-kube-api-access-w72sr\") pod \"dnsmasq-dns-5768d59dd9-5lhjf\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.778181 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-ovsdbserver-nb\") pod \"dnsmasq-dns-5768d59dd9-5lhjf\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.778230 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f13e11c3-b93d-4671-b9a7-961ab83bd23e-config-data-custom\") pod \"barbican-keystone-listener-7d7bb8bff8-tdqdk\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.778312 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/248b92ff-f8e8-40a1-beec-1d456e88d1c2-combined-ca-bundle\") pod \"barbican-api-79d7899494-4fjgd\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.778338 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-ovsdbserver-sb\") pod \"dnsmasq-dns-5768d59dd9-5lhjf\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.782030 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f13e11c3-b93d-4671-b9a7-961ab83bd23e-logs\") pod \"barbican-keystone-listener-7d7bb8bff8-tdqdk\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.788385 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13e11c3-b93d-4671-b9a7-961ab83bd23e-combined-ca-bundle\") pod \"barbican-keystone-listener-7d7bb8bff8-tdqdk\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.798870 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79d7899494-4fjgd"] Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.803218 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f13e11c3-b93d-4671-b9a7-961ab83bd23e-config-data-custom\") pod \"barbican-keystone-listener-7d7bb8bff8-tdqdk\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.804358 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f13e11c3-b93d-4671-b9a7-961ab83bd23e-config-data\") pod \"barbican-keystone-listener-7d7bb8bff8-tdqdk\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.828542 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmcsq\" (UniqueName: \"kubernetes.io/projected/f13e11c3-b93d-4671-b9a7-961ab83bd23e-kube-api-access-mmcsq\") pod \"barbican-keystone-listener-7d7bb8bff8-tdqdk\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.883720 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b24jd\" (UniqueName: \"kubernetes.io/projected/248b92ff-f8e8-40a1-beec-1d456e88d1c2-kube-api-access-b24jd\") pod \"barbican-api-79d7899494-4fjgd\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.884220 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/248b92ff-f8e8-40a1-beec-1d456e88d1c2-config-data\") pod \"barbican-api-79d7899494-4fjgd\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.884250 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-dns-swift-storage-0\") pod \"dnsmasq-dns-5768d59dd9-5lhjf\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.886002 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-dns-swift-storage-0\") pod \"dnsmasq-dns-5768d59dd9-5lhjf\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.886106 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w72sr\" (UniqueName: \"kubernetes.io/projected/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-kube-api-access-w72sr\") pod \"dnsmasq-dns-5768d59dd9-5lhjf\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.889381 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/248b92ff-f8e8-40a1-beec-1d456e88d1c2-config-data\") pod \"barbican-api-79d7899494-4fjgd\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.889591 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-ovsdbserver-nb\") pod \"dnsmasq-dns-5768d59dd9-5lhjf\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.889780 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/248b92ff-f8e8-40a1-beec-1d456e88d1c2-combined-ca-bundle\") pod \"barbican-api-79d7899494-4fjgd\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.889820 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-ovsdbserver-sb\") pod \"dnsmasq-dns-5768d59dd9-5lhjf\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.889947 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-config\") pod \"dnsmasq-dns-5768d59dd9-5lhjf\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.889978 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-dns-svc\") pod \"dnsmasq-dns-5768d59dd9-5lhjf\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.890005 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/248b92ff-f8e8-40a1-beec-1d456e88d1c2-logs\") pod \"barbican-api-79d7899494-4fjgd\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.890069 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/248b92ff-f8e8-40a1-beec-1d456e88d1c2-config-data-custom\") pod \"barbican-api-79d7899494-4fjgd\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.891704 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/248b92ff-f8e8-40a1-beec-1d456e88d1c2-logs\") pod \"barbican-api-79d7899494-4fjgd\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.891824 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-config\") pod \"dnsmasq-dns-5768d59dd9-5lhjf\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.892038 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-ovsdbserver-sb\") pod \"dnsmasq-dns-5768d59dd9-5lhjf\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.892634 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-dns-svc\") pod \"dnsmasq-dns-5768d59dd9-5lhjf\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.893169 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-ovsdbserver-nb\") pod \"dnsmasq-dns-5768d59dd9-5lhjf\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.902323 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/248b92ff-f8e8-40a1-beec-1d456e88d1c2-combined-ca-bundle\") pod \"barbican-api-79d7899494-4fjgd\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.903129 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/248b92ff-f8e8-40a1-beec-1d456e88d1c2-config-data-custom\") pod \"barbican-api-79d7899494-4fjgd\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.911100 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b24jd\" (UniqueName: \"kubernetes.io/projected/248b92ff-f8e8-40a1-beec-1d456e88d1c2-kube-api-access-b24jd\") pod \"barbican-api-79d7899494-4fjgd\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:22 crc kubenswrapper[4954]: I1206 07:19:22.918254 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w72sr\" (UniqueName: \"kubernetes.io/projected/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-kube-api-access-w72sr\") pod \"dnsmasq-dns-5768d59dd9-5lhjf\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.032137 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.051333 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.139358 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.199305 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:19:23 crc kubenswrapper[4954]: W1206 07:19:23.236753 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67ac7d91_1d01_4997_8d95_2ca7e74c2ae1.slice/crio-47fa2c18299344862be26695529db49df9522857e530254505cd47960b91bd42 WatchSource:0}: Error finding container 47fa2c18299344862be26695529db49df9522857e530254505cd47960b91bd42: Status 404 returned error can't find the container with id 47fa2c18299344862be26695529db49df9522857e530254505cd47960b91bd42 Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.341226 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5bf7788f9-vw5rh"] Dec 06 07:19:23 crc kubenswrapper[4954]: W1206 07:19:23.341524 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb907c888_706a_4183_b581_ff7b4742fc74.slice/crio-fdf17bd0dc1ae4c72061ac5a70e2ecda5d81bd81ba6b0830f0afb75b6ad277b4 WatchSource:0}: Error finding container fdf17bd0dc1ae4c72061ac5a70e2ecda5d81bd81ba6b0830f0afb75b6ad277b4: Status 404 returned error can't find the container with id fdf17bd0dc1ae4c72061ac5a70e2ecda5d81bd81ba6b0830f0afb75b6ad277b4 Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.455518 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f74aa2d5-0275-4b4c-a394-468414e6f840" path="/var/lib/kubelet/pods/f74aa2d5-0275-4b4c-a394-468414e6f840/volumes" Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.535067 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk"] Dec 06 07:19:23 crc kubenswrapper[4954]: W1206 07:19:23.544351 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf13e11c3_b93d_4671_b9a7_961ab83bd23e.slice/crio-f6db7f340df6fbda501087af9dab6a6a1d0f26bd3fdf5cd8bb0e29db47e1856b WatchSource:0}: Error finding container f6db7f340df6fbda501087af9dab6a6a1d0f26bd3fdf5cd8bb0e29db47e1856b: Status 404 returned error can't find the container with id f6db7f340df6fbda501087af9dab6a6a1d0f26bd3fdf5cd8bb0e29db47e1856b Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.586863 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.722492 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-config-data\") pod \"24a53752-0ee5-43db-b968-b4d11414ffdb\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.722676 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24a53752-0ee5-43db-b968-b4d11414ffdb-etc-machine-id\") pod \"24a53752-0ee5-43db-b968-b4d11414ffdb\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.722806 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npqzh\" (UniqueName: \"kubernetes.io/projected/24a53752-0ee5-43db-b968-b4d11414ffdb-kube-api-access-npqzh\") pod \"24a53752-0ee5-43db-b968-b4d11414ffdb\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.722857 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-combined-ca-bundle\") pod \"24a53752-0ee5-43db-b968-b4d11414ffdb\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.722934 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-db-sync-config-data\") pod \"24a53752-0ee5-43db-b968-b4d11414ffdb\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.722986 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-scripts\") pod \"24a53752-0ee5-43db-b968-b4d11414ffdb\" (UID: \"24a53752-0ee5-43db-b968-b4d11414ffdb\") " Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.729372 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a53752-0ee5-43db-b968-b4d11414ffdb-kube-api-access-npqzh" (OuterVolumeSpecName: "kube-api-access-npqzh") pod "24a53752-0ee5-43db-b968-b4d11414ffdb" (UID: "24a53752-0ee5-43db-b968-b4d11414ffdb"). InnerVolumeSpecName "kube-api-access-npqzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.729868 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-scripts" (OuterVolumeSpecName: "scripts") pod "24a53752-0ee5-43db-b968-b4d11414ffdb" (UID: "24a53752-0ee5-43db-b968-b4d11414ffdb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.729952 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a53752-0ee5-43db-b968-b4d11414ffdb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "24a53752-0ee5-43db-b968-b4d11414ffdb" (UID: "24a53752-0ee5-43db-b968-b4d11414ffdb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.730044 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "24a53752-0ee5-43db-b968-b4d11414ffdb" (UID: "24a53752-0ee5-43db-b968-b4d11414ffdb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.761792 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24a53752-0ee5-43db-b968-b4d11414ffdb" (UID: "24a53752-0ee5-43db-b968-b4d11414ffdb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.764747 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79d7899494-4fjgd"] Dec 06 07:19:23 crc kubenswrapper[4954]: W1206 07:19:23.771119 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod248b92ff_f8e8_40a1_beec_1d456e88d1c2.slice/crio-9bf9a672335a9f4cc7a899571de11fb912c8c743a935ada8059d09b3dce90040 WatchSource:0}: Error finding container 9bf9a672335a9f4cc7a899571de11fb912c8c743a935ada8059d09b3dce90040: Status 404 returned error can't find the container with id 9bf9a672335a9f4cc7a899571de11fb912c8c743a935ada8059d09b3dce90040 Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.773525 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5768d59dd9-5lhjf"] Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.790051 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-config-data" (OuterVolumeSpecName: "config-data") pod "24a53752-0ee5-43db-b968-b4d11414ffdb" (UID: "24a53752-0ee5-43db-b968-b4d11414ffdb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.825522 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.826160 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.826175 4954 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24a53752-0ee5-43db-b968-b4d11414ffdb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.826192 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npqzh\" (UniqueName: \"kubernetes.io/projected/24a53752-0ee5-43db-b968-b4d11414ffdb-kube-api-access-npqzh\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.826207 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:23 crc kubenswrapper[4954]: I1206 07:19:23.826218 4954 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24a53752-0ee5-43db-b968-b4d11414ffdb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.252372 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d7899494-4fjgd" event={"ID":"248b92ff-f8e8-40a1-beec-1d456e88d1c2","Type":"ContainerStarted","Data":"2ac89b762f89e75447fd0e4d6776c27b6e965739788a28fa0731c202415f691e"} Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.252924 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d7899494-4fjgd" event={"ID":"248b92ff-f8e8-40a1-beec-1d456e88d1c2","Type":"ContainerStarted","Data":"9bf9a672335a9f4cc7a899571de11fb912c8c743a935ada8059d09b3dce90040"} Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.256414 4954 generic.go:334] "Generic (PLEG): container finished" podID="e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9" containerID="95c681e815f438807c473a1c4a7914b51a55e611421042d80c310f0d7d9abcea" exitCode=0 Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.256671 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" event={"ID":"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9","Type":"ContainerDied","Data":"95c681e815f438807c473a1c4a7914b51a55e611421042d80c310f0d7d9abcea"} Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.256731 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" event={"ID":"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9","Type":"ContainerStarted","Data":"564ba153f12b1582c4f91ef137dc6dcad6372d1130b53d84be0493314f8a12a1"} Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.265388 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" event={"ID":"f13e11c3-b93d-4671-b9a7-961ab83bd23e","Type":"ContainerStarted","Data":"f6db7f340df6fbda501087af9dab6a6a1d0f26bd3fdf5cd8bb0e29db47e1856b"} Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.285763 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5bf7788f9-vw5rh" event={"ID":"b907c888-706a-4183-b581-ff7b4742fc74","Type":"ContainerStarted","Data":"fdf17bd0dc1ae4c72061ac5a70e2ecda5d81bd81ba6b0830f0afb75b6ad277b4"} Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.295221 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fcgjm" event={"ID":"24a53752-0ee5-43db-b968-b4d11414ffdb","Type":"ContainerDied","Data":"6b18385b98aa796c4502af81134f0f82d24002f658fd4832d3abc7c06e9b4fac"} Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.295267 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b18385b98aa796c4502af81134f0f82d24002f658fd4832d3abc7c06e9b4fac" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.295408 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fcgjm" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.302802 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1","Type":"ContainerStarted","Data":"8da004a7208cdb90780b4b06432c526d53dea4cd840585281269b19782f9fd6e"} Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.302868 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1","Type":"ContainerStarted","Data":"47fa2c18299344862be26695529db49df9522857e530254505cd47960b91bd42"} Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.557541 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 07:19:24 crc kubenswrapper[4954]: E1206 07:19:24.558371 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a53752-0ee5-43db-b968-b4d11414ffdb" containerName="cinder-db-sync" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.558393 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a53752-0ee5-43db-b968-b4d11414ffdb" containerName="cinder-db-sync" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.558748 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a53752-0ee5-43db-b968-b4d11414ffdb" containerName="cinder-db-sync" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.566001 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.573422 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hk7x6" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.573850 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.575710 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.578187 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.580672 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.661576 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbjsn\" (UniqueName: \"kubernetes.io/projected/00fd9f36-a274-403e-804d-ba5389f6055f-kube-api-access-kbjsn\") pod \"cinder-scheduler-0\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.661645 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.661768 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00fd9f36-a274-403e-804d-ba5389f6055f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.661808 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-scripts\") pod \"cinder-scheduler-0\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.661931 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-config-data\") pod \"cinder-scheduler-0\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.661978 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.663803 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5768d59dd9-5lhjf"] Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.692106 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56d54d44c7-b7jvx"] Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.695642 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.758409 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56d54d44c7-b7jvx"] Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.764097 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-ovsdbserver-nb\") pod \"dnsmasq-dns-56d54d44c7-b7jvx\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.764146 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-ovsdbserver-sb\") pod \"dnsmasq-dns-56d54d44c7-b7jvx\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.764177 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbjsn\" (UniqueName: \"kubernetes.io/projected/00fd9f36-a274-403e-804d-ba5389f6055f-kube-api-access-kbjsn\") pod \"cinder-scheduler-0\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.764203 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.764250 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00fd9f36-a274-403e-804d-ba5389f6055f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.764275 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-scripts\") pod \"cinder-scheduler-0\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.764320 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-dns-swift-storage-0\") pod \"dnsmasq-dns-56d54d44c7-b7jvx\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.764354 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkvgk\" (UniqueName: \"kubernetes.io/projected/674dbe14-b0c4-4854-9861-c374f07568d0-kube-api-access-xkvgk\") pod \"dnsmasq-dns-56d54d44c7-b7jvx\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.764390 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-config-data\") pod \"cinder-scheduler-0\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.764420 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.764470 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-config\") pod \"dnsmasq-dns-56d54d44c7-b7jvx\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.764494 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-dns-svc\") pod \"dnsmasq-dns-56d54d44c7-b7jvx\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.772446 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00fd9f36-a274-403e-804d-ba5389f6055f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.775683 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.788827 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-config-data\") pod \"cinder-scheduler-0\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.798133 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-scripts\") pod \"cinder-scheduler-0\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.826356 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.835433 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbjsn\" (UniqueName: \"kubernetes.io/projected/00fd9f36-a274-403e-804d-ba5389f6055f-kube-api-access-kbjsn\") pod \"cinder-scheduler-0\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.874202 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-config\") pod \"dnsmasq-dns-56d54d44c7-b7jvx\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.874270 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-dns-svc\") pod \"dnsmasq-dns-56d54d44c7-b7jvx\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.874334 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-ovsdbserver-nb\") pod \"dnsmasq-dns-56d54d44c7-b7jvx\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.874384 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-ovsdbserver-sb\") pod \"dnsmasq-dns-56d54d44c7-b7jvx\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.874470 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-dns-swift-storage-0\") pod \"dnsmasq-dns-56d54d44c7-b7jvx\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.874509 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkvgk\" (UniqueName: \"kubernetes.io/projected/674dbe14-b0c4-4854-9861-c374f07568d0-kube-api-access-xkvgk\") pod \"dnsmasq-dns-56d54d44c7-b7jvx\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.875951 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-config\") pod \"dnsmasq-dns-56d54d44c7-b7jvx\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.877379 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-ovsdbserver-sb\") pod \"dnsmasq-dns-56d54d44c7-b7jvx\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.877398 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-dns-svc\") pod \"dnsmasq-dns-56d54d44c7-b7jvx\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.877474 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-dns-swift-storage-0\") pod \"dnsmasq-dns-56d54d44c7-b7jvx\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.876551 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-ovsdbserver-nb\") pod \"dnsmasq-dns-56d54d44c7-b7jvx\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.901369 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkvgk\" (UniqueName: \"kubernetes.io/projected/674dbe14-b0c4-4854-9861-c374f07568d0-kube-api-access-xkvgk\") pod \"dnsmasq-dns-56d54d44c7-b7jvx\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.909364 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.919074 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.922799 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 07:19:24 crc kubenswrapper[4954]: I1206 07:19:24.930017 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.001109 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.005312 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-scripts\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.005451 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-config-data\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.005631 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-config-data-custom\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.005728 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7f82170-4cf8-4801-a741-4714d188f4bf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.005868 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr6cb\" (UniqueName: \"kubernetes.io/projected/f7f82170-4cf8-4801-a741-4714d188f4bf-kube-api-access-pr6cb\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.005992 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.006036 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7f82170-4cf8-4801-a741-4714d188f4bf-logs\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.083143 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.107919 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.108760 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7f82170-4cf8-4801-a741-4714d188f4bf-logs\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.109140 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7f82170-4cf8-4801-a741-4714d188f4bf-logs\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.109314 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-scripts\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.109718 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-config-data\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.109815 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-config-data-custom\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.109876 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7f82170-4cf8-4801-a741-4714d188f4bf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.110211 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7f82170-4cf8-4801-a741-4714d188f4bf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.111418 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr6cb\" (UniqueName: \"kubernetes.io/projected/f7f82170-4cf8-4801-a741-4714d188f4bf-kube-api-access-pr6cb\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.112977 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-scripts\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.116606 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.121841 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-config-data-custom\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.130966 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-config-data\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.136484 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr6cb\" (UniqueName: \"kubernetes.io/projected/f7f82170-4cf8-4801-a741-4714d188f4bf-kube-api-access-pr6cb\") pod \"cinder-api-0\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.320025 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.330462 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d7899494-4fjgd" event={"ID":"248b92ff-f8e8-40a1-beec-1d456e88d1c2","Type":"ContainerStarted","Data":"1207c20e070bbe0dfa107a558dbce71db7570d2952c65d92663ef3e75560ed22"} Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.330779 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.330834 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:25 crc kubenswrapper[4954]: I1206 07:19:25.358434 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-79d7899494-4fjgd" podStartSLOduration=3.358409427 podStartE2EDuration="3.358409427s" podCreationTimestamp="2025-12-06 07:19:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:19:25.349919939 +0000 UTC m=+1340.163279338" watchObservedRunningTime="2025-12-06 07:19:25.358409427 +0000 UTC m=+1340.171768816" Dec 06 07:19:26 crc kubenswrapper[4954]: I1206 07:19:26.740414 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 07:19:26 crc kubenswrapper[4954]: W1206 07:19:26.818996 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7f82170_4cf8_4801_a741_4714d188f4bf.slice/crio-8ff8a0ed15b1439d9936b684edd900984f45982fc4a2a5aa6ca0dc867bdfd9c5 WatchSource:0}: Error finding container 8ff8a0ed15b1439d9936b684edd900984f45982fc4a2a5aa6ca0dc867bdfd9c5: Status 404 returned error can't find the container with id 8ff8a0ed15b1439d9936b684edd900984f45982fc4a2a5aa6ca0dc867bdfd9c5 Dec 06 07:19:26 crc kubenswrapper[4954]: I1206 07:19:26.879273 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 07:19:26 crc kubenswrapper[4954]: W1206 07:19:26.879887 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00fd9f36_a274_403e_804d_ba5389f6055f.slice/crio-1abe3fc4a807a6d9321c8ef3978625f4a9ce9b2909dcf6eb80d0fbc447aa35b8 WatchSource:0}: Error finding container 1abe3fc4a807a6d9321c8ef3978625f4a9ce9b2909dcf6eb80d0fbc447aa35b8: Status 404 returned error can't find the container with id 1abe3fc4a807a6d9321c8ef3978625f4a9ce9b2909dcf6eb80d0fbc447aa35b8 Dec 06 07:19:27 crc kubenswrapper[4954]: I1206 07:19:27.056518 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56d54d44c7-b7jvx"] Dec 06 07:19:27 crc kubenswrapper[4954]: W1206 07:19:27.092962 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod674dbe14_b0c4_4854_9861_c374f07568d0.slice/crio-e8d3b972dfef9ed07865f254c8b8b8923cacc1fb86a4ea88c2547f9a1f55ae48 WatchSource:0}: Error finding container e8d3b972dfef9ed07865f254c8b8b8923cacc1fb86a4ea88c2547f9a1f55ae48: Status 404 returned error can't find the container with id e8d3b972dfef9ed07865f254c8b8b8923cacc1fb86a4ea88c2547f9a1f55ae48 Dec 06 07:19:27 crc kubenswrapper[4954]: I1206 07:19:27.365048 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5bf7788f9-vw5rh" event={"ID":"b907c888-706a-4183-b581-ff7b4742fc74","Type":"ContainerStarted","Data":"ce32b7671edf55a676312d4bc38e6dda973ecc26c56be59a3598a590d1eebc14"} Dec 06 07:19:27 crc kubenswrapper[4954]: I1206 07:19:27.365129 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5bf7788f9-vw5rh" event={"ID":"b907c888-706a-4183-b581-ff7b4742fc74","Type":"ContainerStarted","Data":"632cafbc6ab8b4bca12eb7135c2589ae827ea7666ad9a40b63f7adf3a97af9c7"} Dec 06 07:19:27 crc kubenswrapper[4954]: I1206 07:19:27.370231 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" event={"ID":"674dbe14-b0c4-4854-9861-c374f07568d0","Type":"ContainerStarted","Data":"e8d3b972dfef9ed07865f254c8b8b8923cacc1fb86a4ea88c2547f9a1f55ae48"} Dec 06 07:19:27 crc kubenswrapper[4954]: I1206 07:19:27.379712 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1","Type":"ContainerStarted","Data":"bb382319fd4cfb39429bf60db7013e969ccd10993d2d5b1a011f5d407a6ea5a6"} Dec 06 07:19:27 crc kubenswrapper[4954]: I1206 07:19:27.385584 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f7f82170-4cf8-4801-a741-4714d188f4bf","Type":"ContainerStarted","Data":"8ff8a0ed15b1439d9936b684edd900984f45982fc4a2a5aa6ca0dc867bdfd9c5"} Dec 06 07:19:27 crc kubenswrapper[4954]: I1206 07:19:27.398150 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5bf7788f9-vw5rh" podStartSLOduration=2.546141488 podStartE2EDuration="5.39812075s" podCreationTimestamp="2025-12-06 07:19:22 +0000 UTC" firstStartedPulling="2025-12-06 07:19:23.344824996 +0000 UTC m=+1338.158184385" lastFinishedPulling="2025-12-06 07:19:26.196804248 +0000 UTC m=+1341.010163647" observedRunningTime="2025-12-06 07:19:27.38284907 +0000 UTC m=+1342.196208479" watchObservedRunningTime="2025-12-06 07:19:27.39812075 +0000 UTC m=+1342.211480139" Dec 06 07:19:27 crc kubenswrapper[4954]: I1206 07:19:27.398869 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"00fd9f36-a274-403e-804d-ba5389f6055f","Type":"ContainerStarted","Data":"1abe3fc4a807a6d9321c8ef3978625f4a9ce9b2909dcf6eb80d0fbc447aa35b8"} Dec 06 07:19:27 crc kubenswrapper[4954]: I1206 07:19:27.405644 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" podUID="e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9" containerName="dnsmasq-dns" containerID="cri-o://8bb310e22391bca9f4d47b5c725d352f4c948ac6f7f0569324091692cafa64c2" gracePeriod=10 Dec 06 07:19:27 crc kubenswrapper[4954]: I1206 07:19:27.405695 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" event={"ID":"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9","Type":"ContainerStarted","Data":"8bb310e22391bca9f4d47b5c725d352f4c948ac6f7f0569324091692cafa64c2"} Dec 06 07:19:27 crc kubenswrapper[4954]: I1206 07:19:27.406239 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:27 crc kubenswrapper[4954]: I1206 07:19:27.415400 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" event={"ID":"f13e11c3-b93d-4671-b9a7-961ab83bd23e","Type":"ContainerStarted","Data":"37978e335651ac4f0f09719f13e2d04c7acd8c2ac455f49f29118e1f83cda3a9"} Dec 06 07:19:27 crc kubenswrapper[4954]: I1206 07:19:27.415448 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" event={"ID":"f13e11c3-b93d-4671-b9a7-961ab83bd23e","Type":"ContainerStarted","Data":"0633e2818de57c77f8cbe3341871b6c9295418f75eb7371e9ba35f08bc900d92"} Dec 06 07:19:27 crc kubenswrapper[4954]: I1206 07:19:27.433056 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" podStartSLOduration=5.433035617 podStartE2EDuration="5.433035617s" podCreationTimestamp="2025-12-06 07:19:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:19:27.430742025 +0000 UTC m=+1342.244101414" watchObservedRunningTime="2025-12-06 07:19:27.433035617 +0000 UTC m=+1342.246395006" Dec 06 07:19:27 crc kubenswrapper[4954]: I1206 07:19:27.466653 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" podStartSLOduration=2.78434003 podStartE2EDuration="5.466629248s" podCreationTimestamp="2025-12-06 07:19:22 +0000 UTC" firstStartedPulling="2025-12-06 07:19:23.547886976 +0000 UTC m=+1338.361246365" lastFinishedPulling="2025-12-06 07:19:26.230176194 +0000 UTC m=+1341.043535583" observedRunningTime="2025-12-06 07:19:27.453796944 +0000 UTC m=+1342.267156333" watchObservedRunningTime="2025-12-06 07:19:27.466629248 +0000 UTC m=+1342.279988637" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.234155 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.347291 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-dns-swift-storage-0\") pod \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.347919 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w72sr\" (UniqueName: \"kubernetes.io/projected/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-kube-api-access-w72sr\") pod \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.348007 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-ovsdbserver-sb\") pod \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.348121 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-ovsdbserver-nb\") pod \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.348162 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-config\") pod \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.348192 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-dns-svc\") pod \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\" (UID: \"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9\") " Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.430013 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-kube-api-access-w72sr" (OuterVolumeSpecName: "kube-api-access-w72sr") pod "e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9" (UID: "e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9"). InnerVolumeSpecName "kube-api-access-w72sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.451395 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w72sr\" (UniqueName: \"kubernetes.io/projected/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-kube-api-access-w72sr\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.455479 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9" (UID: "e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.457192 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9" (UID: "e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.465167 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9" (UID: "e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.456018 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.455878 4954 generic.go:334] "Generic (PLEG): container finished" podID="e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9" containerID="8bb310e22391bca9f4d47b5c725d352f4c948ac6f7f0569324091692cafa64c2" exitCode=0 Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.455916 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" event={"ID":"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9","Type":"ContainerDied","Data":"8bb310e22391bca9f4d47b5c725d352f4c948ac6f7f0569324091692cafa64c2"} Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.467062 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5768d59dd9-5lhjf" event={"ID":"e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9","Type":"ContainerDied","Data":"564ba153f12b1582c4f91ef137dc6dcad6372d1130b53d84be0493314f8a12a1"} Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.467105 4954 scope.go:117] "RemoveContainer" containerID="8bb310e22391bca9f4d47b5c725d352f4c948ac6f7f0569324091692cafa64c2" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.480822 4954 generic.go:334] "Generic (PLEG): container finished" podID="674dbe14-b0c4-4854-9861-c374f07568d0" containerID="b7d95cd4a118ae38238e0ac1d517bb00599768e1f779f6d97f081fc261d5242a" exitCode=0 Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.480902 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" event={"ID":"674dbe14-b0c4-4854-9861-c374f07568d0","Type":"ContainerDied","Data":"b7d95cd4a118ae38238e0ac1d517bb00599768e1f779f6d97f081fc261d5242a"} Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.498518 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-config" (OuterVolumeSpecName: "config") pod "e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9" (UID: "e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.506640 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1","Type":"ContainerStarted","Data":"d819d54650536b1bfd14f1cfe21926f4e986c9310263dd229be99b1809a662a6"} Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.507352 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9" (UID: "e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.558825 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.558876 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.558889 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.558902 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.558914 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.672217 4954 scope.go:117] "RemoveContainer" containerID="95c681e815f438807c473a1c4a7914b51a55e611421042d80c310f0d7d9abcea" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.734542 4954 scope.go:117] "RemoveContainer" containerID="8bb310e22391bca9f4d47b5c725d352f4c948ac6f7f0569324091692cafa64c2" Dec 06 07:19:28 crc kubenswrapper[4954]: E1206 07:19:28.735953 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bb310e22391bca9f4d47b5c725d352f4c948ac6f7f0569324091692cafa64c2\": container with ID starting with 8bb310e22391bca9f4d47b5c725d352f4c948ac6f7f0569324091692cafa64c2 not found: ID does not exist" containerID="8bb310e22391bca9f4d47b5c725d352f4c948ac6f7f0569324091692cafa64c2" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.736000 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bb310e22391bca9f4d47b5c725d352f4c948ac6f7f0569324091692cafa64c2"} err="failed to get container status \"8bb310e22391bca9f4d47b5c725d352f4c948ac6f7f0569324091692cafa64c2\": rpc error: code = NotFound desc = could not find container \"8bb310e22391bca9f4d47b5c725d352f4c948ac6f7f0569324091692cafa64c2\": container with ID starting with 8bb310e22391bca9f4d47b5c725d352f4c948ac6f7f0569324091692cafa64c2 not found: ID does not exist" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.736028 4954 scope.go:117] "RemoveContainer" containerID="95c681e815f438807c473a1c4a7914b51a55e611421042d80c310f0d7d9abcea" Dec 06 07:19:28 crc kubenswrapper[4954]: E1206 07:19:28.736653 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95c681e815f438807c473a1c4a7914b51a55e611421042d80c310f0d7d9abcea\": container with ID starting with 95c681e815f438807c473a1c4a7914b51a55e611421042d80c310f0d7d9abcea not found: ID does not exist" containerID="95c681e815f438807c473a1c4a7914b51a55e611421042d80c310f0d7d9abcea" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.736677 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c681e815f438807c473a1c4a7914b51a55e611421042d80c310f0d7d9abcea"} err="failed to get container status \"95c681e815f438807c473a1c4a7914b51a55e611421042d80c310f0d7d9abcea\": rpc error: code = NotFound desc = could not find container \"95c681e815f438807c473a1c4a7914b51a55e611421042d80c310f0d7d9abcea\": container with ID starting with 95c681e815f438807c473a1c4a7914b51a55e611421042d80c310f0d7d9abcea not found: ID does not exist" Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.831724 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5768d59dd9-5lhjf"] Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.852334 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5768d59dd9-5lhjf"] Dec 06 07:19:28 crc kubenswrapper[4954]: I1206 07:19:28.915639 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.509078 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9" path="/var/lib/kubelet/pods/e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9/volumes" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.551682 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" event={"ID":"674dbe14-b0c4-4854-9861-c374f07568d0","Type":"ContainerStarted","Data":"159105e4f36e070d24eb1c382c8bc8b4025cd551fc9cc4825673883d1b46b92d"} Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.551793 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.585390 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1","Type":"ContainerStarted","Data":"8f740bed86f364396934d489e5587539e42c6d9128cd6017eda216b8f5cb484d"} Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.585598 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" podStartSLOduration=5.585551376 podStartE2EDuration="5.585551376s" podCreationTimestamp="2025-12-06 07:19:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:19:29.581875187 +0000 UTC m=+1344.395234576" watchObservedRunningTime="2025-12-06 07:19:29.585551376 +0000 UTC m=+1344.398910795" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.587083 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.596546 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f7f82170-4cf8-4801-a741-4714d188f4bf","Type":"ContainerStarted","Data":"7ad3100b41056d3e892dbf2140d978b7f78fcb4a1c6bd361245de3a14d01da3a"} Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.631253 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"00fd9f36-a274-403e-804d-ba5389f6055f","Type":"ContainerStarted","Data":"91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b"} Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.638490 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9999167180000001 podStartE2EDuration="7.638451476s" podCreationTimestamp="2025-12-06 07:19:22 +0000 UTC" firstStartedPulling="2025-12-06 07:19:23.263354959 +0000 UTC m=+1338.076714338" lastFinishedPulling="2025-12-06 07:19:28.901889707 +0000 UTC m=+1343.715249096" observedRunningTime="2025-12-06 07:19:29.62447315 +0000 UTC m=+1344.437832539" watchObservedRunningTime="2025-12-06 07:19:29.638451476 +0000 UTC m=+1344.451810865" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.686647 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-759b9cfd76-jp2pl"] Dec 06 07:19:29 crc kubenswrapper[4954]: E1206 07:19:29.687212 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9" containerName="dnsmasq-dns" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.687237 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9" containerName="dnsmasq-dns" Dec 06 07:19:29 crc kubenswrapper[4954]: E1206 07:19:29.687271 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9" containerName="init" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.687279 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9" containerName="init" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.687468 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a3dc85-0860-4b9f-bf71-ffebdf1cf2a9" containerName="dnsmasq-dns" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.688643 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.695619 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.698809 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.719995 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-759b9cfd76-jp2pl"] Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.794081 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-public-tls-certs\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.794184 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-config-data\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.794254 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5jfl\" (UniqueName: \"kubernetes.io/projected/25414b25-a6cc-41e6-8360-3e85f54321d5-kube-api-access-h5jfl\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.794317 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25414b25-a6cc-41e6-8360-3e85f54321d5-logs\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.794364 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-internal-tls-certs\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.794393 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-combined-ca-bundle\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.794424 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-config-data-custom\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.900167 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-public-tls-certs\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.900738 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-config-data\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.900847 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5jfl\" (UniqueName: \"kubernetes.io/projected/25414b25-a6cc-41e6-8360-3e85f54321d5-kube-api-access-h5jfl\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.900938 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25414b25-a6cc-41e6-8360-3e85f54321d5-logs\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.901002 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-internal-tls-certs\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.901068 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-combined-ca-bundle\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.901101 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-config-data-custom\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.914263 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25414b25-a6cc-41e6-8360-3e85f54321d5-logs\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.973794 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5jfl\" (UniqueName: \"kubernetes.io/projected/25414b25-a6cc-41e6-8360-3e85f54321d5-kube-api-access-h5jfl\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.977019 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-combined-ca-bundle\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.977581 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-internal-tls-certs\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:29 crc kubenswrapper[4954]: I1206 07:19:29.984900 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-public-tls-certs\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:30 crc kubenswrapper[4954]: I1206 07:19:30.002028 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-config-data\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:30 crc kubenswrapper[4954]: I1206 07:19:30.008764 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-config-data-custom\") pod \"barbican-api-759b9cfd76-jp2pl\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:30 crc kubenswrapper[4954]: I1206 07:19:30.057080 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:30 crc kubenswrapper[4954]: I1206 07:19:30.665738 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f7f82170-4cf8-4801-a741-4714d188f4bf","Type":"ContainerStarted","Data":"2f13334e6beba2ec58e3ae8bd2591f09012968f6920bc7a8a783c05b718fa530"} Dec 06 07:19:30 crc kubenswrapper[4954]: I1206 07:19:30.666309 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 06 07:19:30 crc kubenswrapper[4954]: I1206 07:19:30.665973 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f7f82170-4cf8-4801-a741-4714d188f4bf" containerName="cinder-api" containerID="cri-o://2f13334e6beba2ec58e3ae8bd2591f09012968f6920bc7a8a783c05b718fa530" gracePeriod=30 Dec 06 07:19:30 crc kubenswrapper[4954]: I1206 07:19:30.665911 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f7f82170-4cf8-4801-a741-4714d188f4bf" containerName="cinder-api-log" containerID="cri-o://7ad3100b41056d3e892dbf2140d978b7f78fcb4a1c6bd361245de3a14d01da3a" gracePeriod=30 Dec 06 07:19:30 crc kubenswrapper[4954]: I1206 07:19:30.685733 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"00fd9f36-a274-403e-804d-ba5389f6055f","Type":"ContainerStarted","Data":"caa8a28e30bb3497c62f7ff3b7d5c17a1b2aaaee50a05b343156cb1ac79ad535"} Dec 06 07:19:30 crc kubenswrapper[4954]: I1206 07:19:30.702107 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.702081892 podStartE2EDuration="6.702081892s" podCreationTimestamp="2025-12-06 07:19:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:19:30.695049543 +0000 UTC m=+1345.508408932" watchObservedRunningTime="2025-12-06 07:19:30.702081892 +0000 UTC m=+1345.515441281" Dec 06 07:19:30 crc kubenswrapper[4954]: I1206 07:19:30.736422 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.621683486 podStartE2EDuration="6.736392633s" podCreationTimestamp="2025-12-06 07:19:24 +0000 UTC" firstStartedPulling="2025-12-06 07:19:26.938099993 +0000 UTC m=+1341.751459382" lastFinishedPulling="2025-12-06 07:19:28.05280915 +0000 UTC m=+1342.866168529" observedRunningTime="2025-12-06 07:19:30.733484485 +0000 UTC m=+1345.546843884" watchObservedRunningTime="2025-12-06 07:19:30.736392633 +0000 UTC m=+1345.549752022" Dec 06 07:19:30 crc kubenswrapper[4954]: I1206 07:19:30.771393 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-759b9cfd76-jp2pl"] Dec 06 07:19:31 crc kubenswrapper[4954]: I1206 07:19:31.588957 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:31 crc kubenswrapper[4954]: I1206 07:19:31.746963 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759b9cfd76-jp2pl" event={"ID":"25414b25-a6cc-41e6-8360-3e85f54321d5","Type":"ContainerStarted","Data":"01b096b10176462db3110ae80702f8ad1ade5c76a6f11bd639c695de9a47e99c"} Dec 06 07:19:31 crc kubenswrapper[4954]: I1206 07:19:31.748189 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:31 crc kubenswrapper[4954]: I1206 07:19:31.748227 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:31 crc kubenswrapper[4954]: I1206 07:19:31.748241 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759b9cfd76-jp2pl" event={"ID":"25414b25-a6cc-41e6-8360-3e85f54321d5","Type":"ContainerStarted","Data":"d10a4557556efb566e2c349cfa75bf39a1a2ea52776d0c439f69d9cdbd06b4fa"} Dec 06 07:19:31 crc kubenswrapper[4954]: I1206 07:19:31.748258 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759b9cfd76-jp2pl" event={"ID":"25414b25-a6cc-41e6-8360-3e85f54321d5","Type":"ContainerStarted","Data":"086efa941c2782c799a58103f81050d1c4c077adbeba240ce660ff290dcc32e1"} Dec 06 07:19:31 crc kubenswrapper[4954]: I1206 07:19:31.756343 4954 generic.go:334] "Generic (PLEG): container finished" podID="f7f82170-4cf8-4801-a741-4714d188f4bf" containerID="2f13334e6beba2ec58e3ae8bd2591f09012968f6920bc7a8a783c05b718fa530" exitCode=0 Dec 06 07:19:31 crc kubenswrapper[4954]: I1206 07:19:31.756385 4954 generic.go:334] "Generic (PLEG): container finished" podID="f7f82170-4cf8-4801-a741-4714d188f4bf" containerID="7ad3100b41056d3e892dbf2140d978b7f78fcb4a1c6bd361245de3a14d01da3a" exitCode=143 Dec 06 07:19:31 crc kubenswrapper[4954]: I1206 07:19:31.757668 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f7f82170-4cf8-4801-a741-4714d188f4bf","Type":"ContainerDied","Data":"2f13334e6beba2ec58e3ae8bd2591f09012968f6920bc7a8a783c05b718fa530"} Dec 06 07:19:31 crc kubenswrapper[4954]: I1206 07:19:31.757718 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f7f82170-4cf8-4801-a741-4714d188f4bf","Type":"ContainerDied","Data":"7ad3100b41056d3e892dbf2140d978b7f78fcb4a1c6bd361245de3a14d01da3a"} Dec 06 07:19:31 crc kubenswrapper[4954]: I1206 07:19:31.782861 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-759b9cfd76-jp2pl" podStartSLOduration=2.782831927 podStartE2EDuration="2.782831927s" podCreationTimestamp="2025-12-06 07:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:19:31.774185235 +0000 UTC m=+1346.587544624" watchObservedRunningTime="2025-12-06 07:19:31.782831927 +0000 UTC m=+1346.596191316" Dec 06 07:19:31 crc kubenswrapper[4954]: I1206 07:19:31.947393 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.083178 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-config-data\") pod \"f7f82170-4cf8-4801-a741-4714d188f4bf\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.083239 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-scripts\") pod \"f7f82170-4cf8-4801-a741-4714d188f4bf\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.083310 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7f82170-4cf8-4801-a741-4714d188f4bf-logs\") pod \"f7f82170-4cf8-4801-a741-4714d188f4bf\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.083339 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-combined-ca-bundle\") pod \"f7f82170-4cf8-4801-a741-4714d188f4bf\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.083447 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-config-data-custom\") pod \"f7f82170-4cf8-4801-a741-4714d188f4bf\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.083493 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7f82170-4cf8-4801-a741-4714d188f4bf-etc-machine-id\") pod \"f7f82170-4cf8-4801-a741-4714d188f4bf\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.083552 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr6cb\" (UniqueName: \"kubernetes.io/projected/f7f82170-4cf8-4801-a741-4714d188f4bf-kube-api-access-pr6cb\") pod \"f7f82170-4cf8-4801-a741-4714d188f4bf\" (UID: \"f7f82170-4cf8-4801-a741-4714d188f4bf\") " Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.084999 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7f82170-4cf8-4801-a741-4714d188f4bf-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f7f82170-4cf8-4801-a741-4714d188f4bf" (UID: "f7f82170-4cf8-4801-a741-4714d188f4bf"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.085123 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7f82170-4cf8-4801-a741-4714d188f4bf-logs" (OuterVolumeSpecName: "logs") pod "f7f82170-4cf8-4801-a741-4714d188f4bf" (UID: "f7f82170-4cf8-4801-a741-4714d188f4bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.091644 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f7f82170-4cf8-4801-a741-4714d188f4bf" (UID: "f7f82170-4cf8-4801-a741-4714d188f4bf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.093598 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-scripts" (OuterVolumeSpecName: "scripts") pod "f7f82170-4cf8-4801-a741-4714d188f4bf" (UID: "f7f82170-4cf8-4801-a741-4714d188f4bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.097810 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7f82170-4cf8-4801-a741-4714d188f4bf-kube-api-access-pr6cb" (OuterVolumeSpecName: "kube-api-access-pr6cb") pod "f7f82170-4cf8-4801-a741-4714d188f4bf" (UID: "f7f82170-4cf8-4801-a741-4714d188f4bf"). InnerVolumeSpecName "kube-api-access-pr6cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.128239 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7f82170-4cf8-4801-a741-4714d188f4bf" (UID: "f7f82170-4cf8-4801-a741-4714d188f4bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.149963 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-config-data" (OuterVolumeSpecName: "config-data") pod "f7f82170-4cf8-4801-a741-4714d188f4bf" (UID: "f7f82170-4cf8-4801-a741-4714d188f4bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.186657 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.186712 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.186727 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7f82170-4cf8-4801-a741-4714d188f4bf-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.186738 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.186754 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7f82170-4cf8-4801-a741-4714d188f4bf-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.186765 4954 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7f82170-4cf8-4801-a741-4714d188f4bf-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.186778 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr6cb\" (UniqueName: \"kubernetes.io/projected/f7f82170-4cf8-4801-a741-4714d188f4bf-kube-api-access-pr6cb\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.770749 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.775675 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f7f82170-4cf8-4801-a741-4714d188f4bf","Type":"ContainerDied","Data":"8ff8a0ed15b1439d9936b684edd900984f45982fc4a2a5aa6ca0dc867bdfd9c5"} Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.775730 4954 scope.go:117] "RemoveContainer" containerID="2f13334e6beba2ec58e3ae8bd2591f09012968f6920bc7a8a783c05b718fa530" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.815413 4954 scope.go:117] "RemoveContainer" containerID="7ad3100b41056d3e892dbf2140d978b7f78fcb4a1c6bd361245de3a14d01da3a" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.821425 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.841882 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.858217 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 06 07:19:32 crc kubenswrapper[4954]: E1206 07:19:32.858859 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7f82170-4cf8-4801-a741-4714d188f4bf" containerName="cinder-api-log" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.858885 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7f82170-4cf8-4801-a741-4714d188f4bf" containerName="cinder-api-log" Dec 06 07:19:32 crc kubenswrapper[4954]: E1206 07:19:32.858913 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7f82170-4cf8-4801-a741-4714d188f4bf" containerName="cinder-api" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.858919 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7f82170-4cf8-4801-a741-4714d188f4bf" containerName="cinder-api" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.859106 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7f82170-4cf8-4801-a741-4714d188f4bf" containerName="cinder-api-log" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.859134 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7f82170-4cf8-4801-a741-4714d188f4bf" containerName="cinder-api" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.860322 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.864125 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.864477 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.864593 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 06 07:19:32 crc kubenswrapper[4954]: I1206 07:19:32.873329 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.003180 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c75e5da0-fd73-485e-b53a-b5e96965bb99-logs\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.003271 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-scripts\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.003331 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c75e5da0-fd73-485e-b53a-b5e96965bb99-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.004038 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.004091 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-config-data\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.004115 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46m7r\" (UniqueName: \"kubernetes.io/projected/c75e5da0-fd73-485e-b53a-b5e96965bb99-kube-api-access-46m7r\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.004215 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.004275 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-config-data-custom\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.004377 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.106679 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c75e5da0-fd73-485e-b53a-b5e96965bb99-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.106748 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.106786 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-config-data\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.106809 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46m7r\" (UniqueName: \"kubernetes.io/projected/c75e5da0-fd73-485e-b53a-b5e96965bb99-kube-api-access-46m7r\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.106833 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c75e5da0-fd73-485e-b53a-b5e96965bb99-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.106873 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.107100 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-config-data-custom\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.107153 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.107232 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c75e5da0-fd73-485e-b53a-b5e96965bb99-logs\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.107382 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-scripts\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.108013 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c75e5da0-fd73-485e-b53a-b5e96965bb99-logs\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.111737 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.112805 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.114466 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-config-data\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.115404 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-config-data-custom\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.121421 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-scripts\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.127438 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46m7r\" (UniqueName: \"kubernetes.io/projected/c75e5da0-fd73-485e-b53a-b5e96965bb99-kube-api-access-46m7r\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.128230 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.181651 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.359709 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.466433 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7f82170-4cf8-4801-a741-4714d188f4bf" path="/var/lib/kubelet/pods/f7f82170-4cf8-4801-a741-4714d188f4bf/volumes" Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.755820 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 07:19:33 crc kubenswrapper[4954]: W1206 07:19:33.768730 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc75e5da0_fd73_485e_b53a_b5e96965bb99.slice/crio-098992b0ef35dfec1bdd6242548e49ee9144814dd96d3bf5c510386e7ce75337 WatchSource:0}: Error finding container 098992b0ef35dfec1bdd6242548e49ee9144814dd96d3bf5c510386e7ce75337: Status 404 returned error can't find the container with id 098992b0ef35dfec1bdd6242548e49ee9144814dd96d3bf5c510386e7ce75337 Dec 06 07:19:33 crc kubenswrapper[4954]: I1206 07:19:33.783867 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c75e5da0-fd73-485e-b53a-b5e96965bb99","Type":"ContainerStarted","Data":"098992b0ef35dfec1bdd6242548e49ee9144814dd96d3bf5c510386e7ce75337"} Dec 06 07:19:34 crc kubenswrapper[4954]: I1206 07:19:34.813343 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c75e5da0-fd73-485e-b53a-b5e96965bb99","Type":"ContainerStarted","Data":"2c4a1099f869e71a363c1447c04199831e773a0bfb2ea26125fb2dd4207e631d"} Dec 06 07:19:34 crc kubenswrapper[4954]: I1206 07:19:34.910129 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 06 07:19:35 crc kubenswrapper[4954]: I1206 07:19:35.086100 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:19:35 crc kubenswrapper[4954]: I1206 07:19:35.170776 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67dfc45497-kr42c"] Dec 06 07:19:35 crc kubenswrapper[4954]: I1206 07:19:35.171051 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67dfc45497-kr42c" podUID="c36304c6-bb92-409f-9700-d45ac07bca8a" containerName="dnsmasq-dns" containerID="cri-o://6f6eabd999407ab0205f96cb4ec712282a45d1316429c1b890f020566f380231" gracePeriod=10 Dec 06 07:19:35 crc kubenswrapper[4954]: I1206 07:19:35.365111 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 06 07:19:35 crc kubenswrapper[4954]: I1206 07:19:35.836508 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c75e5da0-fd73-485e-b53a-b5e96965bb99","Type":"ContainerStarted","Data":"6880fda36024a9648a750eeda330087a75c544818aceab1279c1dd4c9af76cd2"} Dec 06 07:19:35 crc kubenswrapper[4954]: I1206 07:19:35.838317 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 06 07:19:35 crc kubenswrapper[4954]: I1206 07:19:35.849171 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dfc45497-kr42c" event={"ID":"c36304c6-bb92-409f-9700-d45ac07bca8a","Type":"ContainerDied","Data":"6f6eabd999407ab0205f96cb4ec712282a45d1316429c1b890f020566f380231"} Dec 06 07:19:35 crc kubenswrapper[4954]: I1206 07:19:35.849256 4954 generic.go:334] "Generic (PLEG): container finished" podID="c36304c6-bb92-409f-9700-d45ac07bca8a" containerID="6f6eabd999407ab0205f96cb4ec712282a45d1316429c1b890f020566f380231" exitCode=0 Dec 06 07:19:35 crc kubenswrapper[4954]: I1206 07:19:35.849341 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dfc45497-kr42c" event={"ID":"c36304c6-bb92-409f-9700-d45ac07bca8a","Type":"ContainerDied","Data":"1c209e1225c76a2a5f479efe8ee4e7a25d1729ea17e97957e9da4cc9edba3ada"} Dec 06 07:19:35 crc kubenswrapper[4954]: I1206 07:19:35.849358 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c209e1225c76a2a5f479efe8ee4e7a25d1729ea17e97957e9da4cc9edba3ada" Dec 06 07:19:35 crc kubenswrapper[4954]: I1206 07:19:35.887678 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.887653473 podStartE2EDuration="3.887653473s" podCreationTimestamp="2025-12-06 07:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:19:35.8610785 +0000 UTC m=+1350.674437889" watchObservedRunningTime="2025-12-06 07:19:35.887653473 +0000 UTC m=+1350.701012862" Dec 06 07:19:35 crc kubenswrapper[4954]: I1206 07:19:35.919270 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 07:19:35 crc kubenswrapper[4954]: I1206 07:19:35.922913 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.010537 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-config\") pod \"c36304c6-bb92-409f-9700-d45ac07bca8a\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.010650 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-dns-swift-storage-0\") pod \"c36304c6-bb92-409f-9700-d45ac07bca8a\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.010682 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-ovsdbserver-nb\") pod \"c36304c6-bb92-409f-9700-d45ac07bca8a\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.010726 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-dns-svc\") pod \"c36304c6-bb92-409f-9700-d45ac07bca8a\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.010783 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4cj8\" (UniqueName: \"kubernetes.io/projected/c36304c6-bb92-409f-9700-d45ac07bca8a-kube-api-access-k4cj8\") pod \"c36304c6-bb92-409f-9700-d45ac07bca8a\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.010905 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-ovsdbserver-sb\") pod \"c36304c6-bb92-409f-9700-d45ac07bca8a\" (UID: \"c36304c6-bb92-409f-9700-d45ac07bca8a\") " Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.047842 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c36304c6-bb92-409f-9700-d45ac07bca8a-kube-api-access-k4cj8" (OuterVolumeSpecName: "kube-api-access-k4cj8") pod "c36304c6-bb92-409f-9700-d45ac07bca8a" (UID: "c36304c6-bb92-409f-9700-d45ac07bca8a"). InnerVolumeSpecName "kube-api-access-k4cj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.088302 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c36304c6-bb92-409f-9700-d45ac07bca8a" (UID: "c36304c6-bb92-409f-9700-d45ac07bca8a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.114096 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.114129 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4cj8\" (UniqueName: \"kubernetes.io/projected/c36304c6-bb92-409f-9700-d45ac07bca8a-kube-api-access-k4cj8\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.121873 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-config" (OuterVolumeSpecName: "config") pod "c36304c6-bb92-409f-9700-d45ac07bca8a" (UID: "c36304c6-bb92-409f-9700-d45ac07bca8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.130633 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c36304c6-bb92-409f-9700-d45ac07bca8a" (UID: "c36304c6-bb92-409f-9700-d45ac07bca8a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.132134 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c36304c6-bb92-409f-9700-d45ac07bca8a" (UID: "c36304c6-bb92-409f-9700-d45ac07bca8a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.146297 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c36304c6-bb92-409f-9700-d45ac07bca8a" (UID: "c36304c6-bb92-409f-9700-d45ac07bca8a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.216489 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.216545 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.216570 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.216579 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c36304c6-bb92-409f-9700-d45ac07bca8a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.612110 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.670906 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.868668 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="00fd9f36-a274-403e-804d-ba5389f6055f" containerName="cinder-scheduler" containerID="cri-o://91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b" gracePeriod=30 Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.869240 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dfc45497-kr42c" Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.869417 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="00fd9f36-a274-403e-804d-ba5389f6055f" containerName="probe" containerID="cri-o://caa8a28e30bb3497c62f7ff3b7d5c17a1b2aaaee50a05b343156cb1ac79ad535" gracePeriod=30 Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.929293 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67dfc45497-kr42c"] Dec 06 07:19:36 crc kubenswrapper[4954]: I1206 07:19:36.936919 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67dfc45497-kr42c"] Dec 06 07:19:37 crc kubenswrapper[4954]: I1206 07:19:37.370070 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:19:37 crc kubenswrapper[4954]: I1206 07:19:37.493029 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c36304c6-bb92-409f-9700-d45ac07bca8a" path="/var/lib/kubelet/pods/c36304c6-bb92-409f-9700-d45ac07bca8a/volumes" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.060213 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:19:38 crc kubenswrapper[4954]: E1206 07:19:38.425235 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00fd9f36_a274_403e_804d_ba5389f6055f.slice/crio-conmon-91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00fd9f36_a274_403e_804d_ba5389f6055f.slice/crio-91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b.scope\": RecentStats: unable to find data in memory cache]" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.728435 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.778514 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbjsn\" (UniqueName: \"kubernetes.io/projected/00fd9f36-a274-403e-804d-ba5389f6055f-kube-api-access-kbjsn\") pod \"00fd9f36-a274-403e-804d-ba5389f6055f\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.778626 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00fd9f36-a274-403e-804d-ba5389f6055f-etc-machine-id\") pod \"00fd9f36-a274-403e-804d-ba5389f6055f\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.778680 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-combined-ca-bundle\") pod \"00fd9f36-a274-403e-804d-ba5389f6055f\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.778756 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-config-data\") pod \"00fd9f36-a274-403e-804d-ba5389f6055f\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.778806 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-scripts\") pod \"00fd9f36-a274-403e-804d-ba5389f6055f\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.778920 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-config-data-custom\") pod \"00fd9f36-a274-403e-804d-ba5389f6055f\" (UID: \"00fd9f36-a274-403e-804d-ba5389f6055f\") " Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.779208 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00fd9f36-a274-403e-804d-ba5389f6055f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "00fd9f36-a274-403e-804d-ba5389f6055f" (UID: "00fd9f36-a274-403e-804d-ba5389f6055f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.786598 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00fd9f36-a274-403e-804d-ba5389f6055f-kube-api-access-kbjsn" (OuterVolumeSpecName: "kube-api-access-kbjsn") pod "00fd9f36-a274-403e-804d-ba5389f6055f" (UID: "00fd9f36-a274-403e-804d-ba5389f6055f"). InnerVolumeSpecName "kube-api-access-kbjsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.802930 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "00fd9f36-a274-403e-804d-ba5389f6055f" (UID: "00fd9f36-a274-403e-804d-ba5389f6055f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.816117 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-scripts" (OuterVolumeSpecName: "scripts") pod "00fd9f36-a274-403e-804d-ba5389f6055f" (UID: "00fd9f36-a274-403e-804d-ba5389f6055f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.875744 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00fd9f36-a274-403e-804d-ba5389f6055f" (UID: "00fd9f36-a274-403e-804d-ba5389f6055f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.881468 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.881540 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbjsn\" (UniqueName: \"kubernetes.io/projected/00fd9f36-a274-403e-804d-ba5389f6055f-kube-api-access-kbjsn\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.881679 4954 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00fd9f36-a274-403e-804d-ba5389f6055f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.881696 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.881712 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.890778 4954 generic.go:334] "Generic (PLEG): container finished" podID="00fd9f36-a274-403e-804d-ba5389f6055f" containerID="caa8a28e30bb3497c62f7ff3b7d5c17a1b2aaaee50a05b343156cb1ac79ad535" exitCode=0 Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.890833 4954 generic.go:334] "Generic (PLEG): container finished" podID="00fd9f36-a274-403e-804d-ba5389f6055f" containerID="91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b" exitCode=0 Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.890869 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"00fd9f36-a274-403e-804d-ba5389f6055f","Type":"ContainerDied","Data":"caa8a28e30bb3497c62f7ff3b7d5c17a1b2aaaee50a05b343156cb1ac79ad535"} Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.890885 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.890918 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"00fd9f36-a274-403e-804d-ba5389f6055f","Type":"ContainerDied","Data":"91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b"} Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.890938 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"00fd9f36-a274-403e-804d-ba5389f6055f","Type":"ContainerDied","Data":"1abe3fc4a807a6d9321c8ef3978625f4a9ce9b2909dcf6eb80d0fbc447aa35b8"} Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.890962 4954 scope.go:117] "RemoveContainer" containerID="caa8a28e30bb3497c62f7ff3b7d5c17a1b2aaaee50a05b343156cb1ac79ad535" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.934757 4954 scope.go:117] "RemoveContainer" containerID="91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.941288 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-config-data" (OuterVolumeSpecName: "config-data") pod "00fd9f36-a274-403e-804d-ba5389f6055f" (UID: "00fd9f36-a274-403e-804d-ba5389f6055f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.965916 4954 scope.go:117] "RemoveContainer" containerID="caa8a28e30bb3497c62f7ff3b7d5c17a1b2aaaee50a05b343156cb1ac79ad535" Dec 06 07:19:38 crc kubenswrapper[4954]: E1206 07:19:38.966601 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caa8a28e30bb3497c62f7ff3b7d5c17a1b2aaaee50a05b343156cb1ac79ad535\": container with ID starting with caa8a28e30bb3497c62f7ff3b7d5c17a1b2aaaee50a05b343156cb1ac79ad535 not found: ID does not exist" containerID="caa8a28e30bb3497c62f7ff3b7d5c17a1b2aaaee50a05b343156cb1ac79ad535" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.966654 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caa8a28e30bb3497c62f7ff3b7d5c17a1b2aaaee50a05b343156cb1ac79ad535"} err="failed to get container status \"caa8a28e30bb3497c62f7ff3b7d5c17a1b2aaaee50a05b343156cb1ac79ad535\": rpc error: code = NotFound desc = could not find container \"caa8a28e30bb3497c62f7ff3b7d5c17a1b2aaaee50a05b343156cb1ac79ad535\": container with ID starting with caa8a28e30bb3497c62f7ff3b7d5c17a1b2aaaee50a05b343156cb1ac79ad535 not found: ID does not exist" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.966685 4954 scope.go:117] "RemoveContainer" containerID="91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b" Dec 06 07:19:38 crc kubenswrapper[4954]: E1206 07:19:38.970519 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b\": container with ID starting with 91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b not found: ID does not exist" containerID="91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.970700 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b"} err="failed to get container status \"91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b\": rpc error: code = NotFound desc = could not find container \"91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b\": container with ID starting with 91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b not found: ID does not exist" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.970741 4954 scope.go:117] "RemoveContainer" containerID="caa8a28e30bb3497c62f7ff3b7d5c17a1b2aaaee50a05b343156cb1ac79ad535" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.971202 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caa8a28e30bb3497c62f7ff3b7d5c17a1b2aaaee50a05b343156cb1ac79ad535"} err="failed to get container status \"caa8a28e30bb3497c62f7ff3b7d5c17a1b2aaaee50a05b343156cb1ac79ad535\": rpc error: code = NotFound desc = could not find container \"caa8a28e30bb3497c62f7ff3b7d5c17a1b2aaaee50a05b343156cb1ac79ad535\": container with ID starting with caa8a28e30bb3497c62f7ff3b7d5c17a1b2aaaee50a05b343156cb1ac79ad535 not found: ID does not exist" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.971220 4954 scope.go:117] "RemoveContainer" containerID="91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.971480 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b"} err="failed to get container status \"91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b\": rpc error: code = NotFound desc = could not find container \"91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b\": container with ID starting with 91a025b4233559094c6ac6881c999d919241ff6092c36a07dd4ffb82f41ac12b not found: ID does not exist" Dec 06 07:19:38 crc kubenswrapper[4954]: I1206 07:19:38.990135 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00fd9f36-a274-403e-804d-ba5389f6055f-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.233596 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.247442 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.284822 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 07:19:39 crc kubenswrapper[4954]: E1206 07:19:39.285888 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00fd9f36-a274-403e-804d-ba5389f6055f" containerName="probe" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.285959 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="00fd9f36-a274-403e-804d-ba5389f6055f" containerName="probe" Dec 06 07:19:39 crc kubenswrapper[4954]: E1206 07:19:39.286009 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c36304c6-bb92-409f-9700-d45ac07bca8a" containerName="dnsmasq-dns" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.286018 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36304c6-bb92-409f-9700-d45ac07bca8a" containerName="dnsmasq-dns" Dec 06 07:19:39 crc kubenswrapper[4954]: E1206 07:19:39.286063 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00fd9f36-a274-403e-804d-ba5389f6055f" containerName="cinder-scheduler" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.286071 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="00fd9f36-a274-403e-804d-ba5389f6055f" containerName="cinder-scheduler" Dec 06 07:19:39 crc kubenswrapper[4954]: E1206 07:19:39.286102 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c36304c6-bb92-409f-9700-d45ac07bca8a" containerName="init" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.286109 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36304c6-bb92-409f-9700-d45ac07bca8a" containerName="init" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.286624 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="00fd9f36-a274-403e-804d-ba5389f6055f" containerName="probe" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.286668 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c36304c6-bb92-409f-9700-d45ac07bca8a" containerName="dnsmasq-dns" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.286689 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="00fd9f36-a274-403e-804d-ba5389f6055f" containerName="cinder-scheduler" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.288751 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.292919 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.307971 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.308322 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz2n8\" (UniqueName: \"kubernetes.io/projected/a6b3ac97-ce01-4110-9dd5-fee903dd5204-kube-api-access-kz2n8\") pod \"cinder-scheduler-0\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.308541 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-config-data\") pod \"cinder-scheduler-0\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.308701 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a6b3ac97-ce01-4110-9dd5-fee903dd5204-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.308848 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.308914 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-scripts\") pod \"cinder-scheduler-0\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.317599 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.410763 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz2n8\" (UniqueName: \"kubernetes.io/projected/a6b3ac97-ce01-4110-9dd5-fee903dd5204-kube-api-access-kz2n8\") pod \"cinder-scheduler-0\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.410897 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-config-data\") pod \"cinder-scheduler-0\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.410960 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a6b3ac97-ce01-4110-9dd5-fee903dd5204-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.411015 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.411047 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-scripts\") pod \"cinder-scheduler-0\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.411104 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.411246 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a6b3ac97-ce01-4110-9dd5-fee903dd5204-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.416683 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-scripts\") pod \"cinder-scheduler-0\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.417996 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-config-data\") pod \"cinder-scheduler-0\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.418963 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.429237 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.435076 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz2n8\" (UniqueName: \"kubernetes.io/projected/a6b3ac97-ce01-4110-9dd5-fee903dd5204-kube-api-access-kz2n8\") pod \"cinder-scheduler-0\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " pod="openstack/cinder-scheduler-0" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.459220 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00fd9f36-a274-403e-804d-ba5389f6055f" path="/var/lib/kubelet/pods/00fd9f36-a274-403e-804d-ba5389f6055f/volumes" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.520027 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.605213 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ff8d8cf8b-qx8qg"] Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.605515 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-ff8d8cf8b-qx8qg" podUID="b41a1242-45fc-4a75-850b-8b0732873987" containerName="neutron-api" containerID="cri-o://13a279add44a664cd0e5027ab334fe72d8f8e781aab77d9e967476e881a64f22" gracePeriod=30 Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.606144 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-ff8d8cf8b-qx8qg" podUID="b41a1242-45fc-4a75-850b-8b0732873987" containerName="neutron-httpd" containerID="cri-o://17734e5db296a1112832ee1fec0a82dc416a216f61ede9129566b9f19b43668e" gracePeriod=30 Dec 06 07:19:39 crc kubenswrapper[4954]: I1206 07:19:39.635285 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 07:19:40 crc kubenswrapper[4954]: I1206 07:19:40.216304 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 07:19:40 crc kubenswrapper[4954]: I1206 07:19:40.939203 4954 generic.go:334] "Generic (PLEG): container finished" podID="b41a1242-45fc-4a75-850b-8b0732873987" containerID="17734e5db296a1112832ee1fec0a82dc416a216f61ede9129566b9f19b43668e" exitCode=0 Dec 06 07:19:40 crc kubenswrapper[4954]: I1206 07:19:40.939296 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ff8d8cf8b-qx8qg" event={"ID":"b41a1242-45fc-4a75-850b-8b0732873987","Type":"ContainerDied","Data":"17734e5db296a1112832ee1fec0a82dc416a216f61ede9129566b9f19b43668e"} Dec 06 07:19:40 crc kubenswrapper[4954]: I1206 07:19:40.947272 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a6b3ac97-ce01-4110-9dd5-fee903dd5204","Type":"ContainerStarted","Data":"242e83ae58481e6672ec1431f57fd8aa94eb2721e13b54b3a8eb1f6bf979046a"} Dec 06 07:19:41 crc kubenswrapper[4954]: I1206 07:19:41.973487 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a6b3ac97-ce01-4110-9dd5-fee903dd5204","Type":"ContainerStarted","Data":"25aaea2454949bcf5aa44f34e823a052575eba578039116af36245405f9d00cf"} Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.316719 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.321937 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.324179 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.325178 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-s44tz" Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.326159 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.335166 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.438731 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-openstack-config\") pod \"openstackclient\" (UID: \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\") " pod="openstack/openstackclient" Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.438901 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-openstack-config-secret\") pod \"openstackclient\" (UID: \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\") " pod="openstack/openstackclient" Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.438961 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmnf9\" (UniqueName: \"kubernetes.io/projected/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-kube-api-access-fmnf9\") pod \"openstackclient\" (UID: \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\") " pod="openstack/openstackclient" Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.439016 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\") " pod="openstack/openstackclient" Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.540919 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-openstack-config-secret\") pod \"openstackclient\" (UID: \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\") " pod="openstack/openstackclient" Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.541399 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmnf9\" (UniqueName: \"kubernetes.io/projected/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-kube-api-access-fmnf9\") pod \"openstackclient\" (UID: \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\") " pod="openstack/openstackclient" Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.541454 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\") " pod="openstack/openstackclient" Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.541540 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-openstack-config\") pod \"openstackclient\" (UID: \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\") " pod="openstack/openstackclient" Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.544990 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-openstack-config\") pod \"openstackclient\" (UID: \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\") " pod="openstack/openstackclient" Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.548308 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-openstack-config-secret\") pod \"openstackclient\" (UID: \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\") " pod="openstack/openstackclient" Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.564576 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\") " pod="openstack/openstackclient" Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.564707 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmnf9\" (UniqueName: \"kubernetes.io/projected/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-kube-api-access-fmnf9\") pod \"openstackclient\" (UID: \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\") " pod="openstack/openstackclient" Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.643247 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.779130 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:42 crc kubenswrapper[4954]: I1206 07:19:42.828423 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:19:43 crc kubenswrapper[4954]: I1206 07:19:43.018984 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-79d7899494-4fjgd"] Dec 06 07:19:43 crc kubenswrapper[4954]: I1206 07:19:43.020149 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-79d7899494-4fjgd" podUID="248b92ff-f8e8-40a1-beec-1d456e88d1c2" containerName="barbican-api" containerID="cri-o://1207c20e070bbe0dfa107a558dbce71db7570d2952c65d92663ef3e75560ed22" gracePeriod=30 Dec 06 07:19:43 crc kubenswrapper[4954]: I1206 07:19:43.019501 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-79d7899494-4fjgd" podUID="248b92ff-f8e8-40a1-beec-1d456e88d1c2" containerName="barbican-api-log" containerID="cri-o://2ac89b762f89e75447fd0e4d6776c27b6e965739788a28fa0731c202415f691e" gracePeriod=30 Dec 06 07:19:43 crc kubenswrapper[4954]: I1206 07:19:43.079725 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a6b3ac97-ce01-4110-9dd5-fee903dd5204","Type":"ContainerStarted","Data":"dde3cc2c35c6b947b115a84354f88ab0c259ca23e0659e70ff1f7de4d80de96e"} Dec 06 07:19:43 crc kubenswrapper[4954]: I1206 07:19:43.136251 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.136228892 podStartE2EDuration="4.136228892s" podCreationTimestamp="2025-12-06 07:19:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:19:43.121076925 +0000 UTC m=+1357.934436314" watchObservedRunningTime="2025-12-06 07:19:43.136228892 +0000 UTC m=+1357.949588281" Dec 06 07:19:43 crc kubenswrapper[4954]: I1206 07:19:43.385242 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 06 07:19:44 crc kubenswrapper[4954]: I1206 07:19:44.110288 4954 generic.go:334] "Generic (PLEG): container finished" podID="248b92ff-f8e8-40a1-beec-1d456e88d1c2" containerID="2ac89b762f89e75447fd0e4d6776c27b6e965739788a28fa0731c202415f691e" exitCode=143 Dec 06 07:19:44 crc kubenswrapper[4954]: I1206 07:19:44.111157 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d7899494-4fjgd" event={"ID":"248b92ff-f8e8-40a1-beec-1d456e88d1c2","Type":"ContainerDied","Data":"2ac89b762f89e75447fd0e4d6776c27b6e965739788a28fa0731c202415f691e"} Dec 06 07:19:44 crc kubenswrapper[4954]: I1206 07:19:44.125636 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e","Type":"ContainerStarted","Data":"231e9b3a26a00bf8da3f377a1857afba59e57eebd4afccffa1379f2bb0de3caf"} Dec 06 07:19:44 crc kubenswrapper[4954]: I1206 07:19:44.635837 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 06 07:19:44 crc kubenswrapper[4954]: I1206 07:19:44.949818 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.040038 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-combined-ca-bundle\") pod \"b41a1242-45fc-4a75-850b-8b0732873987\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.040115 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-httpd-config\") pod \"b41a1242-45fc-4a75-850b-8b0732873987\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.040241 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-config\") pod \"b41a1242-45fc-4a75-850b-8b0732873987\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.040288 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-ovndb-tls-certs\") pod \"b41a1242-45fc-4a75-850b-8b0732873987\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.040328 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9npr2\" (UniqueName: \"kubernetes.io/projected/b41a1242-45fc-4a75-850b-8b0732873987-kube-api-access-9npr2\") pod \"b41a1242-45fc-4a75-850b-8b0732873987\" (UID: \"b41a1242-45fc-4a75-850b-8b0732873987\") " Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.050686 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b41a1242-45fc-4a75-850b-8b0732873987-kube-api-access-9npr2" (OuterVolumeSpecName: "kube-api-access-9npr2") pod "b41a1242-45fc-4a75-850b-8b0732873987" (UID: "b41a1242-45fc-4a75-850b-8b0732873987"). InnerVolumeSpecName "kube-api-access-9npr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.059965 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b41a1242-45fc-4a75-850b-8b0732873987" (UID: "b41a1242-45fc-4a75-850b-8b0732873987"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.140398 4954 generic.go:334] "Generic (PLEG): container finished" podID="b41a1242-45fc-4a75-850b-8b0732873987" containerID="13a279add44a664cd0e5027ab334fe72d8f8e781aab77d9e967476e881a64f22" exitCode=0 Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.141768 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ff8d8cf8b-qx8qg" event={"ID":"b41a1242-45fc-4a75-850b-8b0732873987","Type":"ContainerDied","Data":"13a279add44a664cd0e5027ab334fe72d8f8e781aab77d9e967476e881a64f22"} Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.141826 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ff8d8cf8b-qx8qg" event={"ID":"b41a1242-45fc-4a75-850b-8b0732873987","Type":"ContainerDied","Data":"8fb4a70b034d55c04a0f570cc7f1dcb24563adbff10cb69861b18e631a6e65e4"} Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.141826 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ff8d8cf8b-qx8qg" Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.141850 4954 scope.go:117] "RemoveContainer" containerID="17734e5db296a1112832ee1fec0a82dc416a216f61ede9129566b9f19b43668e" Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.143641 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9npr2\" (UniqueName: \"kubernetes.io/projected/b41a1242-45fc-4a75-850b-8b0732873987-kube-api-access-9npr2\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.143659 4954 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.170902 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b41a1242-45fc-4a75-850b-8b0732873987" (UID: "b41a1242-45fc-4a75-850b-8b0732873987"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.180525 4954 scope.go:117] "RemoveContainer" containerID="13a279add44a664cd0e5027ab334fe72d8f8e781aab77d9e967476e881a64f22" Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.194095 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "b41a1242-45fc-4a75-850b-8b0732873987" (UID: "b41a1242-45fc-4a75-850b-8b0732873987"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.227780 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-config" (OuterVolumeSpecName: "config") pod "b41a1242-45fc-4a75-850b-8b0732873987" (UID: "b41a1242-45fc-4a75-850b-8b0732873987"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.233126 4954 scope.go:117] "RemoveContainer" containerID="17734e5db296a1112832ee1fec0a82dc416a216f61ede9129566b9f19b43668e" Dec 06 07:19:45 crc kubenswrapper[4954]: E1206 07:19:45.234400 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17734e5db296a1112832ee1fec0a82dc416a216f61ede9129566b9f19b43668e\": container with ID starting with 17734e5db296a1112832ee1fec0a82dc416a216f61ede9129566b9f19b43668e not found: ID does not exist" containerID="17734e5db296a1112832ee1fec0a82dc416a216f61ede9129566b9f19b43668e" Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.234440 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17734e5db296a1112832ee1fec0a82dc416a216f61ede9129566b9f19b43668e"} err="failed to get container status \"17734e5db296a1112832ee1fec0a82dc416a216f61ede9129566b9f19b43668e\": rpc error: code = NotFound desc = could not find container \"17734e5db296a1112832ee1fec0a82dc416a216f61ede9129566b9f19b43668e\": container with ID starting with 17734e5db296a1112832ee1fec0a82dc416a216f61ede9129566b9f19b43668e not found: ID does not exist" Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.234463 4954 scope.go:117] "RemoveContainer" containerID="13a279add44a664cd0e5027ab334fe72d8f8e781aab77d9e967476e881a64f22" Dec 06 07:19:45 crc kubenswrapper[4954]: E1206 07:19:45.234852 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13a279add44a664cd0e5027ab334fe72d8f8e781aab77d9e967476e881a64f22\": container with ID starting with 13a279add44a664cd0e5027ab334fe72d8f8e781aab77d9e967476e881a64f22 not found: ID does not exist" containerID="13a279add44a664cd0e5027ab334fe72d8f8e781aab77d9e967476e881a64f22" Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.234884 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a279add44a664cd0e5027ab334fe72d8f8e781aab77d9e967476e881a64f22"} err="failed to get container status \"13a279add44a664cd0e5027ab334fe72d8f8e781aab77d9e967476e881a64f22\": rpc error: code = NotFound desc = could not find container \"13a279add44a664cd0e5027ab334fe72d8f8e781aab77d9e967476e881a64f22\": container with ID starting with 13a279add44a664cd0e5027ab334fe72d8f8e781aab77d9e967476e881a64f22 not found: ID does not exist" Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.245914 4954 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.245950 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.245961 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b41a1242-45fc-4a75-850b-8b0732873987-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.498497 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ff8d8cf8b-qx8qg"] Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.519789 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ff8d8cf8b-qx8qg"] Dec 06 07:19:45 crc kubenswrapper[4954]: I1206 07:19:45.921038 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 06 07:19:46 crc kubenswrapper[4954]: I1206 07:19:46.234395 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79d7899494-4fjgd" podUID="248b92ff-f8e8-40a1-beec-1d456e88d1c2" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:44948->10.217.0.161:9311: read: connection reset by peer" Dec 06 07:19:46 crc kubenswrapper[4954]: I1206 07:19:46.235230 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79d7899494-4fjgd" podUID="248b92ff-f8e8-40a1-beec-1d456e88d1c2" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:44936->10.217.0.161:9311: read: connection reset by peer" Dec 06 07:19:46 crc kubenswrapper[4954]: I1206 07:19:46.815679 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:46 crc kubenswrapper[4954]: I1206 07:19:46.889500 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/248b92ff-f8e8-40a1-beec-1d456e88d1c2-combined-ca-bundle\") pod \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " Dec 06 07:19:46 crc kubenswrapper[4954]: I1206 07:19:46.889598 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/248b92ff-f8e8-40a1-beec-1d456e88d1c2-logs\") pod \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " Dec 06 07:19:46 crc kubenswrapper[4954]: I1206 07:19:46.889800 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/248b92ff-f8e8-40a1-beec-1d456e88d1c2-config-data\") pod \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " Dec 06 07:19:46 crc kubenswrapper[4954]: I1206 07:19:46.889920 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b24jd\" (UniqueName: \"kubernetes.io/projected/248b92ff-f8e8-40a1-beec-1d456e88d1c2-kube-api-access-b24jd\") pod \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " Dec 06 07:19:46 crc kubenswrapper[4954]: I1206 07:19:46.889961 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/248b92ff-f8e8-40a1-beec-1d456e88d1c2-config-data-custom\") pod \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\" (UID: \"248b92ff-f8e8-40a1-beec-1d456e88d1c2\") " Dec 06 07:19:46 crc kubenswrapper[4954]: I1206 07:19:46.891419 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/248b92ff-f8e8-40a1-beec-1d456e88d1c2-logs" (OuterVolumeSpecName: "logs") pod "248b92ff-f8e8-40a1-beec-1d456e88d1c2" (UID: "248b92ff-f8e8-40a1-beec-1d456e88d1c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:19:46 crc kubenswrapper[4954]: I1206 07:19:46.899220 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/248b92ff-f8e8-40a1-beec-1d456e88d1c2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "248b92ff-f8e8-40a1-beec-1d456e88d1c2" (UID: "248b92ff-f8e8-40a1-beec-1d456e88d1c2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:46 crc kubenswrapper[4954]: I1206 07:19:46.924905 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/248b92ff-f8e8-40a1-beec-1d456e88d1c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "248b92ff-f8e8-40a1-beec-1d456e88d1c2" (UID: "248b92ff-f8e8-40a1-beec-1d456e88d1c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:46 crc kubenswrapper[4954]: I1206 07:19:46.931924 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/248b92ff-f8e8-40a1-beec-1d456e88d1c2-kube-api-access-b24jd" (OuterVolumeSpecName: "kube-api-access-b24jd") pod "248b92ff-f8e8-40a1-beec-1d456e88d1c2" (UID: "248b92ff-f8e8-40a1-beec-1d456e88d1c2"). InnerVolumeSpecName "kube-api-access-b24jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:19:46 crc kubenswrapper[4954]: I1206 07:19:46.958773 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/248b92ff-f8e8-40a1-beec-1d456e88d1c2-config-data" (OuterVolumeSpecName: "config-data") pod "248b92ff-f8e8-40a1-beec-1d456e88d1c2" (UID: "248b92ff-f8e8-40a1-beec-1d456e88d1c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:46 crc kubenswrapper[4954]: I1206 07:19:46.996275 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/248b92ff-f8e8-40a1-beec-1d456e88d1c2-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:46 crc kubenswrapper[4954]: I1206 07:19:46.996337 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b24jd\" (UniqueName: \"kubernetes.io/projected/248b92ff-f8e8-40a1-beec-1d456e88d1c2-kube-api-access-b24jd\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:46 crc kubenswrapper[4954]: I1206 07:19:46.996356 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/248b92ff-f8e8-40a1-beec-1d456e88d1c2-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:46 crc kubenswrapper[4954]: I1206 07:19:46.996367 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/248b92ff-f8e8-40a1-beec-1d456e88d1c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:46 crc kubenswrapper[4954]: I1206 07:19:46.996378 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/248b92ff-f8e8-40a1-beec-1d456e88d1c2-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:47 crc kubenswrapper[4954]: I1206 07:19:47.184229 4954 generic.go:334] "Generic (PLEG): container finished" podID="248b92ff-f8e8-40a1-beec-1d456e88d1c2" containerID="1207c20e070bbe0dfa107a558dbce71db7570d2952c65d92663ef3e75560ed22" exitCode=0 Dec 06 07:19:47 crc kubenswrapper[4954]: I1206 07:19:47.184285 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d7899494-4fjgd" event={"ID":"248b92ff-f8e8-40a1-beec-1d456e88d1c2","Type":"ContainerDied","Data":"1207c20e070bbe0dfa107a558dbce71db7570d2952c65d92663ef3e75560ed22"} Dec 06 07:19:47 crc kubenswrapper[4954]: I1206 07:19:47.184292 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79d7899494-4fjgd" Dec 06 07:19:47 crc kubenswrapper[4954]: I1206 07:19:47.184328 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d7899494-4fjgd" event={"ID":"248b92ff-f8e8-40a1-beec-1d456e88d1c2","Type":"ContainerDied","Data":"9bf9a672335a9f4cc7a899571de11fb912c8c743a935ada8059d09b3dce90040"} Dec 06 07:19:47 crc kubenswrapper[4954]: I1206 07:19:47.184350 4954 scope.go:117] "RemoveContainer" containerID="1207c20e070bbe0dfa107a558dbce71db7570d2952c65d92663ef3e75560ed22" Dec 06 07:19:47 crc kubenswrapper[4954]: I1206 07:19:47.236628 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-79d7899494-4fjgd"] Dec 06 07:19:47 crc kubenswrapper[4954]: I1206 07:19:47.242404 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-79d7899494-4fjgd"] Dec 06 07:19:47 crc kubenswrapper[4954]: I1206 07:19:47.253794 4954 scope.go:117] "RemoveContainer" containerID="2ac89b762f89e75447fd0e4d6776c27b6e965739788a28fa0731c202415f691e" Dec 06 07:19:47 crc kubenswrapper[4954]: I1206 07:19:47.278182 4954 scope.go:117] "RemoveContainer" containerID="1207c20e070bbe0dfa107a558dbce71db7570d2952c65d92663ef3e75560ed22" Dec 06 07:19:47 crc kubenswrapper[4954]: E1206 07:19:47.278579 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1207c20e070bbe0dfa107a558dbce71db7570d2952c65d92663ef3e75560ed22\": container with ID starting with 1207c20e070bbe0dfa107a558dbce71db7570d2952c65d92663ef3e75560ed22 not found: ID does not exist" containerID="1207c20e070bbe0dfa107a558dbce71db7570d2952c65d92663ef3e75560ed22" Dec 06 07:19:47 crc kubenswrapper[4954]: I1206 07:19:47.278612 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1207c20e070bbe0dfa107a558dbce71db7570d2952c65d92663ef3e75560ed22"} err="failed to get container status \"1207c20e070bbe0dfa107a558dbce71db7570d2952c65d92663ef3e75560ed22\": rpc error: code = NotFound desc = could not find container \"1207c20e070bbe0dfa107a558dbce71db7570d2952c65d92663ef3e75560ed22\": container with ID starting with 1207c20e070bbe0dfa107a558dbce71db7570d2952c65d92663ef3e75560ed22 not found: ID does not exist" Dec 06 07:19:47 crc kubenswrapper[4954]: I1206 07:19:47.278635 4954 scope.go:117] "RemoveContainer" containerID="2ac89b762f89e75447fd0e4d6776c27b6e965739788a28fa0731c202415f691e" Dec 06 07:19:47 crc kubenswrapper[4954]: E1206 07:19:47.279005 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ac89b762f89e75447fd0e4d6776c27b6e965739788a28fa0731c202415f691e\": container with ID starting with 2ac89b762f89e75447fd0e4d6776c27b6e965739788a28fa0731c202415f691e not found: ID does not exist" containerID="2ac89b762f89e75447fd0e4d6776c27b6e965739788a28fa0731c202415f691e" Dec 06 07:19:47 crc kubenswrapper[4954]: I1206 07:19:47.279027 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ac89b762f89e75447fd0e4d6776c27b6e965739788a28fa0731c202415f691e"} err="failed to get container status \"2ac89b762f89e75447fd0e4d6776c27b6e965739788a28fa0731c202415f691e\": rpc error: code = NotFound desc = could not find container \"2ac89b762f89e75447fd0e4d6776c27b6e965739788a28fa0731c202415f691e\": container with ID starting with 2ac89b762f89e75447fd0e4d6776c27b6e965739788a28fa0731c202415f691e not found: ID does not exist" Dec 06 07:19:47 crc kubenswrapper[4954]: I1206 07:19:47.466522 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="248b92ff-f8e8-40a1-beec-1d456e88d1c2" path="/var/lib/kubelet/pods/248b92ff-f8e8-40a1-beec-1d456e88d1c2/volumes" Dec 06 07:19:47 crc kubenswrapper[4954]: I1206 07:19:47.468714 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b41a1242-45fc-4a75-850b-8b0732873987" path="/var/lib/kubelet/pods/b41a1242-45fc-4a75-850b-8b0732873987/volumes" Dec 06 07:19:49 crc kubenswrapper[4954]: I1206 07:19:49.882891 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 06 07:19:49 crc kubenswrapper[4954]: I1206 07:19:49.923677 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-74985f58f-cpdl4"] Dec 06 07:19:49 crc kubenswrapper[4954]: E1206 07:19:49.924216 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41a1242-45fc-4a75-850b-8b0732873987" containerName="neutron-httpd" Dec 06 07:19:49 crc kubenswrapper[4954]: I1206 07:19:49.924240 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41a1242-45fc-4a75-850b-8b0732873987" containerName="neutron-httpd" Dec 06 07:19:49 crc kubenswrapper[4954]: E1206 07:19:49.924263 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41a1242-45fc-4a75-850b-8b0732873987" containerName="neutron-api" Dec 06 07:19:49 crc kubenswrapper[4954]: I1206 07:19:49.924270 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41a1242-45fc-4a75-850b-8b0732873987" containerName="neutron-api" Dec 06 07:19:49 crc kubenswrapper[4954]: E1206 07:19:49.924288 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="248b92ff-f8e8-40a1-beec-1d456e88d1c2" containerName="barbican-api" Dec 06 07:19:49 crc kubenswrapper[4954]: I1206 07:19:49.924295 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="248b92ff-f8e8-40a1-beec-1d456e88d1c2" containerName="barbican-api" Dec 06 07:19:49 crc kubenswrapper[4954]: E1206 07:19:49.924314 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="248b92ff-f8e8-40a1-beec-1d456e88d1c2" containerName="barbican-api-log" Dec 06 07:19:49 crc kubenswrapper[4954]: I1206 07:19:49.924321 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="248b92ff-f8e8-40a1-beec-1d456e88d1c2" containerName="barbican-api-log" Dec 06 07:19:49 crc kubenswrapper[4954]: I1206 07:19:49.924544 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="248b92ff-f8e8-40a1-beec-1d456e88d1c2" containerName="barbican-api" Dec 06 07:19:49 crc kubenswrapper[4954]: I1206 07:19:49.924583 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b41a1242-45fc-4a75-850b-8b0732873987" containerName="neutron-api" Dec 06 07:19:49 crc kubenswrapper[4954]: I1206 07:19:49.924599 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b41a1242-45fc-4a75-850b-8b0732873987" containerName="neutron-httpd" Dec 06 07:19:49 crc kubenswrapper[4954]: I1206 07:19:49.924609 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="248b92ff-f8e8-40a1-beec-1d456e88d1c2" containerName="barbican-api-log" Dec 06 07:19:49 crc kubenswrapper[4954]: I1206 07:19:49.926262 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:49 crc kubenswrapper[4954]: I1206 07:19:49.929490 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 06 07:19:49 crc kubenswrapper[4954]: I1206 07:19:49.932462 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 06 07:19:49 crc kubenswrapper[4954]: I1206 07:19:49.932941 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 06 07:19:49 crc kubenswrapper[4954]: I1206 07:19:49.959422 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-74985f58f-cpdl4"] Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.118057 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/069bb9c2-be47-445b-ba0e-f32a7db0b96e-etc-swift\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.118157 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/069bb9c2-be47-445b-ba0e-f32a7db0b96e-run-httpd\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.118236 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-internal-tls-certs\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.118309 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-combined-ca-bundle\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.118423 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-public-tls-certs\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.118698 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw8mq\" (UniqueName: \"kubernetes.io/projected/069bb9c2-be47-445b-ba0e-f32a7db0b96e-kube-api-access-nw8mq\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.118824 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-config-data\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.118860 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/069bb9c2-be47-445b-ba0e-f32a7db0b96e-log-httpd\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.220970 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/069bb9c2-be47-445b-ba0e-f32a7db0b96e-etc-swift\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.221067 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/069bb9c2-be47-445b-ba0e-f32a7db0b96e-run-httpd\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.221132 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-internal-tls-certs\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.221214 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-combined-ca-bundle\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.221352 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-public-tls-certs\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.221413 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw8mq\" (UniqueName: \"kubernetes.io/projected/069bb9c2-be47-445b-ba0e-f32a7db0b96e-kube-api-access-nw8mq\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.221449 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-config-data\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.221468 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/069bb9c2-be47-445b-ba0e-f32a7db0b96e-log-httpd\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.221686 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/069bb9c2-be47-445b-ba0e-f32a7db0b96e-run-httpd\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.222642 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/069bb9c2-be47-445b-ba0e-f32a7db0b96e-log-httpd\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.229620 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-internal-tls-certs\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.230096 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/069bb9c2-be47-445b-ba0e-f32a7db0b96e-etc-swift\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.230511 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-combined-ca-bundle\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.231048 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-public-tls-certs\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.233437 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-config-data\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.244598 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw8mq\" (UniqueName: \"kubernetes.io/projected/069bb9c2-be47-445b-ba0e-f32a7db0b96e-kube-api-access-nw8mq\") pod \"swift-proxy-74985f58f-cpdl4\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:50 crc kubenswrapper[4954]: I1206 07:19:50.253970 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:52 crc kubenswrapper[4954]: I1206 07:19:52.605143 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 06 07:19:53 crc kubenswrapper[4954]: I1206 07:19:53.192043 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:19:53 crc kubenswrapper[4954]: I1206 07:19:53.253908 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerName="proxy-httpd" containerID="cri-o://8f740bed86f364396934d489e5587539e42c6d9128cd6017eda216b8f5cb484d" gracePeriod=30 Dec 06 07:19:53 crc kubenswrapper[4954]: I1206 07:19:53.253925 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerName="ceilometer-notification-agent" containerID="cri-o://bb382319fd4cfb39429bf60db7013e969ccd10993d2d5b1a011f5d407a6ea5a6" gracePeriod=30 Dec 06 07:19:53 crc kubenswrapper[4954]: I1206 07:19:53.253934 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerName="sg-core" containerID="cri-o://d819d54650536b1bfd14f1cfe21926f4e986c9310263dd229be99b1809a662a6" gracePeriod=30 Dec 06 07:19:53 crc kubenswrapper[4954]: I1206 07:19:53.254252 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerName="ceilometer-central-agent" containerID="cri-o://8da004a7208cdb90780b4b06432c526d53dea4cd840585281269b19782f9fd6e" gracePeriod=30 Dec 06 07:19:54 crc kubenswrapper[4954]: I1206 07:19:54.279159 4954 generic.go:334] "Generic (PLEG): container finished" podID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerID="8f740bed86f364396934d489e5587539e42c6d9128cd6017eda216b8f5cb484d" exitCode=0 Dec 06 07:19:54 crc kubenswrapper[4954]: I1206 07:19:54.279522 4954 generic.go:334] "Generic (PLEG): container finished" podID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerID="d819d54650536b1bfd14f1cfe21926f4e986c9310263dd229be99b1809a662a6" exitCode=2 Dec 06 07:19:54 crc kubenswrapper[4954]: I1206 07:19:54.279534 4954 generic.go:334] "Generic (PLEG): container finished" podID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerID="8da004a7208cdb90780b4b06432c526d53dea4cd840585281269b19782f9fd6e" exitCode=0 Dec 06 07:19:54 crc kubenswrapper[4954]: I1206 07:19:54.279196 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1","Type":"ContainerDied","Data":"8f740bed86f364396934d489e5587539e42c6d9128cd6017eda216b8f5cb484d"} Dec 06 07:19:54 crc kubenswrapper[4954]: I1206 07:19:54.279603 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1","Type":"ContainerDied","Data":"d819d54650536b1bfd14f1cfe21926f4e986c9310263dd229be99b1809a662a6"} Dec 06 07:19:54 crc kubenswrapper[4954]: I1206 07:19:54.279627 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1","Type":"ContainerDied","Data":"8da004a7208cdb90780b4b06432c526d53dea4cd840585281269b19782f9fd6e"} Dec 06 07:19:55 crc kubenswrapper[4954]: I1206 07:19:55.590618 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 07:19:55 crc kubenswrapper[4954]: I1206 07:19:55.590936 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="3d6a09b0-a359-48db-9682-88114d28e3d9" containerName="kube-state-metrics" containerID="cri-o://c79369a269d20fe6e8e33c11b23244d21c6b3f4b11f43329f758a9d72ccb557c" gracePeriod=30 Dec 06 07:19:56 crc kubenswrapper[4954]: I1206 07:19:56.305019 4954 generic.go:334] "Generic (PLEG): container finished" podID="3d6a09b0-a359-48db-9682-88114d28e3d9" containerID="c79369a269d20fe6e8e33c11b23244d21c6b3f4b11f43329f758a9d72ccb557c" exitCode=2 Dec 06 07:19:56 crc kubenswrapper[4954]: I1206 07:19:56.305107 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3d6a09b0-a359-48db-9682-88114d28e3d9","Type":"ContainerDied","Data":"c79369a269d20fe6e8e33c11b23244d21c6b3f4b11f43329f758a9d72ccb557c"} Dec 06 07:19:57 crc kubenswrapper[4954]: I1206 07:19:57.741274 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 07:19:57 crc kubenswrapper[4954]: I1206 07:19:57.845864 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8th6\" (UniqueName: \"kubernetes.io/projected/3d6a09b0-a359-48db-9682-88114d28e3d9-kube-api-access-r8th6\") pod \"3d6a09b0-a359-48db-9682-88114d28e3d9\" (UID: \"3d6a09b0-a359-48db-9682-88114d28e3d9\") " Dec 06 07:19:57 crc kubenswrapper[4954]: I1206 07:19:57.858080 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d6a09b0-a359-48db-9682-88114d28e3d9-kube-api-access-r8th6" (OuterVolumeSpecName: "kube-api-access-r8th6") pod "3d6a09b0-a359-48db-9682-88114d28e3d9" (UID: "3d6a09b0-a359-48db-9682-88114d28e3d9"). InnerVolumeSpecName "kube-api-access-r8th6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:19:57 crc kubenswrapper[4954]: I1206 07:19:57.948205 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:19:57 crc kubenswrapper[4954]: I1206 07:19:57.958811 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8th6\" (UniqueName: \"kubernetes.io/projected/3d6a09b0-a359-48db-9682-88114d28e3d9-kube-api-access-r8th6\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.061283 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-combined-ca-bundle\") pod \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.062783 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-sg-core-conf-yaml\") pod \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.062937 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-scripts\") pod \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.063068 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-log-httpd\") pod \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.063127 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-config-data\") pod \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.063179 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-run-httpd\") pod \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.063625 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64474\" (UniqueName: \"kubernetes.io/projected/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-kube-api-access-64474\") pod \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\" (UID: \"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1\") " Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.065393 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" (UID: "67ac7d91-1d01-4997-8d95-2ca7e74c2ae1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.070222 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-kube-api-access-64474" (OuterVolumeSpecName: "kube-api-access-64474") pod "67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" (UID: "67ac7d91-1d01-4997-8d95-2ca7e74c2ae1"). InnerVolumeSpecName "kube-api-access-64474". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.074201 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" (UID: "67ac7d91-1d01-4997-8d95-2ca7e74c2ae1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.074777 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-scripts" (OuterVolumeSpecName: "scripts") pod "67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" (UID: "67ac7d91-1d01-4997-8d95-2ca7e74c2ae1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.116122 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" (UID: "67ac7d91-1d01-4997-8d95-2ca7e74c2ae1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.137371 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-74985f58f-cpdl4"] Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.166326 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.166365 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64474\" (UniqueName: \"kubernetes.io/projected/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-kube-api-access-64474\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.166375 4954 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.166386 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.166394 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.209968 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" (UID: "67ac7d91-1d01-4997-8d95-2ca7e74c2ae1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.268874 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.309793 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-config-data" (OuterVolumeSpecName: "config-data") pod "67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" (UID: "67ac7d91-1d01-4997-8d95-2ca7e74c2ae1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.342571 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74985f58f-cpdl4" event={"ID":"069bb9c2-be47-445b-ba0e-f32a7db0b96e","Type":"ContainerStarted","Data":"51421937a8141e0b37995b5aad66e605467b9e0ff73182c1165ee1cc868635ca"} Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.345382 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3d6a09b0-a359-48db-9682-88114d28e3d9","Type":"ContainerDied","Data":"ef343d324936bc75f5611b7e81b73e2669d36961efef4ebb7a3a60185758d0de"} Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.345450 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.345460 4954 scope.go:117] "RemoveContainer" containerID="c79369a269d20fe6e8e33c11b23244d21c6b3f4b11f43329f758a9d72ccb557c" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.354704 4954 generic.go:334] "Generic (PLEG): container finished" podID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerID="bb382319fd4cfb39429bf60db7013e969ccd10993d2d5b1a011f5d407a6ea5a6" exitCode=0 Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.354775 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1","Type":"ContainerDied","Data":"bb382319fd4cfb39429bf60db7013e969ccd10993d2d5b1a011f5d407a6ea5a6"} Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.354849 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67ac7d91-1d01-4997-8d95-2ca7e74c2ae1","Type":"ContainerDied","Data":"47fa2c18299344862be26695529db49df9522857e530254505cd47960b91bd42"} Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.355534 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.367229 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e","Type":"ContainerStarted","Data":"084e897542d772a7c89dc0e1534dec1824e39ef33265b2908249460a05917c64"} Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.373324 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.391535 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.578172902 podStartE2EDuration="16.391513587s" podCreationTimestamp="2025-12-06 07:19:42 +0000 UTC" firstStartedPulling="2025-12-06 07:19:43.398277875 +0000 UTC m=+1358.211637264" lastFinishedPulling="2025-12-06 07:19:57.21161856 +0000 UTC m=+1372.024977949" observedRunningTime="2025-12-06 07:19:58.389297148 +0000 UTC m=+1373.202656537" watchObservedRunningTime="2025-12-06 07:19:58.391513587 +0000 UTC m=+1373.204872976" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.407671 4954 scope.go:117] "RemoveContainer" containerID="8f740bed86f364396934d489e5587539e42c6d9128cd6017eda216b8f5cb484d" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.428646 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.441134 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.447713 4954 scope.go:117] "RemoveContainer" containerID="d819d54650536b1bfd14f1cfe21926f4e986c9310263dd229be99b1809a662a6" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.466165 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.494242 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.497994 4954 scope.go:117] "RemoveContainer" containerID="bb382319fd4cfb39429bf60db7013e969ccd10993d2d5b1a011f5d407a6ea5a6" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.509703 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:19:58 crc kubenswrapper[4954]: E1206 07:19:58.510351 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6a09b0-a359-48db-9682-88114d28e3d9" containerName="kube-state-metrics" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.510380 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6a09b0-a359-48db-9682-88114d28e3d9" containerName="kube-state-metrics" Dec 06 07:19:58 crc kubenswrapper[4954]: E1206 07:19:58.510406 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerName="ceilometer-notification-agent" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.510418 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerName="ceilometer-notification-agent" Dec 06 07:19:58 crc kubenswrapper[4954]: E1206 07:19:58.510454 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerName="proxy-httpd" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.510478 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerName="proxy-httpd" Dec 06 07:19:58 crc kubenswrapper[4954]: E1206 07:19:58.510495 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerName="ceilometer-central-agent" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.510503 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerName="ceilometer-central-agent" Dec 06 07:19:58 crc kubenswrapper[4954]: E1206 07:19:58.510524 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerName="sg-core" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.510531 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerName="sg-core" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.510911 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerName="sg-core" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.510940 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6a09b0-a359-48db-9682-88114d28e3d9" containerName="kube-state-metrics" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.510951 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerName="proxy-httpd" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.510964 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerName="ceilometer-notification-agent" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.510990 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" containerName="ceilometer-central-agent" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.516002 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.521750 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-6dh8g" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.522004 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.522157 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.546525 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.548482 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.550392 4954 scope.go:117] "RemoveContainer" containerID="8da004a7208cdb90780b4b06432c526d53dea4cd840585281269b19782f9fd6e" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.553493 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.554384 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.569890 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.578429 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ff9e434-1b93-4cf9-b51b-b612da62e596-log-httpd\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.578522 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\") " pod="openstack/kube-state-metrics-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.578634 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pczzb\" (UniqueName: \"kubernetes.io/projected/1ff9e434-1b93-4cf9-b51b-b612da62e596-kube-api-access-pczzb\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.578694 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ff9e434-1b93-4cf9-b51b-b612da62e596-run-httpd\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.578792 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\") " pod="openstack/kube-state-metrics-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.578931 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmtc9\" (UniqueName: \"kubernetes.io/projected/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-kube-api-access-fmtc9\") pod \"kube-state-metrics-0\" (UID: \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\") " pod="openstack/kube-state-metrics-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.578977 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.579010 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.579141 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-scripts\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.579205 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\") " pod="openstack/kube-state-metrics-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.579232 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-config-data\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.599277 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.604648 4954 scope.go:117] "RemoveContainer" containerID="8f740bed86f364396934d489e5587539e42c6d9128cd6017eda216b8f5cb484d" Dec 06 07:19:58 crc kubenswrapper[4954]: E1206 07:19:58.615954 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f740bed86f364396934d489e5587539e42c6d9128cd6017eda216b8f5cb484d\": container with ID starting with 8f740bed86f364396934d489e5587539e42c6d9128cd6017eda216b8f5cb484d not found: ID does not exist" containerID="8f740bed86f364396934d489e5587539e42c6d9128cd6017eda216b8f5cb484d" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.616019 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f740bed86f364396934d489e5587539e42c6d9128cd6017eda216b8f5cb484d"} err="failed to get container status \"8f740bed86f364396934d489e5587539e42c6d9128cd6017eda216b8f5cb484d\": rpc error: code = NotFound desc = could not find container \"8f740bed86f364396934d489e5587539e42c6d9128cd6017eda216b8f5cb484d\": container with ID starting with 8f740bed86f364396934d489e5587539e42c6d9128cd6017eda216b8f5cb484d not found: ID does not exist" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.616051 4954 scope.go:117] "RemoveContainer" containerID="d819d54650536b1bfd14f1cfe21926f4e986c9310263dd229be99b1809a662a6" Dec 06 07:19:58 crc kubenswrapper[4954]: E1206 07:19:58.622640 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d819d54650536b1bfd14f1cfe21926f4e986c9310263dd229be99b1809a662a6\": container with ID starting with d819d54650536b1bfd14f1cfe21926f4e986c9310263dd229be99b1809a662a6 not found: ID does not exist" containerID="d819d54650536b1bfd14f1cfe21926f4e986c9310263dd229be99b1809a662a6" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.622669 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d819d54650536b1bfd14f1cfe21926f4e986c9310263dd229be99b1809a662a6"} err="failed to get container status \"d819d54650536b1bfd14f1cfe21926f4e986c9310263dd229be99b1809a662a6\": rpc error: code = NotFound desc = could not find container \"d819d54650536b1bfd14f1cfe21926f4e986c9310263dd229be99b1809a662a6\": container with ID starting with d819d54650536b1bfd14f1cfe21926f4e986c9310263dd229be99b1809a662a6 not found: ID does not exist" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.622693 4954 scope.go:117] "RemoveContainer" containerID="bb382319fd4cfb39429bf60db7013e969ccd10993d2d5b1a011f5d407a6ea5a6" Dec 06 07:19:58 crc kubenswrapper[4954]: E1206 07:19:58.623063 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb382319fd4cfb39429bf60db7013e969ccd10993d2d5b1a011f5d407a6ea5a6\": container with ID starting with bb382319fd4cfb39429bf60db7013e969ccd10993d2d5b1a011f5d407a6ea5a6 not found: ID does not exist" containerID="bb382319fd4cfb39429bf60db7013e969ccd10993d2d5b1a011f5d407a6ea5a6" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.623085 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb382319fd4cfb39429bf60db7013e969ccd10993d2d5b1a011f5d407a6ea5a6"} err="failed to get container status \"bb382319fd4cfb39429bf60db7013e969ccd10993d2d5b1a011f5d407a6ea5a6\": rpc error: code = NotFound desc = could not find container \"bb382319fd4cfb39429bf60db7013e969ccd10993d2d5b1a011f5d407a6ea5a6\": container with ID starting with bb382319fd4cfb39429bf60db7013e969ccd10993d2d5b1a011f5d407a6ea5a6 not found: ID does not exist" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.623100 4954 scope.go:117] "RemoveContainer" containerID="8da004a7208cdb90780b4b06432c526d53dea4cd840585281269b19782f9fd6e" Dec 06 07:19:58 crc kubenswrapper[4954]: E1206 07:19:58.623308 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8da004a7208cdb90780b4b06432c526d53dea4cd840585281269b19782f9fd6e\": container with ID starting with 8da004a7208cdb90780b4b06432c526d53dea4cd840585281269b19782f9fd6e not found: ID does not exist" containerID="8da004a7208cdb90780b4b06432c526d53dea4cd840585281269b19782f9fd6e" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.623338 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8da004a7208cdb90780b4b06432c526d53dea4cd840585281269b19782f9fd6e"} err="failed to get container status \"8da004a7208cdb90780b4b06432c526d53dea4cd840585281269b19782f9fd6e\": rpc error: code = NotFound desc = could not find container \"8da004a7208cdb90780b4b06432c526d53dea4cd840585281269b19782f9fd6e\": container with ID starting with 8da004a7208cdb90780b4b06432c526d53dea4cd840585281269b19782f9fd6e not found: ID does not exist" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.681775 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-scripts\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.682167 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\") " pod="openstack/kube-state-metrics-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.682327 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-config-data\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.682496 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ff9e434-1b93-4cf9-b51b-b612da62e596-log-httpd\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.682637 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\") " pod="openstack/kube-state-metrics-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.682767 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pczzb\" (UniqueName: \"kubernetes.io/projected/1ff9e434-1b93-4cf9-b51b-b612da62e596-kube-api-access-pczzb\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.682878 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ff9e434-1b93-4cf9-b51b-b612da62e596-run-httpd\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.683046 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\") " pod="openstack/kube-state-metrics-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.683164 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmtc9\" (UniqueName: \"kubernetes.io/projected/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-kube-api-access-fmtc9\") pod \"kube-state-metrics-0\" (UID: \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\") " pod="openstack/kube-state-metrics-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.683272 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.683406 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.684096 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ff9e434-1b93-4cf9-b51b-b612da62e596-log-httpd\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.684110 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ff9e434-1b93-4cf9-b51b-b612da62e596-run-httpd\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.686922 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-scripts\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.690270 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.690687 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-config-data\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.693403 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\") " pod="openstack/kube-state-metrics-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.695066 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\") " pod="openstack/kube-state-metrics-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.702671 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.708133 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\") " pod="openstack/kube-state-metrics-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.708889 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pczzb\" (UniqueName: \"kubernetes.io/projected/1ff9e434-1b93-4cf9-b51b-b612da62e596-kube-api-access-pczzb\") pod \"ceilometer-0\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.710114 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmtc9\" (UniqueName: \"kubernetes.io/projected/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-kube-api-access-fmtc9\") pod \"kube-state-metrics-0\" (UID: \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\") " pod="openstack/kube-state-metrics-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.829359 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.830451 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:19:58 crc kubenswrapper[4954]: I1206 07:19:58.886237 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 07:19:59 crc kubenswrapper[4954]: I1206 07:19:59.392019 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74985f58f-cpdl4" event={"ID":"069bb9c2-be47-445b-ba0e-f32a7db0b96e","Type":"ContainerStarted","Data":"b55c4714520851c4ac11e12722e037b67a35d180952a2ddf1903aaa5172b8c72"} Dec 06 07:19:59 crc kubenswrapper[4954]: I1206 07:19:59.392421 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74985f58f-cpdl4" event={"ID":"069bb9c2-be47-445b-ba0e-f32a7db0b96e","Type":"ContainerStarted","Data":"cb83e86e00e38d45967cf91f65a342bea85bf6723ea7014271922e9077ba5d0d"} Dec 06 07:19:59 crc kubenswrapper[4954]: I1206 07:19:59.392442 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:59 crc kubenswrapper[4954]: I1206 07:19:59.392457 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:19:59 crc kubenswrapper[4954]: I1206 07:19:59.444135 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:19:59 crc kubenswrapper[4954]: I1206 07:19:59.450140 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-74985f58f-cpdl4" podStartSLOduration=10.450124339 podStartE2EDuration="10.450124339s" podCreationTimestamp="2025-12-06 07:19:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:19:59.412443917 +0000 UTC m=+1374.225803316" watchObservedRunningTime="2025-12-06 07:19:59.450124339 +0000 UTC m=+1374.263483728" Dec 06 07:19:59 crc kubenswrapper[4954]: I1206 07:19:59.456717 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d6a09b0-a359-48db-9682-88114d28e3d9" path="/var/lib/kubelet/pods/3d6a09b0-a359-48db-9682-88114d28e3d9/volumes" Dec 06 07:19:59 crc kubenswrapper[4954]: I1206 07:19:59.457657 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67ac7d91-1d01-4997-8d95-2ca7e74c2ae1" path="/var/lib/kubelet/pods/67ac7d91-1d01-4997-8d95-2ca7e74c2ae1/volumes" Dec 06 07:19:59 crc kubenswrapper[4954]: I1206 07:19:59.543350 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 07:20:00 crc kubenswrapper[4954]: I1206 07:20:00.409152 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"938d0b4f-21d2-4972-8436-eb1fbd6db5bc","Type":"ContainerStarted","Data":"ff4a1927bb7fd5e4e0bc2af413f92d09c51402a69dd5aa94a639ab5dbab627ea"} Dec 06 07:20:00 crc kubenswrapper[4954]: I1206 07:20:00.409582 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"938d0b4f-21d2-4972-8436-eb1fbd6db5bc","Type":"ContainerStarted","Data":"766cff3da05ce40fb43cec977ccc48f30c62227c4141bf538b1adef92021baef"} Dec 06 07:20:00 crc kubenswrapper[4954]: I1206 07:20:00.411607 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ff9e434-1b93-4cf9-b51b-b612da62e596","Type":"ContainerStarted","Data":"faafb507d5c0d5a0975b20deca4daa2ebb7bd5bed652005e7e001c314a337e50"} Dec 06 07:20:00 crc kubenswrapper[4954]: I1206 07:20:00.411667 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 06 07:20:00 crc kubenswrapper[4954]: I1206 07:20:00.411682 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ff9e434-1b93-4cf9-b51b-b612da62e596","Type":"ContainerStarted","Data":"544a485423def65bacc3a329c625afb737d8e2fd72f852f6cab14c4629517c1d"} Dec 06 07:20:00 crc kubenswrapper[4954]: I1206 07:20:00.435146 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.026075756 podStartE2EDuration="2.435121714s" podCreationTimestamp="2025-12-06 07:19:58 +0000 UTC" firstStartedPulling="2025-12-06 07:19:59.54146838 +0000 UTC m=+1374.354827769" lastFinishedPulling="2025-12-06 07:19:59.950514338 +0000 UTC m=+1374.763873727" observedRunningTime="2025-12-06 07:20:00.432813122 +0000 UTC m=+1375.246172511" watchObservedRunningTime="2025-12-06 07:20:00.435121714 +0000 UTC m=+1375.248481113" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.430547 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ff9e434-1b93-4cf9-b51b-b612da62e596","Type":"ContainerStarted","Data":"d760feaab899550200b5e157a4dc9d636c54ae00ea24b86de2f95c51e3b7443d"} Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.599157 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-5fm8l"] Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.601202 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5fm8l" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.629881 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5fm8l"] Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.697603 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n294l\" (UniqueName: \"kubernetes.io/projected/81e2968b-ef32-49ec-81e1-e3f07c3b73b8-kube-api-access-n294l\") pod \"nova-api-db-create-5fm8l\" (UID: \"81e2968b-ef32-49ec-81e1-e3f07c3b73b8\") " pod="openstack/nova-api-db-create-5fm8l" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.697783 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81e2968b-ef32-49ec-81e1-e3f07c3b73b8-operator-scripts\") pod \"nova-api-db-create-5fm8l\" (UID: \"81e2968b-ef32-49ec-81e1-e3f07c3b73b8\") " pod="openstack/nova-api-db-create-5fm8l" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.733322 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-gqjsq"] Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.737806 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gqjsq" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.780627 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gqjsq"] Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.796008 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-9e6e-account-create-update-fp6hk"] Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.798181 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9e6e-account-create-update-fp6hk" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.800924 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dfefc17-9a79-4857-b32d-d1b2f7ba15dc-operator-scripts\") pod \"nova-cell0-db-create-gqjsq\" (UID: \"3dfefc17-9a79-4857-b32d-d1b2f7ba15dc\") " pod="openstack/nova-cell0-db-create-gqjsq" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.801133 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n294l\" (UniqueName: \"kubernetes.io/projected/81e2968b-ef32-49ec-81e1-e3f07c3b73b8-kube-api-access-n294l\") pod \"nova-api-db-create-5fm8l\" (UID: \"81e2968b-ef32-49ec-81e1-e3f07c3b73b8\") " pod="openstack/nova-api-db-create-5fm8l" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.801427 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr7bz\" (UniqueName: \"kubernetes.io/projected/3dfefc17-9a79-4857-b32d-d1b2f7ba15dc-kube-api-access-lr7bz\") pod \"nova-cell0-db-create-gqjsq\" (UID: \"3dfefc17-9a79-4857-b32d-d1b2f7ba15dc\") " pod="openstack/nova-cell0-db-create-gqjsq" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.801527 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81e2968b-ef32-49ec-81e1-e3f07c3b73b8-operator-scripts\") pod \"nova-api-db-create-5fm8l\" (UID: \"81e2968b-ef32-49ec-81e1-e3f07c3b73b8\") " pod="openstack/nova-api-db-create-5fm8l" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.802700 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81e2968b-ef32-49ec-81e1-e3f07c3b73b8-operator-scripts\") pod \"nova-api-db-create-5fm8l\" (UID: \"81e2968b-ef32-49ec-81e1-e3f07c3b73b8\") " pod="openstack/nova-api-db-create-5fm8l" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.807468 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.859392 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n294l\" (UniqueName: \"kubernetes.io/projected/81e2968b-ef32-49ec-81e1-e3f07c3b73b8-kube-api-access-n294l\") pod \"nova-api-db-create-5fm8l\" (UID: \"81e2968b-ef32-49ec-81e1-e3f07c3b73b8\") " pod="openstack/nova-api-db-create-5fm8l" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.902474 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9e6e-account-create-update-fp6hk"] Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.911148 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr7bz\" (UniqueName: \"kubernetes.io/projected/3dfefc17-9a79-4857-b32d-d1b2f7ba15dc-kube-api-access-lr7bz\") pod \"nova-cell0-db-create-gqjsq\" (UID: \"3dfefc17-9a79-4857-b32d-d1b2f7ba15dc\") " pod="openstack/nova-cell0-db-create-gqjsq" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.911235 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8234c284-a6cb-4bf4-b63a-534a4ee085aa-operator-scripts\") pod \"nova-api-9e6e-account-create-update-fp6hk\" (UID: \"8234c284-a6cb-4bf4-b63a-534a4ee085aa\") " pod="openstack/nova-api-9e6e-account-create-update-fp6hk" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.913306 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-575vz\" (UniqueName: \"kubernetes.io/projected/8234c284-a6cb-4bf4-b63a-534a4ee085aa-kube-api-access-575vz\") pod \"nova-api-9e6e-account-create-update-fp6hk\" (UID: \"8234c284-a6cb-4bf4-b63a-534a4ee085aa\") " pod="openstack/nova-api-9e6e-account-create-update-fp6hk" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.913755 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dfefc17-9a79-4857-b32d-d1b2f7ba15dc-operator-scripts\") pod \"nova-cell0-db-create-gqjsq\" (UID: \"3dfefc17-9a79-4857-b32d-d1b2f7ba15dc\") " pod="openstack/nova-cell0-db-create-gqjsq" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.914948 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dfefc17-9a79-4857-b32d-d1b2f7ba15dc-operator-scripts\") pod \"nova-cell0-db-create-gqjsq\" (UID: \"3dfefc17-9a79-4857-b32d-d1b2f7ba15dc\") " pod="openstack/nova-cell0-db-create-gqjsq" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.965276 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr7bz\" (UniqueName: \"kubernetes.io/projected/3dfefc17-9a79-4857-b32d-d1b2f7ba15dc-kube-api-access-lr7bz\") pod \"nova-cell0-db-create-gqjsq\" (UID: \"3dfefc17-9a79-4857-b32d-d1b2f7ba15dc\") " pod="openstack/nova-cell0-db-create-gqjsq" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.995288 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-4dbc4"] Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.997974 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5fm8l" Dec 06 07:20:01 crc kubenswrapper[4954]: I1206 07:20:01.999071 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4dbc4" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.018980 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-575vz\" (UniqueName: \"kubernetes.io/projected/8234c284-a6cb-4bf4-b63a-534a4ee085aa-kube-api-access-575vz\") pod \"nova-api-9e6e-account-create-update-fp6hk\" (UID: \"8234c284-a6cb-4bf4-b63a-534a4ee085aa\") " pod="openstack/nova-api-9e6e-account-create-update-fp6hk" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.020251 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8234c284-a6cb-4bf4-b63a-534a4ee085aa-operator-scripts\") pod \"nova-api-9e6e-account-create-update-fp6hk\" (UID: \"8234c284-a6cb-4bf4-b63a-534a4ee085aa\") " pod="openstack/nova-api-9e6e-account-create-update-fp6hk" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.022537 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8234c284-a6cb-4bf4-b63a-534a4ee085aa-operator-scripts\") pod \"nova-api-9e6e-account-create-update-fp6hk\" (UID: \"8234c284-a6cb-4bf4-b63a-534a4ee085aa\") " pod="openstack/nova-api-9e6e-account-create-update-fp6hk" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.033506 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4dbc4"] Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.053510 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-f048-account-create-update-88gtz"] Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.056480 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f048-account-create-update-88gtz" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.057960 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-575vz\" (UniqueName: \"kubernetes.io/projected/8234c284-a6cb-4bf4-b63a-534a4ee085aa-kube-api-access-575vz\") pod \"nova-api-9e6e-account-create-update-fp6hk\" (UID: \"8234c284-a6cb-4bf4-b63a-534a4ee085aa\") " pod="openstack/nova-api-9e6e-account-create-update-fp6hk" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.062397 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.067467 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f048-account-create-update-88gtz"] Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.110184 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gqjsq" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.128038 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq45d\" (UniqueName: \"kubernetes.io/projected/a823a756-929e-41ba-a100-0fb69d74b7b8-kube-api-access-vq45d\") pod \"nova-cell0-f048-account-create-update-88gtz\" (UID: \"a823a756-929e-41ba-a100-0fb69d74b7b8\") " pod="openstack/nova-cell0-f048-account-create-update-88gtz" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.128085 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a823a756-929e-41ba-a100-0fb69d74b7b8-operator-scripts\") pod \"nova-cell0-f048-account-create-update-88gtz\" (UID: \"a823a756-929e-41ba-a100-0fb69d74b7b8\") " pod="openstack/nova-cell0-f048-account-create-update-88gtz" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.128146 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23dd7041-53d4-4d98-bfc9-fc64828c6c7f-operator-scripts\") pod \"nova-cell1-db-create-4dbc4\" (UID: \"23dd7041-53d4-4d98-bfc9-fc64828c6c7f\") " pod="openstack/nova-cell1-db-create-4dbc4" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.128164 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr8fs\" (UniqueName: \"kubernetes.io/projected/23dd7041-53d4-4d98-bfc9-fc64828c6c7f-kube-api-access-wr8fs\") pod \"nova-cell1-db-create-4dbc4\" (UID: \"23dd7041-53d4-4d98-bfc9-fc64828c6c7f\") " pod="openstack/nova-cell1-db-create-4dbc4" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.190329 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9e6e-account-create-update-fp6hk" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.192390 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-7ee2-account-create-update-z9pcl"] Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.198600 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7ee2-account-create-update-z9pcl" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.201332 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.213828 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7ee2-account-create-update-z9pcl"] Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.235923 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/048d9a25-3088-4426-914f-5c4436b1e98a-operator-scripts\") pod \"nova-cell1-7ee2-account-create-update-z9pcl\" (UID: \"048d9a25-3088-4426-914f-5c4436b1e98a\") " pod="openstack/nova-cell1-7ee2-account-create-update-z9pcl" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.236084 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq45d\" (UniqueName: \"kubernetes.io/projected/a823a756-929e-41ba-a100-0fb69d74b7b8-kube-api-access-vq45d\") pod \"nova-cell0-f048-account-create-update-88gtz\" (UID: \"a823a756-929e-41ba-a100-0fb69d74b7b8\") " pod="openstack/nova-cell0-f048-account-create-update-88gtz" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.236115 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a823a756-929e-41ba-a100-0fb69d74b7b8-operator-scripts\") pod \"nova-cell0-f048-account-create-update-88gtz\" (UID: \"a823a756-929e-41ba-a100-0fb69d74b7b8\") " pod="openstack/nova-cell0-f048-account-create-update-88gtz" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.236152 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krrp7\" (UniqueName: \"kubernetes.io/projected/048d9a25-3088-4426-914f-5c4436b1e98a-kube-api-access-krrp7\") pod \"nova-cell1-7ee2-account-create-update-z9pcl\" (UID: \"048d9a25-3088-4426-914f-5c4436b1e98a\") " pod="openstack/nova-cell1-7ee2-account-create-update-z9pcl" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.236189 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23dd7041-53d4-4d98-bfc9-fc64828c6c7f-operator-scripts\") pod \"nova-cell1-db-create-4dbc4\" (UID: \"23dd7041-53d4-4d98-bfc9-fc64828c6c7f\") " pod="openstack/nova-cell1-db-create-4dbc4" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.236207 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr8fs\" (UniqueName: \"kubernetes.io/projected/23dd7041-53d4-4d98-bfc9-fc64828c6c7f-kube-api-access-wr8fs\") pod \"nova-cell1-db-create-4dbc4\" (UID: \"23dd7041-53d4-4d98-bfc9-fc64828c6c7f\") " pod="openstack/nova-cell1-db-create-4dbc4" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.237864 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a823a756-929e-41ba-a100-0fb69d74b7b8-operator-scripts\") pod \"nova-cell0-f048-account-create-update-88gtz\" (UID: \"a823a756-929e-41ba-a100-0fb69d74b7b8\") " pod="openstack/nova-cell0-f048-account-create-update-88gtz" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.238656 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23dd7041-53d4-4d98-bfc9-fc64828c6c7f-operator-scripts\") pod \"nova-cell1-db-create-4dbc4\" (UID: \"23dd7041-53d4-4d98-bfc9-fc64828c6c7f\") " pod="openstack/nova-cell1-db-create-4dbc4" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.267657 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr8fs\" (UniqueName: \"kubernetes.io/projected/23dd7041-53d4-4d98-bfc9-fc64828c6c7f-kube-api-access-wr8fs\") pod \"nova-cell1-db-create-4dbc4\" (UID: \"23dd7041-53d4-4d98-bfc9-fc64828c6c7f\") " pod="openstack/nova-cell1-db-create-4dbc4" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.286196 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq45d\" (UniqueName: \"kubernetes.io/projected/a823a756-929e-41ba-a100-0fb69d74b7b8-kube-api-access-vq45d\") pod \"nova-cell0-f048-account-create-update-88gtz\" (UID: \"a823a756-929e-41ba-a100-0fb69d74b7b8\") " pod="openstack/nova-cell0-f048-account-create-update-88gtz" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.335926 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4dbc4" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.338474 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krrp7\" (UniqueName: \"kubernetes.io/projected/048d9a25-3088-4426-914f-5c4436b1e98a-kube-api-access-krrp7\") pod \"nova-cell1-7ee2-account-create-update-z9pcl\" (UID: \"048d9a25-3088-4426-914f-5c4436b1e98a\") " pod="openstack/nova-cell1-7ee2-account-create-update-z9pcl" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.338602 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/048d9a25-3088-4426-914f-5c4436b1e98a-operator-scripts\") pod \"nova-cell1-7ee2-account-create-update-z9pcl\" (UID: \"048d9a25-3088-4426-914f-5c4436b1e98a\") " pod="openstack/nova-cell1-7ee2-account-create-update-z9pcl" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.347612 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/048d9a25-3088-4426-914f-5c4436b1e98a-operator-scripts\") pod \"nova-cell1-7ee2-account-create-update-z9pcl\" (UID: \"048d9a25-3088-4426-914f-5c4436b1e98a\") " pod="openstack/nova-cell1-7ee2-account-create-update-z9pcl" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.387347 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krrp7\" (UniqueName: \"kubernetes.io/projected/048d9a25-3088-4426-914f-5c4436b1e98a-kube-api-access-krrp7\") pod \"nova-cell1-7ee2-account-create-update-z9pcl\" (UID: \"048d9a25-3088-4426-914f-5c4436b1e98a\") " pod="openstack/nova-cell1-7ee2-account-create-update-z9pcl" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.391910 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f048-account-create-update-88gtz" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.472845 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5fm8l"] Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.522552 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ff9e434-1b93-4cf9-b51b-b612da62e596","Type":"ContainerStarted","Data":"7c47885260f8b62a15cdeb0099b4f4ec9a09f934d2bc6ca618962c38ea4c740c"} Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.549125 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7ee2-account-create-update-z9pcl" Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.602305 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gqjsq"] Dec 06 07:20:02 crc kubenswrapper[4954]: I1206 07:20:02.943075 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9e6e-account-create-update-fp6hk"] Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.135949 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f048-account-create-update-88gtz"] Dec 06 07:20:03 crc kubenswrapper[4954]: W1206 07:20:03.153918 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda823a756_929e_41ba_a100_0fb69d74b7b8.slice/crio-edf95f5596eb8dafce409b0401680fa7d88421edcb561af3f192291399dd7c38 WatchSource:0}: Error finding container edf95f5596eb8dafce409b0401680fa7d88421edcb561af3f192291399dd7c38: Status 404 returned error can't find the container with id edf95f5596eb8dafce409b0401680fa7d88421edcb561af3f192291399dd7c38 Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.160776 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4dbc4"] Dec 06 07:20:03 crc kubenswrapper[4954]: W1206 07:20:03.168526 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23dd7041_53d4_4d98_bfc9_fc64828c6c7f.slice/crio-87d1cc0536b6ffe185efa35315a8bc5902fa5c62c603a0aeee2cbe8e229b92ba WatchSource:0}: Error finding container 87d1cc0536b6ffe185efa35315a8bc5902fa5c62c603a0aeee2cbe8e229b92ba: Status 404 returned error can't find the container with id 87d1cc0536b6ffe185efa35315a8bc5902fa5c62c603a0aeee2cbe8e229b92ba Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.330182 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7ee2-account-create-update-z9pcl"] Dec 06 07:20:03 crc kubenswrapper[4954]: W1206 07:20:03.352707 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod048d9a25_3088_4426_914f_5c4436b1e98a.slice/crio-641d324bcb6118c001d11ed3d1eaf2064643e74aae82cf51cdd343c20291303e WatchSource:0}: Error finding container 641d324bcb6118c001d11ed3d1eaf2064643e74aae82cf51cdd343c20291303e: Status 404 returned error can't find the container with id 641d324bcb6118c001d11ed3d1eaf2064643e74aae82cf51cdd343c20291303e Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.534680 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9e6e-account-create-update-fp6hk" event={"ID":"8234c284-a6cb-4bf4-b63a-534a4ee085aa","Type":"ContainerStarted","Data":"d13d80f84977f1d7303d6793ff01f19f9db19816a45b876992777f3cab048b0d"} Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.535094 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9e6e-account-create-update-fp6hk" event={"ID":"8234c284-a6cb-4bf4-b63a-534a4ee085aa","Type":"ContainerStarted","Data":"a58f19579c7854e20114d48068fb5f31747d50760174b99fa8bc737cdcb7faca"} Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.559537 4954 generic.go:334] "Generic (PLEG): container finished" podID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerID="262d32cea62cfe448cf474c3eb2d9b440dd17c7df4b5f10a127108cf648ebfa0" exitCode=1 Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.559704 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ff9e434-1b93-4cf9-b51b-b612da62e596","Type":"ContainerDied","Data":"262d32cea62cfe448cf474c3eb2d9b440dd17c7df4b5f10a127108cf648ebfa0"} Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.559980 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerName="ceilometer-central-agent" containerID="cri-o://faafb507d5c0d5a0975b20deca4daa2ebb7bd5bed652005e7e001c314a337e50" gracePeriod=30 Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.560004 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerName="sg-core" containerID="cri-o://7c47885260f8b62a15cdeb0099b4f4ec9a09f934d2bc6ca618962c38ea4c740c" gracePeriod=30 Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.560113 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerName="ceilometer-notification-agent" containerID="cri-o://d760feaab899550200b5e157a4dc9d636c54ae00ea24b86de2f95c51e3b7443d" gracePeriod=30 Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.571025 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-9e6e-account-create-update-fp6hk" podStartSLOduration=2.570996046 podStartE2EDuration="2.570996046s" podCreationTimestamp="2025-12-06 07:20:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:20:03.564478491 +0000 UTC m=+1378.377837900" watchObservedRunningTime="2025-12-06 07:20:03.570996046 +0000 UTC m=+1378.384355435" Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.574776 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f048-account-create-update-88gtz" event={"ID":"a823a756-929e-41ba-a100-0fb69d74b7b8","Type":"ContainerStarted","Data":"edf95f5596eb8dafce409b0401680fa7d88421edcb561af3f192291399dd7c38"} Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.582489 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7ee2-account-create-update-z9pcl" event={"ID":"048d9a25-3088-4426-914f-5c4436b1e98a","Type":"ContainerStarted","Data":"641d324bcb6118c001d11ed3d1eaf2064643e74aae82cf51cdd343c20291303e"} Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.585817 4954 generic.go:334] "Generic (PLEG): container finished" podID="81e2968b-ef32-49ec-81e1-e3f07c3b73b8" containerID="69544f16f8d5ca7d182a797a8081f7506395ae0820d8b9e4721bb6f746132efe" exitCode=0 Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.585905 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5fm8l" event={"ID":"81e2968b-ef32-49ec-81e1-e3f07c3b73b8","Type":"ContainerDied","Data":"69544f16f8d5ca7d182a797a8081f7506395ae0820d8b9e4721bb6f746132efe"} Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.585937 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5fm8l" event={"ID":"81e2968b-ef32-49ec-81e1-e3f07c3b73b8","Type":"ContainerStarted","Data":"a165ddba110c8b4c6cae9394358fc5ac57a760eb5b7f1f362ca3713e0ee13c8a"} Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.597886 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4dbc4" event={"ID":"23dd7041-53d4-4d98-bfc9-fc64828c6c7f","Type":"ContainerStarted","Data":"87d1cc0536b6ffe185efa35315a8bc5902fa5c62c603a0aeee2cbe8e229b92ba"} Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.604316 4954 generic.go:334] "Generic (PLEG): container finished" podID="3dfefc17-9a79-4857-b32d-d1b2f7ba15dc" containerID="3515d930ad170dbba2ebd7d24e7efa55655ce22df58ee2066ea9b387286c8337" exitCode=0 Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.604450 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gqjsq" event={"ID":"3dfefc17-9a79-4857-b32d-d1b2f7ba15dc","Type":"ContainerDied","Data":"3515d930ad170dbba2ebd7d24e7efa55655ce22df58ee2066ea9b387286c8337"} Dec 06 07:20:03 crc kubenswrapper[4954]: I1206 07:20:03.604486 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gqjsq" event={"ID":"3dfefc17-9a79-4857-b32d-d1b2f7ba15dc","Type":"ContainerStarted","Data":"7a9f1a4995c1ae26e3a0f2e72c85321e77738601ef9f2aed642805edfec4c31a"} Dec 06 07:20:04 crc kubenswrapper[4954]: I1206 07:20:04.632841 4954 generic.go:334] "Generic (PLEG): container finished" podID="a823a756-929e-41ba-a100-0fb69d74b7b8" containerID="81f111a5ef62bb2c7c53d724ac3b5a5c348a8db6b2c997864f73ca9b9165eb0a" exitCode=0 Dec 06 07:20:04 crc kubenswrapper[4954]: I1206 07:20:04.632958 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f048-account-create-update-88gtz" event={"ID":"a823a756-929e-41ba-a100-0fb69d74b7b8","Type":"ContainerDied","Data":"81f111a5ef62bb2c7c53d724ac3b5a5c348a8db6b2c997864f73ca9b9165eb0a"} Dec 06 07:20:04 crc kubenswrapper[4954]: I1206 07:20:04.636497 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7ee2-account-create-update-z9pcl" event={"ID":"048d9a25-3088-4426-914f-5c4436b1e98a","Type":"ContainerStarted","Data":"5bbcaa0af76358e95f0c085a663e99d6d9414aa7753d1930befd841c33827a83"} Dec 06 07:20:04 crc kubenswrapper[4954]: I1206 07:20:04.640541 4954 generic.go:334] "Generic (PLEG): container finished" podID="23dd7041-53d4-4d98-bfc9-fc64828c6c7f" containerID="feb559d91bf9370a973f29e0a2791a8eef9c04f353959aa041a0f510e44856c2" exitCode=0 Dec 06 07:20:04 crc kubenswrapper[4954]: I1206 07:20:04.640650 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4dbc4" event={"ID":"23dd7041-53d4-4d98-bfc9-fc64828c6c7f","Type":"ContainerDied","Data":"feb559d91bf9370a973f29e0a2791a8eef9c04f353959aa041a0f510e44856c2"} Dec 06 07:20:04 crc kubenswrapper[4954]: I1206 07:20:04.643798 4954 generic.go:334] "Generic (PLEG): container finished" podID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerID="7c47885260f8b62a15cdeb0099b4f4ec9a09f934d2bc6ca618962c38ea4c740c" exitCode=2 Dec 06 07:20:04 crc kubenswrapper[4954]: I1206 07:20:04.643839 4954 generic.go:334] "Generic (PLEG): container finished" podID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerID="d760feaab899550200b5e157a4dc9d636c54ae00ea24b86de2f95c51e3b7443d" exitCode=0 Dec 06 07:20:04 crc kubenswrapper[4954]: I1206 07:20:04.643874 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ff9e434-1b93-4cf9-b51b-b612da62e596","Type":"ContainerDied","Data":"7c47885260f8b62a15cdeb0099b4f4ec9a09f934d2bc6ca618962c38ea4c740c"} Dec 06 07:20:04 crc kubenswrapper[4954]: I1206 07:20:04.643947 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ff9e434-1b93-4cf9-b51b-b612da62e596","Type":"ContainerDied","Data":"d760feaab899550200b5e157a4dc9d636c54ae00ea24b86de2f95c51e3b7443d"} Dec 06 07:20:04 crc kubenswrapper[4954]: I1206 07:20:04.649737 4954 generic.go:334] "Generic (PLEG): container finished" podID="8234c284-a6cb-4bf4-b63a-534a4ee085aa" containerID="d13d80f84977f1d7303d6793ff01f19f9db19816a45b876992777f3cab048b0d" exitCode=0 Dec 06 07:20:04 crc kubenswrapper[4954]: I1206 07:20:04.650981 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9e6e-account-create-update-fp6hk" event={"ID":"8234c284-a6cb-4bf4-b63a-534a4ee085aa","Type":"ContainerDied","Data":"d13d80f84977f1d7303d6793ff01f19f9db19816a45b876992777f3cab048b0d"} Dec 06 07:20:04 crc kubenswrapper[4954]: I1206 07:20:04.678616 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-7ee2-account-create-update-z9pcl" podStartSLOduration=2.678587261 podStartE2EDuration="2.678587261s" podCreationTimestamp="2025-12-06 07:20:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:20:04.675649882 +0000 UTC m=+1379.489009271" watchObservedRunningTime="2025-12-06 07:20:04.678587261 +0000 UTC m=+1379.491946650" Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.111049 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gqjsq" Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.117447 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5fm8l" Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.156709 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dfefc17-9a79-4857-b32d-d1b2f7ba15dc-operator-scripts\") pod \"3dfefc17-9a79-4857-b32d-d1b2f7ba15dc\" (UID: \"3dfefc17-9a79-4857-b32d-d1b2f7ba15dc\") " Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.156791 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81e2968b-ef32-49ec-81e1-e3f07c3b73b8-operator-scripts\") pod \"81e2968b-ef32-49ec-81e1-e3f07c3b73b8\" (UID: \"81e2968b-ef32-49ec-81e1-e3f07c3b73b8\") " Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.156937 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr7bz\" (UniqueName: \"kubernetes.io/projected/3dfefc17-9a79-4857-b32d-d1b2f7ba15dc-kube-api-access-lr7bz\") pod \"3dfefc17-9a79-4857-b32d-d1b2f7ba15dc\" (UID: \"3dfefc17-9a79-4857-b32d-d1b2f7ba15dc\") " Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.157037 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n294l\" (UniqueName: \"kubernetes.io/projected/81e2968b-ef32-49ec-81e1-e3f07c3b73b8-kube-api-access-n294l\") pod \"81e2968b-ef32-49ec-81e1-e3f07c3b73b8\" (UID: \"81e2968b-ef32-49ec-81e1-e3f07c3b73b8\") " Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.157799 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dfefc17-9a79-4857-b32d-d1b2f7ba15dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3dfefc17-9a79-4857-b32d-d1b2f7ba15dc" (UID: "3dfefc17-9a79-4857-b32d-d1b2f7ba15dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.157883 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81e2968b-ef32-49ec-81e1-e3f07c3b73b8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81e2968b-ef32-49ec-81e1-e3f07c3b73b8" (UID: "81e2968b-ef32-49ec-81e1-e3f07c3b73b8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.165493 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dfefc17-9a79-4857-b32d-d1b2f7ba15dc-kube-api-access-lr7bz" (OuterVolumeSpecName: "kube-api-access-lr7bz") pod "3dfefc17-9a79-4857-b32d-d1b2f7ba15dc" (UID: "3dfefc17-9a79-4857-b32d-d1b2f7ba15dc"). InnerVolumeSpecName "kube-api-access-lr7bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.168703 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e2968b-ef32-49ec-81e1-e3f07c3b73b8-kube-api-access-n294l" (OuterVolumeSpecName: "kube-api-access-n294l") pod "81e2968b-ef32-49ec-81e1-e3f07c3b73b8" (UID: "81e2968b-ef32-49ec-81e1-e3f07c3b73b8"). InnerVolumeSpecName "kube-api-access-n294l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.259499 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dfefc17-9a79-4857-b32d-d1b2f7ba15dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.259544 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81e2968b-ef32-49ec-81e1-e3f07c3b73b8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.259580 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr7bz\" (UniqueName: \"kubernetes.io/projected/3dfefc17-9a79-4857-b32d-d1b2f7ba15dc-kube-api-access-lr7bz\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.259597 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n294l\" (UniqueName: \"kubernetes.io/projected/81e2968b-ef32-49ec-81e1-e3f07c3b73b8-kube-api-access-n294l\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.270430 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.280347 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.661011 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5fm8l" Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.661021 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5fm8l" event={"ID":"81e2968b-ef32-49ec-81e1-e3f07c3b73b8","Type":"ContainerDied","Data":"a165ddba110c8b4c6cae9394358fc5ac57a760eb5b7f1f362ca3713e0ee13c8a"} Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.661083 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a165ddba110c8b4c6cae9394358fc5ac57a760eb5b7f1f362ca3713e0ee13c8a" Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.671212 4954 generic.go:334] "Generic (PLEG): container finished" podID="048d9a25-3088-4426-914f-5c4436b1e98a" containerID="5bbcaa0af76358e95f0c085a663e99d6d9414aa7753d1930befd841c33827a83" exitCode=0 Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.671300 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7ee2-account-create-update-z9pcl" event={"ID":"048d9a25-3088-4426-914f-5c4436b1e98a","Type":"ContainerDied","Data":"5bbcaa0af76358e95f0c085a663e99d6d9414aa7753d1930befd841c33827a83"} Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.674841 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gqjsq" event={"ID":"3dfefc17-9a79-4857-b32d-d1b2f7ba15dc","Type":"ContainerDied","Data":"7a9f1a4995c1ae26e3a0f2e72c85321e77738601ef9f2aed642805edfec4c31a"} Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.674908 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gqjsq" Dec 06 07:20:05 crc kubenswrapper[4954]: I1206 07:20:05.674932 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a9f1a4995c1ae26e3a0f2e72c85321e77738601ef9f2aed642805edfec4c31a" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.038813 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9e6e-account-create-update-fp6hk" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.089843 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-575vz\" (UniqueName: \"kubernetes.io/projected/8234c284-a6cb-4bf4-b63a-534a4ee085aa-kube-api-access-575vz\") pod \"8234c284-a6cb-4bf4-b63a-534a4ee085aa\" (UID: \"8234c284-a6cb-4bf4-b63a-534a4ee085aa\") " Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.089949 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8234c284-a6cb-4bf4-b63a-534a4ee085aa-operator-scripts\") pod \"8234c284-a6cb-4bf4-b63a-534a4ee085aa\" (UID: \"8234c284-a6cb-4bf4-b63a-534a4ee085aa\") " Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.093085 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8234c284-a6cb-4bf4-b63a-534a4ee085aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8234c284-a6cb-4bf4-b63a-534a4ee085aa" (UID: "8234c284-a6cb-4bf4-b63a-534a4ee085aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.100237 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8234c284-a6cb-4bf4-b63a-534a4ee085aa-kube-api-access-575vz" (OuterVolumeSpecName: "kube-api-access-575vz") pod "8234c284-a6cb-4bf4-b63a-534a4ee085aa" (UID: "8234c284-a6cb-4bf4-b63a-534a4ee085aa"). InnerVolumeSpecName "kube-api-access-575vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.192792 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-575vz\" (UniqueName: \"kubernetes.io/projected/8234c284-a6cb-4bf4-b63a-534a4ee085aa-kube-api-access-575vz\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.192829 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8234c284-a6cb-4bf4-b63a-534a4ee085aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.294995 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4dbc4" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.304169 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f048-account-create-update-88gtz" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.397838 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq45d\" (UniqueName: \"kubernetes.io/projected/a823a756-929e-41ba-a100-0fb69d74b7b8-kube-api-access-vq45d\") pod \"a823a756-929e-41ba-a100-0fb69d74b7b8\" (UID: \"a823a756-929e-41ba-a100-0fb69d74b7b8\") " Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.397985 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a823a756-929e-41ba-a100-0fb69d74b7b8-operator-scripts\") pod \"a823a756-929e-41ba-a100-0fb69d74b7b8\" (UID: \"a823a756-929e-41ba-a100-0fb69d74b7b8\") " Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.398114 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr8fs\" (UniqueName: \"kubernetes.io/projected/23dd7041-53d4-4d98-bfc9-fc64828c6c7f-kube-api-access-wr8fs\") pod \"23dd7041-53d4-4d98-bfc9-fc64828c6c7f\" (UID: \"23dd7041-53d4-4d98-bfc9-fc64828c6c7f\") " Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.398311 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23dd7041-53d4-4d98-bfc9-fc64828c6c7f-operator-scripts\") pod \"23dd7041-53d4-4d98-bfc9-fc64828c6c7f\" (UID: \"23dd7041-53d4-4d98-bfc9-fc64828c6c7f\") " Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.399368 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23dd7041-53d4-4d98-bfc9-fc64828c6c7f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "23dd7041-53d4-4d98-bfc9-fc64828c6c7f" (UID: "23dd7041-53d4-4d98-bfc9-fc64828c6c7f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.403333 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a823a756-929e-41ba-a100-0fb69d74b7b8-kube-api-access-vq45d" (OuterVolumeSpecName: "kube-api-access-vq45d") pod "a823a756-929e-41ba-a100-0fb69d74b7b8" (UID: "a823a756-929e-41ba-a100-0fb69d74b7b8"). InnerVolumeSpecName "kube-api-access-vq45d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.403684 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a823a756-929e-41ba-a100-0fb69d74b7b8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a823a756-929e-41ba-a100-0fb69d74b7b8" (UID: "a823a756-929e-41ba-a100-0fb69d74b7b8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.406913 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23dd7041-53d4-4d98-bfc9-fc64828c6c7f-kube-api-access-wr8fs" (OuterVolumeSpecName: "kube-api-access-wr8fs") pod "23dd7041-53d4-4d98-bfc9-fc64828c6c7f" (UID: "23dd7041-53d4-4d98-bfc9-fc64828c6c7f"). InnerVolumeSpecName "kube-api-access-wr8fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.500475 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr8fs\" (UniqueName: \"kubernetes.io/projected/23dd7041-53d4-4d98-bfc9-fc64828c6c7f-kube-api-access-wr8fs\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.500515 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23dd7041-53d4-4d98-bfc9-fc64828c6c7f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.500525 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq45d\" (UniqueName: \"kubernetes.io/projected/a823a756-929e-41ba-a100-0fb69d74b7b8-kube-api-access-vq45d\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.500535 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a823a756-929e-41ba-a100-0fb69d74b7b8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.696658 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4dbc4" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.696658 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4dbc4" event={"ID":"23dd7041-53d4-4d98-bfc9-fc64828c6c7f","Type":"ContainerDied","Data":"87d1cc0536b6ffe185efa35315a8bc5902fa5c62c603a0aeee2cbe8e229b92ba"} Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.696801 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87d1cc0536b6ffe185efa35315a8bc5902fa5c62c603a0aeee2cbe8e229b92ba" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.698018 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9e6e-account-create-update-fp6hk" event={"ID":"8234c284-a6cb-4bf4-b63a-534a4ee085aa","Type":"ContainerDied","Data":"a58f19579c7854e20114d48068fb5f31747d50760174b99fa8bc737cdcb7faca"} Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.698039 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a58f19579c7854e20114d48068fb5f31747d50760174b99fa8bc737cdcb7faca" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.698121 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9e6e-account-create-update-fp6hk" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.706207 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f048-account-create-update-88gtz" Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.707496 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f048-account-create-update-88gtz" event={"ID":"a823a756-929e-41ba-a100-0fb69d74b7b8","Type":"ContainerDied","Data":"edf95f5596eb8dafce409b0401680fa7d88421edcb561af3f192291399dd7c38"} Dec 06 07:20:06 crc kubenswrapper[4954]: I1206 07:20:06.707579 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edf95f5596eb8dafce409b0401680fa7d88421edcb561af3f192291399dd7c38" Dec 06 07:20:07 crc kubenswrapper[4954]: I1206 07:20:07.093695 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7ee2-account-create-update-z9pcl" Dec 06 07:20:07 crc kubenswrapper[4954]: I1206 07:20:07.216345 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krrp7\" (UniqueName: \"kubernetes.io/projected/048d9a25-3088-4426-914f-5c4436b1e98a-kube-api-access-krrp7\") pod \"048d9a25-3088-4426-914f-5c4436b1e98a\" (UID: \"048d9a25-3088-4426-914f-5c4436b1e98a\") " Dec 06 07:20:07 crc kubenswrapper[4954]: I1206 07:20:07.216979 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/048d9a25-3088-4426-914f-5c4436b1e98a-operator-scripts\") pod \"048d9a25-3088-4426-914f-5c4436b1e98a\" (UID: \"048d9a25-3088-4426-914f-5c4436b1e98a\") " Dec 06 07:20:07 crc kubenswrapper[4954]: I1206 07:20:07.218259 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/048d9a25-3088-4426-914f-5c4436b1e98a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "048d9a25-3088-4426-914f-5c4436b1e98a" (UID: "048d9a25-3088-4426-914f-5c4436b1e98a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:20:07 crc kubenswrapper[4954]: I1206 07:20:07.230011 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/048d9a25-3088-4426-914f-5c4436b1e98a-kube-api-access-krrp7" (OuterVolumeSpecName: "kube-api-access-krrp7") pod "048d9a25-3088-4426-914f-5c4436b1e98a" (UID: "048d9a25-3088-4426-914f-5c4436b1e98a"). InnerVolumeSpecName "kube-api-access-krrp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:20:07 crc kubenswrapper[4954]: I1206 07:20:07.319752 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krrp7\" (UniqueName: \"kubernetes.io/projected/048d9a25-3088-4426-914f-5c4436b1e98a-kube-api-access-krrp7\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:07 crc kubenswrapper[4954]: I1206 07:20:07.319792 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/048d9a25-3088-4426-914f-5c4436b1e98a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:07 crc kubenswrapper[4954]: I1206 07:20:07.720899 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7ee2-account-create-update-z9pcl" event={"ID":"048d9a25-3088-4426-914f-5c4436b1e98a","Type":"ContainerDied","Data":"641d324bcb6118c001d11ed3d1eaf2064643e74aae82cf51cdd343c20291303e"} Dec 06 07:20:07 crc kubenswrapper[4954]: I1206 07:20:07.720957 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="641d324bcb6118c001d11ed3d1eaf2064643e74aae82cf51cdd343c20291303e" Dec 06 07:20:07 crc kubenswrapper[4954]: I1206 07:20:07.721043 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7ee2-account-create-update-z9pcl" Dec 06 07:20:08 crc kubenswrapper[4954]: I1206 07:20:08.897536 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 06 07:20:09 crc kubenswrapper[4954]: E1206 07:20:09.403114 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dfefc17_9a79_4857_b32d_d1b2f7ba15dc.slice/crio-7a9f1a4995c1ae26e3a0f2e72c85321e77738601ef9f2aed642805edfec4c31a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dfefc17_9a79_4857_b32d_d1b2f7ba15dc.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81e2968b_ef32_49ec_81e1_e3f07c3b73b8.slice\": RecentStats: unable to find data in memory cache]" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.260475 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lzjzc"] Dec 06 07:20:12 crc kubenswrapper[4954]: E1206 07:20:12.261661 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8234c284-a6cb-4bf4-b63a-534a4ee085aa" containerName="mariadb-account-create-update" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.261682 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8234c284-a6cb-4bf4-b63a-534a4ee085aa" containerName="mariadb-account-create-update" Dec 06 07:20:12 crc kubenswrapper[4954]: E1206 07:20:12.261700 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e2968b-ef32-49ec-81e1-e3f07c3b73b8" containerName="mariadb-database-create" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.261707 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e2968b-ef32-49ec-81e1-e3f07c3b73b8" containerName="mariadb-database-create" Dec 06 07:20:12 crc kubenswrapper[4954]: E1206 07:20:12.261726 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23dd7041-53d4-4d98-bfc9-fc64828c6c7f" containerName="mariadb-database-create" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.261744 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="23dd7041-53d4-4d98-bfc9-fc64828c6c7f" containerName="mariadb-database-create" Dec 06 07:20:12 crc kubenswrapper[4954]: E1206 07:20:12.261754 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a823a756-929e-41ba-a100-0fb69d74b7b8" containerName="mariadb-account-create-update" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.261759 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a823a756-929e-41ba-a100-0fb69d74b7b8" containerName="mariadb-account-create-update" Dec 06 07:20:12 crc kubenswrapper[4954]: E1206 07:20:12.261771 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="048d9a25-3088-4426-914f-5c4436b1e98a" containerName="mariadb-account-create-update" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.261777 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="048d9a25-3088-4426-914f-5c4436b1e98a" containerName="mariadb-account-create-update" Dec 06 07:20:12 crc kubenswrapper[4954]: E1206 07:20:12.261788 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dfefc17-9a79-4857-b32d-d1b2f7ba15dc" containerName="mariadb-database-create" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.261793 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dfefc17-9a79-4857-b32d-d1b2f7ba15dc" containerName="mariadb-database-create" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.262049 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8234c284-a6cb-4bf4-b63a-534a4ee085aa" containerName="mariadb-account-create-update" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.262094 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="23dd7041-53d4-4d98-bfc9-fc64828c6c7f" containerName="mariadb-database-create" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.262110 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e2968b-ef32-49ec-81e1-e3f07c3b73b8" containerName="mariadb-database-create" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.262123 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dfefc17-9a79-4857-b32d-d1b2f7ba15dc" containerName="mariadb-database-create" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.262133 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a823a756-929e-41ba-a100-0fb69d74b7b8" containerName="mariadb-account-create-update" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.262144 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="048d9a25-3088-4426-914f-5c4436b1e98a" containerName="mariadb-account-create-update" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.263104 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lzjzc" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.265707 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.265864 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.266764 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-lnz9x" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.278878 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lzjzc"] Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.330923 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfda18f4-e714-4a51-a6fe-9d52cef7605b-config-data\") pod \"nova-cell0-conductor-db-sync-lzjzc\" (UID: \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\") " pod="openstack/nova-cell0-conductor-db-sync-lzjzc" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.331031 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d9k6\" (UniqueName: \"kubernetes.io/projected/bfda18f4-e714-4a51-a6fe-9d52cef7605b-kube-api-access-7d9k6\") pod \"nova-cell0-conductor-db-sync-lzjzc\" (UID: \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\") " pod="openstack/nova-cell0-conductor-db-sync-lzjzc" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.331110 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfda18f4-e714-4a51-a6fe-9d52cef7605b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lzjzc\" (UID: \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\") " pod="openstack/nova-cell0-conductor-db-sync-lzjzc" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.331159 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfda18f4-e714-4a51-a6fe-9d52cef7605b-scripts\") pod \"nova-cell0-conductor-db-sync-lzjzc\" (UID: \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\") " pod="openstack/nova-cell0-conductor-db-sync-lzjzc" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.434722 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d9k6\" (UniqueName: \"kubernetes.io/projected/bfda18f4-e714-4a51-a6fe-9d52cef7605b-kube-api-access-7d9k6\") pod \"nova-cell0-conductor-db-sync-lzjzc\" (UID: \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\") " pod="openstack/nova-cell0-conductor-db-sync-lzjzc" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.435103 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfda18f4-e714-4a51-a6fe-9d52cef7605b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lzjzc\" (UID: \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\") " pod="openstack/nova-cell0-conductor-db-sync-lzjzc" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.435254 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfda18f4-e714-4a51-a6fe-9d52cef7605b-scripts\") pod \"nova-cell0-conductor-db-sync-lzjzc\" (UID: \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\") " pod="openstack/nova-cell0-conductor-db-sync-lzjzc" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.435460 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfda18f4-e714-4a51-a6fe-9d52cef7605b-config-data\") pod \"nova-cell0-conductor-db-sync-lzjzc\" (UID: \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\") " pod="openstack/nova-cell0-conductor-db-sync-lzjzc" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.443297 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfda18f4-e714-4a51-a6fe-9d52cef7605b-scripts\") pod \"nova-cell0-conductor-db-sync-lzjzc\" (UID: \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\") " pod="openstack/nova-cell0-conductor-db-sync-lzjzc" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.443393 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfda18f4-e714-4a51-a6fe-9d52cef7605b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lzjzc\" (UID: \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\") " pod="openstack/nova-cell0-conductor-db-sync-lzjzc" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.448073 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfda18f4-e714-4a51-a6fe-9d52cef7605b-config-data\") pod \"nova-cell0-conductor-db-sync-lzjzc\" (UID: \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\") " pod="openstack/nova-cell0-conductor-db-sync-lzjzc" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.463260 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d9k6\" (UniqueName: \"kubernetes.io/projected/bfda18f4-e714-4a51-a6fe-9d52cef7605b-kube-api-access-7d9k6\") pod \"nova-cell0-conductor-db-sync-lzjzc\" (UID: \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\") " pod="openstack/nova-cell0-conductor-db-sync-lzjzc" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.602790 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lzjzc" Dec 06 07:20:12 crc kubenswrapper[4954]: I1206 07:20:12.994448 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lzjzc"] Dec 06 07:20:13 crc kubenswrapper[4954]: I1206 07:20:13.854629 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lzjzc" event={"ID":"bfda18f4-e714-4a51-a6fe-9d52cef7605b","Type":"ContainerStarted","Data":"c7f64cf275f1f70d2386c51361d049ef155cbe78ae92e936f128234ca67be2ab"} Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.837440 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.893557 4954 generic.go:334] "Generic (PLEG): container finished" podID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerID="faafb507d5c0d5a0975b20deca4daa2ebb7bd5bed652005e7e001c314a337e50" exitCode=0 Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.893788 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ff9e434-1b93-4cf9-b51b-b612da62e596","Type":"ContainerDied","Data":"faafb507d5c0d5a0975b20deca4daa2ebb7bd5bed652005e7e001c314a337e50"} Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.893824 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ff9e434-1b93-4cf9-b51b-b612da62e596","Type":"ContainerDied","Data":"544a485423def65bacc3a329c625afb737d8e2fd72f852f6cab14c4629517c1d"} Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.893843 4954 scope.go:117] "RemoveContainer" containerID="262d32cea62cfe448cf474c3eb2d9b440dd17c7df4b5f10a127108cf648ebfa0" Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.894048 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.951086 4954 scope.go:117] "RemoveContainer" containerID="7c47885260f8b62a15cdeb0099b4f4ec9a09f934d2bc6ca618962c38ea4c740c" Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.967042 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-sg-core-conf-yaml\") pod \"1ff9e434-1b93-4cf9-b51b-b612da62e596\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.968369 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-combined-ca-bundle\") pod \"1ff9e434-1b93-4cf9-b51b-b612da62e596\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.968476 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-scripts\") pod \"1ff9e434-1b93-4cf9-b51b-b612da62e596\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.968579 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pczzb\" (UniqueName: \"kubernetes.io/projected/1ff9e434-1b93-4cf9-b51b-b612da62e596-kube-api-access-pczzb\") pod \"1ff9e434-1b93-4cf9-b51b-b612da62e596\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.968625 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ff9e434-1b93-4cf9-b51b-b612da62e596-log-httpd\") pod \"1ff9e434-1b93-4cf9-b51b-b612da62e596\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.968683 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-config-data\") pod \"1ff9e434-1b93-4cf9-b51b-b612da62e596\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.968744 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ff9e434-1b93-4cf9-b51b-b612da62e596-run-httpd\") pod \"1ff9e434-1b93-4cf9-b51b-b612da62e596\" (UID: \"1ff9e434-1b93-4cf9-b51b-b612da62e596\") " Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.969644 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ff9e434-1b93-4cf9-b51b-b612da62e596-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1ff9e434-1b93-4cf9-b51b-b612da62e596" (UID: "1ff9e434-1b93-4cf9-b51b-b612da62e596"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.969795 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ff9e434-1b93-4cf9-b51b-b612da62e596-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1ff9e434-1b93-4cf9-b51b-b612da62e596" (UID: "1ff9e434-1b93-4cf9-b51b-b612da62e596"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.969951 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ff9e434-1b93-4cf9-b51b-b612da62e596-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.976674 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-scripts" (OuterVolumeSpecName: "scripts") pod "1ff9e434-1b93-4cf9-b51b-b612da62e596" (UID: "1ff9e434-1b93-4cf9-b51b-b612da62e596"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.976683 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ff9e434-1b93-4cf9-b51b-b612da62e596-kube-api-access-pczzb" (OuterVolumeSpecName: "kube-api-access-pczzb") pod "1ff9e434-1b93-4cf9-b51b-b612da62e596" (UID: "1ff9e434-1b93-4cf9-b51b-b612da62e596"). InnerVolumeSpecName "kube-api-access-pczzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:20:15 crc kubenswrapper[4954]: I1206 07:20:15.987948 4954 scope.go:117] "RemoveContainer" containerID="d760feaab899550200b5e157a4dc9d636c54ae00ea24b86de2f95c51e3b7443d" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.022692 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1ff9e434-1b93-4cf9-b51b-b612da62e596" (UID: "1ff9e434-1b93-4cf9-b51b-b612da62e596"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.071837 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pczzb\" (UniqueName: \"kubernetes.io/projected/1ff9e434-1b93-4cf9-b51b-b612da62e596-kube-api-access-pczzb\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.072076 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ff9e434-1b93-4cf9-b51b-b612da62e596-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.072090 4954 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.072101 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.073049 4954 scope.go:117] "RemoveContainer" containerID="faafb507d5c0d5a0975b20deca4daa2ebb7bd5bed652005e7e001c314a337e50" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.081895 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ff9e434-1b93-4cf9-b51b-b612da62e596" (UID: "1ff9e434-1b93-4cf9-b51b-b612da62e596"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.116477 4954 scope.go:117] "RemoveContainer" containerID="262d32cea62cfe448cf474c3eb2d9b440dd17c7df4b5f10a127108cf648ebfa0" Dec 06 07:20:16 crc kubenswrapper[4954]: E1206 07:20:16.117555 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"262d32cea62cfe448cf474c3eb2d9b440dd17c7df4b5f10a127108cf648ebfa0\": container with ID starting with 262d32cea62cfe448cf474c3eb2d9b440dd17c7df4b5f10a127108cf648ebfa0 not found: ID does not exist" containerID="262d32cea62cfe448cf474c3eb2d9b440dd17c7df4b5f10a127108cf648ebfa0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.117632 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"262d32cea62cfe448cf474c3eb2d9b440dd17c7df4b5f10a127108cf648ebfa0"} err="failed to get container status \"262d32cea62cfe448cf474c3eb2d9b440dd17c7df4b5f10a127108cf648ebfa0\": rpc error: code = NotFound desc = could not find container \"262d32cea62cfe448cf474c3eb2d9b440dd17c7df4b5f10a127108cf648ebfa0\": container with ID starting with 262d32cea62cfe448cf474c3eb2d9b440dd17c7df4b5f10a127108cf648ebfa0 not found: ID does not exist" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.117680 4954 scope.go:117] "RemoveContainer" containerID="7c47885260f8b62a15cdeb0099b4f4ec9a09f934d2bc6ca618962c38ea4c740c" Dec 06 07:20:16 crc kubenswrapper[4954]: E1206 07:20:16.118311 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c47885260f8b62a15cdeb0099b4f4ec9a09f934d2bc6ca618962c38ea4c740c\": container with ID starting with 7c47885260f8b62a15cdeb0099b4f4ec9a09f934d2bc6ca618962c38ea4c740c not found: ID does not exist" containerID="7c47885260f8b62a15cdeb0099b4f4ec9a09f934d2bc6ca618962c38ea4c740c" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.118399 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c47885260f8b62a15cdeb0099b4f4ec9a09f934d2bc6ca618962c38ea4c740c"} err="failed to get container status \"7c47885260f8b62a15cdeb0099b4f4ec9a09f934d2bc6ca618962c38ea4c740c\": rpc error: code = NotFound desc = could not find container \"7c47885260f8b62a15cdeb0099b4f4ec9a09f934d2bc6ca618962c38ea4c740c\": container with ID starting with 7c47885260f8b62a15cdeb0099b4f4ec9a09f934d2bc6ca618962c38ea4c740c not found: ID does not exist" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.118445 4954 scope.go:117] "RemoveContainer" containerID="d760feaab899550200b5e157a4dc9d636c54ae00ea24b86de2f95c51e3b7443d" Dec 06 07:20:16 crc kubenswrapper[4954]: E1206 07:20:16.119040 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d760feaab899550200b5e157a4dc9d636c54ae00ea24b86de2f95c51e3b7443d\": container with ID starting with d760feaab899550200b5e157a4dc9d636c54ae00ea24b86de2f95c51e3b7443d not found: ID does not exist" containerID="d760feaab899550200b5e157a4dc9d636c54ae00ea24b86de2f95c51e3b7443d" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.119092 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d760feaab899550200b5e157a4dc9d636c54ae00ea24b86de2f95c51e3b7443d"} err="failed to get container status \"d760feaab899550200b5e157a4dc9d636c54ae00ea24b86de2f95c51e3b7443d\": rpc error: code = NotFound desc = could not find container \"d760feaab899550200b5e157a4dc9d636c54ae00ea24b86de2f95c51e3b7443d\": container with ID starting with d760feaab899550200b5e157a4dc9d636c54ae00ea24b86de2f95c51e3b7443d not found: ID does not exist" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.119127 4954 scope.go:117] "RemoveContainer" containerID="faafb507d5c0d5a0975b20deca4daa2ebb7bd5bed652005e7e001c314a337e50" Dec 06 07:20:16 crc kubenswrapper[4954]: E1206 07:20:16.119743 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faafb507d5c0d5a0975b20deca4daa2ebb7bd5bed652005e7e001c314a337e50\": container with ID starting with faafb507d5c0d5a0975b20deca4daa2ebb7bd5bed652005e7e001c314a337e50 not found: ID does not exist" containerID="faafb507d5c0d5a0975b20deca4daa2ebb7bd5bed652005e7e001c314a337e50" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.119793 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faafb507d5c0d5a0975b20deca4daa2ebb7bd5bed652005e7e001c314a337e50"} err="failed to get container status \"faafb507d5c0d5a0975b20deca4daa2ebb7bd5bed652005e7e001c314a337e50\": rpc error: code = NotFound desc = could not find container \"faafb507d5c0d5a0975b20deca4daa2ebb7bd5bed652005e7e001c314a337e50\": container with ID starting with faafb507d5c0d5a0975b20deca4daa2ebb7bd5bed652005e7e001c314a337e50 not found: ID does not exist" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.122335 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-config-data" (OuterVolumeSpecName: "config-data") pod "1ff9e434-1b93-4cf9-b51b-b612da62e596" (UID: "1ff9e434-1b93-4cf9-b51b-b612da62e596"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.174061 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.174109 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ff9e434-1b93-4cf9-b51b-b612da62e596-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.243426 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.258903 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.296839 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:20:16 crc kubenswrapper[4954]: E1206 07:20:16.297301 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerName="proxy-httpd" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.297325 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerName="proxy-httpd" Dec 06 07:20:16 crc kubenswrapper[4954]: E1206 07:20:16.297342 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerName="ceilometer-central-agent" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.297351 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerName="ceilometer-central-agent" Dec 06 07:20:16 crc kubenswrapper[4954]: E1206 07:20:16.297371 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerName="ceilometer-notification-agent" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.297379 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerName="ceilometer-notification-agent" Dec 06 07:20:16 crc kubenswrapper[4954]: E1206 07:20:16.297391 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerName="sg-core" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.297397 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerName="sg-core" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.297633 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerName="ceilometer-notification-agent" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.297647 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerName="sg-core" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.297661 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerName="ceilometer-central-agent" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.297681 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ff9e434-1b93-4cf9-b51b-b612da62e596" containerName="proxy-httpd" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.300404 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.304527 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.304653 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.313190 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.316725 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.394052 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.394092 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25959096-d4c4-42db-83a2-8f397f9d1710-run-httpd\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.394139 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.394161 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-scripts\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.394198 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-config-data\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.394213 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25959096-d4c4-42db-83a2-8f397f9d1710-log-httpd\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.394243 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.394269 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shc72\" (UniqueName: \"kubernetes.io/projected/25959096-d4c4-42db-83a2-8f397f9d1710-kube-api-access-shc72\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.495539 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.495614 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25959096-d4c4-42db-83a2-8f397f9d1710-run-httpd\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.495684 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.495711 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-scripts\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.495759 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-config-data\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.495776 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25959096-d4c4-42db-83a2-8f397f9d1710-log-httpd\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.495814 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.495839 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shc72\" (UniqueName: \"kubernetes.io/projected/25959096-d4c4-42db-83a2-8f397f9d1710-kube-api-access-shc72\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.498056 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25959096-d4c4-42db-83a2-8f397f9d1710-run-httpd\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.498276 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25959096-d4c4-42db-83a2-8f397f9d1710-log-httpd\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.503283 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.504293 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-scripts\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.504376 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.504530 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.506808 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-config-data\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.517777 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shc72\" (UniqueName: \"kubernetes.io/projected/25959096-d4c4-42db-83a2-8f397f9d1710-kube-api-access-shc72\") pod \"ceilometer-0\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.621254 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:20:16 crc kubenswrapper[4954]: I1206 07:20:16.655972 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:20:17 crc kubenswrapper[4954]: I1206 07:20:17.151231 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:20:17 crc kubenswrapper[4954]: W1206 07:20:17.158119 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25959096_d4c4_42db_83a2_8f397f9d1710.slice/crio-adb01740f2c9b3269637916def634f386b585fae9228802a575332dc944a8c55 WatchSource:0}: Error finding container adb01740f2c9b3269637916def634f386b585fae9228802a575332dc944a8c55: Status 404 returned error can't find the container with id adb01740f2c9b3269637916def634f386b585fae9228802a575332dc944a8c55 Dec 06 07:20:17 crc kubenswrapper[4954]: I1206 07:20:17.478591 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ff9e434-1b93-4cf9-b51b-b612da62e596" path="/var/lib/kubelet/pods/1ff9e434-1b93-4cf9-b51b-b612da62e596/volumes" Dec 06 07:20:17 crc kubenswrapper[4954]: I1206 07:20:17.926119 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25959096-d4c4-42db-83a2-8f397f9d1710","Type":"ContainerStarted","Data":"adb01740f2c9b3269637916def634f386b585fae9228802a575332dc944a8c55"} Dec 06 07:20:18 crc kubenswrapper[4954]: I1206 07:20:18.940534 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25959096-d4c4-42db-83a2-8f397f9d1710","Type":"ContainerStarted","Data":"4e9d31b4419dd8f7c7b842958287cba960b323dc5be6ddb006d99bb77ff28566"} Dec 06 07:20:19 crc kubenswrapper[4954]: E1206 07:20:19.711973 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dfefc17_9a79_4857_b32d_d1b2f7ba15dc.slice/crio-7a9f1a4995c1ae26e3a0f2e72c85321e77738601ef9f2aed642805edfec4c31a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81e2968b_ef32_49ec_81e1_e3f07c3b73b8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dfefc17_9a79_4857_b32d_d1b2f7ba15dc.slice\": RecentStats: unable to find data in memory cache]" Dec 06 07:20:24 crc kubenswrapper[4954]: I1206 07:20:24.024294 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25959096-d4c4-42db-83a2-8f397f9d1710","Type":"ContainerStarted","Data":"ca8b32804c6f751f8f500c6cadafcc89fead9efadeafe62e84bcd778860bbf76"} Dec 06 07:20:24 crc kubenswrapper[4954]: I1206 07:20:24.028553 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lzjzc" event={"ID":"bfda18f4-e714-4a51-a6fe-9d52cef7605b","Type":"ContainerStarted","Data":"e3081561a734c7c5623a46dc8f2eda43010b66d623c6f028647ac25db2e6ffee"} Dec 06 07:20:24 crc kubenswrapper[4954]: I1206 07:20:24.069455 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-lzjzc" podStartSLOduration=1.7282530280000001 podStartE2EDuration="12.069433177s" podCreationTimestamp="2025-12-06 07:20:12 +0000 UTC" firstStartedPulling="2025-12-06 07:20:13.000212699 +0000 UTC m=+1387.813572108" lastFinishedPulling="2025-12-06 07:20:23.341392868 +0000 UTC m=+1398.154752257" observedRunningTime="2025-12-06 07:20:24.059494921 +0000 UTC m=+1398.872854320" watchObservedRunningTime="2025-12-06 07:20:24.069433177 +0000 UTC m=+1398.882792556" Dec 06 07:20:25 crc kubenswrapper[4954]: I1206 07:20:25.040512 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25959096-d4c4-42db-83a2-8f397f9d1710","Type":"ContainerStarted","Data":"523a1b4e8fda7e67012e0b4567e66edebc97e264faa79eabbb79fbf06671a000"} Dec 06 07:20:26 crc kubenswrapper[4954]: I1206 07:20:26.054842 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25959096-d4c4-42db-83a2-8f397f9d1710","Type":"ContainerStarted","Data":"9ac7910892b30e266844c6d2b15588817c3c0983d9002cbef1ba2cb537b1e89a"} Dec 06 07:20:26 crc kubenswrapper[4954]: I1206 07:20:26.055126 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25959096-d4c4-42db-83a2-8f397f9d1710" containerName="ceilometer-central-agent" containerID="cri-o://4e9d31b4419dd8f7c7b842958287cba960b323dc5be6ddb006d99bb77ff28566" gracePeriod=30 Dec 06 07:20:26 crc kubenswrapper[4954]: I1206 07:20:26.055227 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25959096-d4c4-42db-83a2-8f397f9d1710" containerName="sg-core" containerID="cri-o://523a1b4e8fda7e67012e0b4567e66edebc97e264faa79eabbb79fbf06671a000" gracePeriod=30 Dec 06 07:20:26 crc kubenswrapper[4954]: I1206 07:20:26.055227 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25959096-d4c4-42db-83a2-8f397f9d1710" containerName="ceilometer-notification-agent" containerID="cri-o://ca8b32804c6f751f8f500c6cadafcc89fead9efadeafe62e84bcd778860bbf76" gracePeriod=30 Dec 06 07:20:26 crc kubenswrapper[4954]: I1206 07:20:26.055248 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25959096-d4c4-42db-83a2-8f397f9d1710" containerName="proxy-httpd" containerID="cri-o://9ac7910892b30e266844c6d2b15588817c3c0983d9002cbef1ba2cb537b1e89a" gracePeriod=30 Dec 06 07:20:26 crc kubenswrapper[4954]: I1206 07:20:26.055340 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 07:20:26 crc kubenswrapper[4954]: I1206 07:20:26.093978 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.6703441570000002 podStartE2EDuration="10.093948941s" podCreationTimestamp="2025-12-06 07:20:16 +0000 UTC" firstStartedPulling="2025-12-06 07:20:17.1607403 +0000 UTC m=+1391.974099689" lastFinishedPulling="2025-12-06 07:20:25.584345084 +0000 UTC m=+1400.397704473" observedRunningTime="2025-12-06 07:20:26.088936597 +0000 UTC m=+1400.902295986" watchObservedRunningTime="2025-12-06 07:20:26.093948941 +0000 UTC m=+1400.907308330" Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.265928 4954 generic.go:334] "Generic (PLEG): container finished" podID="25959096-d4c4-42db-83a2-8f397f9d1710" containerID="9ac7910892b30e266844c6d2b15588817c3c0983d9002cbef1ba2cb537b1e89a" exitCode=0 Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.266370 4954 generic.go:334] "Generic (PLEG): container finished" podID="25959096-d4c4-42db-83a2-8f397f9d1710" containerID="523a1b4e8fda7e67012e0b4567e66edebc97e264faa79eabbb79fbf06671a000" exitCode=2 Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.266380 4954 generic.go:334] "Generic (PLEG): container finished" podID="25959096-d4c4-42db-83a2-8f397f9d1710" containerID="ca8b32804c6f751f8f500c6cadafcc89fead9efadeafe62e84bcd778860bbf76" exitCode=0 Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.266389 4954 generic.go:334] "Generic (PLEG): container finished" podID="25959096-d4c4-42db-83a2-8f397f9d1710" containerID="4e9d31b4419dd8f7c7b842958287cba960b323dc5be6ddb006d99bb77ff28566" exitCode=0 Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.266413 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25959096-d4c4-42db-83a2-8f397f9d1710","Type":"ContainerDied","Data":"9ac7910892b30e266844c6d2b15588817c3c0983d9002cbef1ba2cb537b1e89a"} Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.266454 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25959096-d4c4-42db-83a2-8f397f9d1710","Type":"ContainerDied","Data":"523a1b4e8fda7e67012e0b4567e66edebc97e264faa79eabbb79fbf06671a000"} Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.266475 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25959096-d4c4-42db-83a2-8f397f9d1710","Type":"ContainerDied","Data":"ca8b32804c6f751f8f500c6cadafcc89fead9efadeafe62e84bcd778860bbf76"} Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.266487 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25959096-d4c4-42db-83a2-8f397f9d1710","Type":"ContainerDied","Data":"4e9d31b4419dd8f7c7b842958287cba960b323dc5be6ddb006d99bb77ff28566"} Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.725700 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.860934 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-sg-core-conf-yaml\") pod \"25959096-d4c4-42db-83a2-8f397f9d1710\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.861003 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-config-data\") pod \"25959096-d4c4-42db-83a2-8f397f9d1710\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.861026 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-scripts\") pod \"25959096-d4c4-42db-83a2-8f397f9d1710\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.861094 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-ceilometer-tls-certs\") pod \"25959096-d4c4-42db-83a2-8f397f9d1710\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.861175 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shc72\" (UniqueName: \"kubernetes.io/projected/25959096-d4c4-42db-83a2-8f397f9d1710-kube-api-access-shc72\") pod \"25959096-d4c4-42db-83a2-8f397f9d1710\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.861251 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-combined-ca-bundle\") pod \"25959096-d4c4-42db-83a2-8f397f9d1710\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.861308 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25959096-d4c4-42db-83a2-8f397f9d1710-log-httpd\") pod \"25959096-d4c4-42db-83a2-8f397f9d1710\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.861333 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25959096-d4c4-42db-83a2-8f397f9d1710-run-httpd\") pod \"25959096-d4c4-42db-83a2-8f397f9d1710\" (UID: \"25959096-d4c4-42db-83a2-8f397f9d1710\") " Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.861909 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25959096-d4c4-42db-83a2-8f397f9d1710-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "25959096-d4c4-42db-83a2-8f397f9d1710" (UID: "25959096-d4c4-42db-83a2-8f397f9d1710"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.861925 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25959096-d4c4-42db-83a2-8f397f9d1710-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "25959096-d4c4-42db-83a2-8f397f9d1710" (UID: "25959096-d4c4-42db-83a2-8f397f9d1710"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.866918 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25959096-d4c4-42db-83a2-8f397f9d1710-kube-api-access-shc72" (OuterVolumeSpecName: "kube-api-access-shc72") pod "25959096-d4c4-42db-83a2-8f397f9d1710" (UID: "25959096-d4c4-42db-83a2-8f397f9d1710"). InnerVolumeSpecName "kube-api-access-shc72". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.873787 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-scripts" (OuterVolumeSpecName: "scripts") pod "25959096-d4c4-42db-83a2-8f397f9d1710" (UID: "25959096-d4c4-42db-83a2-8f397f9d1710"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.892300 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "25959096-d4c4-42db-83a2-8f397f9d1710" (UID: "25959096-d4c4-42db-83a2-8f397f9d1710"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.926360 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "25959096-d4c4-42db-83a2-8f397f9d1710" (UID: "25959096-d4c4-42db-83a2-8f397f9d1710"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.965613 4954 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.965694 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.965709 4954 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.965723 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shc72\" (UniqueName: \"kubernetes.io/projected/25959096-d4c4-42db-83a2-8f397f9d1710-kube-api-access-shc72\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.965738 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25959096-d4c4-42db-83a2-8f397f9d1710-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.965749 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25959096-d4c4-42db-83a2-8f397f9d1710-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.968002 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25959096-d4c4-42db-83a2-8f397f9d1710" (UID: "25959096-d4c4-42db-83a2-8f397f9d1710"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:27 crc kubenswrapper[4954]: I1206 07:20:27.982920 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-config-data" (OuterVolumeSpecName: "config-data") pod "25959096-d4c4-42db-83a2-8f397f9d1710" (UID: "25959096-d4c4-42db-83a2-8f397f9d1710"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.067547 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.067830 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25959096-d4c4-42db-83a2-8f397f9d1710-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.281045 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25959096-d4c4-42db-83a2-8f397f9d1710","Type":"ContainerDied","Data":"adb01740f2c9b3269637916def634f386b585fae9228802a575332dc944a8c55"} Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.281112 4954 scope.go:117] "RemoveContainer" containerID="9ac7910892b30e266844c6d2b15588817c3c0983d9002cbef1ba2cb537b1e89a" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.281147 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.327462 4954 scope.go:117] "RemoveContainer" containerID="523a1b4e8fda7e67012e0b4567e66edebc97e264faa79eabbb79fbf06671a000" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.331787 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.342285 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.359450 4954 scope.go:117] "RemoveContainer" containerID="ca8b32804c6f751f8f500c6cadafcc89fead9efadeafe62e84bcd778860bbf76" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.365863 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:20:28 crc kubenswrapper[4954]: E1206 07:20:28.366724 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25959096-d4c4-42db-83a2-8f397f9d1710" containerName="ceilometer-central-agent" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.366851 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="25959096-d4c4-42db-83a2-8f397f9d1710" containerName="ceilometer-central-agent" Dec 06 07:20:28 crc kubenswrapper[4954]: E1206 07:20:28.366963 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25959096-d4c4-42db-83a2-8f397f9d1710" containerName="proxy-httpd" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.367044 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="25959096-d4c4-42db-83a2-8f397f9d1710" containerName="proxy-httpd" Dec 06 07:20:28 crc kubenswrapper[4954]: E1206 07:20:28.367125 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25959096-d4c4-42db-83a2-8f397f9d1710" containerName="sg-core" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.367202 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="25959096-d4c4-42db-83a2-8f397f9d1710" containerName="sg-core" Dec 06 07:20:28 crc kubenswrapper[4954]: E1206 07:20:28.367296 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25959096-d4c4-42db-83a2-8f397f9d1710" containerName="ceilometer-notification-agent" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.367370 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="25959096-d4c4-42db-83a2-8f397f9d1710" containerName="ceilometer-notification-agent" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.367720 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="25959096-d4c4-42db-83a2-8f397f9d1710" containerName="proxy-httpd" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.367811 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="25959096-d4c4-42db-83a2-8f397f9d1710" containerName="sg-core" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.367899 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="25959096-d4c4-42db-83a2-8f397f9d1710" containerName="ceilometer-central-agent" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.367988 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="25959096-d4c4-42db-83a2-8f397f9d1710" containerName="ceilometer-notification-agent" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.370340 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.380978 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.381120 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.381170 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.382163 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.382235 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-config-data\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.382970 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-run-httpd\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.383058 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.383178 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.383226 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-log-httpd\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.383527 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndvcl\" (UniqueName: \"kubernetes.io/projected/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-kube-api-access-ndvcl\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.383603 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-scripts\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.515611 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndvcl\" (UniqueName: \"kubernetes.io/projected/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-kube-api-access-ndvcl\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.515682 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-scripts\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.515785 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.515825 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-config-data\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.515860 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-run-httpd\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.515911 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.515937 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.515975 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-log-httpd\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.517078 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-log-httpd\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.517790 4954 scope.go:117] "RemoveContainer" containerID="4e9d31b4419dd8f7c7b842958287cba960b323dc5be6ddb006d99bb77ff28566" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.518073 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-run-httpd\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.526606 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-scripts\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.526651 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.529222 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.531267 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.540978 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-config-data\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.549981 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.568487 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndvcl\" (UniqueName: \"kubernetes.io/projected/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-kube-api-access-ndvcl\") pod \"ceilometer-0\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " pod="openstack/ceilometer-0" Dec 06 07:20:28 crc kubenswrapper[4954]: I1206 07:20:28.822983 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:20:29 crc kubenswrapper[4954]: I1206 07:20:29.498012 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25959096-d4c4-42db-83a2-8f397f9d1710" path="/var/lib/kubelet/pods/25959096-d4c4-42db-83a2-8f397f9d1710/volumes" Dec 06 07:20:29 crc kubenswrapper[4954]: I1206 07:20:29.507672 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:20:29 crc kubenswrapper[4954]: I1206 07:20:29.558329 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 07:20:29 crc kubenswrapper[4954]: I1206 07:20:29.558659 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="55a1a9c0-08bd-400d-8b38-b4226dadc5d5" containerName="glance-log" containerID="cri-o://99c8dea202a8920a4eee42b890be9d604676fddb74f851b9d59b1959e543e7d3" gracePeriod=30 Dec 06 07:20:29 crc kubenswrapper[4954]: I1206 07:20:29.559226 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="55a1a9c0-08bd-400d-8b38-b4226dadc5d5" containerName="glance-httpd" containerID="cri-o://95f70c01692c02f12be12c27c5292ae7d782b346fe0682d4e7ac933a4509847b" gracePeriod=30 Dec 06 07:20:30 crc kubenswrapper[4954]: E1206 07:20:30.050818 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dfefc17_9a79_4857_b32d_d1b2f7ba15dc.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dfefc17_9a79_4857_b32d_d1b2f7ba15dc.slice/crio-7a9f1a4995c1ae26e3a0f2e72c85321e77738601ef9f2aed642805edfec4c31a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81e2968b_ef32_49ec_81e1_e3f07c3b73b8.slice\": RecentStats: unable to find data in memory cache]" Dec 06 07:20:30 crc kubenswrapper[4954]: I1206 07:20:30.306830 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5e1395f-4cb7-43cb-95bc-6af68b1afe12","Type":"ContainerStarted","Data":"e0b9d03233f12a5a3c2301c097d0c2ddbb68693318141fbf84723f49b1c6de88"} Dec 06 07:20:30 crc kubenswrapper[4954]: I1206 07:20:30.311937 4954 generic.go:334] "Generic (PLEG): container finished" podID="55a1a9c0-08bd-400d-8b38-b4226dadc5d5" containerID="99c8dea202a8920a4eee42b890be9d604676fddb74f851b9d59b1959e543e7d3" exitCode=143 Dec 06 07:20:30 crc kubenswrapper[4954]: I1206 07:20:30.312038 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"55a1a9c0-08bd-400d-8b38-b4226dadc5d5","Type":"ContainerDied","Data":"99c8dea202a8920a4eee42b890be9d604676fddb74f851b9d59b1959e543e7d3"} Dec 06 07:20:30 crc kubenswrapper[4954]: I1206 07:20:30.506602 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 07:20:30 crc kubenswrapper[4954]: I1206 07:20:30.506934 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="99f880ab-3992-479d-b71c-d71152a6199a" containerName="glance-log" containerID="cri-o://14336c5639296bdf66b099aa7e78137865151ca77f4751466d1020a44df20161" gracePeriod=30 Dec 06 07:20:30 crc kubenswrapper[4954]: I1206 07:20:30.507111 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="99f880ab-3992-479d-b71c-d71152a6199a" containerName="glance-httpd" containerID="cri-o://d7e0c85d4629bccacf706938d220e273e792bc1181cb2b98cadc350830f3ff62" gracePeriod=30 Dec 06 07:20:31 crc kubenswrapper[4954]: I1206 07:20:31.059676 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:20:31 crc kubenswrapper[4954]: I1206 07:20:31.328695 4954 generic.go:334] "Generic (PLEG): container finished" podID="99f880ab-3992-479d-b71c-d71152a6199a" containerID="14336c5639296bdf66b099aa7e78137865151ca77f4751466d1020a44df20161" exitCode=143 Dec 06 07:20:31 crc kubenswrapper[4954]: I1206 07:20:31.328809 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"99f880ab-3992-479d-b71c-d71152a6199a","Type":"ContainerDied","Data":"14336c5639296bdf66b099aa7e78137865151ca77f4751466d1020a44df20161"} Dec 06 07:20:31 crc kubenswrapper[4954]: I1206 07:20:31.331085 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5e1395f-4cb7-43cb-95bc-6af68b1afe12","Type":"ContainerStarted","Data":"9de561b2c0b76bb9abadca62666b3e7ba539866befc7f158acdeb0bda27b4816"} Dec 06 07:20:31 crc kubenswrapper[4954]: I1206 07:20:31.331227 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5e1395f-4cb7-43cb-95bc-6af68b1afe12","Type":"ContainerStarted","Data":"1d3018cb0323a55dd27ab50dfc8c1c45d036e3a0d933efaeeabf7c8a0d828c83"} Dec 06 07:20:32 crc kubenswrapper[4954]: I1206 07:20:32.430901 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5e1395f-4cb7-43cb-95bc-6af68b1afe12","Type":"ContainerStarted","Data":"310f5d91362bc584e31c754ad9228af3c1e1b630ab73151ab721f60a88f25318"} Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.395102 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.446022 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-logs\") pod \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.446106 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-combined-ca-bundle\") pod \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.446189 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-config-data\") pod \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.446230 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.446277 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw5h4\" (UniqueName: \"kubernetes.io/projected/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-kube-api-access-hw5h4\") pod \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.446307 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-scripts\") pod \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.446385 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-httpd-run\") pod \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.448327 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "55a1a9c0-08bd-400d-8b38-b4226dadc5d5" (UID: "55a1a9c0-08bd-400d-8b38-b4226dadc5d5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.452365 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-logs" (OuterVolumeSpecName: "logs") pod "55a1a9c0-08bd-400d-8b38-b4226dadc5d5" (UID: "55a1a9c0-08bd-400d-8b38-b4226dadc5d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.460744 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-scripts" (OuterVolumeSpecName: "scripts") pod "55a1a9c0-08bd-400d-8b38-b4226dadc5d5" (UID: "55a1a9c0-08bd-400d-8b38-b4226dadc5d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.461349 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "55a1a9c0-08bd-400d-8b38-b4226dadc5d5" (UID: "55a1a9c0-08bd-400d-8b38-b4226dadc5d5"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.461428 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-kube-api-access-hw5h4" (OuterVolumeSpecName: "kube-api-access-hw5h4") pod "55a1a9c0-08bd-400d-8b38-b4226dadc5d5" (UID: "55a1a9c0-08bd-400d-8b38-b4226dadc5d5"). InnerVolumeSpecName "kube-api-access-hw5h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.465095 4954 generic.go:334] "Generic (PLEG): container finished" podID="55a1a9c0-08bd-400d-8b38-b4226dadc5d5" containerID="95f70c01692c02f12be12c27c5292ae7d782b346fe0682d4e7ac933a4509847b" exitCode=0 Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.465242 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.482463 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55a1a9c0-08bd-400d-8b38-b4226dadc5d5" (UID: "55a1a9c0-08bd-400d-8b38-b4226dadc5d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.551918 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-public-tls-certs\") pod \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\" (UID: \"55a1a9c0-08bd-400d-8b38-b4226dadc5d5\") " Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.553949 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.554086 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.554218 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.554345 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw5h4\" (UniqueName: \"kubernetes.io/projected/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-kube-api-access-hw5h4\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.554465 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.554602 4954 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.657885 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-config-data" (OuterVolumeSpecName: "config-data") pod "55a1a9c0-08bd-400d-8b38-b4226dadc5d5" (UID: "55a1a9c0-08bd-400d-8b38-b4226dadc5d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.658781 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.663878 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.689120 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"55a1a9c0-08bd-400d-8b38-b4226dadc5d5","Type":"ContainerDied","Data":"95f70c01692c02f12be12c27c5292ae7d782b346fe0682d4e7ac933a4509847b"} Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.689170 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"55a1a9c0-08bd-400d-8b38-b4226dadc5d5","Type":"ContainerDied","Data":"2761ee60708c7ea3fa05d389fccee48b2a7505d17e82f6bcb595cf0862d01396"} Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.689192 4954 scope.go:117] "RemoveContainer" containerID="95f70c01692c02f12be12c27c5292ae7d782b346fe0682d4e7ac933a4509847b" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.744511 4954 scope.go:117] "RemoveContainer" containerID="99c8dea202a8920a4eee42b890be9d604676fddb74f851b9d59b1959e543e7d3" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.751504 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "55a1a9c0-08bd-400d-8b38-b4226dadc5d5" (UID: "55a1a9c0-08bd-400d-8b38-b4226dadc5d5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.761906 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.761942 4954 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55a1a9c0-08bd-400d-8b38-b4226dadc5d5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.852839 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.867049 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.922451 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 07:20:33 crc kubenswrapper[4954]: E1206 07:20:33.923040 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a1a9c0-08bd-400d-8b38-b4226dadc5d5" containerName="glance-httpd" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.923056 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a1a9c0-08bd-400d-8b38-b4226dadc5d5" containerName="glance-httpd" Dec 06 07:20:33 crc kubenswrapper[4954]: E1206 07:20:33.923075 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a1a9c0-08bd-400d-8b38-b4226dadc5d5" containerName="glance-log" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.923082 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a1a9c0-08bd-400d-8b38-b4226dadc5d5" containerName="glance-log" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.923357 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a1a9c0-08bd-400d-8b38-b4226dadc5d5" containerName="glance-log" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.923373 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a1a9c0-08bd-400d-8b38-b4226dadc5d5" containerName="glance-httpd" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.924600 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.928313 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.928641 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.928880 4954 scope.go:117] "RemoveContainer" containerID="95f70c01692c02f12be12c27c5292ae7d782b346fe0682d4e7ac933a4509847b" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.934344 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 07:20:33 crc kubenswrapper[4954]: E1206 07:20:33.942107 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95f70c01692c02f12be12c27c5292ae7d782b346fe0682d4e7ac933a4509847b\": container with ID starting with 95f70c01692c02f12be12c27c5292ae7d782b346fe0682d4e7ac933a4509847b not found: ID does not exist" containerID="95f70c01692c02f12be12c27c5292ae7d782b346fe0682d4e7ac933a4509847b" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.942179 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95f70c01692c02f12be12c27c5292ae7d782b346fe0682d4e7ac933a4509847b"} err="failed to get container status \"95f70c01692c02f12be12c27c5292ae7d782b346fe0682d4e7ac933a4509847b\": rpc error: code = NotFound desc = could not find container \"95f70c01692c02f12be12c27c5292ae7d782b346fe0682d4e7ac933a4509847b\": container with ID starting with 95f70c01692c02f12be12c27c5292ae7d782b346fe0682d4e7ac933a4509847b not found: ID does not exist" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.942236 4954 scope.go:117] "RemoveContainer" containerID="99c8dea202a8920a4eee42b890be9d604676fddb74f851b9d59b1959e543e7d3" Dec 06 07:20:33 crc kubenswrapper[4954]: E1206 07:20:33.943034 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99c8dea202a8920a4eee42b890be9d604676fddb74f851b9d59b1959e543e7d3\": container with ID starting with 99c8dea202a8920a4eee42b890be9d604676fddb74f851b9d59b1959e543e7d3 not found: ID does not exist" containerID="99c8dea202a8920a4eee42b890be9d604676fddb74f851b9d59b1959e543e7d3" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.943100 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99c8dea202a8920a4eee42b890be9d604676fddb74f851b9d59b1959e543e7d3"} err="failed to get container status \"99c8dea202a8920a4eee42b890be9d604676fddb74f851b9d59b1959e543e7d3\": rpc error: code = NotFound desc = could not find container \"99c8dea202a8920a4eee42b890be9d604676fddb74f851b9d59b1959e543e7d3\": container with ID starting with 99c8dea202a8920a4eee42b890be9d604676fddb74f851b9d59b1959e543e7d3 not found: ID does not exist" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.980227 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-config-data\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.980323 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a14211-fd70-4578-83c1-d674b2cf6172-logs\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.980344 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-scripts\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.981189 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.981429 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.981517 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wkrf\" (UniqueName: \"kubernetes.io/projected/37a14211-fd70-4578-83c1-d674b2cf6172-kube-api-access-2wkrf\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.981574 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37a14211-fd70-4578-83c1-d674b2cf6172-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:33 crc kubenswrapper[4954]: I1206 07:20:33.981609 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.084065 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.084187 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-config-data\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.084239 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a14211-fd70-4578-83c1-d674b2cf6172-logs\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.084260 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-scripts\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.084317 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.084379 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.084412 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wkrf\" (UniqueName: \"kubernetes.io/projected/37a14211-fd70-4578-83c1-d674b2cf6172-kube-api-access-2wkrf\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.084454 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37a14211-fd70-4578-83c1-d674b2cf6172-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.084951 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37a14211-fd70-4578-83c1-d674b2cf6172-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.085072 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.088199 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a14211-fd70-4578-83c1-d674b2cf6172-logs\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.096590 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-config-data\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.098081 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.098144 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-scripts\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.105318 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.111196 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wkrf\" (UniqueName: \"kubernetes.io/projected/37a14211-fd70-4578-83c1-d674b2cf6172-kube-api-access-2wkrf\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.133544 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.254790 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.433478 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.551108 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5e1395f-4cb7-43cb-95bc-6af68b1afe12","Type":"ContainerStarted","Data":"497c08cb9a4ca0a94ca122c48ba38edc1cb5bfdcd494b1af9f3af76f70950ae7"} Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.551324 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerName="ceilometer-central-agent" containerID="cri-o://1d3018cb0323a55dd27ab50dfc8c1c45d036e3a0d933efaeeabf7c8a0d828c83" gracePeriod=30 Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.551427 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.551815 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerName="proxy-httpd" containerID="cri-o://497c08cb9a4ca0a94ca122c48ba38edc1cb5bfdcd494b1af9f3af76f70950ae7" gracePeriod=30 Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.551861 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerName="sg-core" containerID="cri-o://310f5d91362bc584e31c754ad9228af3c1e1b630ab73151ab721f60a88f25318" gracePeriod=30 Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.551896 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerName="ceilometer-notification-agent" containerID="cri-o://9de561b2c0b76bb9abadca62666b3e7ba539866befc7f158acdeb0bda27b4816" gracePeriod=30 Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.577382 4954 generic.go:334] "Generic (PLEG): container finished" podID="99f880ab-3992-479d-b71c-d71152a6199a" containerID="d7e0c85d4629bccacf706938d220e273e792bc1181cb2b98cadc350830f3ff62" exitCode=0 Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.577446 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"99f880ab-3992-479d-b71c-d71152a6199a","Type":"ContainerDied","Data":"d7e0c85d4629bccacf706938d220e273e792bc1181cb2b98cadc350830f3ff62"} Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.577483 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"99f880ab-3992-479d-b71c-d71152a6199a","Type":"ContainerDied","Data":"13f7dfaff04d47297b2510bf2b77929baba939adcfb9c219f900ce62c96d596b"} Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.577507 4954 scope.go:117] "RemoveContainer" containerID="d7e0c85d4629bccacf706938d220e273e792bc1181cb2b98cadc350830f3ff62" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.577789 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.600382 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.354153838 podStartE2EDuration="6.600349678s" podCreationTimestamp="2025-12-06 07:20:28 +0000 UTC" firstStartedPulling="2025-12-06 07:20:29.543488631 +0000 UTC m=+1404.356848020" lastFinishedPulling="2025-12-06 07:20:33.789684471 +0000 UTC m=+1408.603043860" observedRunningTime="2025-12-06 07:20:34.583381413 +0000 UTC m=+1409.396740792" watchObservedRunningTime="2025-12-06 07:20:34.600349678 +0000 UTC m=+1409.413709067" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.620223 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99f880ab-3992-479d-b71c-d71152a6199a-httpd-run\") pod \"99f880ab-3992-479d-b71c-d71152a6199a\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.620328 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f880ab-3992-479d-b71c-d71152a6199a-logs\") pod \"99f880ab-3992-479d-b71c-d71152a6199a\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.620392 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-combined-ca-bundle\") pod \"99f880ab-3992-479d-b71c-d71152a6199a\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.620455 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-scripts\") pod \"99f880ab-3992-479d-b71c-d71152a6199a\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.620481 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz8pk\" (UniqueName: \"kubernetes.io/projected/99f880ab-3992-479d-b71c-d71152a6199a-kube-api-access-hz8pk\") pod \"99f880ab-3992-479d-b71c-d71152a6199a\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.620511 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-config-data\") pod \"99f880ab-3992-479d-b71c-d71152a6199a\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.620586 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-internal-tls-certs\") pod \"99f880ab-3992-479d-b71c-d71152a6199a\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.620622 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"99f880ab-3992-479d-b71c-d71152a6199a\" (UID: \"99f880ab-3992-479d-b71c-d71152a6199a\") " Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.621997 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99f880ab-3992-479d-b71c-d71152a6199a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "99f880ab-3992-479d-b71c-d71152a6199a" (UID: "99f880ab-3992-479d-b71c-d71152a6199a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.623056 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99f880ab-3992-479d-b71c-d71152a6199a-logs" (OuterVolumeSpecName: "logs") pod "99f880ab-3992-479d-b71c-d71152a6199a" (UID: "99f880ab-3992-479d-b71c-d71152a6199a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.631763 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-scripts" (OuterVolumeSpecName: "scripts") pod "99f880ab-3992-479d-b71c-d71152a6199a" (UID: "99f880ab-3992-479d-b71c-d71152a6199a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.636434 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "99f880ab-3992-479d-b71c-d71152a6199a" (UID: "99f880ab-3992-479d-b71c-d71152a6199a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.643239 4954 scope.go:117] "RemoveContainer" containerID="14336c5639296bdf66b099aa7e78137865151ca77f4751466d1020a44df20161" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.643306 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99f880ab-3992-479d-b71c-d71152a6199a-kube-api-access-hz8pk" (OuterVolumeSpecName: "kube-api-access-hz8pk") pod "99f880ab-3992-479d-b71c-d71152a6199a" (UID: "99f880ab-3992-479d-b71c-d71152a6199a"). InnerVolumeSpecName "kube-api-access-hz8pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.656889 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99f880ab-3992-479d-b71c-d71152a6199a" (UID: "99f880ab-3992-479d-b71c-d71152a6199a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.708640 4954 scope.go:117] "RemoveContainer" containerID="d7e0c85d4629bccacf706938d220e273e792bc1181cb2b98cadc350830f3ff62" Dec 06 07:20:34 crc kubenswrapper[4954]: E1206 07:20:34.712284 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7e0c85d4629bccacf706938d220e273e792bc1181cb2b98cadc350830f3ff62\": container with ID starting with d7e0c85d4629bccacf706938d220e273e792bc1181cb2b98cadc350830f3ff62 not found: ID does not exist" containerID="d7e0c85d4629bccacf706938d220e273e792bc1181cb2b98cadc350830f3ff62" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.712351 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7e0c85d4629bccacf706938d220e273e792bc1181cb2b98cadc350830f3ff62"} err="failed to get container status \"d7e0c85d4629bccacf706938d220e273e792bc1181cb2b98cadc350830f3ff62\": rpc error: code = NotFound desc = could not find container \"d7e0c85d4629bccacf706938d220e273e792bc1181cb2b98cadc350830f3ff62\": container with ID starting with d7e0c85d4629bccacf706938d220e273e792bc1181cb2b98cadc350830f3ff62 not found: ID does not exist" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.712389 4954 scope.go:117] "RemoveContainer" containerID="14336c5639296bdf66b099aa7e78137865151ca77f4751466d1020a44df20161" Dec 06 07:20:34 crc kubenswrapper[4954]: E1206 07:20:34.713184 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14336c5639296bdf66b099aa7e78137865151ca77f4751466d1020a44df20161\": container with ID starting with 14336c5639296bdf66b099aa7e78137865151ca77f4751466d1020a44df20161 not found: ID does not exist" containerID="14336c5639296bdf66b099aa7e78137865151ca77f4751466d1020a44df20161" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.713283 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14336c5639296bdf66b099aa7e78137865151ca77f4751466d1020a44df20161"} err="failed to get container status \"14336c5639296bdf66b099aa7e78137865151ca77f4751466d1020a44df20161\": rpc error: code = NotFound desc = could not find container \"14336c5639296bdf66b099aa7e78137865151ca77f4751466d1020a44df20161\": container with ID starting with 14336c5639296bdf66b099aa7e78137865151ca77f4751466d1020a44df20161 not found: ID does not exist" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.726133 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.726180 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz8pk\" (UniqueName: \"kubernetes.io/projected/99f880ab-3992-479d-b71c-d71152a6199a-kube-api-access-hz8pk\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.726235 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.726253 4954 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99f880ab-3992-479d-b71c-d71152a6199a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.726269 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f880ab-3992-479d-b71c-d71152a6199a-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.726282 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.738470 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "99f880ab-3992-479d-b71c-d71152a6199a" (UID: "99f880ab-3992-479d-b71c-d71152a6199a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.872660 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.886861 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-config-data" (OuterVolumeSpecName: "config-data") pod "99f880ab-3992-479d-b71c-d71152a6199a" (UID: "99f880ab-3992-479d-b71c-d71152a6199a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.889846 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.889878 4954 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f880ab-3992-479d-b71c-d71152a6199a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:34 crc kubenswrapper[4954]: I1206 07:20:34.889887 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.251197 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.278306 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.315520 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 07:20:35 crc kubenswrapper[4954]: E1206 07:20:35.326634 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f880ab-3992-479d-b71c-d71152a6199a" containerName="glance-log" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.326689 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f880ab-3992-479d-b71c-d71152a6199a" containerName="glance-log" Dec 06 07:20:35 crc kubenswrapper[4954]: E1206 07:20:35.326700 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f880ab-3992-479d-b71c-d71152a6199a" containerName="glance-httpd" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.326709 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f880ab-3992-479d-b71c-d71152a6199a" containerName="glance-httpd" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.327117 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="99f880ab-3992-479d-b71c-d71152a6199a" containerName="glance-httpd" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.327154 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="99f880ab-3992-479d-b71c-d71152a6199a" containerName="glance-log" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.335518 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.341845 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.343822 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.499328 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55a1a9c0-08bd-400d-8b38-b4226dadc5d5" path="/var/lib/kubelet/pods/55a1a9c0-08bd-400d-8b38-b4226dadc5d5/volumes" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.504011 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99f880ab-3992-479d-b71c-d71152a6199a" path="/var/lib/kubelet/pods/99f880ab-3992-479d-b71c-d71152a6199a/volumes" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.505528 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.505581 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.524500 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt9jw\" (UniqueName: \"kubernetes.io/projected/79dc6de3-cf27-4c1d-91c5-f922acd48400-kube-api-access-xt9jw\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.524596 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79dc6de3-cf27-4c1d-91c5-f922acd48400-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.524629 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-scripts\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.524666 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.524698 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.524726 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.524789 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-config-data\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.524809 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79dc6de3-cf27-4c1d-91c5-f922acd48400-logs\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.600722 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37a14211-fd70-4578-83c1-d674b2cf6172","Type":"ContainerStarted","Data":"5891bcf5b4323836ecdec4e35150830c0d731502463d8f7b5a584e351eeca4b5"} Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.626094 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-scripts\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.626165 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.626204 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.626237 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.626310 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-config-data\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.626329 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79dc6de3-cf27-4c1d-91c5-f922acd48400-logs\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.626425 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt9jw\" (UniqueName: \"kubernetes.io/projected/79dc6de3-cf27-4c1d-91c5-f922acd48400-kube-api-access-xt9jw\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.626457 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79dc6de3-cf27-4c1d-91c5-f922acd48400-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.626938 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79dc6de3-cf27-4c1d-91c5-f922acd48400-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.628057 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79dc6de3-cf27-4c1d-91c5-f922acd48400-logs\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.629066 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.633716 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-scripts\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.635578 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.637265 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-config-data\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.640199 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.649682 4954 generic.go:334] "Generic (PLEG): container finished" podID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerID="497c08cb9a4ca0a94ca122c48ba38edc1cb5bfdcd494b1af9f3af76f70950ae7" exitCode=0 Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.649747 4954 generic.go:334] "Generic (PLEG): container finished" podID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerID="310f5d91362bc584e31c754ad9228af3c1e1b630ab73151ab721f60a88f25318" exitCode=2 Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.649759 4954 generic.go:334] "Generic (PLEG): container finished" podID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerID="9de561b2c0b76bb9abadca62666b3e7ba539866befc7f158acdeb0bda27b4816" exitCode=0 Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.649789 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5e1395f-4cb7-43cb-95bc-6af68b1afe12","Type":"ContainerDied","Data":"497c08cb9a4ca0a94ca122c48ba38edc1cb5bfdcd494b1af9f3af76f70950ae7"} Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.649838 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5e1395f-4cb7-43cb-95bc-6af68b1afe12","Type":"ContainerDied","Data":"310f5d91362bc584e31c754ad9228af3c1e1b630ab73151ab721f60a88f25318"} Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.649848 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5e1395f-4cb7-43cb-95bc-6af68b1afe12","Type":"ContainerDied","Data":"9de561b2c0b76bb9abadca62666b3e7ba539866befc7f158acdeb0bda27b4816"} Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.653554 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt9jw\" (UniqueName: \"kubernetes.io/projected/79dc6de3-cf27-4c1d-91c5-f922acd48400-kube-api-access-xt9jw\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.688907 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " pod="openstack/glance-default-internal-api-0" Dec 06 07:20:35 crc kubenswrapper[4954]: I1206 07:20:35.703416 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 07:20:36 crc kubenswrapper[4954]: I1206 07:20:36.561463 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 07:20:36 crc kubenswrapper[4954]: W1206 07:20:36.570481 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79dc6de3_cf27_4c1d_91c5_f922acd48400.slice/crio-ce1cdbb2c6fc2625fbd0e04cdb4ed598c45460bf2f09b1ec8999150c3f5d1660 WatchSource:0}: Error finding container ce1cdbb2c6fc2625fbd0e04cdb4ed598c45460bf2f09b1ec8999150c3f5d1660: Status 404 returned error can't find the container with id ce1cdbb2c6fc2625fbd0e04cdb4ed598c45460bf2f09b1ec8999150c3f5d1660 Dec 06 07:20:36 crc kubenswrapper[4954]: I1206 07:20:36.695919 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37a14211-fd70-4578-83c1-d674b2cf6172","Type":"ContainerStarted","Data":"0bb0b24e4d3085b50b85c5df8d24f8ed4c7c0c28e8532d6d7a2d8957c39ce556"} Dec 06 07:20:36 crc kubenswrapper[4954]: I1206 07:20:36.702066 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"79dc6de3-cf27-4c1d-91c5-f922acd48400","Type":"ContainerStarted","Data":"ce1cdbb2c6fc2625fbd0e04cdb4ed598c45460bf2f09b1ec8999150c3f5d1660"} Dec 06 07:20:37 crc kubenswrapper[4954]: I1206 07:20:37.727189 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37a14211-fd70-4578-83c1-d674b2cf6172","Type":"ContainerStarted","Data":"dff634325c63563a5768825e16323ab3d990f36bc7c9fcb6773bf50f64fbe138"} Dec 06 07:20:37 crc kubenswrapper[4954]: I1206 07:20:37.733723 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"79dc6de3-cf27-4c1d-91c5-f922acd48400","Type":"ContainerStarted","Data":"c848e10be510af8ec76555e3960b2f1fc5ccf1b3023d60abc7f4ffb8dd93dba2"} Dec 06 07:20:37 crc kubenswrapper[4954]: I1206 07:20:37.758537 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.758514178 podStartE2EDuration="4.758514178s" podCreationTimestamp="2025-12-06 07:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:20:37.752287581 +0000 UTC m=+1412.565646990" watchObservedRunningTime="2025-12-06 07:20:37.758514178 +0000 UTC m=+1412.571873567" Dec 06 07:20:38 crc kubenswrapper[4954]: I1206 07:20:38.750040 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"79dc6de3-cf27-4c1d-91c5-f922acd48400","Type":"ContainerStarted","Data":"e09a4c785be2122538bcc0a01f7e6782d054fe47c518d01aab3ac5049b21cf94"} Dec 06 07:20:38 crc kubenswrapper[4954]: I1206 07:20:38.826316 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.826291646 podStartE2EDuration="3.826291646s" podCreationTimestamp="2025-12-06 07:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:20:38.798407607 +0000 UTC m=+1413.611766986" watchObservedRunningTime="2025-12-06 07:20:38.826291646 +0000 UTC m=+1413.639651035" Dec 06 07:20:39 crc kubenswrapper[4954]: I1206 07:20:39.763397 4954 generic.go:334] "Generic (PLEG): container finished" podID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerID="1d3018cb0323a55dd27ab50dfc8c1c45d036e3a0d933efaeeabf7c8a0d828c83" exitCode=0 Dec 06 07:20:39 crc kubenswrapper[4954]: I1206 07:20:39.763459 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5e1395f-4cb7-43cb-95bc-6af68b1afe12","Type":"ContainerDied","Data":"1d3018cb0323a55dd27ab50dfc8c1c45d036e3a0d933efaeeabf7c8a0d828c83"} Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.137281 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.279676 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-config-data\") pod \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.279817 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-run-httpd\") pod \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.279932 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndvcl\" (UniqueName: \"kubernetes.io/projected/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-kube-api-access-ndvcl\") pod \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.280011 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-scripts\") pod \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.280072 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-sg-core-conf-yaml\") pod \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.280111 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-log-httpd\") pod \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.280170 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-ceilometer-tls-certs\") pod \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.280223 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-combined-ca-bundle\") pod \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\" (UID: \"c5e1395f-4cb7-43cb-95bc-6af68b1afe12\") " Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.280957 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c5e1395f-4cb7-43cb-95bc-6af68b1afe12" (UID: "c5e1395f-4cb7-43cb-95bc-6af68b1afe12"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.281023 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c5e1395f-4cb7-43cb-95bc-6af68b1afe12" (UID: "c5e1395f-4cb7-43cb-95bc-6af68b1afe12"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.281243 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.281271 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.293781 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-scripts" (OuterVolumeSpecName: "scripts") pod "c5e1395f-4cb7-43cb-95bc-6af68b1afe12" (UID: "c5e1395f-4cb7-43cb-95bc-6af68b1afe12"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.293821 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-kube-api-access-ndvcl" (OuterVolumeSpecName: "kube-api-access-ndvcl") pod "c5e1395f-4cb7-43cb-95bc-6af68b1afe12" (UID: "c5e1395f-4cb7-43cb-95bc-6af68b1afe12"). InnerVolumeSpecName "kube-api-access-ndvcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.313251 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c5e1395f-4cb7-43cb-95bc-6af68b1afe12" (UID: "c5e1395f-4cb7-43cb-95bc-6af68b1afe12"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.346516 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c5e1395f-4cb7-43cb-95bc-6af68b1afe12" (UID: "c5e1395f-4cb7-43cb-95bc-6af68b1afe12"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:40 crc kubenswrapper[4954]: E1206 07:20:40.347990 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dfefc17_9a79_4857_b32d_d1b2f7ba15dc.slice/crio-7a9f1a4995c1ae26e3a0f2e72c85321e77738601ef9f2aed642805edfec4c31a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81e2968b_ef32_49ec_81e1_e3f07c3b73b8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dfefc17_9a79_4857_b32d_d1b2f7ba15dc.slice\": RecentStats: unable to find data in memory cache]" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.377888 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5e1395f-4cb7-43cb-95bc-6af68b1afe12" (UID: "c5e1395f-4cb7-43cb-95bc-6af68b1afe12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.385378 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.385429 4954 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.385443 4954 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.385456 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.385470 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndvcl\" (UniqueName: \"kubernetes.io/projected/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-kube-api-access-ndvcl\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.400629 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-config-data" (OuterVolumeSpecName: "config-data") pod "c5e1395f-4cb7-43cb-95bc-6af68b1afe12" (UID: "c5e1395f-4cb7-43cb-95bc-6af68b1afe12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.487684 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5e1395f-4cb7-43cb-95bc-6af68b1afe12-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.785533 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5e1395f-4cb7-43cb-95bc-6af68b1afe12","Type":"ContainerDied","Data":"e0b9d03233f12a5a3c2301c097d0c2ddbb68693318141fbf84723f49b1c6de88"} Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.786405 4954 scope.go:117] "RemoveContainer" containerID="497c08cb9a4ca0a94ca122c48ba38edc1cb5bfdcd494b1af9f3af76f70950ae7" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.785796 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.789460 4954 generic.go:334] "Generic (PLEG): container finished" podID="bfda18f4-e714-4a51-a6fe-9d52cef7605b" containerID="e3081561a734c7c5623a46dc8f2eda43010b66d623c6f028647ac25db2e6ffee" exitCode=0 Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.789591 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lzjzc" event={"ID":"bfda18f4-e714-4a51-a6fe-9d52cef7605b","Type":"ContainerDied","Data":"e3081561a734c7c5623a46dc8f2eda43010b66d623c6f028647ac25db2e6ffee"} Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.849981 4954 scope.go:117] "RemoveContainer" containerID="310f5d91362bc584e31c754ad9228af3c1e1b630ab73151ab721f60a88f25318" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.855582 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.867824 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.882969 4954 scope.go:117] "RemoveContainer" containerID="9de561b2c0b76bb9abadca62666b3e7ba539866befc7f158acdeb0bda27b4816" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.888154 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:20:40 crc kubenswrapper[4954]: E1206 07:20:40.889224 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerName="ceilometer-notification-agent" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.889375 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerName="ceilometer-notification-agent" Dec 06 07:20:40 crc kubenswrapper[4954]: E1206 07:20:40.889473 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerName="sg-core" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.889547 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerName="sg-core" Dec 06 07:20:40 crc kubenswrapper[4954]: E1206 07:20:40.889663 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerName="ceilometer-central-agent" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.889770 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerName="ceilometer-central-agent" Dec 06 07:20:40 crc kubenswrapper[4954]: E1206 07:20:40.889859 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerName="proxy-httpd" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.889933 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerName="proxy-httpd" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.890292 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerName="proxy-httpd" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.890394 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerName="ceilometer-central-agent" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.890492 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerName="ceilometer-notification-agent" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.890678 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" containerName="sg-core" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.893587 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.897388 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.897732 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.899258 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.911772 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.932832 4954 scope.go:117] "RemoveContainer" containerID="1d3018cb0323a55dd27ab50dfc8c1c45d036e3a0d933efaeeabf7c8a0d828c83" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.998144 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.998197 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b828b931-cffa-4170-b444-8d4c96dac8d4-log-httpd\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.998221 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-config-data\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.998244 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.998262 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z4k7\" (UniqueName: \"kubernetes.io/projected/b828b931-cffa-4170-b444-8d4c96dac8d4-kube-api-access-2z4k7\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.998281 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-scripts\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.998346 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b828b931-cffa-4170-b444-8d4c96dac8d4-run-httpd\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:40 crc kubenswrapper[4954]: I1206 07:20:40.998385 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.101488 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b828b931-cffa-4170-b444-8d4c96dac8d4-run-httpd\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.101613 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.101705 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.101727 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b828b931-cffa-4170-b444-8d4c96dac8d4-log-httpd\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.101743 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-config-data\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.101767 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.101785 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z4k7\" (UniqueName: \"kubernetes.io/projected/b828b931-cffa-4170-b444-8d4c96dac8d4-kube-api-access-2z4k7\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.101830 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-scripts\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.102715 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b828b931-cffa-4170-b444-8d4c96dac8d4-log-httpd\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.102876 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b828b931-cffa-4170-b444-8d4c96dac8d4-run-httpd\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.108322 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.109199 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-config-data\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.131708 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.137129 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.137542 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-scripts\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.147280 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z4k7\" (UniqueName: \"kubernetes.io/projected/b828b931-cffa-4170-b444-8d4c96dac8d4-kube-api-access-2z4k7\") pod \"ceilometer-0\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " pod="openstack/ceilometer-0" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.229092 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.460930 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5e1395f-4cb7-43cb-95bc-6af68b1afe12" path="/var/lib/kubelet/pods/c5e1395f-4cb7-43cb-95bc-6af68b1afe12/volumes" Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.719975 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:20:41 crc kubenswrapper[4954]: W1206 07:20:41.731249 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb828b931_cffa_4170_b444_8d4c96dac8d4.slice/crio-a706dd2dab9c2de4356937699ef4e577a5e4a7f40414e1938584ed32379d77ed WatchSource:0}: Error finding container a706dd2dab9c2de4356937699ef4e577a5e4a7f40414e1938584ed32379d77ed: Status 404 returned error can't find the container with id a706dd2dab9c2de4356937699ef4e577a5e4a7f40414e1938584ed32379d77ed Dec 06 07:20:41 crc kubenswrapper[4954]: I1206 07:20:41.805686 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b828b931-cffa-4170-b444-8d4c96dac8d4","Type":"ContainerStarted","Data":"a706dd2dab9c2de4356937699ef4e577a5e4a7f40414e1938584ed32379d77ed"} Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.381740 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lzjzc" Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.531828 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfda18f4-e714-4a51-a6fe-9d52cef7605b-combined-ca-bundle\") pod \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\" (UID: \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\") " Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.531954 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfda18f4-e714-4a51-a6fe-9d52cef7605b-config-data\") pod \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\" (UID: \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\") " Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.531983 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d9k6\" (UniqueName: \"kubernetes.io/projected/bfda18f4-e714-4a51-a6fe-9d52cef7605b-kube-api-access-7d9k6\") pod \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\" (UID: \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\") " Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.532034 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfda18f4-e714-4a51-a6fe-9d52cef7605b-scripts\") pod \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\" (UID: \"bfda18f4-e714-4a51-a6fe-9d52cef7605b\") " Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.542443 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfda18f4-e714-4a51-a6fe-9d52cef7605b-kube-api-access-7d9k6" (OuterVolumeSpecName: "kube-api-access-7d9k6") pod "bfda18f4-e714-4a51-a6fe-9d52cef7605b" (UID: "bfda18f4-e714-4a51-a6fe-9d52cef7605b"). InnerVolumeSpecName "kube-api-access-7d9k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.558830 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfda18f4-e714-4a51-a6fe-9d52cef7605b-scripts" (OuterVolumeSpecName: "scripts") pod "bfda18f4-e714-4a51-a6fe-9d52cef7605b" (UID: "bfda18f4-e714-4a51-a6fe-9d52cef7605b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.618787 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfda18f4-e714-4a51-a6fe-9d52cef7605b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfda18f4-e714-4a51-a6fe-9d52cef7605b" (UID: "bfda18f4-e714-4a51-a6fe-9d52cef7605b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.631181 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfda18f4-e714-4a51-a6fe-9d52cef7605b-config-data" (OuterVolumeSpecName: "config-data") pod "bfda18f4-e714-4a51-a6fe-9d52cef7605b" (UID: "bfda18f4-e714-4a51-a6fe-9d52cef7605b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.634543 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfda18f4-e714-4a51-a6fe-9d52cef7605b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.634607 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfda18f4-e714-4a51-a6fe-9d52cef7605b-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.634626 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d9k6\" (UniqueName: \"kubernetes.io/projected/bfda18f4-e714-4a51-a6fe-9d52cef7605b-kube-api-access-7d9k6\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.634642 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfda18f4-e714-4a51-a6fe-9d52cef7605b-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.817717 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b828b931-cffa-4170-b444-8d4c96dac8d4","Type":"ContainerStarted","Data":"f40ef4eda20929d5d12f39fc0c1eb75066eaf4f8b13661257f721beffd512197"} Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.819553 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lzjzc" event={"ID":"bfda18f4-e714-4a51-a6fe-9d52cef7605b","Type":"ContainerDied","Data":"c7f64cf275f1f70d2386c51361d049ef155cbe78ae92e936f128234ca67be2ab"} Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.819670 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7f64cf275f1f70d2386c51361d049ef155cbe78ae92e936f128234ca67be2ab" Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.819681 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lzjzc" Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.944120 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 07:20:42 crc kubenswrapper[4954]: E1206 07:20:42.944756 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfda18f4-e714-4a51-a6fe-9d52cef7605b" containerName="nova-cell0-conductor-db-sync" Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.944781 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfda18f4-e714-4a51-a6fe-9d52cef7605b" containerName="nova-cell0-conductor-db-sync" Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.944957 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfda18f4-e714-4a51-a6fe-9d52cef7605b" containerName="nova-cell0-conductor-db-sync" Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.945815 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.948243 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-lnz9x" Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.948472 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 06 07:20:42 crc kubenswrapper[4954]: I1206 07:20:42.967277 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 07:20:43 crc kubenswrapper[4954]: I1206 07:20:43.043395 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2306ed-dcff-4143-a452-9a209d0a46a1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ca2306ed-dcff-4143-a452-9a209d0a46a1\") " pod="openstack/nova-cell0-conductor-0" Dec 06 07:20:43 crc kubenswrapper[4954]: I1206 07:20:43.043442 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nbf6\" (UniqueName: \"kubernetes.io/projected/ca2306ed-dcff-4143-a452-9a209d0a46a1-kube-api-access-2nbf6\") pod \"nova-cell0-conductor-0\" (UID: \"ca2306ed-dcff-4143-a452-9a209d0a46a1\") " pod="openstack/nova-cell0-conductor-0" Dec 06 07:20:43 crc kubenswrapper[4954]: I1206 07:20:43.043605 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca2306ed-dcff-4143-a452-9a209d0a46a1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ca2306ed-dcff-4143-a452-9a209d0a46a1\") " pod="openstack/nova-cell0-conductor-0" Dec 06 07:20:43 crc kubenswrapper[4954]: I1206 07:20:43.145275 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2306ed-dcff-4143-a452-9a209d0a46a1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ca2306ed-dcff-4143-a452-9a209d0a46a1\") " pod="openstack/nova-cell0-conductor-0" Dec 06 07:20:43 crc kubenswrapper[4954]: I1206 07:20:43.145324 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nbf6\" (UniqueName: \"kubernetes.io/projected/ca2306ed-dcff-4143-a452-9a209d0a46a1-kube-api-access-2nbf6\") pod \"nova-cell0-conductor-0\" (UID: \"ca2306ed-dcff-4143-a452-9a209d0a46a1\") " pod="openstack/nova-cell0-conductor-0" Dec 06 07:20:43 crc kubenswrapper[4954]: I1206 07:20:43.145395 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca2306ed-dcff-4143-a452-9a209d0a46a1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ca2306ed-dcff-4143-a452-9a209d0a46a1\") " pod="openstack/nova-cell0-conductor-0" Dec 06 07:20:43 crc kubenswrapper[4954]: I1206 07:20:43.150147 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca2306ed-dcff-4143-a452-9a209d0a46a1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ca2306ed-dcff-4143-a452-9a209d0a46a1\") " pod="openstack/nova-cell0-conductor-0" Dec 06 07:20:43 crc kubenswrapper[4954]: I1206 07:20:43.151904 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2306ed-dcff-4143-a452-9a209d0a46a1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ca2306ed-dcff-4143-a452-9a209d0a46a1\") " pod="openstack/nova-cell0-conductor-0" Dec 06 07:20:43 crc kubenswrapper[4954]: I1206 07:20:43.170002 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nbf6\" (UniqueName: \"kubernetes.io/projected/ca2306ed-dcff-4143-a452-9a209d0a46a1-kube-api-access-2nbf6\") pod \"nova-cell0-conductor-0\" (UID: \"ca2306ed-dcff-4143-a452-9a209d0a46a1\") " pod="openstack/nova-cell0-conductor-0" Dec 06 07:20:43 crc kubenswrapper[4954]: I1206 07:20:43.273426 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 06 07:20:43 crc kubenswrapper[4954]: I1206 07:20:43.780496 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 07:20:43 crc kubenswrapper[4954]: W1206 07:20:43.790134 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca2306ed_dcff_4143_a452_9a209d0a46a1.slice/crio-737c35ee429a8ed15bc551f188d6fb72b21e4170f86011d5bdfa2b550cdd9f47 WatchSource:0}: Error finding container 737c35ee429a8ed15bc551f188d6fb72b21e4170f86011d5bdfa2b550cdd9f47: Status 404 returned error can't find the container with id 737c35ee429a8ed15bc551f188d6fb72b21e4170f86011d5bdfa2b550cdd9f47 Dec 06 07:20:43 crc kubenswrapper[4954]: I1206 07:20:43.870234 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b828b931-cffa-4170-b444-8d4c96dac8d4","Type":"ContainerStarted","Data":"0a07eb818f81db3b59667d91c14dcbcdcf6908e6e66ca2faa69fcc8677bb85d6"} Dec 06 07:20:43 crc kubenswrapper[4954]: I1206 07:20:43.872526 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ca2306ed-dcff-4143-a452-9a209d0a46a1","Type":"ContainerStarted","Data":"737c35ee429a8ed15bc551f188d6fb72b21e4170f86011d5bdfa2b550cdd9f47"} Dec 06 07:20:44 crc kubenswrapper[4954]: I1206 07:20:44.255583 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 07:20:44 crc kubenswrapper[4954]: I1206 07:20:44.255675 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 07:20:44 crc kubenswrapper[4954]: I1206 07:20:44.289555 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 07:20:44 crc kubenswrapper[4954]: I1206 07:20:44.307810 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 07:20:44 crc kubenswrapper[4954]: I1206 07:20:44.886120 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ca2306ed-dcff-4143-a452-9a209d0a46a1","Type":"ContainerStarted","Data":"358a83794fb9c4d6619d2cfaee88e7ec329afae2443bcf0de2adee871931aa8f"} Dec 06 07:20:44 crc kubenswrapper[4954]: I1206 07:20:44.886949 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 06 07:20:44 crc kubenswrapper[4954]: I1206 07:20:44.889950 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b828b931-cffa-4170-b444-8d4c96dac8d4","Type":"ContainerStarted","Data":"40c57cbfad521f169477bd33c369595d9e302bfde4d29f358049345954358c9d"} Dec 06 07:20:44 crc kubenswrapper[4954]: I1206 07:20:44.890354 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 07:20:44 crc kubenswrapper[4954]: I1206 07:20:44.890648 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 07:20:44 crc kubenswrapper[4954]: I1206 07:20:44.908081 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.908058699 podStartE2EDuration="2.908058699s" podCreationTimestamp="2025-12-06 07:20:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:20:44.906027045 +0000 UTC m=+1419.719386434" watchObservedRunningTime="2025-12-06 07:20:44.908058699 +0000 UTC m=+1419.721418098" Dec 06 07:20:45 crc kubenswrapper[4954]: I1206 07:20:45.703845 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 07:20:45 crc kubenswrapper[4954]: I1206 07:20:45.704416 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 07:20:45 crc kubenswrapper[4954]: I1206 07:20:45.786189 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 07:20:45 crc kubenswrapper[4954]: I1206 07:20:45.808843 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 07:20:45 crc kubenswrapper[4954]: I1206 07:20:45.905417 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b828b931-cffa-4170-b444-8d4c96dac8d4","Type":"ContainerStarted","Data":"bbc2db009796d1f06b4d5d7bcc1e07bd6717dbab8a4dd4380c0383aa7165333d"} Dec 06 07:20:45 crc kubenswrapper[4954]: I1206 07:20:45.906553 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 07:20:45 crc kubenswrapper[4954]: I1206 07:20:45.906616 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 07:20:45 crc kubenswrapper[4954]: I1206 07:20:45.940906 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.16981179 podStartE2EDuration="5.940876948s" podCreationTimestamp="2025-12-06 07:20:40 +0000 UTC" firstStartedPulling="2025-12-06 07:20:41.734206949 +0000 UTC m=+1416.547566338" lastFinishedPulling="2025-12-06 07:20:45.505272097 +0000 UTC m=+1420.318631496" observedRunningTime="2025-12-06 07:20:45.933767848 +0000 UTC m=+1420.747127247" watchObservedRunningTime="2025-12-06 07:20:45.940876948 +0000 UTC m=+1420.754236327" Dec 06 07:20:46 crc kubenswrapper[4954]: I1206 07:20:46.935185 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 07:20:46 crc kubenswrapper[4954]: I1206 07:20:46.937322 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 07:20:46 crc kubenswrapper[4954]: I1206 07:20:46.935766 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 07:20:47 crc kubenswrapper[4954]: I1206 07:20:47.469089 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 07:20:47 crc kubenswrapper[4954]: I1206 07:20:47.544076 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 07:20:47 crc kubenswrapper[4954]: I1206 07:20:47.946200 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 07:20:47 crc kubenswrapper[4954]: I1206 07:20:47.946250 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 07:20:48 crc kubenswrapper[4954]: I1206 07:20:48.593153 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 07:20:48 crc kubenswrapper[4954]: I1206 07:20:48.639431 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 07:20:50 crc kubenswrapper[4954]: E1206 07:20:50.621624 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81e2968b_ef32_49ec_81e1_e3f07c3b73b8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dfefc17_9a79_4857_b32d_d1b2f7ba15dc.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dfefc17_9a79_4857_b32d_d1b2f7ba15dc.slice/crio-7a9f1a4995c1ae26e3a0f2e72c85321e77738601ef9f2aed642805edfec4c31a\": RecentStats: unable to find data in memory cache]" Dec 06 07:20:53 crc kubenswrapper[4954]: I1206 07:20:53.311313 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.024395 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-qjzpd"] Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.026499 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qjzpd" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.031286 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.031306 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.040414 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qjzpd"] Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.114008 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg4jb\" (UniqueName: \"kubernetes.io/projected/3d7bda2f-6de7-4e30-be44-adc2196c3510-kube-api-access-mg4jb\") pod \"nova-cell0-cell-mapping-qjzpd\" (UID: \"3d7bda2f-6de7-4e30-be44-adc2196c3510\") " pod="openstack/nova-cell0-cell-mapping-qjzpd" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.114516 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7bda2f-6de7-4e30-be44-adc2196c3510-config-data\") pod \"nova-cell0-cell-mapping-qjzpd\" (UID: \"3d7bda2f-6de7-4e30-be44-adc2196c3510\") " pod="openstack/nova-cell0-cell-mapping-qjzpd" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.114600 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7bda2f-6de7-4e30-be44-adc2196c3510-scripts\") pod \"nova-cell0-cell-mapping-qjzpd\" (UID: \"3d7bda2f-6de7-4e30-be44-adc2196c3510\") " pod="openstack/nova-cell0-cell-mapping-qjzpd" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.114775 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7bda2f-6de7-4e30-be44-adc2196c3510-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qjzpd\" (UID: \"3d7bda2f-6de7-4e30-be44-adc2196c3510\") " pod="openstack/nova-cell0-cell-mapping-qjzpd" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.217668 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7bda2f-6de7-4e30-be44-adc2196c3510-scripts\") pod \"nova-cell0-cell-mapping-qjzpd\" (UID: \"3d7bda2f-6de7-4e30-be44-adc2196c3510\") " pod="openstack/nova-cell0-cell-mapping-qjzpd" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.217839 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7bda2f-6de7-4e30-be44-adc2196c3510-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qjzpd\" (UID: \"3d7bda2f-6de7-4e30-be44-adc2196c3510\") " pod="openstack/nova-cell0-cell-mapping-qjzpd" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.217910 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg4jb\" (UniqueName: \"kubernetes.io/projected/3d7bda2f-6de7-4e30-be44-adc2196c3510-kube-api-access-mg4jb\") pod \"nova-cell0-cell-mapping-qjzpd\" (UID: \"3d7bda2f-6de7-4e30-be44-adc2196c3510\") " pod="openstack/nova-cell0-cell-mapping-qjzpd" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.218030 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7bda2f-6de7-4e30-be44-adc2196c3510-config-data\") pod \"nova-cell0-cell-mapping-qjzpd\" (UID: \"3d7bda2f-6de7-4e30-be44-adc2196c3510\") " pod="openstack/nova-cell0-cell-mapping-qjzpd" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.237248 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7bda2f-6de7-4e30-be44-adc2196c3510-scripts\") pod \"nova-cell0-cell-mapping-qjzpd\" (UID: \"3d7bda2f-6de7-4e30-be44-adc2196c3510\") " pod="openstack/nova-cell0-cell-mapping-qjzpd" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.239014 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7bda2f-6de7-4e30-be44-adc2196c3510-config-data\") pod \"nova-cell0-cell-mapping-qjzpd\" (UID: \"3d7bda2f-6de7-4e30-be44-adc2196c3510\") " pod="openstack/nova-cell0-cell-mapping-qjzpd" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.245258 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7bda2f-6de7-4e30-be44-adc2196c3510-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qjzpd\" (UID: \"3d7bda2f-6de7-4e30-be44-adc2196c3510\") " pod="openstack/nova-cell0-cell-mapping-qjzpd" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.257284 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg4jb\" (UniqueName: \"kubernetes.io/projected/3d7bda2f-6de7-4e30-be44-adc2196c3510-kube-api-access-mg4jb\") pod \"nova-cell0-cell-mapping-qjzpd\" (UID: \"3d7bda2f-6de7-4e30-be44-adc2196c3510\") " pod="openstack/nova-cell0-cell-mapping-qjzpd" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.331367 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.333251 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.337226 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.354865 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.364543 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qjzpd" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.421978 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f5e966-55ea-4534-9731-c914136c224b-config-data\") pod \"nova-api-0\" (UID: \"f2f5e966-55ea-4534-9731-c914136c224b\") " pod="openstack/nova-api-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.422089 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2f5e966-55ea-4534-9731-c914136c224b-logs\") pod \"nova-api-0\" (UID: \"f2f5e966-55ea-4534-9731-c914136c224b\") " pod="openstack/nova-api-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.423158 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f5e966-55ea-4534-9731-c914136c224b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f2f5e966-55ea-4534-9731-c914136c224b\") " pod="openstack/nova-api-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.423232 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xsbq\" (UniqueName: \"kubernetes.io/projected/f2f5e966-55ea-4534-9731-c914136c224b-kube-api-access-9xsbq\") pod \"nova-api-0\" (UID: \"f2f5e966-55ea-4534-9731-c914136c224b\") " pod="openstack/nova-api-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.485001 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.486788 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.512065 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.525241 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f5e966-55ea-4534-9731-c914136c224b-config-data\") pod \"nova-api-0\" (UID: \"f2f5e966-55ea-4534-9731-c914136c224b\") " pod="openstack/nova-api-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.525323 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2f5e966-55ea-4534-9731-c914136c224b-logs\") pod \"nova-api-0\" (UID: \"f2f5e966-55ea-4534-9731-c914136c224b\") " pod="openstack/nova-api-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.525370 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f5e966-55ea-4534-9731-c914136c224b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f2f5e966-55ea-4534-9731-c914136c224b\") " pod="openstack/nova-api-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.525394 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xsbq\" (UniqueName: \"kubernetes.io/projected/f2f5e966-55ea-4534-9731-c914136c224b-kube-api-access-9xsbq\") pod \"nova-api-0\" (UID: \"f2f5e966-55ea-4534-9731-c914136c224b\") " pod="openstack/nova-api-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.532139 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2f5e966-55ea-4534-9731-c914136c224b-logs\") pod \"nova-api-0\" (UID: \"f2f5e966-55ea-4534-9731-c914136c224b\") " pod="openstack/nova-api-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.543077 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.547823 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f5e966-55ea-4534-9731-c914136c224b-config-data\") pod \"nova-api-0\" (UID: \"f2f5e966-55ea-4534-9731-c914136c224b\") " pod="openstack/nova-api-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.574516 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f5e966-55ea-4534-9731-c914136c224b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f2f5e966-55ea-4534-9731-c914136c224b\") " pod="openstack/nova-api-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.598808 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xsbq\" (UniqueName: \"kubernetes.io/projected/f2f5e966-55ea-4534-9731-c914136c224b-kube-api-access-9xsbq\") pod \"nova-api-0\" (UID: \"f2f5e966-55ea-4534-9731-c914136c224b\") " pod="openstack/nova-api-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.615872 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.617797 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.627942 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.630023 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.644132 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7j85\" (UniqueName: \"kubernetes.io/projected/82a54673-0760-4196-adc8-728b1cf08d6f-kube-api-access-p7j85\") pod \"nova-metadata-0\" (UID: \"82a54673-0760-4196-adc8-728b1cf08d6f\") " pod="openstack/nova-metadata-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.644392 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82a54673-0760-4196-adc8-728b1cf08d6f-logs\") pod \"nova-metadata-0\" (UID: \"82a54673-0760-4196-adc8-728b1cf08d6f\") " pod="openstack/nova-metadata-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.644639 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a54673-0760-4196-adc8-728b1cf08d6f-config-data\") pod \"nova-metadata-0\" (UID: \"82a54673-0760-4196-adc8-728b1cf08d6f\") " pod="openstack/nova-metadata-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.644841 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a54673-0760-4196-adc8-728b1cf08d6f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"82a54673-0760-4196-adc8-728b1cf08d6f\") " pod="openstack/nova-metadata-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.681168 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.746184 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2-config-data\") pod \"nova-scheduler-0\" (UID: \"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2\") " pod="openstack/nova-scheduler-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.746224 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7j85\" (UniqueName: \"kubernetes.io/projected/82a54673-0760-4196-adc8-728b1cf08d6f-kube-api-access-p7j85\") pod \"nova-metadata-0\" (UID: \"82a54673-0760-4196-adc8-728b1cf08d6f\") " pod="openstack/nova-metadata-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.746247 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2\") " pod="openstack/nova-scheduler-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.746287 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9r2c\" (UniqueName: \"kubernetes.io/projected/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2-kube-api-access-f9r2c\") pod \"nova-scheduler-0\" (UID: \"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2\") " pod="openstack/nova-scheduler-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.746341 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82a54673-0760-4196-adc8-728b1cf08d6f-logs\") pod \"nova-metadata-0\" (UID: \"82a54673-0760-4196-adc8-728b1cf08d6f\") " pod="openstack/nova-metadata-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.746403 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a54673-0760-4196-adc8-728b1cf08d6f-config-data\") pod \"nova-metadata-0\" (UID: \"82a54673-0760-4196-adc8-728b1cf08d6f\") " pod="openstack/nova-metadata-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.746446 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a54673-0760-4196-adc8-728b1cf08d6f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"82a54673-0760-4196-adc8-728b1cf08d6f\") " pod="openstack/nova-metadata-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.747873 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82a54673-0760-4196-adc8-728b1cf08d6f-logs\") pod \"nova-metadata-0\" (UID: \"82a54673-0760-4196-adc8-728b1cf08d6f\") " pod="openstack/nova-metadata-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.747956 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bd87576bf-bh22g"] Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.749941 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.787054 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a54673-0760-4196-adc8-728b1cf08d6f-config-data\") pod \"nova-metadata-0\" (UID: \"82a54673-0760-4196-adc8-728b1cf08d6f\") " pod="openstack/nova-metadata-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.789747 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.795651 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.802929 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.812449 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.813883 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a54673-0760-4196-adc8-728b1cf08d6f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"82a54673-0760-4196-adc8-728b1cf08d6f\") " pod="openstack/nova-metadata-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.833883 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7j85\" (UniqueName: \"kubernetes.io/projected/82a54673-0760-4196-adc8-728b1cf08d6f-kube-api-access-p7j85\") pod \"nova-metadata-0\" (UID: \"82a54673-0760-4196-adc8-728b1cf08d6f\") " pod="openstack/nova-metadata-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.851982 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd87576bf-bh22g"] Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.865097 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmp5b\" (UniqueName: \"kubernetes.io/projected/cf8ab637-2f55-4fe9-ba71-e2f98657d262-kube-api-access-hmp5b\") pod \"dnsmasq-dns-7bd87576bf-bh22g\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.865486 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-dns-svc\") pod \"dnsmasq-dns-7bd87576bf-bh22g\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.865509 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd87576bf-bh22g\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.865538 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd87576bf-bh22g\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.865657 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2-config-data\") pod \"nova-scheduler-0\" (UID: \"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2\") " pod="openstack/nova-scheduler-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.865689 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2\") " pod="openstack/nova-scheduler-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.865726 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-config\") pod \"dnsmasq-dns-7bd87576bf-bh22g\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.865757 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9r2c\" (UniqueName: \"kubernetes.io/projected/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2-kube-api-access-f9r2c\") pod \"nova-scheduler-0\" (UID: \"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2\") " pod="openstack/nova-scheduler-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.865797 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd87576bf-bh22g\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.880164 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2\") " pod="openstack/nova-scheduler-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.889947 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2-config-data\") pod \"nova-scheduler-0\" (UID: \"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2\") " pod="openstack/nova-scheduler-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.913101 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.933705 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9r2c\" (UniqueName: \"kubernetes.io/projected/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2-kube-api-access-f9r2c\") pod \"nova-scheduler-0\" (UID: \"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2\") " pod="openstack/nova-scheduler-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.950472 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.974145 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnqhv\" (UniqueName: \"kubernetes.io/projected/81f000b6-60d1-4fd0-a294-8c1907b13d87-kube-api-access-nnqhv\") pod \"nova-cell1-novncproxy-0\" (UID: \"81f000b6-60d1-4fd0-a294-8c1907b13d87\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.974231 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-config\") pod \"dnsmasq-dns-7bd87576bf-bh22g\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.974273 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81f000b6-60d1-4fd0-a294-8c1907b13d87-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"81f000b6-60d1-4fd0-a294-8c1907b13d87\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.974301 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd87576bf-bh22g\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.974396 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmp5b\" (UniqueName: \"kubernetes.io/projected/cf8ab637-2f55-4fe9-ba71-e2f98657d262-kube-api-access-hmp5b\") pod \"dnsmasq-dns-7bd87576bf-bh22g\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.974506 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f000b6-60d1-4fd0-a294-8c1907b13d87-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"81f000b6-60d1-4fd0-a294-8c1907b13d87\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.974546 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-dns-svc\") pod \"dnsmasq-dns-7bd87576bf-bh22g\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.974596 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd87576bf-bh22g\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.974625 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd87576bf-bh22g\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.976209 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd87576bf-bh22g\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.976639 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd87576bf-bh22g\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.977836 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-config\") pod \"dnsmasq-dns-7bd87576bf-bh22g\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.980705 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-dns-svc\") pod \"dnsmasq-dns-7bd87576bf-bh22g\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.987245 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd87576bf-bh22g\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:54 crc kubenswrapper[4954]: I1206 07:20:54.993478 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmp5b\" (UniqueName: \"kubernetes.io/projected/cf8ab637-2f55-4fe9-ba71-e2f98657d262-kube-api-access-hmp5b\") pod \"dnsmasq-dns-7bd87576bf-bh22g\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.077134 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81f000b6-60d1-4fd0-a294-8c1907b13d87-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"81f000b6-60d1-4fd0-a294-8c1907b13d87\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.077324 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f000b6-60d1-4fd0-a294-8c1907b13d87-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"81f000b6-60d1-4fd0-a294-8c1907b13d87\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.077414 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnqhv\" (UniqueName: \"kubernetes.io/projected/81f000b6-60d1-4fd0-a294-8c1907b13d87-kube-api-access-nnqhv\") pod \"nova-cell1-novncproxy-0\" (UID: \"81f000b6-60d1-4fd0-a294-8c1907b13d87\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.081909 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81f000b6-60d1-4fd0-a294-8c1907b13d87-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"81f000b6-60d1-4fd0-a294-8c1907b13d87\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.084116 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f000b6-60d1-4fd0-a294-8c1907b13d87-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"81f000b6-60d1-4fd0-a294-8c1907b13d87\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.096116 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnqhv\" (UniqueName: \"kubernetes.io/projected/81f000b6-60d1-4fd0-a294-8c1907b13d87-kube-api-access-nnqhv\") pod \"nova-cell1-novncproxy-0\" (UID: \"81f000b6-60d1-4fd0-a294-8c1907b13d87\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.125152 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.181679 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.412154 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qjzpd"] Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.675393 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.697175 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.776369 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2jlbn"] Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.778762 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2jlbn" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.784598 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.785464 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.799538 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2jlbn"] Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.875809 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-scripts\") pod \"nova-cell1-conductor-db-sync-2jlbn\" (UID: \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\") " pod="openstack/nova-cell1-conductor-db-sync-2jlbn" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.875875 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2jlbn\" (UID: \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\") " pod="openstack/nova-cell1-conductor-db-sync-2jlbn" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.875921 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-config-data\") pod \"nova-cell1-conductor-db-sync-2jlbn\" (UID: \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\") " pod="openstack/nova-cell1-conductor-db-sync-2jlbn" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.875981 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z62qh\" (UniqueName: \"kubernetes.io/projected/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-kube-api-access-z62qh\") pod \"nova-cell1-conductor-db-sync-2jlbn\" (UID: \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\") " pod="openstack/nova-cell1-conductor-db-sync-2jlbn" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.877837 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.923084 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.977462 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z62qh\" (UniqueName: \"kubernetes.io/projected/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-kube-api-access-z62qh\") pod \"nova-cell1-conductor-db-sync-2jlbn\" (UID: \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\") " pod="openstack/nova-cell1-conductor-db-sync-2jlbn" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.977594 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-scripts\") pod \"nova-cell1-conductor-db-sync-2jlbn\" (UID: \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\") " pod="openstack/nova-cell1-conductor-db-sync-2jlbn" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.977639 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2jlbn\" (UID: \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\") " pod="openstack/nova-cell1-conductor-db-sync-2jlbn" Dec 06 07:20:55 crc kubenswrapper[4954]: I1206 07:20:55.977683 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-config-data\") pod \"nova-cell1-conductor-db-sync-2jlbn\" (UID: \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\") " pod="openstack/nova-cell1-conductor-db-sync-2jlbn" Dec 06 07:20:56 crc kubenswrapper[4954]: I1206 07:20:55.999815 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2jlbn\" (UID: \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\") " pod="openstack/nova-cell1-conductor-db-sync-2jlbn" Dec 06 07:20:56 crc kubenswrapper[4954]: I1206 07:20:56.003317 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-config-data\") pod \"nova-cell1-conductor-db-sync-2jlbn\" (UID: \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\") " pod="openstack/nova-cell1-conductor-db-sync-2jlbn" Dec 06 07:20:56 crc kubenswrapper[4954]: I1206 07:20:56.003978 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-scripts\") pod \"nova-cell1-conductor-db-sync-2jlbn\" (UID: \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\") " pod="openstack/nova-cell1-conductor-db-sync-2jlbn" Dec 06 07:20:56 crc kubenswrapper[4954]: I1206 07:20:56.012055 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z62qh\" (UniqueName: \"kubernetes.io/projected/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-kube-api-access-z62qh\") pod \"nova-cell1-conductor-db-sync-2jlbn\" (UID: \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\") " pod="openstack/nova-cell1-conductor-db-sync-2jlbn" Dec 06 07:20:56 crc kubenswrapper[4954]: I1206 07:20:56.059056 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qjzpd" event={"ID":"3d7bda2f-6de7-4e30-be44-adc2196c3510","Type":"ContainerStarted","Data":"14e44df1724c72aff096b76324ca8eedc1cc0ab62e408634aaddcfc4fd8fdddb"} Dec 06 07:20:56 crc kubenswrapper[4954]: I1206 07:20:56.078975 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"82a54673-0760-4196-adc8-728b1cf08d6f","Type":"ContainerStarted","Data":"d93c8d084ce9ca568e511e2a333f40ebe622bcc72897226697d22773d3fa6702"} Dec 06 07:20:56 crc kubenswrapper[4954]: I1206 07:20:56.095002 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2","Type":"ContainerStarted","Data":"addb3aac249e9e6ce83f45a2336a48c29ee8eeaf9d574b21f48d5f60a5584f55"} Dec 06 07:20:56 crc kubenswrapper[4954]: I1206 07:20:56.124001 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f2f5e966-55ea-4534-9731-c914136c224b","Type":"ContainerStarted","Data":"95cb1ecbb47ecc2268a8ea2aedc338f77fcc858b7d3b817cab69ee94bb7c5c19"} Dec 06 07:20:56 crc kubenswrapper[4954]: I1206 07:20:56.147677 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2jlbn" Dec 06 07:20:56 crc kubenswrapper[4954]: I1206 07:20:56.157109 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 07:20:56 crc kubenswrapper[4954]: I1206 07:20:56.205774 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd87576bf-bh22g"] Dec 06 07:20:56 crc kubenswrapper[4954]: I1206 07:20:56.775551 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2jlbn"] Dec 06 07:20:57 crc kubenswrapper[4954]: I1206 07:20:57.168230 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qjzpd" event={"ID":"3d7bda2f-6de7-4e30-be44-adc2196c3510","Type":"ContainerStarted","Data":"fbf01d8c738f88aa0917c5da1a6fed31a6e5ed43c6500aa8c04f15f09dd7184a"} Dec 06 07:20:57 crc kubenswrapper[4954]: I1206 07:20:57.173953 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2jlbn" event={"ID":"a98f49b7-0577-4cc4-a3cc-bf5392fae7af","Type":"ContainerStarted","Data":"5e837fa95d1fa9b7b115359fe96d4dd565c2b734b97c7894835e55a0e0e55069"} Dec 06 07:20:57 crc kubenswrapper[4954]: I1206 07:20:57.174005 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2jlbn" event={"ID":"a98f49b7-0577-4cc4-a3cc-bf5392fae7af","Type":"ContainerStarted","Data":"a4f104bb9924f6ec13de44358f08c00de1789913706533e4ee94167df3cc6cb5"} Dec 06 07:20:57 crc kubenswrapper[4954]: I1206 07:20:57.179050 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81f000b6-60d1-4fd0-a294-8c1907b13d87","Type":"ContainerStarted","Data":"cf932f376a808ca50efcf8dbe28bb3d1bf7c94f5c1edddc9c131bce31e61bc84"} Dec 06 07:20:57 crc kubenswrapper[4954]: I1206 07:20:57.182403 4954 generic.go:334] "Generic (PLEG): container finished" podID="cf8ab637-2f55-4fe9-ba71-e2f98657d262" containerID="49cad8d92feec35081d61fb74310c81ffdb65596e75afa4f556d27a5b745dfa5" exitCode=0 Dec 06 07:20:57 crc kubenswrapper[4954]: I1206 07:20:57.182446 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" event={"ID":"cf8ab637-2f55-4fe9-ba71-e2f98657d262","Type":"ContainerDied","Data":"49cad8d92feec35081d61fb74310c81ffdb65596e75afa4f556d27a5b745dfa5"} Dec 06 07:20:57 crc kubenswrapper[4954]: I1206 07:20:57.182466 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" event={"ID":"cf8ab637-2f55-4fe9-ba71-e2f98657d262","Type":"ContainerStarted","Data":"7921291b24459d67932e56a85bfc2e426431f8af5f83af89515bce9aa524ec30"} Dec 06 07:20:57 crc kubenswrapper[4954]: I1206 07:20:57.193488 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-qjzpd" podStartSLOduration=4.19346221 podStartE2EDuration="4.19346221s" podCreationTimestamp="2025-12-06 07:20:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:20:57.188235189 +0000 UTC m=+1432.001594598" watchObservedRunningTime="2025-12-06 07:20:57.19346221 +0000 UTC m=+1432.006821599" Dec 06 07:20:57 crc kubenswrapper[4954]: I1206 07:20:57.216888 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-2jlbn" podStartSLOduration=2.21686548 podStartE2EDuration="2.21686548s" podCreationTimestamp="2025-12-06 07:20:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:20:57.206164432 +0000 UTC m=+1432.019523821" watchObservedRunningTime="2025-12-06 07:20:57.21686548 +0000 UTC m=+1432.030224869" Dec 06 07:20:58 crc kubenswrapper[4954]: I1206 07:20:58.432713 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 07:20:58 crc kubenswrapper[4954]: I1206 07:20:58.448999 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 07:21:00 crc kubenswrapper[4954]: I1206 07:21:00.223382 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"82a54673-0760-4196-adc8-728b1cf08d6f","Type":"ContainerStarted","Data":"0588f4ab5c61818a8cf69761dd41d59aa76fec625ff5df9de1e6eb75f04588a8"} Dec 06 07:21:00 crc kubenswrapper[4954]: I1206 07:21:00.224012 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"82a54673-0760-4196-adc8-728b1cf08d6f","Type":"ContainerStarted","Data":"848822261d379f961b6c14c6ddcd86498583854ec7232167f65f9afaf2c468b8"} Dec 06 07:21:00 crc kubenswrapper[4954]: I1206 07:21:00.223647 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="82a54673-0760-4196-adc8-728b1cf08d6f" containerName="nova-metadata-metadata" containerID="cri-o://0588f4ab5c61818a8cf69761dd41d59aa76fec625ff5df9de1e6eb75f04588a8" gracePeriod=30 Dec 06 07:21:00 crc kubenswrapper[4954]: I1206 07:21:00.223454 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="82a54673-0760-4196-adc8-728b1cf08d6f" containerName="nova-metadata-log" containerID="cri-o://848822261d379f961b6c14c6ddcd86498583854ec7232167f65f9afaf2c468b8" gracePeriod=30 Dec 06 07:21:00 crc kubenswrapper[4954]: I1206 07:21:00.227069 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2","Type":"ContainerStarted","Data":"d5bc3cbe768284dd9aec6bec08e9a2a8013ad4ae9bd475a320d930bd023ac86f"} Dec 06 07:21:00 crc kubenswrapper[4954]: I1206 07:21:00.230784 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f2f5e966-55ea-4534-9731-c914136c224b","Type":"ContainerStarted","Data":"d730da60e7e9b02874412058733684d323f5559903d10c98387eeac547cef214"} Dec 06 07:21:00 crc kubenswrapper[4954]: I1206 07:21:00.230827 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f2f5e966-55ea-4534-9731-c914136c224b","Type":"ContainerStarted","Data":"0a5f7776ac6932a67493a983e7d3486a87d250c43ecfbbfd162fa44fdd435793"} Dec 06 07:21:00 crc kubenswrapper[4954]: I1206 07:21:00.233452 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81f000b6-60d1-4fd0-a294-8c1907b13d87","Type":"ContainerStarted","Data":"aa147bd0eccd1b05b4dbd4ed6ed1631d0683cb1082a218f357e0cbfc2f7eb735"} Dec 06 07:21:00 crc kubenswrapper[4954]: I1206 07:21:00.233771 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="81f000b6-60d1-4fd0-a294-8c1907b13d87" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://aa147bd0eccd1b05b4dbd4ed6ed1631d0683cb1082a218f357e0cbfc2f7eb735" gracePeriod=30 Dec 06 07:21:00 crc kubenswrapper[4954]: I1206 07:21:00.238363 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" event={"ID":"cf8ab637-2f55-4fe9-ba71-e2f98657d262","Type":"ContainerStarted","Data":"487f3aa5673d5d19a4b982a861f9ad3bd692ff743abe78c916a7c6664e7db129"} Dec 06 07:21:00 crc kubenswrapper[4954]: I1206 07:21:00.238586 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:21:00 crc kubenswrapper[4954]: I1206 07:21:00.258124 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.676536679 podStartE2EDuration="6.258103787s" podCreationTimestamp="2025-12-06 07:20:54 +0000 UTC" firstStartedPulling="2025-12-06 07:20:55.872824646 +0000 UTC m=+1430.686184035" lastFinishedPulling="2025-12-06 07:20:59.454391744 +0000 UTC m=+1434.267751143" observedRunningTime="2025-12-06 07:21:00.24446122 +0000 UTC m=+1435.057820609" watchObservedRunningTime="2025-12-06 07:21:00.258103787 +0000 UTC m=+1435.071463176" Dec 06 07:21:00 crc kubenswrapper[4954]: I1206 07:21:00.272769 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.535479841 podStartE2EDuration="6.272753622s" podCreationTimestamp="2025-12-06 07:20:54 +0000 UTC" firstStartedPulling="2025-12-06 07:20:55.696821336 +0000 UTC m=+1430.510180725" lastFinishedPulling="2025-12-06 07:20:59.434095117 +0000 UTC m=+1434.247454506" observedRunningTime="2025-12-06 07:21:00.267994324 +0000 UTC m=+1435.081353713" watchObservedRunningTime="2025-12-06 07:21:00.272753622 +0000 UTC m=+1435.086113011" Dec 06 07:21:00 crc kubenswrapper[4954]: I1206 07:21:00.287855 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.065942246 podStartE2EDuration="6.287833228s" podCreationTimestamp="2025-12-06 07:20:54 +0000 UTC" firstStartedPulling="2025-12-06 07:20:56.211698692 +0000 UTC m=+1431.025058081" lastFinishedPulling="2025-12-06 07:20:59.433589674 +0000 UTC m=+1434.246949063" observedRunningTime="2025-12-06 07:21:00.28569126 +0000 UTC m=+1435.099050659" watchObservedRunningTime="2025-12-06 07:21:00.287833228 +0000 UTC m=+1435.101192617" Dec 06 07:21:00 crc kubenswrapper[4954]: I1206 07:21:00.318922 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.818191623 podStartE2EDuration="6.318898184s" podCreationTimestamp="2025-12-06 07:20:54 +0000 UTC" firstStartedPulling="2025-12-06 07:20:55.934824215 +0000 UTC m=+1430.748183604" lastFinishedPulling="2025-12-06 07:20:59.435530746 +0000 UTC m=+1434.248890165" observedRunningTime="2025-12-06 07:21:00.306025818 +0000 UTC m=+1435.119385207" watchObservedRunningTime="2025-12-06 07:21:00.318898184 +0000 UTC m=+1435.132257573" Dec 06 07:21:00 crc kubenswrapper[4954]: I1206 07:21:00.339002 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" podStartSLOduration=6.338971115 podStartE2EDuration="6.338971115s" podCreationTimestamp="2025-12-06 07:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:21:00.331130694 +0000 UTC m=+1435.144490083" watchObservedRunningTime="2025-12-06 07:21:00.338971115 +0000 UTC m=+1435.152330494" Dec 06 07:21:00 crc kubenswrapper[4954]: E1206 07:21:00.897553 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dfefc17_9a79_4857_b32d_d1b2f7ba15dc.slice/crio-7a9f1a4995c1ae26e3a0f2e72c85321e77738601ef9f2aed642805edfec4c31a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81e2968b_ef32_49ec_81e1_e3f07c3b73b8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dfefc17_9a79_4857_b32d_d1b2f7ba15dc.slice\": RecentStats: unable to find data in memory cache]" Dec 06 07:21:01 crc kubenswrapper[4954]: I1206 07:21:01.256883 4954 generic.go:334] "Generic (PLEG): container finished" podID="82a54673-0760-4196-adc8-728b1cf08d6f" containerID="848822261d379f961b6c14c6ddcd86498583854ec7232167f65f9afaf2c468b8" exitCode=143 Dec 06 07:21:01 crc kubenswrapper[4954]: I1206 07:21:01.258328 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"82a54673-0760-4196-adc8-728b1cf08d6f","Type":"ContainerDied","Data":"848822261d379f961b6c14c6ddcd86498583854ec7232167f65f9afaf2c468b8"} Dec 06 07:21:04 crc kubenswrapper[4954]: I1206 07:21:04.682714 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 07:21:04 crc kubenswrapper[4954]: I1206 07:21:04.683210 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 07:21:04 crc kubenswrapper[4954]: I1206 07:21:04.913790 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 07:21:04 crc kubenswrapper[4954]: I1206 07:21:04.914849 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 07:21:04 crc kubenswrapper[4954]: I1206 07:21:04.952645 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 06 07:21:04 crc kubenswrapper[4954]: I1206 07:21:04.952988 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.003814 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.127840 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.183229 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.224657 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56d54d44c7-b7jvx"] Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.225415 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" podUID="674dbe14-b0c4-4854-9861-c374f07568d0" containerName="dnsmasq-dns" containerID="cri-o://159105e4f36e070d24eb1c382c8bc8b4025cd551fc9cc4825673883d1b46b92d" gracePeriod=10 Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.312924 4954 generic.go:334] "Generic (PLEG): container finished" podID="3d7bda2f-6de7-4e30-be44-adc2196c3510" containerID="fbf01d8c738f88aa0917c5da1a6fed31a6e5ed43c6500aa8c04f15f09dd7184a" exitCode=0 Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.313026 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qjzpd" event={"ID":"3d7bda2f-6de7-4e30-be44-adc2196c3510","Type":"ContainerDied","Data":"fbf01d8c738f88aa0917c5da1a6fed31a6e5ed43c6500aa8c04f15f09dd7184a"} Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.316307 4954 generic.go:334] "Generic (PLEG): container finished" podID="a98f49b7-0577-4cc4-a3cc-bf5392fae7af" containerID="5e837fa95d1fa9b7b115359fe96d4dd565c2b734b97c7894835e55a0e0e55069" exitCode=0 Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.316480 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2jlbn" event={"ID":"a98f49b7-0577-4cc4-a3cc-bf5392fae7af","Type":"ContainerDied","Data":"5e837fa95d1fa9b7b115359fe96d4dd565c2b734b97c7894835e55a0e0e55069"} Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.378151 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.766145 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f2f5e966-55ea-4534-9731-c914136c224b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.767248 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f2f5e966-55ea-4534-9731-c914136c224b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.877153 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.894082 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkvgk\" (UniqueName: \"kubernetes.io/projected/674dbe14-b0c4-4854-9861-c374f07568d0-kube-api-access-xkvgk\") pod \"674dbe14-b0c4-4854-9861-c374f07568d0\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.894151 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-ovsdbserver-sb\") pod \"674dbe14-b0c4-4854-9861-c374f07568d0\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.894329 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-dns-svc\") pod \"674dbe14-b0c4-4854-9861-c374f07568d0\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.894351 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-dns-swift-storage-0\") pod \"674dbe14-b0c4-4854-9861-c374f07568d0\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.894402 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-ovsdbserver-nb\") pod \"674dbe14-b0c4-4854-9861-c374f07568d0\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.894507 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-config\") pod \"674dbe14-b0c4-4854-9861-c374f07568d0\" (UID: \"674dbe14-b0c4-4854-9861-c374f07568d0\") " Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.937650 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/674dbe14-b0c4-4854-9861-c374f07568d0-kube-api-access-xkvgk" (OuterVolumeSpecName: "kube-api-access-xkvgk") pod "674dbe14-b0c4-4854-9861-c374f07568d0" (UID: "674dbe14-b0c4-4854-9861-c374f07568d0"). InnerVolumeSpecName "kube-api-access-xkvgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.997800 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkvgk\" (UniqueName: \"kubernetes.io/projected/674dbe14-b0c4-4854-9861-c374f07568d0-kube-api-access-xkvgk\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:05 crc kubenswrapper[4954]: I1206 07:21:05.998537 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "674dbe14-b0c4-4854-9861-c374f07568d0" (UID: "674dbe14-b0c4-4854-9861-c374f07568d0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.029418 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "674dbe14-b0c4-4854-9861-c374f07568d0" (UID: "674dbe14-b0c4-4854-9861-c374f07568d0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.047554 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "674dbe14-b0c4-4854-9861-c374f07568d0" (UID: "674dbe14-b0c4-4854-9861-c374f07568d0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.067037 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "674dbe14-b0c4-4854-9861-c374f07568d0" (UID: "674dbe14-b0c4-4854-9861-c374f07568d0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.073154 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-config" (OuterVolumeSpecName: "config") pod "674dbe14-b0c4-4854-9861-c374f07568d0" (UID: "674dbe14-b0c4-4854-9861-c374f07568d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.101237 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.101281 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.101295 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.101307 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.101317 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/674dbe14-b0c4-4854-9861-c374f07568d0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.331018 4954 generic.go:334] "Generic (PLEG): container finished" podID="674dbe14-b0c4-4854-9861-c374f07568d0" containerID="159105e4f36e070d24eb1c382c8bc8b4025cd551fc9cc4825673883d1b46b92d" exitCode=0 Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.331096 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.331154 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" event={"ID":"674dbe14-b0c4-4854-9861-c374f07568d0","Type":"ContainerDied","Data":"159105e4f36e070d24eb1c382c8bc8b4025cd551fc9cc4825673883d1b46b92d"} Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.331235 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d54d44c7-b7jvx" event={"ID":"674dbe14-b0c4-4854-9861-c374f07568d0","Type":"ContainerDied","Data":"e8d3b972dfef9ed07865f254c8b8b8923cacc1fb86a4ea88c2547f9a1f55ae48"} Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.331261 4954 scope.go:117] "RemoveContainer" containerID="159105e4f36e070d24eb1c382c8bc8b4025cd551fc9cc4825673883d1b46b92d" Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.438965 4954 scope.go:117] "RemoveContainer" containerID="b7d95cd4a118ae38238e0ac1d517bb00599768e1f779f6d97f081fc261d5242a" Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.448322 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56d54d44c7-b7jvx"] Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.460934 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56d54d44c7-b7jvx"] Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.514746 4954 scope.go:117] "RemoveContainer" containerID="159105e4f36e070d24eb1c382c8bc8b4025cd551fc9cc4825673883d1b46b92d" Dec 06 07:21:06 crc kubenswrapper[4954]: E1206 07:21:06.516290 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"159105e4f36e070d24eb1c382c8bc8b4025cd551fc9cc4825673883d1b46b92d\": container with ID starting with 159105e4f36e070d24eb1c382c8bc8b4025cd551fc9cc4825673883d1b46b92d not found: ID does not exist" containerID="159105e4f36e070d24eb1c382c8bc8b4025cd551fc9cc4825673883d1b46b92d" Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.516397 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"159105e4f36e070d24eb1c382c8bc8b4025cd551fc9cc4825673883d1b46b92d"} err="failed to get container status \"159105e4f36e070d24eb1c382c8bc8b4025cd551fc9cc4825673883d1b46b92d\": rpc error: code = NotFound desc = could not find container \"159105e4f36e070d24eb1c382c8bc8b4025cd551fc9cc4825673883d1b46b92d\": container with ID starting with 159105e4f36e070d24eb1c382c8bc8b4025cd551fc9cc4825673883d1b46b92d not found: ID does not exist" Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.516438 4954 scope.go:117] "RemoveContainer" containerID="b7d95cd4a118ae38238e0ac1d517bb00599768e1f779f6d97f081fc261d5242a" Dec 06 07:21:06 crc kubenswrapper[4954]: E1206 07:21:06.518281 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7d95cd4a118ae38238e0ac1d517bb00599768e1f779f6d97f081fc261d5242a\": container with ID starting with b7d95cd4a118ae38238e0ac1d517bb00599768e1f779f6d97f081fc261d5242a not found: ID does not exist" containerID="b7d95cd4a118ae38238e0ac1d517bb00599768e1f779f6d97f081fc261d5242a" Dec 06 07:21:06 crc kubenswrapper[4954]: I1206 07:21:06.518341 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d95cd4a118ae38238e0ac1d517bb00599768e1f779f6d97f081fc261d5242a"} err="failed to get container status \"b7d95cd4a118ae38238e0ac1d517bb00599768e1f779f6d97f081fc261d5242a\": rpc error: code = NotFound desc = could not find container \"b7d95cd4a118ae38238e0ac1d517bb00599768e1f779f6d97f081fc261d5242a\": container with ID starting with b7d95cd4a118ae38238e0ac1d517bb00599768e1f779f6d97f081fc261d5242a not found: ID does not exist" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.034037 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qjzpd" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.040668 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2jlbn" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.147752 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-config-data\") pod \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\" (UID: \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\") " Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.147835 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z62qh\" (UniqueName: \"kubernetes.io/projected/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-kube-api-access-z62qh\") pod \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\" (UID: \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\") " Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.147889 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7bda2f-6de7-4e30-be44-adc2196c3510-combined-ca-bundle\") pod \"3d7bda2f-6de7-4e30-be44-adc2196c3510\" (UID: \"3d7bda2f-6de7-4e30-be44-adc2196c3510\") " Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.147987 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg4jb\" (UniqueName: \"kubernetes.io/projected/3d7bda2f-6de7-4e30-be44-adc2196c3510-kube-api-access-mg4jb\") pod \"3d7bda2f-6de7-4e30-be44-adc2196c3510\" (UID: \"3d7bda2f-6de7-4e30-be44-adc2196c3510\") " Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.148007 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-combined-ca-bundle\") pod \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\" (UID: \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\") " Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.148058 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7bda2f-6de7-4e30-be44-adc2196c3510-config-data\") pod \"3d7bda2f-6de7-4e30-be44-adc2196c3510\" (UID: \"3d7bda2f-6de7-4e30-be44-adc2196c3510\") " Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.148082 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7bda2f-6de7-4e30-be44-adc2196c3510-scripts\") pod \"3d7bda2f-6de7-4e30-be44-adc2196c3510\" (UID: \"3d7bda2f-6de7-4e30-be44-adc2196c3510\") " Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.148152 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-scripts\") pod \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\" (UID: \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\") " Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.157927 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d7bda2f-6de7-4e30-be44-adc2196c3510-kube-api-access-mg4jb" (OuterVolumeSpecName: "kube-api-access-mg4jb") pod "3d7bda2f-6de7-4e30-be44-adc2196c3510" (UID: "3d7bda2f-6de7-4e30-be44-adc2196c3510"). InnerVolumeSpecName "kube-api-access-mg4jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.157954 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7bda2f-6de7-4e30-be44-adc2196c3510-scripts" (OuterVolumeSpecName: "scripts") pod "3d7bda2f-6de7-4e30-be44-adc2196c3510" (UID: "3d7bda2f-6de7-4e30-be44-adc2196c3510"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.158163 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-scripts" (OuterVolumeSpecName: "scripts") pod "a98f49b7-0577-4cc4-a3cc-bf5392fae7af" (UID: "a98f49b7-0577-4cc4-a3cc-bf5392fae7af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.164734 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-kube-api-access-z62qh" (OuterVolumeSpecName: "kube-api-access-z62qh") pod "a98f49b7-0577-4cc4-a3cc-bf5392fae7af" (UID: "a98f49b7-0577-4cc4-a3cc-bf5392fae7af"). InnerVolumeSpecName "kube-api-access-z62qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:21:07 crc kubenswrapper[4954]: E1206 07:21:07.189974 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-config-data podName:a98f49b7-0577-4cc4-a3cc-bf5392fae7af nodeName:}" failed. No retries permitted until 2025-12-06 07:21:07.689931495 +0000 UTC m=+1442.503290884 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-config-data") pod "a98f49b7-0577-4cc4-a3cc-bf5392fae7af" (UID: "a98f49b7-0577-4cc4-a3cc-bf5392fae7af") : error deleting /var/lib/kubelet/pods/a98f49b7-0577-4cc4-a3cc-bf5392fae7af/volume-subpaths: remove /var/lib/kubelet/pods/a98f49b7-0577-4cc4-a3cc-bf5392fae7af/volume-subpaths: no such file or directory Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.196700 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a98f49b7-0577-4cc4-a3cc-bf5392fae7af" (UID: "a98f49b7-0577-4cc4-a3cc-bf5392fae7af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.219755 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7bda2f-6de7-4e30-be44-adc2196c3510-config-data" (OuterVolumeSpecName: "config-data") pod "3d7bda2f-6de7-4e30-be44-adc2196c3510" (UID: "3d7bda2f-6de7-4e30-be44-adc2196c3510"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.225111 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7bda2f-6de7-4e30-be44-adc2196c3510-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d7bda2f-6de7-4e30-be44-adc2196c3510" (UID: "3d7bda2f-6de7-4e30-be44-adc2196c3510"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.251169 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg4jb\" (UniqueName: \"kubernetes.io/projected/3d7bda2f-6de7-4e30-be44-adc2196c3510-kube-api-access-mg4jb\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.251409 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.251424 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7bda2f-6de7-4e30-be44-adc2196c3510-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.251436 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7bda2f-6de7-4e30-be44-adc2196c3510-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.251445 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.251454 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z62qh\" (UniqueName: \"kubernetes.io/projected/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-kube-api-access-z62qh\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.251464 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7bda2f-6de7-4e30-be44-adc2196c3510-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.346898 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qjzpd" event={"ID":"3d7bda2f-6de7-4e30-be44-adc2196c3510","Type":"ContainerDied","Data":"14e44df1724c72aff096b76324ca8eedc1cc0ab62e408634aaddcfc4fd8fdddb"} Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.346958 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14e44df1724c72aff096b76324ca8eedc1cc0ab62e408634aaddcfc4fd8fdddb" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.347051 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qjzpd" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.358611 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2jlbn" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.359024 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2jlbn" event={"ID":"a98f49b7-0577-4cc4-a3cc-bf5392fae7af","Type":"ContainerDied","Data":"a4f104bb9924f6ec13de44358f08c00de1789913706533e4ee94167df3cc6cb5"} Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.359070 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4f104bb9924f6ec13de44358f08c00de1789913706533e4ee94167df3cc6cb5" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.462125 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="674dbe14-b0c4-4854-9861-c374f07568d0" path="/var/lib/kubelet/pods/674dbe14-b0c4-4854-9861-c374f07568d0/volumes" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.487399 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 07:21:07 crc kubenswrapper[4954]: E1206 07:21:07.494117 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="674dbe14-b0c4-4854-9861-c374f07568d0" containerName="dnsmasq-dns" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.494178 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="674dbe14-b0c4-4854-9861-c374f07568d0" containerName="dnsmasq-dns" Dec 06 07:21:07 crc kubenswrapper[4954]: E1206 07:21:07.494226 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="674dbe14-b0c4-4854-9861-c374f07568d0" containerName="init" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.494234 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="674dbe14-b0c4-4854-9861-c374f07568d0" containerName="init" Dec 06 07:21:07 crc kubenswrapper[4954]: E1206 07:21:07.494255 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7bda2f-6de7-4e30-be44-adc2196c3510" containerName="nova-manage" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.494262 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7bda2f-6de7-4e30-be44-adc2196c3510" containerName="nova-manage" Dec 06 07:21:07 crc kubenswrapper[4954]: E1206 07:21:07.494277 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a98f49b7-0577-4cc4-a3cc-bf5392fae7af" containerName="nova-cell1-conductor-db-sync" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.494284 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a98f49b7-0577-4cc4-a3cc-bf5392fae7af" containerName="nova-cell1-conductor-db-sync" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.494536 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="674dbe14-b0c4-4854-9861-c374f07568d0" containerName="dnsmasq-dns" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.494578 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7bda2f-6de7-4e30-be44-adc2196c3510" containerName="nova-manage" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.494592 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a98f49b7-0577-4cc4-a3cc-bf5392fae7af" containerName="nova-cell1-conductor-db-sync" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.495408 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.514951 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.561631 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcb2k\" (UniqueName: \"kubernetes.io/projected/dc34e079-7411-438c-aca8-b2c95854158e-kube-api-access-vcb2k\") pod \"nova-cell1-conductor-0\" (UID: \"dc34e079-7411-438c-aca8-b2c95854158e\") " pod="openstack/nova-cell1-conductor-0" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.561879 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc34e079-7411-438c-aca8-b2c95854158e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dc34e079-7411-438c-aca8-b2c95854158e\") " pod="openstack/nova-cell1-conductor-0" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.561909 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc34e079-7411-438c-aca8-b2c95854158e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dc34e079-7411-438c-aca8-b2c95854158e\") " pod="openstack/nova-cell1-conductor-0" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.648658 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.659508 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.659807 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f2f5e966-55ea-4534-9731-c914136c224b" containerName="nova-api-log" containerID="cri-o://0a5f7776ac6932a67493a983e7d3486a87d250c43ecfbbfd162fa44fdd435793" gracePeriod=30 Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.660333 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f2f5e966-55ea-4534-9731-c914136c224b" containerName="nova-api-api" containerID="cri-o://d730da60e7e9b02874412058733684d323f5559903d10c98387eeac547cef214" gracePeriod=30 Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.663869 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcb2k\" (UniqueName: \"kubernetes.io/projected/dc34e079-7411-438c-aca8-b2c95854158e-kube-api-access-vcb2k\") pod \"nova-cell1-conductor-0\" (UID: \"dc34e079-7411-438c-aca8-b2c95854158e\") " pod="openstack/nova-cell1-conductor-0" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.663926 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc34e079-7411-438c-aca8-b2c95854158e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dc34e079-7411-438c-aca8-b2c95854158e\") " pod="openstack/nova-cell1-conductor-0" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.663957 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc34e079-7411-438c-aca8-b2c95854158e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dc34e079-7411-438c-aca8-b2c95854158e\") " pod="openstack/nova-cell1-conductor-0" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.694413 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc34e079-7411-438c-aca8-b2c95854158e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dc34e079-7411-438c-aca8-b2c95854158e\") " pod="openstack/nova-cell1-conductor-0" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.695159 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc34e079-7411-438c-aca8-b2c95854158e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dc34e079-7411-438c-aca8-b2c95854158e\") " pod="openstack/nova-cell1-conductor-0" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.698944 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcb2k\" (UniqueName: \"kubernetes.io/projected/dc34e079-7411-438c-aca8-b2c95854158e-kube-api-access-vcb2k\") pod \"nova-cell1-conductor-0\" (UID: \"dc34e079-7411-438c-aca8-b2c95854158e\") " pod="openstack/nova-cell1-conductor-0" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.766121 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-config-data\") pod \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\" (UID: \"a98f49b7-0577-4cc4-a3cc-bf5392fae7af\") " Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.776861 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-config-data" (OuterVolumeSpecName: "config-data") pod "a98f49b7-0577-4cc4-a3cc-bf5392fae7af" (UID: "a98f49b7-0577-4cc4-a3cc-bf5392fae7af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.835452 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 06 07:21:07 crc kubenswrapper[4954]: I1206 07:21:07.870585 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a98f49b7-0577-4cc4-a3cc-bf5392fae7af-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:08 crc kubenswrapper[4954]: I1206 07:21:08.384864 4954 generic.go:334] "Generic (PLEG): container finished" podID="f2f5e966-55ea-4534-9731-c914136c224b" containerID="0a5f7776ac6932a67493a983e7d3486a87d250c43ecfbbfd162fa44fdd435793" exitCode=143 Dec 06 07:21:08 crc kubenswrapper[4954]: I1206 07:21:08.385644 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2" containerName="nova-scheduler-scheduler" containerID="cri-o://d5bc3cbe768284dd9aec6bec08e9a2a8013ad4ae9bd475a320d930bd023ac86f" gracePeriod=30 Dec 06 07:21:08 crc kubenswrapper[4954]: I1206 07:21:08.386056 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f2f5e966-55ea-4534-9731-c914136c224b","Type":"ContainerDied","Data":"0a5f7776ac6932a67493a983e7d3486a87d250c43ecfbbfd162fa44fdd435793"} Dec 06 07:21:08 crc kubenswrapper[4954]: I1206 07:21:08.430245 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 07:21:09 crc kubenswrapper[4954]: I1206 07:21:09.402342 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dc34e079-7411-438c-aca8-b2c95854158e","Type":"ContainerStarted","Data":"31234f4535422fb1bf7bada8c66937069f24dc6cdacdb390f9005698b7491575"} Dec 06 07:21:09 crc kubenswrapper[4954]: I1206 07:21:09.403021 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dc34e079-7411-438c-aca8-b2c95854158e","Type":"ContainerStarted","Data":"762b3ad59c7c647a311242e16d7f8285386c8c853d84ff67da08816f97c82278"} Dec 06 07:21:09 crc kubenswrapper[4954]: I1206 07:21:09.403058 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 06 07:21:09 crc kubenswrapper[4954]: I1206 07:21:09.432233 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.432209587 podStartE2EDuration="2.432209587s" podCreationTimestamp="2025-12-06 07:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:21:09.4315887 +0000 UTC m=+1444.244948099" watchObservedRunningTime="2025-12-06 07:21:09.432209587 +0000 UTC m=+1444.245568976" Dec 06 07:21:09 crc kubenswrapper[4954]: E1206 07:21:09.956017 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5bc3cbe768284dd9aec6bec08e9a2a8013ad4ae9bd475a320d930bd023ac86f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 07:21:09 crc kubenswrapper[4954]: E1206 07:21:09.957869 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5bc3cbe768284dd9aec6bec08e9a2a8013ad4ae9bd475a320d930bd023ac86f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 07:21:09 crc kubenswrapper[4954]: E1206 07:21:09.959684 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5bc3cbe768284dd9aec6bec08e9a2a8013ad4ae9bd475a320d930bd023ac86f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 07:21:09 crc kubenswrapper[4954]: E1206 07:21:09.959734 4954 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2" containerName="nova-scheduler-scheduler" Dec 06 07:21:10 crc kubenswrapper[4954]: I1206 07:21:10.102049 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:21:10 crc kubenswrapper[4954]: I1206 07:21:10.102149 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:21:11 crc kubenswrapper[4954]: I1206 07:21:11.246470 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.086317 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rlp8c"] Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.089245 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlp8c" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.102155 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rlp8c"] Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.170611 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cs2n\" (UniqueName: \"kubernetes.io/projected/e1260548-45c8-4fca-9f7b-52a5cd1e9d30-kube-api-access-2cs2n\") pod \"redhat-operators-rlp8c\" (UID: \"e1260548-45c8-4fca-9f7b-52a5cd1e9d30\") " pod="openshift-marketplace/redhat-operators-rlp8c" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.171052 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1260548-45c8-4fca-9f7b-52a5cd1e9d30-utilities\") pod \"redhat-operators-rlp8c\" (UID: \"e1260548-45c8-4fca-9f7b-52a5cd1e9d30\") " pod="openshift-marketplace/redhat-operators-rlp8c" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.171272 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1260548-45c8-4fca-9f7b-52a5cd1e9d30-catalog-content\") pod \"redhat-operators-rlp8c\" (UID: \"e1260548-45c8-4fca-9f7b-52a5cd1e9d30\") " pod="openshift-marketplace/redhat-operators-rlp8c" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.216434 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.272782 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2-combined-ca-bundle\") pod \"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2\" (UID: \"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2\") " Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.272983 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9r2c\" (UniqueName: \"kubernetes.io/projected/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2-kube-api-access-f9r2c\") pod \"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2\" (UID: \"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2\") " Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.273105 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2-config-data\") pod \"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2\" (UID: \"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2\") " Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.273490 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cs2n\" (UniqueName: \"kubernetes.io/projected/e1260548-45c8-4fca-9f7b-52a5cd1e9d30-kube-api-access-2cs2n\") pod \"redhat-operators-rlp8c\" (UID: \"e1260548-45c8-4fca-9f7b-52a5cd1e9d30\") " pod="openshift-marketplace/redhat-operators-rlp8c" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.273555 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1260548-45c8-4fca-9f7b-52a5cd1e9d30-utilities\") pod \"redhat-operators-rlp8c\" (UID: \"e1260548-45c8-4fca-9f7b-52a5cd1e9d30\") " pod="openshift-marketplace/redhat-operators-rlp8c" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.273634 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1260548-45c8-4fca-9f7b-52a5cd1e9d30-catalog-content\") pod \"redhat-operators-rlp8c\" (UID: \"e1260548-45c8-4fca-9f7b-52a5cd1e9d30\") " pod="openshift-marketplace/redhat-operators-rlp8c" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.282887 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1260548-45c8-4fca-9f7b-52a5cd1e9d30-catalog-content\") pod \"redhat-operators-rlp8c\" (UID: \"e1260548-45c8-4fca-9f7b-52a5cd1e9d30\") " pod="openshift-marketplace/redhat-operators-rlp8c" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.286449 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2-kube-api-access-f9r2c" (OuterVolumeSpecName: "kube-api-access-f9r2c") pod "70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2" (UID: "70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2"). InnerVolumeSpecName "kube-api-access-f9r2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.287406 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1260548-45c8-4fca-9f7b-52a5cd1e9d30-utilities\") pod \"redhat-operators-rlp8c\" (UID: \"e1260548-45c8-4fca-9f7b-52a5cd1e9d30\") " pod="openshift-marketplace/redhat-operators-rlp8c" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.299164 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cs2n\" (UniqueName: \"kubernetes.io/projected/e1260548-45c8-4fca-9f7b-52a5cd1e9d30-kube-api-access-2cs2n\") pod \"redhat-operators-rlp8c\" (UID: \"e1260548-45c8-4fca-9f7b-52a5cd1e9d30\") " pod="openshift-marketplace/redhat-operators-rlp8c" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.328235 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2" (UID: "70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.350108 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2-config-data" (OuterVolumeSpecName: "config-data") pod "70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2" (UID: "70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.375402 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9r2c\" (UniqueName: \"kubernetes.io/projected/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2-kube-api-access-f9r2c\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.375440 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.375454 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.465694 4954 generic.go:334] "Generic (PLEG): container finished" podID="70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2" containerID="d5bc3cbe768284dd9aec6bec08e9a2a8013ad4ae9bd475a320d930bd023ac86f" exitCode=0 Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.465799 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2","Type":"ContainerDied","Data":"d5bc3cbe768284dd9aec6bec08e9a2a8013ad4ae9bd475a320d930bd023ac86f"} Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.465853 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2","Type":"ContainerDied","Data":"addb3aac249e9e6ce83f45a2336a48c29ee8eeaf9d574b21f48d5f60a5584f55"} Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.465871 4954 scope.go:117] "RemoveContainer" containerID="d5bc3cbe768284dd9aec6bec08e9a2a8013ad4ae9bd475a320d930bd023ac86f" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.466385 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.486074 4954 generic.go:334] "Generic (PLEG): container finished" podID="f2f5e966-55ea-4534-9731-c914136c224b" containerID="d730da60e7e9b02874412058733684d323f5559903d10c98387eeac547cef214" exitCode=0 Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.486148 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f2f5e966-55ea-4534-9731-c914136c224b","Type":"ContainerDied","Data":"d730da60e7e9b02874412058733684d323f5559903d10c98387eeac547cef214"} Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.516919 4954 scope.go:117] "RemoveContainer" containerID="d5bc3cbe768284dd9aec6bec08e9a2a8013ad4ae9bd475a320d930bd023ac86f" Dec 06 07:21:12 crc kubenswrapper[4954]: E1206 07:21:12.522748 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5bc3cbe768284dd9aec6bec08e9a2a8013ad4ae9bd475a320d930bd023ac86f\": container with ID starting with d5bc3cbe768284dd9aec6bec08e9a2a8013ad4ae9bd475a320d930bd023ac86f not found: ID does not exist" containerID="d5bc3cbe768284dd9aec6bec08e9a2a8013ad4ae9bd475a320d930bd023ac86f" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.523031 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5bc3cbe768284dd9aec6bec08e9a2a8013ad4ae9bd475a320d930bd023ac86f"} err="failed to get container status \"d5bc3cbe768284dd9aec6bec08e9a2a8013ad4ae9bd475a320d930bd023ac86f\": rpc error: code = NotFound desc = could not find container \"d5bc3cbe768284dd9aec6bec08e9a2a8013ad4ae9bd475a320d930bd023ac86f\": container with ID starting with d5bc3cbe768284dd9aec6bec08e9a2a8013ad4ae9bd475a320d930bd023ac86f not found: ID does not exist" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.532143 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.539628 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlp8c" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.548435 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.560363 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 07:21:12 crc kubenswrapper[4954]: E1206 07:21:12.561155 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2" containerName="nova-scheduler-scheduler" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.561188 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2" containerName="nova-scheduler-scheduler" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.561521 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2" containerName="nova-scheduler-scheduler" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.562642 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.566241 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.579779 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.586832 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.681589 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f5e966-55ea-4534-9731-c914136c224b-combined-ca-bundle\") pod \"f2f5e966-55ea-4534-9731-c914136c224b\" (UID: \"f2f5e966-55ea-4534-9731-c914136c224b\") " Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.682081 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2f5e966-55ea-4534-9731-c914136c224b-logs\") pod \"f2f5e966-55ea-4534-9731-c914136c224b\" (UID: \"f2f5e966-55ea-4534-9731-c914136c224b\") " Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.682215 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f5e966-55ea-4534-9731-c914136c224b-config-data\") pod \"f2f5e966-55ea-4534-9731-c914136c224b\" (UID: \"f2f5e966-55ea-4534-9731-c914136c224b\") " Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.682394 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xsbq\" (UniqueName: \"kubernetes.io/projected/f2f5e966-55ea-4534-9731-c914136c224b-kube-api-access-9xsbq\") pod \"f2f5e966-55ea-4534-9731-c914136c224b\" (UID: \"f2f5e966-55ea-4534-9731-c914136c224b\") " Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.682727 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlfc2\" (UniqueName: \"kubernetes.io/projected/54d05135-2771-460c-9949-4e16516015c4-kube-api-access-dlfc2\") pod \"nova-scheduler-0\" (UID: \"54d05135-2771-460c-9949-4e16516015c4\") " pod="openstack/nova-scheduler-0" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.682861 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54d05135-2771-460c-9949-4e16516015c4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"54d05135-2771-460c-9949-4e16516015c4\") " pod="openstack/nova-scheduler-0" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.682904 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54d05135-2771-460c-9949-4e16516015c4-config-data\") pod \"nova-scheduler-0\" (UID: \"54d05135-2771-460c-9949-4e16516015c4\") " pod="openstack/nova-scheduler-0" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.683117 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2f5e966-55ea-4534-9731-c914136c224b-logs" (OuterVolumeSpecName: "logs") pod "f2f5e966-55ea-4534-9731-c914136c224b" (UID: "f2f5e966-55ea-4534-9731-c914136c224b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.692024 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2f5e966-55ea-4534-9731-c914136c224b-kube-api-access-9xsbq" (OuterVolumeSpecName: "kube-api-access-9xsbq") pod "f2f5e966-55ea-4534-9731-c914136c224b" (UID: "f2f5e966-55ea-4534-9731-c914136c224b"). InnerVolumeSpecName "kube-api-access-9xsbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.741817 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f5e966-55ea-4534-9731-c914136c224b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2f5e966-55ea-4534-9731-c914136c224b" (UID: "f2f5e966-55ea-4534-9731-c914136c224b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.748812 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f5e966-55ea-4534-9731-c914136c224b-config-data" (OuterVolumeSpecName: "config-data") pod "f2f5e966-55ea-4534-9731-c914136c224b" (UID: "f2f5e966-55ea-4534-9731-c914136c224b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.787165 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlfc2\" (UniqueName: \"kubernetes.io/projected/54d05135-2771-460c-9949-4e16516015c4-kube-api-access-dlfc2\") pod \"nova-scheduler-0\" (UID: \"54d05135-2771-460c-9949-4e16516015c4\") " pod="openstack/nova-scheduler-0" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.787303 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54d05135-2771-460c-9949-4e16516015c4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"54d05135-2771-460c-9949-4e16516015c4\") " pod="openstack/nova-scheduler-0" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.787343 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54d05135-2771-460c-9949-4e16516015c4-config-data\") pod \"nova-scheduler-0\" (UID: \"54d05135-2771-460c-9949-4e16516015c4\") " pod="openstack/nova-scheduler-0" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.787425 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f5e966-55ea-4534-9731-c914136c224b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.787438 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2f5e966-55ea-4534-9731-c914136c224b-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.787447 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f5e966-55ea-4534-9731-c914136c224b-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.787456 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xsbq\" (UniqueName: \"kubernetes.io/projected/f2f5e966-55ea-4534-9731-c914136c224b-kube-api-access-9xsbq\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.792212 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54d05135-2771-460c-9949-4e16516015c4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"54d05135-2771-460c-9949-4e16516015c4\") " pod="openstack/nova-scheduler-0" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.794763 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54d05135-2771-460c-9949-4e16516015c4-config-data\") pod \"nova-scheduler-0\" (UID: \"54d05135-2771-460c-9949-4e16516015c4\") " pod="openstack/nova-scheduler-0" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.816228 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlfc2\" (UniqueName: \"kubernetes.io/projected/54d05135-2771-460c-9949-4e16516015c4-kube-api-access-dlfc2\") pod \"nova-scheduler-0\" (UID: \"54d05135-2771-460c-9949-4e16516015c4\") " pod="openstack/nova-scheduler-0" Dec 06 07:21:12 crc kubenswrapper[4954]: I1206 07:21:12.892983 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.140832 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rlp8c"] Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.458678 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2" path="/var/lib/kubelet/pods/70b1dcc4-a8ef-4211-b946-3e0bedcbe5c2/volumes" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.500125 4954 generic.go:334] "Generic (PLEG): container finished" podID="e1260548-45c8-4fca-9f7b-52a5cd1e9d30" containerID="cfdf07915e103b77672f6abb91e43ab46cd191d43f002e4b0c5c7cdd130880af" exitCode=0 Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.500254 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlp8c" event={"ID":"e1260548-45c8-4fca-9f7b-52a5cd1e9d30","Type":"ContainerDied","Data":"cfdf07915e103b77672f6abb91e43ab46cd191d43f002e4b0c5c7cdd130880af"} Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.500366 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlp8c" event={"ID":"e1260548-45c8-4fca-9f7b-52a5cd1e9d30","Type":"ContainerStarted","Data":"8ecc53852fd1b2470c21dfa24e49e5fb814320c6b3c90dda975394ab2044db7a"} Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.507333 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f2f5e966-55ea-4534-9731-c914136c224b","Type":"ContainerDied","Data":"95cb1ecbb47ecc2268a8ea2aedc338f77fcc858b7d3b817cab69ee94bb7c5c19"} Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.507397 4954 scope.go:117] "RemoveContainer" containerID="d730da60e7e9b02874412058733684d323f5559903d10c98387eeac547cef214" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.507464 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.545903 4954 scope.go:117] "RemoveContainer" containerID="0a5f7776ac6932a67493a983e7d3486a87d250c43ecfbbfd162fa44fdd435793" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.629103 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.644675 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.657191 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.668293 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 07:21:13 crc kubenswrapper[4954]: E1206 07:21:13.669185 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f5e966-55ea-4534-9731-c914136c224b" containerName="nova-api-log" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.669264 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f5e966-55ea-4534-9731-c914136c224b" containerName="nova-api-log" Dec 06 07:21:13 crc kubenswrapper[4954]: E1206 07:21:13.669395 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f5e966-55ea-4534-9731-c914136c224b" containerName="nova-api-api" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.669459 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f5e966-55ea-4534-9731-c914136c224b" containerName="nova-api-api" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.669733 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f5e966-55ea-4534-9731-c914136c224b" containerName="nova-api-log" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.669805 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f5e966-55ea-4534-9731-c914136c224b" containerName="nova-api-api" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.671423 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.678096 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.694278 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.722262 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\") " pod="openstack/nova-api-0" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.722368 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-config-data\") pod \"nova-api-0\" (UID: \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\") " pod="openstack/nova-api-0" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.722404 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-logs\") pod \"nova-api-0\" (UID: \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\") " pod="openstack/nova-api-0" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.722450 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt8dw\" (UniqueName: \"kubernetes.io/projected/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-kube-api-access-qt8dw\") pod \"nova-api-0\" (UID: \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\") " pod="openstack/nova-api-0" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.824697 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\") " pod="openstack/nova-api-0" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.824830 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-config-data\") pod \"nova-api-0\" (UID: \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\") " pod="openstack/nova-api-0" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.824871 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-logs\") pod \"nova-api-0\" (UID: \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\") " pod="openstack/nova-api-0" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.824923 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt8dw\" (UniqueName: \"kubernetes.io/projected/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-kube-api-access-qt8dw\") pod \"nova-api-0\" (UID: \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\") " pod="openstack/nova-api-0" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.825782 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-logs\") pod \"nova-api-0\" (UID: \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\") " pod="openstack/nova-api-0" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.831017 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-config-data\") pod \"nova-api-0\" (UID: \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\") " pod="openstack/nova-api-0" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.831453 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\") " pod="openstack/nova-api-0" Dec 06 07:21:13 crc kubenswrapper[4954]: I1206 07:21:13.847779 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt8dw\" (UniqueName: \"kubernetes.io/projected/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-kube-api-access-qt8dw\") pod \"nova-api-0\" (UID: \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\") " pod="openstack/nova-api-0" Dec 06 07:21:14 crc kubenswrapper[4954]: I1206 07:21:14.073175 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 07:21:14 crc kubenswrapper[4954]: I1206 07:21:14.518717 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlp8c" event={"ID":"e1260548-45c8-4fca-9f7b-52a5cd1e9d30","Type":"ContainerStarted","Data":"073e61f5d90665e18b8c3396d1a787b2657cc16236e7aacf7f53848c5eb3adf8"} Dec 06 07:21:14 crc kubenswrapper[4954]: I1206 07:21:14.522419 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"54d05135-2771-460c-9949-4e16516015c4","Type":"ContainerStarted","Data":"0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b"} Dec 06 07:21:14 crc kubenswrapper[4954]: I1206 07:21:14.522469 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"54d05135-2771-460c-9949-4e16516015c4","Type":"ContainerStarted","Data":"7d7866f19950fb113dbb94472eda9135a59f8e762fca28aef0fa16631311a4ba"} Dec 06 07:21:14 crc kubenswrapper[4954]: I1206 07:21:14.566611 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.56659172 podStartE2EDuration="2.56659172s" podCreationTimestamp="2025-12-06 07:21:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:21:14.557491735 +0000 UTC m=+1449.370851124" watchObservedRunningTime="2025-12-06 07:21:14.56659172 +0000 UTC m=+1449.379951109" Dec 06 07:21:14 crc kubenswrapper[4954]: W1206 07:21:14.652222 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fd0b6e9_6264_4abf_80a8_b4fbffbe9f3f.slice/crio-bac223ca4a24bd15c0802b0ad83239a652cf8ec1f37c88877645870e7e7c2a4d WatchSource:0}: Error finding container bac223ca4a24bd15c0802b0ad83239a652cf8ec1f37c88877645870e7e7c2a4d: Status 404 returned error can't find the container with id bac223ca4a24bd15c0802b0ad83239a652cf8ec1f37c88877645870e7e7c2a4d Dec 06 07:21:14 crc kubenswrapper[4954]: I1206 07:21:14.653277 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:21:15 crc kubenswrapper[4954]: I1206 07:21:15.459820 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2f5e966-55ea-4534-9731-c914136c224b" path="/var/lib/kubelet/pods/f2f5e966-55ea-4534-9731-c914136c224b/volumes" Dec 06 07:21:15 crc kubenswrapper[4954]: I1206 07:21:15.538472 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f","Type":"ContainerStarted","Data":"f379258a258b9574fba5dfd90fa5636ffb019e3060fedd9f49e7bb781e4d2242"} Dec 06 07:21:15 crc kubenswrapper[4954]: I1206 07:21:15.538526 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f","Type":"ContainerStarted","Data":"210068a35ed37a138c9935d71a7ec75ae8598294f57340d220b906ed630a4ada"} Dec 06 07:21:15 crc kubenswrapper[4954]: I1206 07:21:15.538541 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f","Type":"ContainerStarted","Data":"bac223ca4a24bd15c0802b0ad83239a652cf8ec1f37c88877645870e7e7c2a4d"} Dec 06 07:21:15 crc kubenswrapper[4954]: I1206 07:21:15.541264 4954 generic.go:334] "Generic (PLEG): container finished" podID="e1260548-45c8-4fca-9f7b-52a5cd1e9d30" containerID="073e61f5d90665e18b8c3396d1a787b2657cc16236e7aacf7f53848c5eb3adf8" exitCode=0 Dec 06 07:21:15 crc kubenswrapper[4954]: I1206 07:21:15.541292 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlp8c" event={"ID":"e1260548-45c8-4fca-9f7b-52a5cd1e9d30","Type":"ContainerDied","Data":"073e61f5d90665e18b8c3396d1a787b2657cc16236e7aacf7f53848c5eb3adf8"} Dec 06 07:21:15 crc kubenswrapper[4954]: I1206 07:21:15.565633 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.565605172 podStartE2EDuration="2.565605172s" podCreationTimestamp="2025-12-06 07:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:21:15.556851257 +0000 UTC m=+1450.370210646" watchObservedRunningTime="2025-12-06 07:21:15.565605172 +0000 UTC m=+1450.378964571" Dec 06 07:21:17 crc kubenswrapper[4954]: I1206 07:21:17.564142 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlp8c" event={"ID":"e1260548-45c8-4fca-9f7b-52a5cd1e9d30","Type":"ContainerStarted","Data":"8afbd1dae36253d138592f7b2ecd6d436f3b2d2cbaae7e1362a7c3a410bb0e16"} Dec 06 07:21:17 crc kubenswrapper[4954]: I1206 07:21:17.592702 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rlp8c" podStartSLOduration=2.512464682 podStartE2EDuration="5.592676329s" podCreationTimestamp="2025-12-06 07:21:12 +0000 UTC" firstStartedPulling="2025-12-06 07:21:13.502381571 +0000 UTC m=+1448.315740960" lastFinishedPulling="2025-12-06 07:21:16.582593218 +0000 UTC m=+1451.395952607" observedRunningTime="2025-12-06 07:21:17.584255232 +0000 UTC m=+1452.397614621" watchObservedRunningTime="2025-12-06 07:21:17.592676329 +0000 UTC m=+1452.406035718" Dec 06 07:21:17 crc kubenswrapper[4954]: I1206 07:21:17.888728 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 06 07:21:17 crc kubenswrapper[4954]: I1206 07:21:17.894183 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 06 07:21:22 crc kubenswrapper[4954]: I1206 07:21:22.540752 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rlp8c" Dec 06 07:21:22 crc kubenswrapper[4954]: I1206 07:21:22.543245 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rlp8c" Dec 06 07:21:22 crc kubenswrapper[4954]: I1206 07:21:22.598472 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rlp8c" Dec 06 07:21:22 crc kubenswrapper[4954]: I1206 07:21:22.696326 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rlp8c" Dec 06 07:21:22 crc kubenswrapper[4954]: I1206 07:21:22.856092 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rlp8c"] Dec 06 07:21:22 crc kubenswrapper[4954]: I1206 07:21:22.894741 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 06 07:21:22 crc kubenswrapper[4954]: I1206 07:21:22.933985 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 06 07:21:23 crc kubenswrapper[4954]: I1206 07:21:23.702601 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 06 07:21:24 crc kubenswrapper[4954]: I1206 07:21:24.073653 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 07:21:24 crc kubenswrapper[4954]: I1206 07:21:24.073742 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 07:21:24 crc kubenswrapper[4954]: I1206 07:21:24.669063 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rlp8c" podUID="e1260548-45c8-4fca-9f7b-52a5cd1e9d30" containerName="registry-server" containerID="cri-o://8afbd1dae36253d138592f7b2ecd6d436f3b2d2cbaae7e1362a7c3a410bb0e16" gracePeriod=2 Dec 06 07:21:25 crc kubenswrapper[4954]: I1206 07:21:25.155986 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 07:21:25 crc kubenswrapper[4954]: I1206 07:21:25.156128 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 07:21:25 crc kubenswrapper[4954]: I1206 07:21:25.693646 4954 generic.go:334] "Generic (PLEG): container finished" podID="e1260548-45c8-4fca-9f7b-52a5cd1e9d30" containerID="8afbd1dae36253d138592f7b2ecd6d436f3b2d2cbaae7e1362a7c3a410bb0e16" exitCode=0 Dec 06 07:21:25 crc kubenswrapper[4954]: I1206 07:21:25.693840 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlp8c" event={"ID":"e1260548-45c8-4fca-9f7b-52a5cd1e9d30","Type":"ContainerDied","Data":"8afbd1dae36253d138592f7b2ecd6d436f3b2d2cbaae7e1362a7c3a410bb0e16"} Dec 06 07:21:25 crc kubenswrapper[4954]: I1206 07:21:25.694007 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlp8c" event={"ID":"e1260548-45c8-4fca-9f7b-52a5cd1e9d30","Type":"ContainerDied","Data":"8ecc53852fd1b2470c21dfa24e49e5fb814320c6b3c90dda975394ab2044db7a"} Dec 06 07:21:25 crc kubenswrapper[4954]: I1206 07:21:25.694023 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ecc53852fd1b2470c21dfa24e49e5fb814320c6b3c90dda975394ab2044db7a" Dec 06 07:21:25 crc kubenswrapper[4954]: I1206 07:21:25.738365 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlp8c" Dec 06 07:21:25 crc kubenswrapper[4954]: I1206 07:21:25.830518 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cs2n\" (UniqueName: \"kubernetes.io/projected/e1260548-45c8-4fca-9f7b-52a5cd1e9d30-kube-api-access-2cs2n\") pod \"e1260548-45c8-4fca-9f7b-52a5cd1e9d30\" (UID: \"e1260548-45c8-4fca-9f7b-52a5cd1e9d30\") " Dec 06 07:21:25 crc kubenswrapper[4954]: I1206 07:21:25.830674 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1260548-45c8-4fca-9f7b-52a5cd1e9d30-utilities\") pod \"e1260548-45c8-4fca-9f7b-52a5cd1e9d30\" (UID: \"e1260548-45c8-4fca-9f7b-52a5cd1e9d30\") " Dec 06 07:21:25 crc kubenswrapper[4954]: I1206 07:21:25.830749 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1260548-45c8-4fca-9f7b-52a5cd1e9d30-catalog-content\") pod \"e1260548-45c8-4fca-9f7b-52a5cd1e9d30\" (UID: \"e1260548-45c8-4fca-9f7b-52a5cd1e9d30\") " Dec 06 07:21:25 crc kubenswrapper[4954]: I1206 07:21:25.839423 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1260548-45c8-4fca-9f7b-52a5cd1e9d30-utilities" (OuterVolumeSpecName: "utilities") pod "e1260548-45c8-4fca-9f7b-52a5cd1e9d30" (UID: "e1260548-45c8-4fca-9f7b-52a5cd1e9d30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:21:25 crc kubenswrapper[4954]: I1206 07:21:25.848030 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1260548-45c8-4fca-9f7b-52a5cd1e9d30-kube-api-access-2cs2n" (OuterVolumeSpecName: "kube-api-access-2cs2n") pod "e1260548-45c8-4fca-9f7b-52a5cd1e9d30" (UID: "e1260548-45c8-4fca-9f7b-52a5cd1e9d30"). InnerVolumeSpecName "kube-api-access-2cs2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:21:25 crc kubenswrapper[4954]: I1206 07:21:25.932528 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cs2n\" (UniqueName: \"kubernetes.io/projected/e1260548-45c8-4fca-9f7b-52a5cd1e9d30-kube-api-access-2cs2n\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:25 crc kubenswrapper[4954]: I1206 07:21:25.932999 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1260548-45c8-4fca-9f7b-52a5cd1e9d30-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:25 crc kubenswrapper[4954]: I1206 07:21:25.954470 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1260548-45c8-4fca-9f7b-52a5cd1e9d30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1260548-45c8-4fca-9f7b-52a5cd1e9d30" (UID: "e1260548-45c8-4fca-9f7b-52a5cd1e9d30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:21:26 crc kubenswrapper[4954]: I1206 07:21:26.035060 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1260548-45c8-4fca-9f7b-52a5cd1e9d30-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:26 crc kubenswrapper[4954]: I1206 07:21:26.720773 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlp8c" Dec 06 07:21:26 crc kubenswrapper[4954]: I1206 07:21:26.785005 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rlp8c"] Dec 06 07:21:26 crc kubenswrapper[4954]: I1206 07:21:26.795468 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rlp8c"] Dec 06 07:21:27 crc kubenswrapper[4954]: I1206 07:21:27.461344 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1260548-45c8-4fca-9f7b-52a5cd1e9d30" path="/var/lib/kubelet/pods/e1260548-45c8-4fca-9f7b-52a5cd1e9d30/volumes" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.629312 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.753654 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81f000b6-60d1-4fd0-a294-8c1907b13d87-config-data\") pod \"81f000b6-60d1-4fd0-a294-8c1907b13d87\" (UID: \"81f000b6-60d1-4fd0-a294-8c1907b13d87\") " Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.755133 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnqhv\" (UniqueName: \"kubernetes.io/projected/81f000b6-60d1-4fd0-a294-8c1907b13d87-kube-api-access-nnqhv\") pod \"81f000b6-60d1-4fd0-a294-8c1907b13d87\" (UID: \"81f000b6-60d1-4fd0-a294-8c1907b13d87\") " Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.755478 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f000b6-60d1-4fd0-a294-8c1907b13d87-combined-ca-bundle\") pod \"81f000b6-60d1-4fd0-a294-8c1907b13d87\" (UID: \"81f000b6-60d1-4fd0-a294-8c1907b13d87\") " Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.755692 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.763149 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81f000b6-60d1-4fd0-a294-8c1907b13d87-kube-api-access-nnqhv" (OuterVolumeSpecName: "kube-api-access-nnqhv") pod "81f000b6-60d1-4fd0-a294-8c1907b13d87" (UID: "81f000b6-60d1-4fd0-a294-8c1907b13d87"). InnerVolumeSpecName "kube-api-access-nnqhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.774002 4954 generic.go:334] "Generic (PLEG): container finished" podID="81f000b6-60d1-4fd0-a294-8c1907b13d87" containerID="aa147bd0eccd1b05b4dbd4ed6ed1631d0683cb1082a218f357e0cbfc2f7eb735" exitCode=137 Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.774046 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81f000b6-60d1-4fd0-a294-8c1907b13d87","Type":"ContainerDied","Data":"aa147bd0eccd1b05b4dbd4ed6ed1631d0683cb1082a218f357e0cbfc2f7eb735"} Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.774092 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81f000b6-60d1-4fd0-a294-8c1907b13d87","Type":"ContainerDied","Data":"cf932f376a808ca50efcf8dbe28bb3d1bf7c94f5c1edddc9c131bce31e61bc84"} Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.774113 4954 scope.go:117] "RemoveContainer" containerID="aa147bd0eccd1b05b4dbd4ed6ed1631d0683cb1082a218f357e0cbfc2f7eb735" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.774112 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.786787 4954 generic.go:334] "Generic (PLEG): container finished" podID="82a54673-0760-4196-adc8-728b1cf08d6f" containerID="0588f4ab5c61818a8cf69761dd41d59aa76fec625ff5df9de1e6eb75f04588a8" exitCode=137 Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.786843 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"82a54673-0760-4196-adc8-728b1cf08d6f","Type":"ContainerDied","Data":"0588f4ab5c61818a8cf69761dd41d59aa76fec625ff5df9de1e6eb75f04588a8"} Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.787227 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"82a54673-0760-4196-adc8-728b1cf08d6f","Type":"ContainerDied","Data":"d93c8d084ce9ca568e511e2a333f40ebe622bcc72897226697d22773d3fa6702"} Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.786881 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.797147 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f000b6-60d1-4fd0-a294-8c1907b13d87-config-data" (OuterVolumeSpecName: "config-data") pod "81f000b6-60d1-4fd0-a294-8c1907b13d87" (UID: "81f000b6-60d1-4fd0-a294-8c1907b13d87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.812581 4954 scope.go:117] "RemoveContainer" containerID="aa147bd0eccd1b05b4dbd4ed6ed1631d0683cb1082a218f357e0cbfc2f7eb735" Dec 06 07:21:30 crc kubenswrapper[4954]: E1206 07:21:30.813074 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa147bd0eccd1b05b4dbd4ed6ed1631d0683cb1082a218f357e0cbfc2f7eb735\": container with ID starting with aa147bd0eccd1b05b4dbd4ed6ed1631d0683cb1082a218f357e0cbfc2f7eb735 not found: ID does not exist" containerID="aa147bd0eccd1b05b4dbd4ed6ed1631d0683cb1082a218f357e0cbfc2f7eb735" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.813123 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa147bd0eccd1b05b4dbd4ed6ed1631d0683cb1082a218f357e0cbfc2f7eb735"} err="failed to get container status \"aa147bd0eccd1b05b4dbd4ed6ed1631d0683cb1082a218f357e0cbfc2f7eb735\": rpc error: code = NotFound desc = could not find container \"aa147bd0eccd1b05b4dbd4ed6ed1631d0683cb1082a218f357e0cbfc2f7eb735\": container with ID starting with aa147bd0eccd1b05b4dbd4ed6ed1631d0683cb1082a218f357e0cbfc2f7eb735 not found: ID does not exist" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.813152 4954 scope.go:117] "RemoveContainer" containerID="0588f4ab5c61818a8cf69761dd41d59aa76fec625ff5df9de1e6eb75f04588a8" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.832474 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f000b6-60d1-4fd0-a294-8c1907b13d87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81f000b6-60d1-4fd0-a294-8c1907b13d87" (UID: "81f000b6-60d1-4fd0-a294-8c1907b13d87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.847400 4954 scope.go:117] "RemoveContainer" containerID="848822261d379f961b6c14c6ddcd86498583854ec7232167f65f9afaf2c468b8" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.857396 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a54673-0760-4196-adc8-728b1cf08d6f-config-data\") pod \"82a54673-0760-4196-adc8-728b1cf08d6f\" (UID: \"82a54673-0760-4196-adc8-728b1cf08d6f\") " Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.857621 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7j85\" (UniqueName: \"kubernetes.io/projected/82a54673-0760-4196-adc8-728b1cf08d6f-kube-api-access-p7j85\") pod \"82a54673-0760-4196-adc8-728b1cf08d6f\" (UID: \"82a54673-0760-4196-adc8-728b1cf08d6f\") " Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.857676 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82a54673-0760-4196-adc8-728b1cf08d6f-logs\") pod \"82a54673-0760-4196-adc8-728b1cf08d6f\" (UID: \"82a54673-0760-4196-adc8-728b1cf08d6f\") " Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.857738 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a54673-0760-4196-adc8-728b1cf08d6f-combined-ca-bundle\") pod \"82a54673-0760-4196-adc8-728b1cf08d6f\" (UID: \"82a54673-0760-4196-adc8-728b1cf08d6f\") " Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.858113 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82a54673-0760-4196-adc8-728b1cf08d6f-logs" (OuterVolumeSpecName: "logs") pod "82a54673-0760-4196-adc8-728b1cf08d6f" (UID: "82a54673-0760-4196-adc8-728b1cf08d6f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.858237 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81f000b6-60d1-4fd0-a294-8c1907b13d87-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.858256 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnqhv\" (UniqueName: \"kubernetes.io/projected/81f000b6-60d1-4fd0-a294-8c1907b13d87-kube-api-access-nnqhv\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.858270 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82a54673-0760-4196-adc8-728b1cf08d6f-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.858281 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f000b6-60d1-4fd0-a294-8c1907b13d87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.861752 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82a54673-0760-4196-adc8-728b1cf08d6f-kube-api-access-p7j85" (OuterVolumeSpecName: "kube-api-access-p7j85") pod "82a54673-0760-4196-adc8-728b1cf08d6f" (UID: "82a54673-0760-4196-adc8-728b1cf08d6f"). InnerVolumeSpecName "kube-api-access-p7j85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.874723 4954 scope.go:117] "RemoveContainer" containerID="0588f4ab5c61818a8cf69761dd41d59aa76fec625ff5df9de1e6eb75f04588a8" Dec 06 07:21:30 crc kubenswrapper[4954]: E1206 07:21:30.875450 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0588f4ab5c61818a8cf69761dd41d59aa76fec625ff5df9de1e6eb75f04588a8\": container with ID starting with 0588f4ab5c61818a8cf69761dd41d59aa76fec625ff5df9de1e6eb75f04588a8 not found: ID does not exist" containerID="0588f4ab5c61818a8cf69761dd41d59aa76fec625ff5df9de1e6eb75f04588a8" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.875490 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0588f4ab5c61818a8cf69761dd41d59aa76fec625ff5df9de1e6eb75f04588a8"} err="failed to get container status \"0588f4ab5c61818a8cf69761dd41d59aa76fec625ff5df9de1e6eb75f04588a8\": rpc error: code = NotFound desc = could not find container \"0588f4ab5c61818a8cf69761dd41d59aa76fec625ff5df9de1e6eb75f04588a8\": container with ID starting with 0588f4ab5c61818a8cf69761dd41d59aa76fec625ff5df9de1e6eb75f04588a8 not found: ID does not exist" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.875515 4954 scope.go:117] "RemoveContainer" containerID="848822261d379f961b6c14c6ddcd86498583854ec7232167f65f9afaf2c468b8" Dec 06 07:21:30 crc kubenswrapper[4954]: E1206 07:21:30.875887 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"848822261d379f961b6c14c6ddcd86498583854ec7232167f65f9afaf2c468b8\": container with ID starting with 848822261d379f961b6c14c6ddcd86498583854ec7232167f65f9afaf2c468b8 not found: ID does not exist" containerID="848822261d379f961b6c14c6ddcd86498583854ec7232167f65f9afaf2c468b8" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.875926 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"848822261d379f961b6c14c6ddcd86498583854ec7232167f65f9afaf2c468b8"} err="failed to get container status \"848822261d379f961b6c14c6ddcd86498583854ec7232167f65f9afaf2c468b8\": rpc error: code = NotFound desc = could not find container \"848822261d379f961b6c14c6ddcd86498583854ec7232167f65f9afaf2c468b8\": container with ID starting with 848822261d379f961b6c14c6ddcd86498583854ec7232167f65f9afaf2c468b8 not found: ID does not exist" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.890382 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a54673-0760-4196-adc8-728b1cf08d6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82a54673-0760-4196-adc8-728b1cf08d6f" (UID: "82a54673-0760-4196-adc8-728b1cf08d6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.897311 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a54673-0760-4196-adc8-728b1cf08d6f-config-data" (OuterVolumeSpecName: "config-data") pod "82a54673-0760-4196-adc8-728b1cf08d6f" (UID: "82a54673-0760-4196-adc8-728b1cf08d6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.959645 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7j85\" (UniqueName: \"kubernetes.io/projected/82a54673-0760-4196-adc8-728b1cf08d6f-kube-api-access-p7j85\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.959692 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a54673-0760-4196-adc8-728b1cf08d6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:30 crc kubenswrapper[4954]: I1206 07:21:30.959706 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a54673-0760-4196-adc8-728b1cf08d6f-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.153017 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.169031 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.184113 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 07:21:31 crc kubenswrapper[4954]: E1206 07:21:31.184817 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a54673-0760-4196-adc8-728b1cf08d6f" containerName="nova-metadata-log" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.184837 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a54673-0760-4196-adc8-728b1cf08d6f" containerName="nova-metadata-log" Dec 06 07:21:31 crc kubenswrapper[4954]: E1206 07:21:31.184863 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a54673-0760-4196-adc8-728b1cf08d6f" containerName="nova-metadata-metadata" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.184873 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a54673-0760-4196-adc8-728b1cf08d6f" containerName="nova-metadata-metadata" Dec 06 07:21:31 crc kubenswrapper[4954]: E1206 07:21:31.184901 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f000b6-60d1-4fd0-a294-8c1907b13d87" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.184910 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f000b6-60d1-4fd0-a294-8c1907b13d87" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 07:21:31 crc kubenswrapper[4954]: E1206 07:21:31.184925 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1260548-45c8-4fca-9f7b-52a5cd1e9d30" containerName="extract-utilities" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.184934 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1260548-45c8-4fca-9f7b-52a5cd1e9d30" containerName="extract-utilities" Dec 06 07:21:31 crc kubenswrapper[4954]: E1206 07:21:31.184965 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1260548-45c8-4fca-9f7b-52a5cd1e9d30" containerName="extract-content" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.184973 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1260548-45c8-4fca-9f7b-52a5cd1e9d30" containerName="extract-content" Dec 06 07:21:31 crc kubenswrapper[4954]: E1206 07:21:31.185005 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1260548-45c8-4fca-9f7b-52a5cd1e9d30" containerName="registry-server" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.185013 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1260548-45c8-4fca-9f7b-52a5cd1e9d30" containerName="registry-server" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.185627 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f000b6-60d1-4fd0-a294-8c1907b13d87" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.185711 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1260548-45c8-4fca-9f7b-52a5cd1e9d30" containerName="registry-server" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.185838 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="82a54673-0760-4196-adc8-728b1cf08d6f" containerName="nova-metadata-metadata" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.185853 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="82a54673-0760-4196-adc8-728b1cf08d6f" containerName="nova-metadata-log" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.188212 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.194978 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.195163 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.195257 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.197814 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.215515 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.228423 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.236099 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.238529 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.241696 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.243637 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.245590 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.267175 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52j85\" (UniqueName: \"kubernetes.io/projected/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-kube-api-access-52j85\") pod \"nova-metadata-0\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " pod="openstack/nova-metadata-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.267234 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.267312 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.267339 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " pod="openstack/nova-metadata-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.267395 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-logs\") pod \"nova-metadata-0\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " pod="openstack/nova-metadata-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.267422 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.267454 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwbh7\" (UniqueName: \"kubernetes.io/projected/5e602bfa-4a57-4fe6-adca-90acb13b7458-kube-api-access-cwbh7\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.267611 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " pod="openstack/nova-metadata-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.267816 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.267849 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-config-data\") pod \"nova-metadata-0\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " pod="openstack/nova-metadata-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.369784 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52j85\" (UniqueName: \"kubernetes.io/projected/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-kube-api-access-52j85\") pod \"nova-metadata-0\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " pod="openstack/nova-metadata-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.369830 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.369866 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.369894 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " pod="openstack/nova-metadata-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.369921 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-logs\") pod \"nova-metadata-0\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " pod="openstack/nova-metadata-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.369943 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.369970 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwbh7\" (UniqueName: \"kubernetes.io/projected/5e602bfa-4a57-4fe6-adca-90acb13b7458-kube-api-access-cwbh7\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.370007 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " pod="openstack/nova-metadata-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.370098 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.370122 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-config-data\") pod \"nova-metadata-0\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " pod="openstack/nova-metadata-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.371096 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-logs\") pod \"nova-metadata-0\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " pod="openstack/nova-metadata-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.374386 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-config-data\") pod \"nova-metadata-0\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " pod="openstack/nova-metadata-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.376065 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " pod="openstack/nova-metadata-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.376409 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.376851 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.377335 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " pod="openstack/nova-metadata-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.380134 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.381115 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.392929 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52j85\" (UniqueName: \"kubernetes.io/projected/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-kube-api-access-52j85\") pod \"nova-metadata-0\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " pod="openstack/nova-metadata-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.395458 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwbh7\" (UniqueName: \"kubernetes.io/projected/5e602bfa-4a57-4fe6-adca-90acb13b7458-kube-api-access-cwbh7\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.462697 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81f000b6-60d1-4fd0-a294-8c1907b13d87" path="/var/lib/kubelet/pods/81f000b6-60d1-4fd0-a294-8c1907b13d87/volumes" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.463666 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82a54673-0760-4196-adc8-728b1cf08d6f" path="/var/lib/kubelet/pods/82a54673-0760-4196-adc8-728b1cf08d6f/volumes" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.517645 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:31 crc kubenswrapper[4954]: I1206 07:21:31.566231 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 07:21:32 crc kubenswrapper[4954]: I1206 07:21:32.022884 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 07:21:32 crc kubenswrapper[4954]: W1206 07:21:32.029878 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e602bfa_4a57_4fe6_adca_90acb13b7458.slice/crio-184f44b211091fd1ce32e67162b351906ae92ae908086377ab1a1fba8b4da095 WatchSource:0}: Error finding container 184f44b211091fd1ce32e67162b351906ae92ae908086377ab1a1fba8b4da095: Status 404 returned error can't find the container with id 184f44b211091fd1ce32e67162b351906ae92ae908086377ab1a1fba8b4da095 Dec 06 07:21:32 crc kubenswrapper[4954]: I1206 07:21:32.152738 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 07:21:32 crc kubenswrapper[4954]: W1206 07:21:32.158094 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeee9fe3e_4652_4a6e_aaaf_a28e6738e9dd.slice/crio-c6691f557efc50f71003c5353a0d0b0eb51a139e841e13ae3ec303be659df647 WatchSource:0}: Error finding container c6691f557efc50f71003c5353a0d0b0eb51a139e841e13ae3ec303be659df647: Status 404 returned error can't find the container with id c6691f557efc50f71003c5353a0d0b0eb51a139e841e13ae3ec303be659df647 Dec 06 07:21:32 crc kubenswrapper[4954]: I1206 07:21:32.814152 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5e602bfa-4a57-4fe6-adca-90acb13b7458","Type":"ContainerStarted","Data":"9b66120864df3196219880eda3f954ff63b12cccdf4e304f76c83d176d3e19b9"} Dec 06 07:21:32 crc kubenswrapper[4954]: I1206 07:21:32.814617 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5e602bfa-4a57-4fe6-adca-90acb13b7458","Type":"ContainerStarted","Data":"184f44b211091fd1ce32e67162b351906ae92ae908086377ab1a1fba8b4da095"} Dec 06 07:21:32 crc kubenswrapper[4954]: I1206 07:21:32.817502 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd","Type":"ContainerStarted","Data":"2b3c10bb6fbb7cba56a366b47af2d85fc7beef682db7458a31124a55307d76a7"} Dec 06 07:21:32 crc kubenswrapper[4954]: I1206 07:21:32.817529 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd","Type":"ContainerStarted","Data":"0df57cfe1015d2399cf8df8174c51253197ec3e7f94347194281810498a39461"} Dec 06 07:21:32 crc kubenswrapper[4954]: I1206 07:21:32.817540 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd","Type":"ContainerStarted","Data":"c6691f557efc50f71003c5353a0d0b0eb51a139e841e13ae3ec303be659df647"} Dec 06 07:21:32 crc kubenswrapper[4954]: I1206 07:21:32.857648 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.857618788 podStartE2EDuration="1.857618788s" podCreationTimestamp="2025-12-06 07:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:21:32.849419978 +0000 UTC m=+1467.662779377" watchObservedRunningTime="2025-12-06 07:21:32.857618788 +0000 UTC m=+1467.670978177" Dec 06 07:21:32 crc kubenswrapper[4954]: I1206 07:21:32.886645 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.8866161689999998 podStartE2EDuration="1.886616169s" podCreationTimestamp="2025-12-06 07:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:21:32.872399717 +0000 UTC m=+1467.685759106" watchObservedRunningTime="2025-12-06 07:21:32.886616169 +0000 UTC m=+1467.699975588" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.078295 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.078900 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.079304 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.079377 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.084698 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.085344 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.321382 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n"] Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.324139 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.334983 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-ovsdbserver-nb\") pod \"dnsmasq-dns-7f9fbbf6f7-6bn5n\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.335039 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-config\") pod \"dnsmasq-dns-7f9fbbf6f7-6bn5n\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.335072 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-dns-svc\") pod \"dnsmasq-dns-7f9fbbf6f7-6bn5n\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.335116 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58kkq\" (UniqueName: \"kubernetes.io/projected/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-kube-api-access-58kkq\") pod \"dnsmasq-dns-7f9fbbf6f7-6bn5n\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.335174 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-dns-swift-storage-0\") pod \"dnsmasq-dns-7f9fbbf6f7-6bn5n\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.335232 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-ovsdbserver-sb\") pod \"dnsmasq-dns-7f9fbbf6f7-6bn5n\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.339533 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n"] Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.437030 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58kkq\" (UniqueName: \"kubernetes.io/projected/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-kube-api-access-58kkq\") pod \"dnsmasq-dns-7f9fbbf6f7-6bn5n\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.437140 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-dns-swift-storage-0\") pod \"dnsmasq-dns-7f9fbbf6f7-6bn5n\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.437219 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-ovsdbserver-sb\") pod \"dnsmasq-dns-7f9fbbf6f7-6bn5n\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.437252 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-ovsdbserver-nb\") pod \"dnsmasq-dns-7f9fbbf6f7-6bn5n\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.437279 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-config\") pod \"dnsmasq-dns-7f9fbbf6f7-6bn5n\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.437309 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-dns-svc\") pod \"dnsmasq-dns-7f9fbbf6f7-6bn5n\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.438402 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-dns-svc\") pod \"dnsmasq-dns-7f9fbbf6f7-6bn5n\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.438910 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-dns-swift-storage-0\") pod \"dnsmasq-dns-7f9fbbf6f7-6bn5n\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.439233 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-ovsdbserver-nb\") pod \"dnsmasq-dns-7f9fbbf6f7-6bn5n\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.442156 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-config\") pod \"dnsmasq-dns-7f9fbbf6f7-6bn5n\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.442398 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-ovsdbserver-sb\") pod \"dnsmasq-dns-7f9fbbf6f7-6bn5n\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.463498 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58kkq\" (UniqueName: \"kubernetes.io/projected/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-kube-api-access-58kkq\") pod \"dnsmasq-dns-7f9fbbf6f7-6bn5n\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:34 crc kubenswrapper[4954]: I1206 07:21:34.658013 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:35 crc kubenswrapper[4954]: I1206 07:21:35.161992 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n"] Dec 06 07:21:35 crc kubenswrapper[4954]: I1206 07:21:35.859758 4954 generic.go:334] "Generic (PLEG): container finished" podID="56254c3b-cd9d-40d9-bb7e-b1c858f3b87f" containerID="837f7307011dcdff84ddf290fbfe19555e90f8ff13ebde2aef98e541e550a775" exitCode=0 Dec 06 07:21:35 crc kubenswrapper[4954]: I1206 07:21:35.859818 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" event={"ID":"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f","Type":"ContainerDied","Data":"837f7307011dcdff84ddf290fbfe19555e90f8ff13ebde2aef98e541e550a775"} Dec 06 07:21:35 crc kubenswrapper[4954]: I1206 07:21:35.860163 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" event={"ID":"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f","Type":"ContainerStarted","Data":"20f0554ceee87a95a1da8cbc3375427e393f1dcd7c992d70c597d1754d942d58"} Dec 06 07:21:36 crc kubenswrapper[4954]: I1206 07:21:36.304848 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:21:36 crc kubenswrapper[4954]: I1206 07:21:36.305772 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerName="ceilometer-central-agent" containerID="cri-o://f40ef4eda20929d5d12f39fc0c1eb75066eaf4f8b13661257f721beffd512197" gracePeriod=30 Dec 06 07:21:36 crc kubenswrapper[4954]: I1206 07:21:36.305856 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerName="sg-core" containerID="cri-o://40c57cbfad521f169477bd33c369595d9e302bfde4d29f358049345954358c9d" gracePeriod=30 Dec 06 07:21:36 crc kubenswrapper[4954]: I1206 07:21:36.305902 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerName="ceilometer-notification-agent" containerID="cri-o://0a07eb818f81db3b59667d91c14dcbcdcf6908e6e66ca2faa69fcc8677bb85d6" gracePeriod=30 Dec 06 07:21:36 crc kubenswrapper[4954]: I1206 07:21:36.305903 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerName="proxy-httpd" containerID="cri-o://bbc2db009796d1f06b4d5d7bcc1e07bd6717dbab8a4dd4380c0383aa7165333d" gracePeriod=30 Dec 06 07:21:36 crc kubenswrapper[4954]: I1206 07:21:36.518146 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:36 crc kubenswrapper[4954]: I1206 07:21:36.566471 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 07:21:36 crc kubenswrapper[4954]: I1206 07:21:36.566546 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 07:21:36 crc kubenswrapper[4954]: I1206 07:21:36.892985 4954 generic.go:334] "Generic (PLEG): container finished" podID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerID="bbc2db009796d1f06b4d5d7bcc1e07bd6717dbab8a4dd4380c0383aa7165333d" exitCode=0 Dec 06 07:21:36 crc kubenswrapper[4954]: I1206 07:21:36.893042 4954 generic.go:334] "Generic (PLEG): container finished" podID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerID="40c57cbfad521f169477bd33c369595d9e302bfde4d29f358049345954358c9d" exitCode=2 Dec 06 07:21:36 crc kubenswrapper[4954]: I1206 07:21:36.893165 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b828b931-cffa-4170-b444-8d4c96dac8d4","Type":"ContainerDied","Data":"bbc2db009796d1f06b4d5d7bcc1e07bd6717dbab8a4dd4380c0383aa7165333d"} Dec 06 07:21:36 crc kubenswrapper[4954]: I1206 07:21:36.893206 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b828b931-cffa-4170-b444-8d4c96dac8d4","Type":"ContainerDied","Data":"40c57cbfad521f169477bd33c369595d9e302bfde4d29f358049345954358c9d"} Dec 06 07:21:36 crc kubenswrapper[4954]: I1206 07:21:36.915180 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" event={"ID":"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f","Type":"ContainerStarted","Data":"9d7871a1cc043780cd4a4b6b1e5824bece7fc1fc553c7805c5dd188196365a91"} Dec 06 07:21:36 crc kubenswrapper[4954]: I1206 07:21:36.916715 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:36 crc kubenswrapper[4954]: I1206 07:21:36.991125 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" podStartSLOduration=2.991100649 podStartE2EDuration="2.991100649s" podCreationTimestamp="2025-12-06 07:21:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:21:36.987002399 +0000 UTC m=+1471.800361788" watchObservedRunningTime="2025-12-06 07:21:36.991100649 +0000 UTC m=+1471.804460038" Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.779880 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.924097 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-ceilometer-tls-certs\") pod \"b828b931-cffa-4170-b444-8d4c96dac8d4\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.924618 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-config-data\") pod \"b828b931-cffa-4170-b444-8d4c96dac8d4\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.924670 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-scripts\") pod \"b828b931-cffa-4170-b444-8d4c96dac8d4\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.924710 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b828b931-cffa-4170-b444-8d4c96dac8d4-run-httpd\") pod \"b828b931-cffa-4170-b444-8d4c96dac8d4\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.924758 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z4k7\" (UniqueName: \"kubernetes.io/projected/b828b931-cffa-4170-b444-8d4c96dac8d4-kube-api-access-2z4k7\") pod \"b828b931-cffa-4170-b444-8d4c96dac8d4\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.924830 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-combined-ca-bundle\") pod \"b828b931-cffa-4170-b444-8d4c96dac8d4\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.924898 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b828b931-cffa-4170-b444-8d4c96dac8d4-log-httpd\") pod \"b828b931-cffa-4170-b444-8d4c96dac8d4\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.924953 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-sg-core-conf-yaml\") pod \"b828b931-cffa-4170-b444-8d4c96dac8d4\" (UID: \"b828b931-cffa-4170-b444-8d4c96dac8d4\") " Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.926089 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b828b931-cffa-4170-b444-8d4c96dac8d4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b828b931-cffa-4170-b444-8d4c96dac8d4" (UID: "b828b931-cffa-4170-b444-8d4c96dac8d4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.926188 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b828b931-cffa-4170-b444-8d4c96dac8d4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b828b931-cffa-4170-b444-8d4c96dac8d4" (UID: "b828b931-cffa-4170-b444-8d4c96dac8d4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.936883 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-scripts" (OuterVolumeSpecName: "scripts") pod "b828b931-cffa-4170-b444-8d4c96dac8d4" (UID: "b828b931-cffa-4170-b444-8d4c96dac8d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.938858 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b828b931-cffa-4170-b444-8d4c96dac8d4-kube-api-access-2z4k7" (OuterVolumeSpecName: "kube-api-access-2z4k7") pod "b828b931-cffa-4170-b444-8d4c96dac8d4" (UID: "b828b931-cffa-4170-b444-8d4c96dac8d4"). InnerVolumeSpecName "kube-api-access-2z4k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.952801 4954 generic.go:334] "Generic (PLEG): container finished" podID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerID="0a07eb818f81db3b59667d91c14dcbcdcf6908e6e66ca2faa69fcc8677bb85d6" exitCode=0 Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.953085 4954 generic.go:334] "Generic (PLEG): container finished" podID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerID="f40ef4eda20929d5d12f39fc0c1eb75066eaf4f8b13661257f721beffd512197" exitCode=0 Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.953132 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.953023 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b828b931-cffa-4170-b444-8d4c96dac8d4","Type":"ContainerDied","Data":"0a07eb818f81db3b59667d91c14dcbcdcf6908e6e66ca2faa69fcc8677bb85d6"} Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.954698 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b828b931-cffa-4170-b444-8d4c96dac8d4","Type":"ContainerDied","Data":"f40ef4eda20929d5d12f39fc0c1eb75066eaf4f8b13661257f721beffd512197"} Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.954719 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b828b931-cffa-4170-b444-8d4c96dac8d4","Type":"ContainerDied","Data":"a706dd2dab9c2de4356937699ef4e577a5e4a7f40414e1938584ed32379d77ed"} Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.954737 4954 scope.go:117] "RemoveContainer" containerID="bbc2db009796d1f06b4d5d7bcc1e07bd6717dbab8a4dd4380c0383aa7165333d" Dec 06 07:21:37 crc kubenswrapper[4954]: I1206 07:21:37.976956 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b828b931-cffa-4170-b444-8d4c96dac8d4" (UID: "b828b931-cffa-4170-b444-8d4c96dac8d4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.026720 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b828b931-cffa-4170-b444-8d4c96dac8d4-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.027277 4954 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.027403 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.027487 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b828b931-cffa-4170-b444-8d4c96dac8d4-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.027589 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z4k7\" (UniqueName: \"kubernetes.io/projected/b828b931-cffa-4170-b444-8d4c96dac8d4-kube-api-access-2z4k7\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.044718 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b828b931-cffa-4170-b444-8d4c96dac8d4" (UID: "b828b931-cffa-4170-b444-8d4c96dac8d4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.086755 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-config-data" (OuterVolumeSpecName: "config-data") pod "b828b931-cffa-4170-b444-8d4c96dac8d4" (UID: "b828b931-cffa-4170-b444-8d4c96dac8d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.095904 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b828b931-cffa-4170-b444-8d4c96dac8d4" (UID: "b828b931-cffa-4170-b444-8d4c96dac8d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.130204 4954 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.130249 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.130261 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b828b931-cffa-4170-b444-8d4c96dac8d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.165202 4954 scope.go:117] "RemoveContainer" containerID="40c57cbfad521f169477bd33c369595d9e302bfde4d29f358049345954358c9d" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.187272 4954 scope.go:117] "RemoveContainer" containerID="0a07eb818f81db3b59667d91c14dcbcdcf6908e6e66ca2faa69fcc8677bb85d6" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.209642 4954 scope.go:117] "RemoveContainer" containerID="f40ef4eda20929d5d12f39fc0c1eb75066eaf4f8b13661257f721beffd512197" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.230582 4954 scope.go:117] "RemoveContainer" containerID="bbc2db009796d1f06b4d5d7bcc1e07bd6717dbab8a4dd4380c0383aa7165333d" Dec 06 07:21:38 crc kubenswrapper[4954]: E1206 07:21:38.231040 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbc2db009796d1f06b4d5d7bcc1e07bd6717dbab8a4dd4380c0383aa7165333d\": container with ID starting with bbc2db009796d1f06b4d5d7bcc1e07bd6717dbab8a4dd4380c0383aa7165333d not found: ID does not exist" containerID="bbc2db009796d1f06b4d5d7bcc1e07bd6717dbab8a4dd4380c0383aa7165333d" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.231093 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbc2db009796d1f06b4d5d7bcc1e07bd6717dbab8a4dd4380c0383aa7165333d"} err="failed to get container status \"bbc2db009796d1f06b4d5d7bcc1e07bd6717dbab8a4dd4380c0383aa7165333d\": rpc error: code = NotFound desc = could not find container \"bbc2db009796d1f06b4d5d7bcc1e07bd6717dbab8a4dd4380c0383aa7165333d\": container with ID starting with bbc2db009796d1f06b4d5d7bcc1e07bd6717dbab8a4dd4380c0383aa7165333d not found: ID does not exist" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.231125 4954 scope.go:117] "RemoveContainer" containerID="40c57cbfad521f169477bd33c369595d9e302bfde4d29f358049345954358c9d" Dec 06 07:21:38 crc kubenswrapper[4954]: E1206 07:21:38.231613 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40c57cbfad521f169477bd33c369595d9e302bfde4d29f358049345954358c9d\": container with ID starting with 40c57cbfad521f169477bd33c369595d9e302bfde4d29f358049345954358c9d not found: ID does not exist" containerID="40c57cbfad521f169477bd33c369595d9e302bfde4d29f358049345954358c9d" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.231741 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40c57cbfad521f169477bd33c369595d9e302bfde4d29f358049345954358c9d"} err="failed to get container status \"40c57cbfad521f169477bd33c369595d9e302bfde4d29f358049345954358c9d\": rpc error: code = NotFound desc = could not find container \"40c57cbfad521f169477bd33c369595d9e302bfde4d29f358049345954358c9d\": container with ID starting with 40c57cbfad521f169477bd33c369595d9e302bfde4d29f358049345954358c9d not found: ID does not exist" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.231804 4954 scope.go:117] "RemoveContainer" containerID="0a07eb818f81db3b59667d91c14dcbcdcf6908e6e66ca2faa69fcc8677bb85d6" Dec 06 07:21:38 crc kubenswrapper[4954]: E1206 07:21:38.232214 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a07eb818f81db3b59667d91c14dcbcdcf6908e6e66ca2faa69fcc8677bb85d6\": container with ID starting with 0a07eb818f81db3b59667d91c14dcbcdcf6908e6e66ca2faa69fcc8677bb85d6 not found: ID does not exist" containerID="0a07eb818f81db3b59667d91c14dcbcdcf6908e6e66ca2faa69fcc8677bb85d6" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.232258 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a07eb818f81db3b59667d91c14dcbcdcf6908e6e66ca2faa69fcc8677bb85d6"} err="failed to get container status \"0a07eb818f81db3b59667d91c14dcbcdcf6908e6e66ca2faa69fcc8677bb85d6\": rpc error: code = NotFound desc = could not find container \"0a07eb818f81db3b59667d91c14dcbcdcf6908e6e66ca2faa69fcc8677bb85d6\": container with ID starting with 0a07eb818f81db3b59667d91c14dcbcdcf6908e6e66ca2faa69fcc8677bb85d6 not found: ID does not exist" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.232286 4954 scope.go:117] "RemoveContainer" containerID="f40ef4eda20929d5d12f39fc0c1eb75066eaf4f8b13661257f721beffd512197" Dec 06 07:21:38 crc kubenswrapper[4954]: E1206 07:21:38.232583 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f40ef4eda20929d5d12f39fc0c1eb75066eaf4f8b13661257f721beffd512197\": container with ID starting with f40ef4eda20929d5d12f39fc0c1eb75066eaf4f8b13661257f721beffd512197 not found: ID does not exist" containerID="f40ef4eda20929d5d12f39fc0c1eb75066eaf4f8b13661257f721beffd512197" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.232608 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f40ef4eda20929d5d12f39fc0c1eb75066eaf4f8b13661257f721beffd512197"} err="failed to get container status \"f40ef4eda20929d5d12f39fc0c1eb75066eaf4f8b13661257f721beffd512197\": rpc error: code = NotFound desc = could not find container \"f40ef4eda20929d5d12f39fc0c1eb75066eaf4f8b13661257f721beffd512197\": container with ID starting with f40ef4eda20929d5d12f39fc0c1eb75066eaf4f8b13661257f721beffd512197 not found: ID does not exist" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.232625 4954 scope.go:117] "RemoveContainer" containerID="bbc2db009796d1f06b4d5d7bcc1e07bd6717dbab8a4dd4380c0383aa7165333d" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.232810 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbc2db009796d1f06b4d5d7bcc1e07bd6717dbab8a4dd4380c0383aa7165333d"} err="failed to get container status \"bbc2db009796d1f06b4d5d7bcc1e07bd6717dbab8a4dd4380c0383aa7165333d\": rpc error: code = NotFound desc = could not find container \"bbc2db009796d1f06b4d5d7bcc1e07bd6717dbab8a4dd4380c0383aa7165333d\": container with ID starting with bbc2db009796d1f06b4d5d7bcc1e07bd6717dbab8a4dd4380c0383aa7165333d not found: ID does not exist" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.232833 4954 scope.go:117] "RemoveContainer" containerID="40c57cbfad521f169477bd33c369595d9e302bfde4d29f358049345954358c9d" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.233022 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40c57cbfad521f169477bd33c369595d9e302bfde4d29f358049345954358c9d"} err="failed to get container status \"40c57cbfad521f169477bd33c369595d9e302bfde4d29f358049345954358c9d\": rpc error: code = NotFound desc = could not find container \"40c57cbfad521f169477bd33c369595d9e302bfde4d29f358049345954358c9d\": container with ID starting with 40c57cbfad521f169477bd33c369595d9e302bfde4d29f358049345954358c9d not found: ID does not exist" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.233044 4954 scope.go:117] "RemoveContainer" containerID="0a07eb818f81db3b59667d91c14dcbcdcf6908e6e66ca2faa69fcc8677bb85d6" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.233368 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a07eb818f81db3b59667d91c14dcbcdcf6908e6e66ca2faa69fcc8677bb85d6"} err="failed to get container status \"0a07eb818f81db3b59667d91c14dcbcdcf6908e6e66ca2faa69fcc8677bb85d6\": rpc error: code = NotFound desc = could not find container \"0a07eb818f81db3b59667d91c14dcbcdcf6908e6e66ca2faa69fcc8677bb85d6\": container with ID starting with 0a07eb818f81db3b59667d91c14dcbcdcf6908e6e66ca2faa69fcc8677bb85d6 not found: ID does not exist" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.233432 4954 scope.go:117] "RemoveContainer" containerID="f40ef4eda20929d5d12f39fc0c1eb75066eaf4f8b13661257f721beffd512197" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.233770 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f40ef4eda20929d5d12f39fc0c1eb75066eaf4f8b13661257f721beffd512197"} err="failed to get container status \"f40ef4eda20929d5d12f39fc0c1eb75066eaf4f8b13661257f721beffd512197\": rpc error: code = NotFound desc = could not find container \"f40ef4eda20929d5d12f39fc0c1eb75066eaf4f8b13661257f721beffd512197\": container with ID starting with f40ef4eda20929d5d12f39fc0c1eb75066eaf4f8b13661257f721beffd512197 not found: ID does not exist" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.298491 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.318910 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.337748 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:21:38 crc kubenswrapper[4954]: E1206 07:21:38.338372 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerName="proxy-httpd" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.338393 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerName="proxy-httpd" Dec 06 07:21:38 crc kubenswrapper[4954]: E1206 07:21:38.338405 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerName="sg-core" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.338412 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerName="sg-core" Dec 06 07:21:38 crc kubenswrapper[4954]: E1206 07:21:38.338425 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerName="ceilometer-notification-agent" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.338432 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerName="ceilometer-notification-agent" Dec 06 07:21:38 crc kubenswrapper[4954]: E1206 07:21:38.338446 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerName="ceilometer-central-agent" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.338452 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerName="ceilometer-central-agent" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.338655 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerName="ceilometer-notification-agent" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.338690 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerName="proxy-httpd" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.338700 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerName="sg-core" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.338711 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b828b931-cffa-4170-b444-8d4c96dac8d4" containerName="ceilometer-central-agent" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.349603 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.349735 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.354166 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.354318 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.354585 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.436246 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtl6m\" (UniqueName: \"kubernetes.io/projected/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-kube-api-access-jtl6m\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.436339 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-log-httpd\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.436448 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.436493 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.436652 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-run-httpd\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.436713 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.436779 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-config-data\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.436821 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-scripts\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.538254 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.538299 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.538361 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-run-httpd\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.539008 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-run-httpd\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.538429 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.539093 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-config-data\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.539117 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-scripts\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.539625 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtl6m\" (UniqueName: \"kubernetes.io/projected/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-kube-api-access-jtl6m\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.539653 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-log-httpd\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.539990 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-log-httpd\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.544461 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.544773 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.546248 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-scripts\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.549457 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.550134 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-config-data\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.558410 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtl6m\" (UniqueName: \"kubernetes.io/projected/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-kube-api-access-jtl6m\") pod \"ceilometer-0\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.667474 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.747755 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.748043 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f" containerName="nova-api-log" containerID="cri-o://210068a35ed37a138c9935d71a7ec75ae8598294f57340d220b906ed630a4ada" gracePeriod=30 Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.748138 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f" containerName="nova-api-api" containerID="cri-o://f379258a258b9574fba5dfd90fa5636ffb019e3060fedd9f49e7bb781e4d2242" gracePeriod=30 Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.981817 4954 generic.go:334] "Generic (PLEG): container finished" podID="5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f" containerID="210068a35ed37a138c9935d71a7ec75ae8598294f57340d220b906ed630a4ada" exitCode=143 Dec 06 07:21:38 crc kubenswrapper[4954]: I1206 07:21:38.982236 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f","Type":"ContainerDied","Data":"210068a35ed37a138c9935d71a7ec75ae8598294f57340d220b906ed630a4ada"} Dec 06 07:21:39 crc kubenswrapper[4954]: W1206 07:21:39.198413 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a9972ee_3ddb_4cf7_b715_1ef79a4ddf33.slice/crio-c50c0be506841b6ee69088c47962de1542a25496a4c3af68310c41deea98cfd7 WatchSource:0}: Error finding container c50c0be506841b6ee69088c47962de1542a25496a4c3af68310c41deea98cfd7: Status 404 returned error can't find the container with id c50c0be506841b6ee69088c47962de1542a25496a4c3af68310c41deea98cfd7 Dec 06 07:21:39 crc kubenswrapper[4954]: I1206 07:21:39.208672 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:21:39 crc kubenswrapper[4954]: I1206 07:21:39.456797 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b828b931-cffa-4170-b444-8d4c96dac8d4" path="/var/lib/kubelet/pods/b828b931-cffa-4170-b444-8d4c96dac8d4/volumes" Dec 06 07:21:40 crc kubenswrapper[4954]: I1206 07:21:40.009711 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33","Type":"ContainerStarted","Data":"0df26832be8ed0e4cfddd4cb370f627c62b614a5308c62ad3c8362aef6afc836"} Dec 06 07:21:40 crc kubenswrapper[4954]: I1206 07:21:40.010072 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33","Type":"ContainerStarted","Data":"c50c0be506841b6ee69088c47962de1542a25496a4c3af68310c41deea98cfd7"} Dec 06 07:21:40 crc kubenswrapper[4954]: I1206 07:21:40.101678 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:21:40 crc kubenswrapper[4954]: I1206 07:21:40.102074 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:21:40 crc kubenswrapper[4954]: I1206 07:21:40.903150 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:21:41 crc kubenswrapper[4954]: I1206 07:21:41.026728 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33","Type":"ContainerStarted","Data":"69a230e1a79ca9db23d9251fc789f2df890892130a17635354b52932100cf78d"} Dec 06 07:21:41 crc kubenswrapper[4954]: I1206 07:21:41.518329 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:41 crc kubenswrapper[4954]: I1206 07:21:41.540787 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:41 crc kubenswrapper[4954]: I1206 07:21:41.566764 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 07:21:41 crc kubenswrapper[4954]: I1206 07:21:41.566828 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.037365 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33","Type":"ContainerStarted","Data":"fd43c1404183a391e32310915bf39a01d4fddc42d64c7049b23d5c3d2faaf8dd"} Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.059592 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.256615 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-dw292"] Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.258076 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dw292" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.261626 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.261756 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.279618 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dw292"] Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.418240 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgkgd\" (UniqueName: \"kubernetes.io/projected/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-kube-api-access-mgkgd\") pod \"nova-cell1-cell-mapping-dw292\" (UID: \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\") " pod="openstack/nova-cell1-cell-mapping-dw292" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.420991 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dw292\" (UID: \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\") " pod="openstack/nova-cell1-cell-mapping-dw292" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.421072 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-scripts\") pod \"nova-cell1-cell-mapping-dw292\" (UID: \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\") " pod="openstack/nova-cell1-cell-mapping-dw292" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.421148 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-config-data\") pod \"nova-cell1-cell-mapping-dw292\" (UID: \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\") " pod="openstack/nova-cell1-cell-mapping-dw292" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.529602 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dw292\" (UID: \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\") " pod="openstack/nova-cell1-cell-mapping-dw292" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.529685 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-scripts\") pod \"nova-cell1-cell-mapping-dw292\" (UID: \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\") " pod="openstack/nova-cell1-cell-mapping-dw292" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.529737 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-config-data\") pod \"nova-cell1-cell-mapping-dw292\" (UID: \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\") " pod="openstack/nova-cell1-cell-mapping-dw292" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.529797 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgkgd\" (UniqueName: \"kubernetes.io/projected/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-kube-api-access-mgkgd\") pod \"nova-cell1-cell-mapping-dw292\" (UID: \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\") " pod="openstack/nova-cell1-cell-mapping-dw292" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.539263 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-config-data\") pod \"nova-cell1-cell-mapping-dw292\" (UID: \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\") " pod="openstack/nova-cell1-cell-mapping-dw292" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.539587 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dw292\" (UID: \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\") " pod="openstack/nova-cell1-cell-mapping-dw292" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.539885 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-scripts\") pod \"nova-cell1-cell-mapping-dw292\" (UID: \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\") " pod="openstack/nova-cell1-cell-mapping-dw292" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.561291 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgkgd\" (UniqueName: \"kubernetes.io/projected/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-kube-api-access-mgkgd\") pod \"nova-cell1-cell-mapping-dw292\" (UID: \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\") " pod="openstack/nova-cell1-cell-mapping-dw292" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.586001 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.586106 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.627380 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dw292" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.677783 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.835468 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-config-data\") pod \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\" (UID: \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\") " Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.835951 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-logs\") pod \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\" (UID: \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\") " Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.836060 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt8dw\" (UniqueName: \"kubernetes.io/projected/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-kube-api-access-qt8dw\") pod \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\" (UID: \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\") " Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.836156 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-combined-ca-bundle\") pod \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\" (UID: \"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f\") " Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.837528 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-logs" (OuterVolumeSpecName: "logs") pod "5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f" (UID: "5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.844887 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-kube-api-access-qt8dw" (OuterVolumeSpecName: "kube-api-access-qt8dw") pod "5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f" (UID: "5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f"). InnerVolumeSpecName "kube-api-access-qt8dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.883766 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f" (UID: "5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.885962 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-config-data" (OuterVolumeSpecName: "config-data") pod "5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f" (UID: "5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.939086 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.939125 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.939135 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:42 crc kubenswrapper[4954]: I1206 07:21:42.939147 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt8dw\" (UniqueName: \"kubernetes.io/projected/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f-kube-api-access-qt8dw\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.060772 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33","Type":"ContainerStarted","Data":"8ceb960ad9f38302f005316f0ed08fe20612b50b07ad13d3d3e09ec242b6bb2c"} Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.061208 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.061271 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerName="ceilometer-central-agent" containerID="cri-o://0df26832be8ed0e4cfddd4cb370f627c62b614a5308c62ad3c8362aef6afc836" gracePeriod=30 Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.061824 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerName="ceilometer-notification-agent" containerID="cri-o://69a230e1a79ca9db23d9251fc789f2df890892130a17635354b52932100cf78d" gracePeriod=30 Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.061715 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerName="sg-core" containerID="cri-o://fd43c1404183a391e32310915bf39a01d4fddc42d64c7049b23d5c3d2faaf8dd" gracePeriod=30 Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.062483 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerName="proxy-httpd" containerID="cri-o://8ceb960ad9f38302f005316f0ed08fe20612b50b07ad13d3d3e09ec242b6bb2c" gracePeriod=30 Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.073101 4954 generic.go:334] "Generic (PLEG): container finished" podID="5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f" containerID="f379258a258b9574fba5dfd90fa5636ffb019e3060fedd9f49e7bb781e4d2242" exitCode=0 Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.074099 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.079056 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f","Type":"ContainerDied","Data":"f379258a258b9574fba5dfd90fa5636ffb019e3060fedd9f49e7bb781e4d2242"} Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.079150 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f","Type":"ContainerDied","Data":"bac223ca4a24bd15c0802b0ad83239a652cf8ec1f37c88877645870e7e7c2a4d"} Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.079179 4954 scope.go:117] "RemoveContainer" containerID="f379258a258b9574fba5dfd90fa5636ffb019e3060fedd9f49e7bb781e4d2242" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.114105 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9547381069999998 podStartE2EDuration="5.114072005s" podCreationTimestamp="2025-12-06 07:21:38 +0000 UTC" firstStartedPulling="2025-12-06 07:21:39.201151813 +0000 UTC m=+1474.014511212" lastFinishedPulling="2025-12-06 07:21:42.360485721 +0000 UTC m=+1477.173845110" observedRunningTime="2025-12-06 07:21:43.087403536 +0000 UTC m=+1477.900762935" watchObservedRunningTime="2025-12-06 07:21:43.114072005 +0000 UTC m=+1477.927431414" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.150425 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.155870 4954 scope.go:117] "RemoveContainer" containerID="210068a35ed37a138c9935d71a7ec75ae8598294f57340d220b906ed630a4ada" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.169038 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.191939 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 07:21:43 crc kubenswrapper[4954]: E1206 07:21:43.197205 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f" containerName="nova-api-api" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.197233 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f" containerName="nova-api-api" Dec 06 07:21:43 crc kubenswrapper[4954]: E1206 07:21:43.197282 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f" containerName="nova-api-log" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.197289 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f" containerName="nova-api-log" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.197550 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f" containerName="nova-api-api" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.197577 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f" containerName="nova-api-log" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.198764 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.206844 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.207124 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.207300 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.208402 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.219441 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dw292"] Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.220825 4954 scope.go:117] "RemoveContainer" containerID="f379258a258b9574fba5dfd90fa5636ffb019e3060fedd9f49e7bb781e4d2242" Dec 06 07:21:43 crc kubenswrapper[4954]: E1206 07:21:43.222677 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f379258a258b9574fba5dfd90fa5636ffb019e3060fedd9f49e7bb781e4d2242\": container with ID starting with f379258a258b9574fba5dfd90fa5636ffb019e3060fedd9f49e7bb781e4d2242 not found: ID does not exist" containerID="f379258a258b9574fba5dfd90fa5636ffb019e3060fedd9f49e7bb781e4d2242" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.222717 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f379258a258b9574fba5dfd90fa5636ffb019e3060fedd9f49e7bb781e4d2242"} err="failed to get container status \"f379258a258b9574fba5dfd90fa5636ffb019e3060fedd9f49e7bb781e4d2242\": rpc error: code = NotFound desc = could not find container \"f379258a258b9574fba5dfd90fa5636ffb019e3060fedd9f49e7bb781e4d2242\": container with ID starting with f379258a258b9574fba5dfd90fa5636ffb019e3060fedd9f49e7bb781e4d2242 not found: ID does not exist" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.222746 4954 scope.go:117] "RemoveContainer" containerID="210068a35ed37a138c9935d71a7ec75ae8598294f57340d220b906ed630a4ada" Dec 06 07:21:43 crc kubenswrapper[4954]: E1206 07:21:43.231475 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"210068a35ed37a138c9935d71a7ec75ae8598294f57340d220b906ed630a4ada\": container with ID starting with 210068a35ed37a138c9935d71a7ec75ae8598294f57340d220b906ed630a4ada not found: ID does not exist" containerID="210068a35ed37a138c9935d71a7ec75ae8598294f57340d220b906ed630a4ada" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.234134 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"210068a35ed37a138c9935d71a7ec75ae8598294f57340d220b906ed630a4ada"} err="failed to get container status \"210068a35ed37a138c9935d71a7ec75ae8598294f57340d220b906ed630a4ada\": rpc error: code = NotFound desc = could not find container \"210068a35ed37a138c9935d71a7ec75ae8598294f57340d220b906ed630a4ada\": container with ID starting with 210068a35ed37a138c9935d71a7ec75ae8598294f57340d220b906ed630a4ada not found: ID does not exist" Dec 06 07:21:43 crc kubenswrapper[4954]: W1206 07:21:43.235666 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc8cf20f_f0a2_417a_914a_7710fa0fafb9.slice/crio-248782f5ca8973af9cb2a6a85430e61287373ebea4b83076424ecaf2a974848f WatchSource:0}: Error finding container 248782f5ca8973af9cb2a6a85430e61287373ebea4b83076424ecaf2a974848f: Status 404 returned error can't find the container with id 248782f5ca8973af9cb2a6a85430e61287373ebea4b83076424ecaf2a974848f Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.352650 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.353130 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-public-tls-certs\") pod \"nova-api-0\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.353162 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nktr\" (UniqueName: \"kubernetes.io/projected/2e51130d-df1d-4b88-96b4-d7fdaf627585-kube-api-access-4nktr\") pod \"nova-api-0\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.353188 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-config-data\") pod \"nova-api-0\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.353219 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e51130d-df1d-4b88-96b4-d7fdaf627585-logs\") pod \"nova-api-0\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.353266 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.455451 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nktr\" (UniqueName: \"kubernetes.io/projected/2e51130d-df1d-4b88-96b4-d7fdaf627585-kube-api-access-4nktr\") pod \"nova-api-0\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.455519 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-config-data\") pod \"nova-api-0\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.455593 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e51130d-df1d-4b88-96b4-d7fdaf627585-logs\") pod \"nova-api-0\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.455706 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.456187 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e51130d-df1d-4b88-96b4-d7fdaf627585-logs\") pod \"nova-api-0\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.456540 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.456620 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-public-tls-certs\") pod \"nova-api-0\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.456676 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f" path="/var/lib/kubelet/pods/5fd0b6e9-6264-4abf-80a8-b4fbffbe9f3f/volumes" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.461399 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-public-tls-certs\") pod \"nova-api-0\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.465860 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.467408 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-config-data\") pod \"nova-api-0\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.469396 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.483260 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nktr\" (UniqueName: \"kubernetes.io/projected/2e51130d-df1d-4b88-96b4-d7fdaf627585-kube-api-access-4nktr\") pod \"nova-api-0\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " pod="openstack/nova-api-0" Dec 06 07:21:43 crc kubenswrapper[4954]: I1206 07:21:43.549831 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 07:21:44 crc kubenswrapper[4954]: I1206 07:21:44.051435 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:21:44 crc kubenswrapper[4954]: W1206 07:21:44.054936 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e51130d_df1d_4b88_96b4_d7fdaf627585.slice/crio-3a79582f615b4bb1056cf00b9bada647f848bba9d3af24671b4eb4cd19297980 WatchSource:0}: Error finding container 3a79582f615b4bb1056cf00b9bada647f848bba9d3af24671b4eb4cd19297980: Status 404 returned error can't find the container with id 3a79582f615b4bb1056cf00b9bada647f848bba9d3af24671b4eb4cd19297980 Dec 06 07:21:44 crc kubenswrapper[4954]: I1206 07:21:44.094367 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dw292" event={"ID":"dc8cf20f-f0a2-417a-914a-7710fa0fafb9","Type":"ContainerStarted","Data":"cadf0bf970efee2a4815c1a1b175fffd83284eaf0f45d7a45fe5892947c1264b"} Dec 06 07:21:44 crc kubenswrapper[4954]: I1206 07:21:44.094441 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dw292" event={"ID":"dc8cf20f-f0a2-417a-914a-7710fa0fafb9","Type":"ContainerStarted","Data":"248782f5ca8973af9cb2a6a85430e61287373ebea4b83076424ecaf2a974848f"} Dec 06 07:21:44 crc kubenswrapper[4954]: I1206 07:21:44.098832 4954 generic.go:334] "Generic (PLEG): container finished" podID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerID="8ceb960ad9f38302f005316f0ed08fe20612b50b07ad13d3d3e09ec242b6bb2c" exitCode=0 Dec 06 07:21:44 crc kubenswrapper[4954]: I1206 07:21:44.098865 4954 generic.go:334] "Generic (PLEG): container finished" podID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerID="fd43c1404183a391e32310915bf39a01d4fddc42d64c7049b23d5c3d2faaf8dd" exitCode=2 Dec 06 07:21:44 crc kubenswrapper[4954]: I1206 07:21:44.098880 4954 generic.go:334] "Generic (PLEG): container finished" podID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerID="69a230e1a79ca9db23d9251fc789f2df890892130a17635354b52932100cf78d" exitCode=0 Dec 06 07:21:44 crc kubenswrapper[4954]: I1206 07:21:44.098860 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33","Type":"ContainerDied","Data":"8ceb960ad9f38302f005316f0ed08fe20612b50b07ad13d3d3e09ec242b6bb2c"} Dec 06 07:21:44 crc kubenswrapper[4954]: I1206 07:21:44.098936 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33","Type":"ContainerDied","Data":"fd43c1404183a391e32310915bf39a01d4fddc42d64c7049b23d5c3d2faaf8dd"} Dec 06 07:21:44 crc kubenswrapper[4954]: I1206 07:21:44.098955 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33","Type":"ContainerDied","Data":"69a230e1a79ca9db23d9251fc789f2df890892130a17635354b52932100cf78d"} Dec 06 07:21:44 crc kubenswrapper[4954]: I1206 07:21:44.101797 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e51130d-df1d-4b88-96b4-d7fdaf627585","Type":"ContainerStarted","Data":"3a79582f615b4bb1056cf00b9bada647f848bba9d3af24671b4eb4cd19297980"} Dec 06 07:21:44 crc kubenswrapper[4954]: I1206 07:21:44.114807 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-dw292" podStartSLOduration=2.114788053 podStartE2EDuration="2.114788053s" podCreationTimestamp="2025-12-06 07:21:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:21:44.113294483 +0000 UTC m=+1478.926653892" watchObservedRunningTime="2025-12-06 07:21:44.114788053 +0000 UTC m=+1478.928147442" Dec 06 07:21:44 crc kubenswrapper[4954]: I1206 07:21:44.659743 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:21:44 crc kubenswrapper[4954]: I1206 07:21:44.751418 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd87576bf-bh22g"] Dec 06 07:21:44 crc kubenswrapper[4954]: I1206 07:21:44.751917 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" podUID="cf8ab637-2f55-4fe9-ba71-e2f98657d262" containerName="dnsmasq-dns" containerID="cri-o://487f3aa5673d5d19a4b982a861f9ad3bd692ff743abe78c916a7c6664e7db129" gracePeriod=10 Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.116632 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e51130d-df1d-4b88-96b4-d7fdaf627585","Type":"ContainerStarted","Data":"0d75a0fbb0dc278846fb50c1bc0b4b577c675599c9f610b4ac5a6dc908b9d049"} Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.116974 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e51130d-df1d-4b88-96b4-d7fdaf627585","Type":"ContainerStarted","Data":"397c197bb4916cc3fdaeece9fa61fc8cbfc77895ee2b87f8e8864bb2222c4711"} Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.125955 4954 generic.go:334] "Generic (PLEG): container finished" podID="cf8ab637-2f55-4fe9-ba71-e2f98657d262" containerID="487f3aa5673d5d19a4b982a861f9ad3bd692ff743abe78c916a7c6664e7db129" exitCode=0 Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.126108 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" event={"ID":"cf8ab637-2f55-4fe9-ba71-e2f98657d262","Type":"ContainerDied","Data":"487f3aa5673d5d19a4b982a861f9ad3bd692ff743abe78c916a7c6664e7db129"} Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.143493 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.143469664 podStartE2EDuration="2.143469664s" podCreationTimestamp="2025-12-06 07:21:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:21:45.139072706 +0000 UTC m=+1479.952432105" watchObservedRunningTime="2025-12-06 07:21:45.143469664 +0000 UTC m=+1479.956829053" Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.217993 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.403391 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-ovsdbserver-nb\") pod \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.403459 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-config\") pod \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.403545 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-dns-svc\") pod \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.403661 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-ovsdbserver-sb\") pod \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.403691 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-dns-swift-storage-0\") pod \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.403904 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmp5b\" (UniqueName: \"kubernetes.io/projected/cf8ab637-2f55-4fe9-ba71-e2f98657d262-kube-api-access-hmp5b\") pod \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\" (UID: \"cf8ab637-2f55-4fe9-ba71-e2f98657d262\") " Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.415975 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf8ab637-2f55-4fe9-ba71-e2f98657d262-kube-api-access-hmp5b" (OuterVolumeSpecName: "kube-api-access-hmp5b") pod "cf8ab637-2f55-4fe9-ba71-e2f98657d262" (UID: "cf8ab637-2f55-4fe9-ba71-e2f98657d262"). InnerVolumeSpecName "kube-api-access-hmp5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.464302 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-config" (OuterVolumeSpecName: "config") pod "cf8ab637-2f55-4fe9-ba71-e2f98657d262" (UID: "cf8ab637-2f55-4fe9-ba71-e2f98657d262"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.464827 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf8ab637-2f55-4fe9-ba71-e2f98657d262" (UID: "cf8ab637-2f55-4fe9-ba71-e2f98657d262"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.484048 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf8ab637-2f55-4fe9-ba71-e2f98657d262" (UID: "cf8ab637-2f55-4fe9-ba71-e2f98657d262"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.485401 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf8ab637-2f55-4fe9-ba71-e2f98657d262" (UID: "cf8ab637-2f55-4fe9-ba71-e2f98657d262"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.504538 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cf8ab637-2f55-4fe9-ba71-e2f98657d262" (UID: "cf8ab637-2f55-4fe9-ba71-e2f98657d262"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.506026 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.506065 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.506078 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmp5b\" (UniqueName: \"kubernetes.io/projected/cf8ab637-2f55-4fe9-ba71-e2f98657d262-kube-api-access-hmp5b\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.506089 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.506099 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:45 crc kubenswrapper[4954]: I1206 07:21:45.506107 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf8ab637-2f55-4fe9-ba71-e2f98657d262-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:46 crc kubenswrapper[4954]: I1206 07:21:46.141502 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" Dec 06 07:21:46 crc kubenswrapper[4954]: I1206 07:21:46.142693 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" event={"ID":"cf8ab637-2f55-4fe9-ba71-e2f98657d262","Type":"ContainerDied","Data":"7921291b24459d67932e56a85bfc2e426431f8af5f83af89515bce9aa524ec30"} Dec 06 07:21:46 crc kubenswrapper[4954]: I1206 07:21:46.142778 4954 scope.go:117] "RemoveContainer" containerID="487f3aa5673d5d19a4b982a861f9ad3bd692ff743abe78c916a7c6664e7db129" Dec 06 07:21:46 crc kubenswrapper[4954]: I1206 07:21:46.185723 4954 scope.go:117] "RemoveContainer" containerID="49cad8d92feec35081d61fb74310c81ffdb65596e75afa4f556d27a5b745dfa5" Dec 06 07:21:46 crc kubenswrapper[4954]: I1206 07:21:46.190374 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd87576bf-bh22g"] Dec 06 07:21:46 crc kubenswrapper[4954]: I1206 07:21:46.203019 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bd87576bf-bh22g"] Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.186290 4954 generic.go:334] "Generic (PLEG): container finished" podID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerID="0df26832be8ed0e4cfddd4cb370f627c62b614a5308c62ad3c8362aef6afc836" exitCode=0 Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.186382 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33","Type":"ContainerDied","Data":"0df26832be8ed0e4cfddd4cb370f627c62b614a5308c62ad3c8362aef6afc836"} Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.352800 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.448716 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-ceilometer-tls-certs\") pod \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.448867 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-config-data\") pod \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.448920 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-sg-core-conf-yaml\") pod \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.449041 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtl6m\" (UniqueName: \"kubernetes.io/projected/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-kube-api-access-jtl6m\") pod \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.449158 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-log-httpd\") pod \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.449189 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-run-httpd\") pod \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.449218 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-combined-ca-bundle\") pod \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.449341 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-scripts\") pod \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.451962 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" (UID: "4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.452361 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" (UID: "4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.458780 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-scripts" (OuterVolumeSpecName: "scripts") pod "4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" (UID: "4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.465833 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf8ab637-2f55-4fe9-ba71-e2f98657d262" path="/var/lib/kubelet/pods/cf8ab637-2f55-4fe9-ba71-e2f98657d262/volumes" Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.468664 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-kube-api-access-jtl6m" (OuterVolumeSpecName: "kube-api-access-jtl6m") pod "4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" (UID: "4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33"). InnerVolumeSpecName "kube-api-access-jtl6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.518405 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" (UID: "4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.519668 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" (UID: "4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.550456 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" (UID: "4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.551246 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-combined-ca-bundle\") pod \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\" (UID: \"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33\") " Dec 06 07:21:47 crc kubenswrapper[4954]: W1206 07:21:47.551467 4954 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33/volumes/kubernetes.io~secret/combined-ca-bundle Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.551518 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" (UID: "4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.552066 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtl6m\" (UniqueName: \"kubernetes.io/projected/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-kube-api-access-jtl6m\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.552098 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.552111 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.552130 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.552140 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.552149 4954 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.552445 4954 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.577211 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-config-data" (OuterVolumeSpecName: "config-data") pod "4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" (UID: "4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:47 crc kubenswrapper[4954]: I1206 07:21:47.660435 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.206368 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33","Type":"ContainerDied","Data":"c50c0be506841b6ee69088c47962de1542a25496a4c3af68310c41deea98cfd7"} Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.206427 4954 scope.go:117] "RemoveContainer" containerID="8ceb960ad9f38302f005316f0ed08fe20612b50b07ad13d3d3e09ec242b6bb2c" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.206607 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.268390 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.270645 4954 scope.go:117] "RemoveContainer" containerID="fd43c1404183a391e32310915bf39a01d4fddc42d64c7049b23d5c3d2faaf8dd" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.297978 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.309230 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:21:48 crc kubenswrapper[4954]: E1206 07:21:48.314359 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerName="proxy-httpd" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.314395 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerName="proxy-httpd" Dec 06 07:21:48 crc kubenswrapper[4954]: E1206 07:21:48.314441 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerName="ceilometer-notification-agent" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.314451 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerName="ceilometer-notification-agent" Dec 06 07:21:48 crc kubenswrapper[4954]: E1206 07:21:48.314465 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerName="sg-core" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.314473 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerName="sg-core" Dec 06 07:21:48 crc kubenswrapper[4954]: E1206 07:21:48.314499 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf8ab637-2f55-4fe9-ba71-e2f98657d262" containerName="dnsmasq-dns" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.314509 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf8ab637-2f55-4fe9-ba71-e2f98657d262" containerName="dnsmasq-dns" Dec 06 07:21:48 crc kubenswrapper[4954]: E1206 07:21:48.314580 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf8ab637-2f55-4fe9-ba71-e2f98657d262" containerName="init" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.314592 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf8ab637-2f55-4fe9-ba71-e2f98657d262" containerName="init" Dec 06 07:21:48 crc kubenswrapper[4954]: E1206 07:21:48.314617 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerName="ceilometer-central-agent" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.314625 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerName="ceilometer-central-agent" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.314934 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerName="ceilometer-central-agent" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.314948 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerName="sg-core" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.314973 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerName="ceilometer-notification-agent" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.314996 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf8ab637-2f55-4fe9-ba71-e2f98657d262" containerName="dnsmasq-dns" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.315013 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" containerName="proxy-httpd" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.319398 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.326318 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.326769 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.326845 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.350910 4954 scope.go:117] "RemoveContainer" containerID="69a230e1a79ca9db23d9251fc789f2df890892130a17635354b52932100cf78d" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.351162 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.376670 4954 scope.go:117] "RemoveContainer" containerID="0df26832be8ed0e4cfddd4cb370f627c62b614a5308c62ad3c8362aef6afc836" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.476946 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-scripts\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.477095 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.477140 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6b0598-f49f-4300-a2e3-edb512001517-run-httpd\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.477180 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.477207 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-config-data\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.477231 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gp7f\" (UniqueName: \"kubernetes.io/projected/6f6b0598-f49f-4300-a2e3-edb512001517-kube-api-access-4gp7f\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.477270 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.477292 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6b0598-f49f-4300-a2e3-edb512001517-log-httpd\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.579603 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6b0598-f49f-4300-a2e3-edb512001517-run-httpd\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.579730 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.579770 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-config-data\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.579819 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gp7f\" (UniqueName: \"kubernetes.io/projected/6f6b0598-f49f-4300-a2e3-edb512001517-kube-api-access-4gp7f\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.579875 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.579919 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6b0598-f49f-4300-a2e3-edb512001517-log-httpd\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.579964 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-scripts\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.580059 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.580317 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6b0598-f49f-4300-a2e3-edb512001517-run-httpd\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.580647 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6b0598-f49f-4300-a2e3-edb512001517-log-httpd\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.588408 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-config-data\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.591890 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.591906 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.592532 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.600473 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gp7f\" (UniqueName: \"kubernetes.io/projected/6f6b0598-f49f-4300-a2e3-edb512001517-kube-api-access-4gp7f\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.611655 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-scripts\") pod \"ceilometer-0\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " pod="openstack/ceilometer-0" Dec 06 07:21:48 crc kubenswrapper[4954]: I1206 07:21:48.656316 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:21:49 crc kubenswrapper[4954]: W1206 07:21:49.138177 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f6b0598_f49f_4300_a2e3_edb512001517.slice/crio-6e7d8d3bc70058aa8b89c559d1a61bc72fc98f0687b379dea6d56813fad2887a WatchSource:0}: Error finding container 6e7d8d3bc70058aa8b89c559d1a61bc72fc98f0687b379dea6d56813fad2887a: Status 404 returned error can't find the container with id 6e7d8d3bc70058aa8b89c559d1a61bc72fc98f0687b379dea6d56813fad2887a Dec 06 07:21:49 crc kubenswrapper[4954]: I1206 07:21:49.140290 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:21:49 crc kubenswrapper[4954]: I1206 07:21:49.221061 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6b0598-f49f-4300-a2e3-edb512001517","Type":"ContainerStarted","Data":"6e7d8d3bc70058aa8b89c559d1a61bc72fc98f0687b379dea6d56813fad2887a"} Dec 06 07:21:49 crc kubenswrapper[4954]: I1206 07:21:49.467957 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33" path="/var/lib/kubelet/pods/4a9972ee-3ddb-4cf7-b715-1ef79a4ddf33/volumes" Dec 06 07:21:50 crc kubenswrapper[4954]: I1206 07:21:50.126724 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7bd87576bf-bh22g" podUID="cf8ab637-2f55-4fe9-ba71-e2f98657d262" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.189:5353: i/o timeout" Dec 06 07:21:50 crc kubenswrapper[4954]: I1206 07:21:50.234302 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6b0598-f49f-4300-a2e3-edb512001517","Type":"ContainerStarted","Data":"0231a48c4c2f1af85763f5b6c38a7336d6485526192523aeaa1890f0cad332a2"} Dec 06 07:21:50 crc kubenswrapper[4954]: I1206 07:21:50.235961 4954 generic.go:334] "Generic (PLEG): container finished" podID="dc8cf20f-f0a2-417a-914a-7710fa0fafb9" containerID="cadf0bf970efee2a4815c1a1b175fffd83284eaf0f45d7a45fe5892947c1264b" exitCode=0 Dec 06 07:21:50 crc kubenswrapper[4954]: I1206 07:21:50.236039 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dw292" event={"ID":"dc8cf20f-f0a2-417a-914a-7710fa0fafb9","Type":"ContainerDied","Data":"cadf0bf970efee2a4815c1a1b175fffd83284eaf0f45d7a45fe5892947c1264b"} Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.251363 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6b0598-f49f-4300-a2e3-edb512001517","Type":"ContainerStarted","Data":"ec3ed71b2491ef41795c579a498091f87719a945dc7c7ab6d0235f2e1a17dee5"} Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.252126 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6b0598-f49f-4300-a2e3-edb512001517","Type":"ContainerStarted","Data":"87f6b7dbad9cba2999b009687b2b8792fd5e80223d28635e9b8a19849f7be812"} Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.582025 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.582273 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.590260 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.594078 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.713193 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dw292" Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.874158 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-scripts\") pod \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\" (UID: \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\") " Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.874224 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-config-data\") pod \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\" (UID: \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\") " Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.874272 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-combined-ca-bundle\") pod \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\" (UID: \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\") " Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.874295 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgkgd\" (UniqueName: \"kubernetes.io/projected/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-kube-api-access-mgkgd\") pod \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\" (UID: \"dc8cf20f-f0a2-417a-914a-7710fa0fafb9\") " Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.892205 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-kube-api-access-mgkgd" (OuterVolumeSpecName: "kube-api-access-mgkgd") pod "dc8cf20f-f0a2-417a-914a-7710fa0fafb9" (UID: "dc8cf20f-f0a2-417a-914a-7710fa0fafb9"). InnerVolumeSpecName "kube-api-access-mgkgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.892836 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-scripts" (OuterVolumeSpecName: "scripts") pod "dc8cf20f-f0a2-417a-914a-7710fa0fafb9" (UID: "dc8cf20f-f0a2-417a-914a-7710fa0fafb9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.927298 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc8cf20f-f0a2-417a-914a-7710fa0fafb9" (UID: "dc8cf20f-f0a2-417a-914a-7710fa0fafb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.927365 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-config-data" (OuterVolumeSpecName: "config-data") pod "dc8cf20f-f0a2-417a-914a-7710fa0fafb9" (UID: "dc8cf20f-f0a2-417a-914a-7710fa0fafb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.976844 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.977927 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.978073 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:51 crc kubenswrapper[4954]: I1206 07:21:51.978172 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgkgd\" (UniqueName: \"kubernetes.io/projected/dc8cf20f-f0a2-417a-914a-7710fa0fafb9-kube-api-access-mgkgd\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:52 crc kubenswrapper[4954]: I1206 07:21:52.262874 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dw292" event={"ID":"dc8cf20f-f0a2-417a-914a-7710fa0fafb9","Type":"ContainerDied","Data":"248782f5ca8973af9cb2a6a85430e61287373ebea4b83076424ecaf2a974848f"} Dec 06 07:21:52 crc kubenswrapper[4954]: I1206 07:21:52.262935 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="248782f5ca8973af9cb2a6a85430e61287373ebea4b83076424ecaf2a974848f" Dec 06 07:21:52 crc kubenswrapper[4954]: I1206 07:21:52.262944 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dw292" Dec 06 07:21:52 crc kubenswrapper[4954]: E1206 07:21:52.400181 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc8cf20f_f0a2_417a_914a_7710fa0fafb9.slice\": RecentStats: unable to find data in memory cache]" Dec 06 07:21:52 crc kubenswrapper[4954]: I1206 07:21:52.937688 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 07:21:52 crc kubenswrapper[4954]: I1206 07:21:52.938428 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="54d05135-2771-460c-9949-4e16516015c4" containerName="nova-scheduler-scheduler" containerID="cri-o://0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b" gracePeriod=30 Dec 06 07:21:52 crc kubenswrapper[4954]: I1206 07:21:52.961642 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:21:52 crc kubenswrapper[4954]: I1206 07:21:52.962342 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2e51130d-df1d-4b88-96b4-d7fdaf627585" containerName="nova-api-log" containerID="cri-o://397c197bb4916cc3fdaeece9fa61fc8cbfc77895ee2b87f8e8864bb2222c4711" gracePeriod=30 Dec 06 07:21:52 crc kubenswrapper[4954]: I1206 07:21:52.963047 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2e51130d-df1d-4b88-96b4-d7fdaf627585" containerName="nova-api-api" containerID="cri-o://0d75a0fbb0dc278846fb50c1bc0b4b577c675599c9f610b4ac5a6dc908b9d049" gracePeriod=30 Dec 06 07:21:52 crc kubenswrapper[4954]: I1206 07:21:52.987787 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.285412 4954 generic.go:334] "Generic (PLEG): container finished" podID="2e51130d-df1d-4b88-96b4-d7fdaf627585" containerID="0d75a0fbb0dc278846fb50c1bc0b4b577c675599c9f610b4ac5a6dc908b9d049" exitCode=0 Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.301011 4954 generic.go:334] "Generic (PLEG): container finished" podID="2e51130d-df1d-4b88-96b4-d7fdaf627585" containerID="397c197bb4916cc3fdaeece9fa61fc8cbfc77895ee2b87f8e8864bb2222c4711" exitCode=143 Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.286252 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e51130d-df1d-4b88-96b4-d7fdaf627585","Type":"ContainerDied","Data":"0d75a0fbb0dc278846fb50c1bc0b4b577c675599c9f610b4ac5a6dc908b9d049"} Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.301221 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e51130d-df1d-4b88-96b4-d7fdaf627585","Type":"ContainerDied","Data":"397c197bb4916cc3fdaeece9fa61fc8cbfc77895ee2b87f8e8864bb2222c4711"} Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.308537 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6b0598-f49f-4300-a2e3-edb512001517","Type":"ContainerStarted","Data":"872a2bac50f05d39969f244d52edcb2950bf78dcb8c8eeda153b5323ad19a271"} Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.308629 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.590497 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.608996 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-internal-tls-certs\") pod \"2e51130d-df1d-4b88-96b4-d7fdaf627585\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.609037 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-combined-ca-bundle\") pod \"2e51130d-df1d-4b88-96b4-d7fdaf627585\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.609079 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nktr\" (UniqueName: \"kubernetes.io/projected/2e51130d-df1d-4b88-96b4-d7fdaf627585-kube-api-access-4nktr\") pod \"2e51130d-df1d-4b88-96b4-d7fdaf627585\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.609205 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-config-data\") pod \"2e51130d-df1d-4b88-96b4-d7fdaf627585\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.609239 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e51130d-df1d-4b88-96b4-d7fdaf627585-logs\") pod \"2e51130d-df1d-4b88-96b4-d7fdaf627585\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.609279 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-public-tls-certs\") pod \"2e51130d-df1d-4b88-96b4-d7fdaf627585\" (UID: \"2e51130d-df1d-4b88-96b4-d7fdaf627585\") " Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.610118 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e51130d-df1d-4b88-96b4-d7fdaf627585-logs" (OuterVolumeSpecName: "logs") pod "2e51130d-df1d-4b88-96b4-d7fdaf627585" (UID: "2e51130d-df1d-4b88-96b4-d7fdaf627585"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.620708 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e51130d-df1d-4b88-96b4-d7fdaf627585-kube-api-access-4nktr" (OuterVolumeSpecName: "kube-api-access-4nktr") pod "2e51130d-df1d-4b88-96b4-d7fdaf627585" (UID: "2e51130d-df1d-4b88-96b4-d7fdaf627585"). InnerVolumeSpecName "kube-api-access-4nktr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.625363 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.405635779 podStartE2EDuration="5.625336762s" podCreationTimestamp="2025-12-06 07:21:48 +0000 UTC" firstStartedPulling="2025-12-06 07:21:49.144084057 +0000 UTC m=+1483.957443486" lastFinishedPulling="2025-12-06 07:21:52.36378508 +0000 UTC m=+1487.177144469" observedRunningTime="2025-12-06 07:21:53.345150477 +0000 UTC m=+1488.158509876" watchObservedRunningTime="2025-12-06 07:21:53.625336762 +0000 UTC m=+1488.438696151" Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.656673 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-config-data" (OuterVolumeSpecName: "config-data") pod "2e51130d-df1d-4b88-96b4-d7fdaf627585" (UID: "2e51130d-df1d-4b88-96b4-d7fdaf627585"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.661691 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e51130d-df1d-4b88-96b4-d7fdaf627585" (UID: "2e51130d-df1d-4b88-96b4-d7fdaf627585"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.694351 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2e51130d-df1d-4b88-96b4-d7fdaf627585" (UID: "2e51130d-df1d-4b88-96b4-d7fdaf627585"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.711548 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2e51130d-df1d-4b88-96b4-d7fdaf627585" (UID: "2e51130d-df1d-4b88-96b4-d7fdaf627585"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.712614 4954 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.712647 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.712661 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nktr\" (UniqueName: \"kubernetes.io/projected/2e51130d-df1d-4b88-96b4-d7fdaf627585-kube-api-access-4nktr\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.712680 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.712693 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e51130d-df1d-4b88-96b4-d7fdaf627585-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:53 crc kubenswrapper[4954]: I1206 07:21:53.712705 4954 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e51130d-df1d-4b88-96b4-d7fdaf627585-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.318552 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" containerName="nova-metadata-log" containerID="cri-o://0df57cfe1015d2399cf8df8174c51253197ec3e7f94347194281810498a39461" gracePeriod=30 Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.318971 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.318983 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e51130d-df1d-4b88-96b4-d7fdaf627585","Type":"ContainerDied","Data":"3a79582f615b4bb1056cf00b9bada647f848bba9d3af24671b4eb4cd19297980"} Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.319025 4954 scope.go:117] "RemoveContainer" containerID="0d75a0fbb0dc278846fb50c1bc0b4b577c675599c9f610b4ac5a6dc908b9d049" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.320393 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" containerName="nova-metadata-metadata" containerID="cri-o://2b3c10bb6fbb7cba56a366b47af2d85fc7beef682db7458a31124a55307d76a7" gracePeriod=30 Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.355884 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.364022 4954 scope.go:117] "RemoveContainer" containerID="397c197bb4916cc3fdaeece9fa61fc8cbfc77895ee2b87f8e8864bb2222c4711" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.366583 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.392883 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 07:21:54 crc kubenswrapper[4954]: E1206 07:21:54.427585 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e51130d-df1d-4b88-96b4-d7fdaf627585" containerName="nova-api-api" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.427645 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e51130d-df1d-4b88-96b4-d7fdaf627585" containerName="nova-api-api" Dec 06 07:21:54 crc kubenswrapper[4954]: E1206 07:21:54.427685 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e51130d-df1d-4b88-96b4-d7fdaf627585" containerName="nova-api-log" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.427693 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e51130d-df1d-4b88-96b4-d7fdaf627585" containerName="nova-api-log" Dec 06 07:21:54 crc kubenswrapper[4954]: E1206 07:21:54.427721 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc8cf20f-f0a2-417a-914a-7710fa0fafb9" containerName="nova-manage" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.427728 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc8cf20f-f0a2-417a-914a-7710fa0fafb9" containerName="nova-manage" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.432247 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e51130d-df1d-4b88-96b4-d7fdaf627585" containerName="nova-api-api" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.432356 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc8cf20f-f0a2-417a-914a-7710fa0fafb9" containerName="nova-manage" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.432377 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e51130d-df1d-4b88-96b4-d7fdaf627585" containerName="nova-api-log" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.434528 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.443900 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.444723 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.444793 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.458312 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.547666 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhspz\" (UniqueName: \"kubernetes.io/projected/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-kube-api-access-mhspz\") pod \"nova-api-0\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.547953 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-config-data\") pod \"nova-api-0\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.548001 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-public-tls-certs\") pod \"nova-api-0\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.548030 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.548049 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.548074 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-logs\") pod \"nova-api-0\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.650412 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhspz\" (UniqueName: \"kubernetes.io/projected/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-kube-api-access-mhspz\") pod \"nova-api-0\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.650830 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-config-data\") pod \"nova-api-0\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.651015 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-public-tls-certs\") pod \"nova-api-0\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.651158 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.651268 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.651361 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-logs\") pod \"nova-api-0\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.651753 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-logs\") pod \"nova-api-0\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.657356 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-config-data\") pod \"nova-api-0\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.658007 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.658355 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-public-tls-certs\") pod \"nova-api-0\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.658481 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.671527 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhspz\" (UniqueName: \"kubernetes.io/projected/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-kube-api-access-mhspz\") pod \"nova-api-0\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " pod="openstack/nova-api-0" Dec 06 07:21:54 crc kubenswrapper[4954]: I1206 07:21:54.801658 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 07:21:55 crc kubenswrapper[4954]: I1206 07:21:55.262451 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:21:55 crc kubenswrapper[4954]: W1206 07:21:55.268184 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc76a6f0b_0bc0_4f2e_8e03_f7871b960ebe.slice/crio-b46e78bf1848a06a70c7514940675caa666a41fb787968055e444933e52c136b WatchSource:0}: Error finding container b46e78bf1848a06a70c7514940675caa666a41fb787968055e444933e52c136b: Status 404 returned error can't find the container with id b46e78bf1848a06a70c7514940675caa666a41fb787968055e444933e52c136b Dec 06 07:21:55 crc kubenswrapper[4954]: I1206 07:21:55.331211 4954 generic.go:334] "Generic (PLEG): container finished" podID="eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" containerID="0df57cfe1015d2399cf8df8174c51253197ec3e7f94347194281810498a39461" exitCode=143 Dec 06 07:21:55 crc kubenswrapper[4954]: I1206 07:21:55.331314 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd","Type":"ContainerDied","Data":"0df57cfe1015d2399cf8df8174c51253197ec3e7f94347194281810498a39461"} Dec 06 07:21:55 crc kubenswrapper[4954]: I1206 07:21:55.334216 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe","Type":"ContainerStarted","Data":"b46e78bf1848a06a70c7514940675caa666a41fb787968055e444933e52c136b"} Dec 06 07:21:55 crc kubenswrapper[4954]: I1206 07:21:55.455410 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e51130d-df1d-4b88-96b4-d7fdaf627585" path="/var/lib/kubelet/pods/2e51130d-df1d-4b88-96b4-d7fdaf627585/volumes" Dec 06 07:21:56 crc kubenswrapper[4954]: I1206 07:21:56.347886 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe","Type":"ContainerStarted","Data":"5c5b57148f52d4e8a3969fbcc3484fc2918682cc67900402ffb917e03848f564"} Dec 06 07:21:56 crc kubenswrapper[4954]: I1206 07:21:56.348256 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe","Type":"ContainerStarted","Data":"6f34aece6d2e9c27b3779a555391f111659dcf20da766d7854e90f5cfe888d96"} Dec 06 07:21:56 crc kubenswrapper[4954]: I1206 07:21:56.392135 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.3920946880000002 podStartE2EDuration="2.392094688s" podCreationTimestamp="2025-12-06 07:21:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:21:56.375254915 +0000 UTC m=+1491.188614304" watchObservedRunningTime="2025-12-06 07:21:56.392094688 +0000 UTC m=+1491.205454077" Dec 06 07:21:57 crc kubenswrapper[4954]: I1206 07:21:57.462364 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:53156->10.217.0.197:8775: read: connection reset by peer" Dec 06 07:21:57 crc kubenswrapper[4954]: I1206 07:21:57.462682 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:53158->10.217.0.197:8775: read: connection reset by peer" Dec 06 07:21:57 crc kubenswrapper[4954]: E1206 07:21:57.895140 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b is running failed: container process not found" containerID="0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 07:21:57 crc kubenswrapper[4954]: E1206 07:21:57.896353 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b is running failed: container process not found" containerID="0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 07:21:57 crc kubenswrapper[4954]: E1206 07:21:57.898045 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b is running failed: container process not found" containerID="0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 07:21:57 crc kubenswrapper[4954]: E1206 07:21:57.898083 4954 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="54d05135-2771-460c-9949-4e16516015c4" containerName="nova-scheduler-scheduler" Dec 06 07:21:57 crc kubenswrapper[4954]: I1206 07:21:57.957502 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.130335 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.139526 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52j85\" (UniqueName: \"kubernetes.io/projected/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-kube-api-access-52j85\") pod \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.139615 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-config-data\") pod \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.139640 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-combined-ca-bundle\") pod \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.139789 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-nova-metadata-tls-certs\") pod \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.139904 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-logs\") pod \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\" (UID: \"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd\") " Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.141634 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-logs" (OuterVolumeSpecName: "logs") pod "eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" (UID: "eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.153609 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-kube-api-access-52j85" (OuterVolumeSpecName: "kube-api-access-52j85") pod "eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" (UID: "eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd"). InnerVolumeSpecName "kube-api-access-52j85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.176771 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" (UID: "eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.195012 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-config-data" (OuterVolumeSpecName: "config-data") pod "eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" (UID: "eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.218423 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" (UID: "eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.241723 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54d05135-2771-460c-9949-4e16516015c4-combined-ca-bundle\") pod \"54d05135-2771-460c-9949-4e16516015c4\" (UID: \"54d05135-2771-460c-9949-4e16516015c4\") " Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.241901 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54d05135-2771-460c-9949-4e16516015c4-config-data\") pod \"54d05135-2771-460c-9949-4e16516015c4\" (UID: \"54d05135-2771-460c-9949-4e16516015c4\") " Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.241945 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlfc2\" (UniqueName: \"kubernetes.io/projected/54d05135-2771-460c-9949-4e16516015c4-kube-api-access-dlfc2\") pod \"54d05135-2771-460c-9949-4e16516015c4\" (UID: \"54d05135-2771-460c-9949-4e16516015c4\") " Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.242392 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52j85\" (UniqueName: \"kubernetes.io/projected/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-kube-api-access-52j85\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.242410 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.242421 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.242430 4954 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.242438 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.245723 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54d05135-2771-460c-9949-4e16516015c4-kube-api-access-dlfc2" (OuterVolumeSpecName: "kube-api-access-dlfc2") pod "54d05135-2771-460c-9949-4e16516015c4" (UID: "54d05135-2771-460c-9949-4e16516015c4"). InnerVolumeSpecName "kube-api-access-dlfc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.272556 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54d05135-2771-460c-9949-4e16516015c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54d05135-2771-460c-9949-4e16516015c4" (UID: "54d05135-2771-460c-9949-4e16516015c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.280886 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54d05135-2771-460c-9949-4e16516015c4-config-data" (OuterVolumeSpecName: "config-data") pod "54d05135-2771-460c-9949-4e16516015c4" (UID: "54d05135-2771-460c-9949-4e16516015c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.344076 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlfc2\" (UniqueName: \"kubernetes.io/projected/54d05135-2771-460c-9949-4e16516015c4-kube-api-access-dlfc2\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.344125 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54d05135-2771-460c-9949-4e16516015c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.344136 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54d05135-2771-460c-9949-4e16516015c4-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.377000 4954 generic.go:334] "Generic (PLEG): container finished" podID="eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" containerID="2b3c10bb6fbb7cba56a366b47af2d85fc7beef682db7458a31124a55307d76a7" exitCode=0 Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.377105 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.377108 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd","Type":"ContainerDied","Data":"2b3c10bb6fbb7cba56a366b47af2d85fc7beef682db7458a31124a55307d76a7"} Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.377252 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd","Type":"ContainerDied","Data":"c6691f557efc50f71003c5353a0d0b0eb51a139e841e13ae3ec303be659df647"} Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.377282 4954 scope.go:117] "RemoveContainer" containerID="2b3c10bb6fbb7cba56a366b47af2d85fc7beef682db7458a31124a55307d76a7" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.381575 4954 generic.go:334] "Generic (PLEG): container finished" podID="54d05135-2771-460c-9949-4e16516015c4" containerID="0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b" exitCode=0 Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.381616 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"54d05135-2771-460c-9949-4e16516015c4","Type":"ContainerDied","Data":"0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b"} Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.381640 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"54d05135-2771-460c-9949-4e16516015c4","Type":"ContainerDied","Data":"7d7866f19950fb113dbb94472eda9135a59f8e762fca28aef0fa16631311a4ba"} Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.381680 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.412251 4954 scope.go:117] "RemoveContainer" containerID="0df57cfe1015d2399cf8df8174c51253197ec3e7f94347194281810498a39461" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.426939 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.464790 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.475725 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.476924 4954 scope.go:117] "RemoveContainer" containerID="2b3c10bb6fbb7cba56a366b47af2d85fc7beef682db7458a31124a55307d76a7" Dec 06 07:21:58 crc kubenswrapper[4954]: E1206 07:21:58.482073 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b3c10bb6fbb7cba56a366b47af2d85fc7beef682db7458a31124a55307d76a7\": container with ID starting with 2b3c10bb6fbb7cba56a366b47af2d85fc7beef682db7458a31124a55307d76a7 not found: ID does not exist" containerID="2b3c10bb6fbb7cba56a366b47af2d85fc7beef682db7458a31124a55307d76a7" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.482145 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b3c10bb6fbb7cba56a366b47af2d85fc7beef682db7458a31124a55307d76a7"} err="failed to get container status \"2b3c10bb6fbb7cba56a366b47af2d85fc7beef682db7458a31124a55307d76a7\": rpc error: code = NotFound desc = could not find container \"2b3c10bb6fbb7cba56a366b47af2d85fc7beef682db7458a31124a55307d76a7\": container with ID starting with 2b3c10bb6fbb7cba56a366b47af2d85fc7beef682db7458a31124a55307d76a7 not found: ID does not exist" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.482181 4954 scope.go:117] "RemoveContainer" containerID="0df57cfe1015d2399cf8df8174c51253197ec3e7f94347194281810498a39461" Dec 06 07:21:58 crc kubenswrapper[4954]: E1206 07:21:58.483477 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0df57cfe1015d2399cf8df8174c51253197ec3e7f94347194281810498a39461\": container with ID starting with 0df57cfe1015d2399cf8df8174c51253197ec3e7f94347194281810498a39461 not found: ID does not exist" containerID="0df57cfe1015d2399cf8df8174c51253197ec3e7f94347194281810498a39461" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.483529 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0df57cfe1015d2399cf8df8174c51253197ec3e7f94347194281810498a39461"} err="failed to get container status \"0df57cfe1015d2399cf8df8174c51253197ec3e7f94347194281810498a39461\": rpc error: code = NotFound desc = could not find container \"0df57cfe1015d2399cf8df8174c51253197ec3e7f94347194281810498a39461\": container with ID starting with 0df57cfe1015d2399cf8df8174c51253197ec3e7f94347194281810498a39461 not found: ID does not exist" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.483575 4954 scope.go:117] "RemoveContainer" containerID="0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.487232 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 07:21:58 crc kubenswrapper[4954]: E1206 07:21:58.487957 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" containerName="nova-metadata-metadata" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.487980 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" containerName="nova-metadata-metadata" Dec 06 07:21:58 crc kubenswrapper[4954]: E1206 07:21:58.487994 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" containerName="nova-metadata-log" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.488003 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" containerName="nova-metadata-log" Dec 06 07:21:58 crc kubenswrapper[4954]: E1206 07:21:58.488039 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d05135-2771-460c-9949-4e16516015c4" containerName="nova-scheduler-scheduler" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.488048 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d05135-2771-460c-9949-4e16516015c4" containerName="nova-scheduler-scheduler" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.488307 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="54d05135-2771-460c-9949-4e16516015c4" containerName="nova-scheduler-scheduler" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.488332 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" containerName="nova-metadata-log" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.488349 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" containerName="nova-metadata-metadata" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.489948 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.500656 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.501485 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.501774 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.508728 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.519506 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.520030 4954 scope.go:117] "RemoveContainer" containerID="0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b" Dec 06 07:21:58 crc kubenswrapper[4954]: E1206 07:21:58.521007 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b\": container with ID starting with 0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b not found: ID does not exist" containerID="0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.521078 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b"} err="failed to get container status \"0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b\": rpc error: code = NotFound desc = could not find container \"0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b\": container with ID starting with 0d8a9291d43220232f9272b31f563b2fd3a3e841f5c6c54c27aa27291dddb99b not found: ID does not exist" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.522226 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.525736 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.529661 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.651256 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/432b9d93-b045-4e25-b58b-b3a6fd8512c4-config-data\") pod \"nova-metadata-0\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.651325 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1-config-data\") pod \"nova-scheduler-0\" (UID: \"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1\") " pod="openstack/nova-scheduler-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.651352 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-975mx\" (UniqueName: \"kubernetes.io/projected/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1-kube-api-access-975mx\") pod \"nova-scheduler-0\" (UID: \"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1\") " pod="openstack/nova-scheduler-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.651412 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdcp8\" (UniqueName: \"kubernetes.io/projected/432b9d93-b045-4e25-b58b-b3a6fd8512c4-kube-api-access-jdcp8\") pod \"nova-metadata-0\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.651447 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/432b9d93-b045-4e25-b58b-b3a6fd8512c4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.651599 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1\") " pod="openstack/nova-scheduler-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.651656 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/432b9d93-b045-4e25-b58b-b3a6fd8512c4-logs\") pod \"nova-metadata-0\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.651691 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/432b9d93-b045-4e25-b58b-b3a6fd8512c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.753201 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1\") " pod="openstack/nova-scheduler-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.753270 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/432b9d93-b045-4e25-b58b-b3a6fd8512c4-logs\") pod \"nova-metadata-0\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.753290 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/432b9d93-b045-4e25-b58b-b3a6fd8512c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.753332 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/432b9d93-b045-4e25-b58b-b3a6fd8512c4-config-data\") pod \"nova-metadata-0\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.753389 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1-config-data\") pod \"nova-scheduler-0\" (UID: \"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1\") " pod="openstack/nova-scheduler-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.753411 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-975mx\" (UniqueName: \"kubernetes.io/projected/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1-kube-api-access-975mx\") pod \"nova-scheduler-0\" (UID: \"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1\") " pod="openstack/nova-scheduler-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.753847 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/432b9d93-b045-4e25-b58b-b3a6fd8512c4-logs\") pod \"nova-metadata-0\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.754242 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdcp8\" (UniqueName: \"kubernetes.io/projected/432b9d93-b045-4e25-b58b-b3a6fd8512c4-kube-api-access-jdcp8\") pod \"nova-metadata-0\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.754379 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/432b9d93-b045-4e25-b58b-b3a6fd8512c4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.758412 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/432b9d93-b045-4e25-b58b-b3a6fd8512c4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.758595 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/432b9d93-b045-4e25-b58b-b3a6fd8512c4-config-data\") pod \"nova-metadata-0\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.759524 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1\") " pod="openstack/nova-scheduler-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.759948 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1-config-data\") pod \"nova-scheduler-0\" (UID: \"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1\") " pod="openstack/nova-scheduler-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.760618 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/432b9d93-b045-4e25-b58b-b3a6fd8512c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.769658 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-975mx\" (UniqueName: \"kubernetes.io/projected/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1-kube-api-access-975mx\") pod \"nova-scheduler-0\" (UID: \"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1\") " pod="openstack/nova-scheduler-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.772518 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdcp8\" (UniqueName: \"kubernetes.io/projected/432b9d93-b045-4e25-b58b-b3a6fd8512c4-kube-api-access-jdcp8\") pod \"nova-metadata-0\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.821524 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 07:21:58 crc kubenswrapper[4954]: I1206 07:21:58.899050 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 07:21:59 crc kubenswrapper[4954]: W1206 07:21:59.300438 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod432b9d93_b045_4e25_b58b_b3a6fd8512c4.slice/crio-a2f1467030e2ed182f134941e6c80e1c7330809d7284e0dab90842fce7458b87 WatchSource:0}: Error finding container a2f1467030e2ed182f134941e6c80e1c7330809d7284e0dab90842fce7458b87: Status 404 returned error can't find the container with id a2f1467030e2ed182f134941e6c80e1c7330809d7284e0dab90842fce7458b87 Dec 06 07:21:59 crc kubenswrapper[4954]: I1206 07:21:59.301530 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 07:21:59 crc kubenswrapper[4954]: I1206 07:21:59.400798 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"432b9d93-b045-4e25-b58b-b3a6fd8512c4","Type":"ContainerStarted","Data":"a2f1467030e2ed182f134941e6c80e1c7330809d7284e0dab90842fce7458b87"} Dec 06 07:21:59 crc kubenswrapper[4954]: I1206 07:21:59.415496 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 07:21:59 crc kubenswrapper[4954]: I1206 07:21:59.464072 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54d05135-2771-460c-9949-4e16516015c4" path="/var/lib/kubelet/pods/54d05135-2771-460c-9949-4e16516015c4/volumes" Dec 06 07:21:59 crc kubenswrapper[4954]: I1206 07:21:59.466357 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd" path="/var/lib/kubelet/pods/eee9fe3e-4652-4a6e-aaaf-a28e6738e9dd/volumes" Dec 06 07:22:00 crc kubenswrapper[4954]: I1206 07:22:00.415769 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"432b9d93-b045-4e25-b58b-b3a6fd8512c4","Type":"ContainerStarted","Data":"d14ca933ddb6a12f4399c39d7a1279cb6dd7f6ad978226ac72fa861be2573071"} Dec 06 07:22:00 crc kubenswrapper[4954]: I1206 07:22:00.416522 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"432b9d93-b045-4e25-b58b-b3a6fd8512c4","Type":"ContainerStarted","Data":"ac777ba7c81ee56eb4fb435d7da2f478c6a3c4f40024a4daae7126627007674a"} Dec 06 07:22:00 crc kubenswrapper[4954]: I1206 07:22:00.429255 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1","Type":"ContainerStarted","Data":"3441df07b213b831010c3f148e9250178d2250f974a1c6530c3c0990a93f5d6f"} Dec 06 07:22:00 crc kubenswrapper[4954]: I1206 07:22:00.429305 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1","Type":"ContainerStarted","Data":"00ec9c4811b34c7923cdf604064125a472b2f12d6fe03e96a6f5155d5d491496"} Dec 06 07:22:00 crc kubenswrapper[4954]: I1206 07:22:00.476274 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.476245791 podStartE2EDuration="2.476245791s" podCreationTimestamp="2025-12-06 07:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:22:00.439801429 +0000 UTC m=+1495.253160838" watchObservedRunningTime="2025-12-06 07:22:00.476245791 +0000 UTC m=+1495.289605200" Dec 06 07:22:00 crc kubenswrapper[4954]: I1206 07:22:00.499134 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.499113026 podStartE2EDuration="2.499113026s" podCreationTimestamp="2025-12-06 07:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:22:00.47473057 +0000 UTC m=+1495.288089959" watchObservedRunningTime="2025-12-06 07:22:00.499113026 +0000 UTC m=+1495.312472415" Dec 06 07:22:03 crc kubenswrapper[4954]: I1206 07:22:03.822026 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 07:22:03 crc kubenswrapper[4954]: I1206 07:22:03.822295 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 07:22:03 crc kubenswrapper[4954]: I1206 07:22:03.899230 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 06 07:22:04 crc kubenswrapper[4954]: I1206 07:22:04.803009 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 07:22:04 crc kubenswrapper[4954]: I1206 07:22:04.803553 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 07:22:05 crc kubenswrapper[4954]: I1206 07:22:05.820845 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 07:22:05 crc kubenswrapper[4954]: I1206 07:22:05.820855 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 07:22:08 crc kubenswrapper[4954]: I1206 07:22:08.823010 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 07:22:08 crc kubenswrapper[4954]: I1206 07:22:08.823571 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 07:22:08 crc kubenswrapper[4954]: I1206 07:22:08.899718 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 06 07:22:08 crc kubenswrapper[4954]: I1206 07:22:08.956912 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 06 07:22:09 crc kubenswrapper[4954]: I1206 07:22:09.552360 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 06 07:22:09 crc kubenswrapper[4954]: I1206 07:22:09.837744 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="432b9d93-b045-4e25-b58b-b3a6fd8512c4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 07:22:09 crc kubenswrapper[4954]: I1206 07:22:09.837891 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="432b9d93-b045-4e25-b58b-b3a6fd8512c4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 07:22:10 crc kubenswrapper[4954]: I1206 07:22:10.101827 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:22:10 crc kubenswrapper[4954]: I1206 07:22:10.102284 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:22:10 crc kubenswrapper[4954]: I1206 07:22:10.102411 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 07:22:10 crc kubenswrapper[4954]: I1206 07:22:10.103380 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 07:22:10 crc kubenswrapper[4954]: I1206 07:22:10.103517 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" gracePeriod=600 Dec 06 07:22:10 crc kubenswrapper[4954]: E1206 07:22:10.242694 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:22:10 crc kubenswrapper[4954]: I1206 07:22:10.533455 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" exitCode=0 Dec 06 07:22:10 crc kubenswrapper[4954]: I1206 07:22:10.533544 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689"} Dec 06 07:22:10 crc kubenswrapper[4954]: I1206 07:22:10.533649 4954 scope.go:117] "RemoveContainer" containerID="3092fb494fb5f62377d5237c9a62fe265b08e78c0cfb30eb9b606f026fbf3679" Dec 06 07:22:10 crc kubenswrapper[4954]: I1206 07:22:10.535736 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:22:10 crc kubenswrapper[4954]: E1206 07:22:10.536617 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:22:14 crc kubenswrapper[4954]: I1206 07:22:14.816708 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 07:22:14 crc kubenswrapper[4954]: I1206 07:22:14.817210 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 07:22:14 crc kubenswrapper[4954]: I1206 07:22:14.817977 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 07:22:14 crc kubenswrapper[4954]: I1206 07:22:14.817996 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 07:22:14 crc kubenswrapper[4954]: I1206 07:22:14.829741 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 07:22:14 crc kubenswrapper[4954]: I1206 07:22:14.830075 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 07:22:18 crc kubenswrapper[4954]: I1206 07:22:18.667006 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 06 07:22:18 crc kubenswrapper[4954]: I1206 07:22:18.831418 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 07:22:18 crc kubenswrapper[4954]: I1206 07:22:18.832232 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 07:22:18 crc kubenswrapper[4954]: I1206 07:22:18.841405 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 07:22:19 crc kubenswrapper[4954]: I1206 07:22:19.641474 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 07:22:24 crc kubenswrapper[4954]: I1206 07:22:24.443619 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:22:24 crc kubenswrapper[4954]: E1206 07:22:24.444483 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:22:37 crc kubenswrapper[4954]: I1206 07:22:37.443199 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:22:37 crc kubenswrapper[4954]: E1206 07:22:37.444011 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.026583 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glanced905-account-delete-6dwqh"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.028683 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glanced905-account-delete-6dwqh" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.038080 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46-operator-scripts\") pod \"glanced905-account-delete-6dwqh\" (UID: \"e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46\") " pod="openstack/glanced905-account-delete-6dwqh" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.038200 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmt8w\" (UniqueName: \"kubernetes.io/projected/e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46-kube-api-access-pmt8w\") pod \"glanced905-account-delete-6dwqh\" (UID: \"e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46\") " pod="openstack/glanced905-account-delete-6dwqh" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.078618 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.078866 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e" containerName="openstackclient" containerID="cri-o://084e897542d772a7c89dc0e1534dec1824e39ef33265b2908249460a05917c64" gracePeriod=2 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.091609 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glanced905-account-delete-6dwqh"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.107506 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.122600 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.139442 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46-operator-scripts\") pod \"glanced905-account-delete-6dwqh\" (UID: \"e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46\") " pod="openstack/glanced905-account-delete-6dwqh" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.139537 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmt8w\" (UniqueName: \"kubernetes.io/projected/e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46-kube-api-access-pmt8w\") pod \"glanced905-account-delete-6dwqh\" (UID: \"e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46\") " pod="openstack/glanced905-account-delete-6dwqh" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.140688 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46-operator-scripts\") pod \"glanced905-account-delete-6dwqh\" (UID: \"e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46\") " pod="openstack/glanced905-account-delete-6dwqh" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.166434 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.166773 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="6de2aad5-fb15-489c-b0fc-200e18ad3baa" containerName="ovn-northd" containerID="cri-o://9874b322d9222424c03e259859ca289d93dd364308738ff78e86095855f8b6e6" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.167232 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="6de2aad5-fb15-489c-b0fc-200e18ad3baa" containerName="openstack-network-exporter" containerID="cri-o://b185262581277ddddfd5f329caf8d803c0479c8dea4732929b0bc2936e8a4cb6" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.182268 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-zttzq"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.195409 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-zttzq"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.214616 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmt8w\" (UniqueName: \"kubernetes.io/projected/e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46-kube-api-access-pmt8w\") pod \"glanced905-account-delete-6dwqh\" (UID: \"e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46\") " pod="openstack/glanced905-account-delete-6dwqh" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.226800 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-fcgjm"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.252637 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-fcgjm"] Dec 06 07:22:42 crc kubenswrapper[4954]: E1206 07:22:42.269467 4954 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 06 07:22:42 crc kubenswrapper[4954]: E1206 07:22:42.269537 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-config-data podName:578bec25-a54c-4f52-95f2-19f20f833437 nodeName:}" failed. No retries permitted until 2025-12-06 07:22:42.769515951 +0000 UTC m=+1537.582875340 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-config-data") pod "rabbitmq-server-0" (UID: "578bec25-a54c-4f52-95f2-19f20f833437") : configmap "rabbitmq-config-data" not found Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.282609 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinderdf0e-account-delete-fsz84"] Dec 06 07:22:42 crc kubenswrapper[4954]: E1206 07:22:42.283414 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e" containerName="openstackclient" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.283428 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e" containerName="openstackclient" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.283695 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e" containerName="openstackclient" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.284456 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinderdf0e-account-delete-fsz84" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.319648 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinderdf0e-account-delete-fsz84"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.332682 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement88ab-account-delete-597b8"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.334385 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement88ab-account-delete-597b8" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.357636 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lnbn8"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.365132 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glanced905-account-delete-6dwqh" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.372059 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/418e92aa-4713-4a55-b4d5-650587fcb6ca-operator-scripts\") pod \"placement88ab-account-delete-597b8\" (UID: \"418e92aa-4713-4a55-b4d5-650587fcb6ca\") " pod="openstack/placement88ab-account-delete-597b8" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.372117 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbc8h\" (UniqueName: \"kubernetes.io/projected/418e92aa-4713-4a55-b4d5-650587fcb6ca-kube-api-access-qbc8h\") pod \"placement88ab-account-delete-597b8\" (UID: \"418e92aa-4713-4a55-b4d5-650587fcb6ca\") " pod="openstack/placement88ab-account-delete-597b8" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.372144 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mclv\" (UniqueName: \"kubernetes.io/projected/d2ccdb46-97b8-40d8-aebb-5cf28cb6854d-kube-api-access-9mclv\") pod \"cinderdf0e-account-delete-fsz84\" (UID: \"d2ccdb46-97b8-40d8-aebb-5cf28cb6854d\") " pod="openstack/cinderdf0e-account-delete-fsz84" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.372237 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2ccdb46-97b8-40d8-aebb-5cf28cb6854d-operator-scripts\") pod \"cinderdf0e-account-delete-fsz84\" (UID: \"d2ccdb46-97b8-40d8-aebb-5cf28cb6854d\") " pod="openstack/cinderdf0e-account-delete-fsz84" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.430174 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement88ab-account-delete-597b8"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.479059 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbc8h\" (UniqueName: \"kubernetes.io/projected/418e92aa-4713-4a55-b4d5-650587fcb6ca-kube-api-access-qbc8h\") pod \"placement88ab-account-delete-597b8\" (UID: \"418e92aa-4713-4a55-b4d5-650587fcb6ca\") " pod="openstack/placement88ab-account-delete-597b8" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.479115 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mclv\" (UniqueName: \"kubernetes.io/projected/d2ccdb46-97b8-40d8-aebb-5cf28cb6854d-kube-api-access-9mclv\") pod \"cinderdf0e-account-delete-fsz84\" (UID: \"d2ccdb46-97b8-40d8-aebb-5cf28cb6854d\") " pod="openstack/cinderdf0e-account-delete-fsz84" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.479240 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2ccdb46-97b8-40d8-aebb-5cf28cb6854d-operator-scripts\") pod \"cinderdf0e-account-delete-fsz84\" (UID: \"d2ccdb46-97b8-40d8-aebb-5cf28cb6854d\") " pod="openstack/cinderdf0e-account-delete-fsz84" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.479343 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/418e92aa-4713-4a55-b4d5-650587fcb6ca-operator-scripts\") pod \"placement88ab-account-delete-597b8\" (UID: \"418e92aa-4713-4a55-b4d5-650587fcb6ca\") " pod="openstack/placement88ab-account-delete-597b8" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.487458 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2ccdb46-97b8-40d8-aebb-5cf28cb6854d-operator-scripts\") pod \"cinderdf0e-account-delete-fsz84\" (UID: \"d2ccdb46-97b8-40d8-aebb-5cf28cb6854d\") " pod="openstack/cinderdf0e-account-delete-fsz84" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.492322 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/418e92aa-4713-4a55-b4d5-650587fcb6ca-operator-scripts\") pod \"placement88ab-account-delete-597b8\" (UID: \"418e92aa-4713-4a55-b4d5-650587fcb6ca\") " pod="openstack/placement88ab-account-delete-597b8" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.521380 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-hcgg8"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.521689 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-hcgg8" podUID="75a7f9b6-6582-4498-bccd-954270bc3f8e" containerName="openstack-network-exporter" containerID="cri-o://051545d1d756d25636df272c7f761b7921bf9ef327f55451e1d78ab235564be7" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.548822 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbc8h\" (UniqueName: \"kubernetes.io/projected/418e92aa-4713-4a55-b4d5-650587fcb6ca-kube-api-access-qbc8h\") pod \"placement88ab-account-delete-597b8\" (UID: \"418e92aa-4713-4a55-b4d5-650587fcb6ca\") " pod="openstack/placement88ab-account-delete-597b8" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.567403 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mclv\" (UniqueName: \"kubernetes.io/projected/d2ccdb46-97b8-40d8-aebb-5cf28cb6854d-kube-api-access-9mclv\") pod \"cinderdf0e-account-delete-fsz84\" (UID: \"d2ccdb46-97b8-40d8-aebb-5cf28cb6854d\") " pod="openstack/cinderdf0e-account-delete-fsz84" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.606861 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-xskgs"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.631243 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-9zvlz"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.685651 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-9zvlz"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.744738 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.778743 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.779785 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="account-server" containerID="cri-o://68942f2c61b18621daaf78d82703376edb42951c9bc918b02a792818df0d6cc3" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.780343 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="swift-recon-cron" containerID="cri-o://aa40c5ec6d752ec2fbe2321f2103def28820a11e0c19b5134c7eb5bd9a35558b" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.780898 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="container-updater" containerID="cri-o://3a5cc68bf70f6cc371a8cb6cffab174f354261b971b711b47d5d116af25c682e" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.780988 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="rsync" containerID="cri-o://60ca35ac3122f717dfae9adfc952b010f5c490d1fbab75b0ecd0e2061e89d87e" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.781001 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="container-auditor" containerID="cri-o://ec4a2a2f803658eaa7ba7c7d378bed85b1093f0a8c4791ca3b80cb2214e6e449" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.781045 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-expirer" containerID="cri-o://85237b97891e4838aee5911e5b516a52841339958665c7a83d1d2d7121799c5a" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.781064 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="container-replicator" containerID="cri-o://bb1952930feb993e59b60f78d883b948aa239b6e485225832fb60b3345763360" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.781085 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-updater" containerID="cri-o://3e090d4172cdef300aa94f8d8fba36deddc5d8017b13b15dfe18162d7c6aa8ef" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.781106 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="container-server" containerID="cri-o://854d5d618d13ec03ec6c2f4744287e27f89042a1dae6f369b12bbc3408f4ff5c" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.781125 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-auditor" containerID="cri-o://9e6c70c416cc446ced1c2039d0651b51abc7bb3be2892db83aa00306a5d00bb3" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.781143 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="account-reaper" containerID="cri-o://17fe025c33d04ce12223561653e1cb4e9236655eaf9cd02886b52df15dccb969" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.781163 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-replicator" containerID="cri-o://fd7f3929fff74fd65f6989375596655e7c51806663d983bd2aad57effe232114" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.781179 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="account-auditor" containerID="cri-o://7d0ab20184ee34faada856909d1b2e18f0f407fa1b19e33fe65f5f582f55f86c" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.781202 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-server" containerID="cri-o://e533f83b20e9f9300f4fb16b6fe9cdf376566cb8e295ffbbfa5b2a2195bb2002" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.781217 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="account-replicator" containerID="cri-o://1bb5213e7cc1c7b665991c2e686f0fcd95258f340b82e006fe07376d1ac56ed8" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.786702 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinderdf0e-account-delete-fsz84" Dec 06 07:22:42 crc kubenswrapper[4954]: E1206 07:22:42.804909 4954 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 06 07:22:42 crc kubenswrapper[4954]: E1206 07:22:42.804994 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-config-data podName:31452db7-e2c4-4e61-8f8c-7017476f0bc0 nodeName:}" failed. No retries permitted until 2025-12-06 07:22:43.3049733 +0000 UTC m=+1538.118332689 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-config-data") pod "rabbitmq-cell1-server-0" (UID: "31452db7-e2c4-4e61-8f8c-7017476f0bc0") : configmap "rabbitmq-cell1-config-data" not found Dec 06 07:22:42 crc kubenswrapper[4954]: E1206 07:22:42.808442 4954 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 06 07:22:42 crc kubenswrapper[4954]: E1206 07:22:42.808514 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-config-data podName:578bec25-a54c-4f52-95f2-19f20f833437 nodeName:}" failed. No retries permitted until 2025-12-06 07:22:43.808492655 +0000 UTC m=+1538.621852044 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-config-data") pod "rabbitmq-server-0" (UID: "578bec25-a54c-4f52-95f2-19f20f833437") : configmap "rabbitmq-config-data" not found Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.856659 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.857026 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="79dc6de3-cf27-4c1d-91c5-f922acd48400" containerName="glance-log" containerID="cri-o://c848e10be510af8ec76555e3960b2f1fc5ccf1b3023d60abc7f4ffb8dd93dba2" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.857666 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="79dc6de3-cf27-4c1d-91c5-f922acd48400" containerName="glance-httpd" containerID="cri-o://e09a4c785be2122538bcc0a01f7e6782d054fe47c518d01aab3ac5049b21cf94" gracePeriod=30 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.890259 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.891391 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="40b282e6-b847-4393-89f2-7844fce43388" containerName="openstack-network-exporter" containerID="cri-o://e5d78eb2813e941bbf1327182aea6826153731bf9030370520ea6cf536e2c69c" gracePeriod=300 Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.985768 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-cddwk"] Dec 06 07:22:42 crc kubenswrapper[4954]: E1206 07:22:42.986029 4954 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-lnbn8" message="Exiting ovn-controller (1) " Dec 06 07:22:42 crc kubenswrapper[4954]: E1206 07:22:42.986074 4954 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-lnbn8" podUID="092ab1a2-b565-47cd-9b83-f306883b688e" containerName="ovn-controller" containerID="cri-o://fa365fab5db7334bced55d4b6aa2cb11e9adda9a3cc808705301e34527d718db" Dec 06 07:22:42 crc kubenswrapper[4954]: I1206 07:22:42.986108 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-lnbn8" podUID="092ab1a2-b565-47cd-9b83-f306883b688e" containerName="ovn-controller" containerID="cri-o://fa365fab5db7334bced55d4b6aa2cb11e9adda9a3cc808705301e34527d718db" gracePeriod=30 Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.050001 4954 generic.go:334] "Generic (PLEG): container finished" podID="6de2aad5-fb15-489c-b0fc-200e18ad3baa" containerID="b185262581277ddddfd5f329caf8d803c0479c8dea4732929b0bc2936e8a4cb6" exitCode=2 Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.050152 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6de2aad5-fb15-489c-b0fc-200e18ad3baa","Type":"ContainerDied","Data":"b185262581277ddddfd5f329caf8d803c0479c8dea4732929b0bc2936e8a4cb6"} Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.056492 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-cddwk"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.081141 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hcgg8_75a7f9b6-6582-4498-bccd-954270bc3f8e/openstack-network-exporter/0.log" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.081196 4954 generic.go:334] "Generic (PLEG): container finished" podID="75a7f9b6-6582-4498-bccd-954270bc3f8e" containerID="051545d1d756d25636df272c7f761b7921bf9ef327f55451e1d78ab235564be7" exitCode=2 Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.081236 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hcgg8" event={"ID":"75a7f9b6-6582-4498-bccd-954270bc3f8e","Type":"ContainerDied","Data":"051545d1d756d25636df272c7f761b7921bf9ef327f55451e1d78ab235564be7"} Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.193768 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.194499 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" containerName="openstack-network-exporter" containerID="cri-o://f5f430ad3df9e0cf4cc84ee32b89af25b995b3297d702d7f97feec4a11fa4787" gracePeriod=300 Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.196196 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement88ab-account-delete-597b8" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.217263 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-xc5wf"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.240578 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-xc5wf"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.254776 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.255094 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a6b3ac97-ce01-4110-9dd5-fee903dd5204" containerName="cinder-scheduler" containerID="cri-o://25aaea2454949bcf5aa44f34e823a052575eba578039116af36245405f9d00cf" gracePeriod=30 Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.255688 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a6b3ac97-ce01-4110-9dd5-fee903dd5204" containerName="probe" containerID="cri-o://dde3cc2c35c6b947b115a84354f88ab0c259ca23e0659e70ff1f7de4d80de96e" gracePeriod=30 Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.290372 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican664f-account-delete-l67mj"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.292323 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican664f-account-delete-l67mj" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.312668 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican664f-account-delete-l67mj"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.345109 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron251d-account-delete-rhsq5"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.346524 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron251d-account-delete-rhsq5"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.348138 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron251d-account-delete-rhsq5" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.354751 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl4gg\" (UniqueName: \"kubernetes.io/projected/693e40da-d019-421e-83a8-6dc351580607-kube-api-access-nl4gg\") pod \"barbican664f-account-delete-l67mj\" (UID: \"693e40da-d019-421e-83a8-6dc351580607\") " pod="openstack/barbican664f-account-delete-l67mj" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.354793 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097befa0-58fe-4616-bed7-ada4f7d81ce3-operator-scripts\") pod \"neutron251d-account-delete-rhsq5\" (UID: \"097befa0-58fe-4616-bed7-ada4f7d81ce3\") " pod="openstack/neutron251d-account-delete-rhsq5" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.354887 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v275\" (UniqueName: \"kubernetes.io/projected/097befa0-58fe-4616-bed7-ada4f7d81ce3-kube-api-access-6v275\") pod \"neutron251d-account-delete-rhsq5\" (UID: \"097befa0-58fe-4616-bed7-ada4f7d81ce3\") " pod="openstack/neutron251d-account-delete-rhsq5" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.354967 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693e40da-d019-421e-83a8-6dc351580607-operator-scripts\") pod \"barbican664f-account-delete-l67mj\" (UID: \"693e40da-d019-421e-83a8-6dc351580607\") " pod="openstack/barbican664f-account-delete-l67mj" Dec 06 07:22:43 crc kubenswrapper[4954]: E1206 07:22:43.355096 4954 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 06 07:22:43 crc kubenswrapper[4954]: E1206 07:22:43.355157 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-config-data podName:31452db7-e2c4-4e61-8f8c-7017476f0bc0 nodeName:}" failed. No retries permitted until 2025-12-06 07:22:44.355137236 +0000 UTC m=+1539.168496625 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-config-data") pod "rabbitmq-cell1-server-0" (UID: "31452db7-e2c4-4e61-8f8c-7017476f0bc0") : configmap "rabbitmq-cell1-config-data" not found Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.388216 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-55bf8ff4-7brlv"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.388545 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-55bf8ff4-7brlv" podUID="f84d04d2-6282-4a9c-89a8-3aa64ef22c74" containerName="placement-log" containerID="cri-o://27d6334f40e13cafc61de087025fab078b3fe41774336c33565c8ae4035e063f" gracePeriod=30 Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.388712 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-55bf8ff4-7brlv" podUID="f84d04d2-6282-4a9c-89a8-3aa64ef22c74" containerName="placement-api" containerID="cri-o://6ab0f45e984c650aee9fc279116d7bc9e68e5fe5d6b74cb802fbb810f6d5de26" gracePeriod=30 Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.402106 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapi9e6e-account-delete-2shf2"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.443202 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi9e6e-account-delete-2shf2" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.465832 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v275\" (UniqueName: \"kubernetes.io/projected/097befa0-58fe-4616-bed7-ada4f7d81ce3-kube-api-access-6v275\") pod \"neutron251d-account-delete-rhsq5\" (UID: \"097befa0-58fe-4616-bed7-ada4f7d81ce3\") " pod="openstack/neutron251d-account-delete-rhsq5" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.465934 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693e40da-d019-421e-83a8-6dc351580607-operator-scripts\") pod \"barbican664f-account-delete-l67mj\" (UID: \"693e40da-d019-421e-83a8-6dc351580607\") " pod="openstack/barbican664f-account-delete-l67mj" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.466004 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl4gg\" (UniqueName: \"kubernetes.io/projected/693e40da-d019-421e-83a8-6dc351580607-kube-api-access-nl4gg\") pod \"barbican664f-account-delete-l67mj\" (UID: \"693e40da-d019-421e-83a8-6dc351580607\") " pod="openstack/barbican664f-account-delete-l67mj" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.466025 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097befa0-58fe-4616-bed7-ada4f7d81ce3-operator-scripts\") pod \"neutron251d-account-delete-rhsq5\" (UID: \"097befa0-58fe-4616-bed7-ada4f7d81ce3\") " pod="openstack/neutron251d-account-delete-rhsq5" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.467070 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097befa0-58fe-4616-bed7-ada4f7d81ce3-operator-scripts\") pod \"neutron251d-account-delete-rhsq5\" (UID: \"097befa0-58fe-4616-bed7-ada4f7d81ce3\") " pod="openstack/neutron251d-account-delete-rhsq5" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.468191 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693e40da-d019-421e-83a8-6dc351580607-operator-scripts\") pod \"barbican664f-account-delete-l67mj\" (UID: \"693e40da-d019-421e-83a8-6dc351580607\") " pod="openstack/barbican664f-account-delete-l67mj" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.581934 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mrq9\" (UniqueName: \"kubernetes.io/projected/c417ca1e-22df-4163-96f9-349df3d624e8-kube-api-access-2mrq9\") pod \"novaapi9e6e-account-delete-2shf2\" (UID: \"c417ca1e-22df-4163-96f9-349df3d624e8\") " pod="openstack/novaapi9e6e-account-delete-2shf2" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.582070 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c417ca1e-22df-4163-96f9-349df3d624e8-operator-scripts\") pod \"novaapi9e6e-account-delete-2shf2\" (UID: \"c417ca1e-22df-4163-96f9-349df3d624e8\") " pod="openstack/novaapi9e6e-account-delete-2shf2" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.597726 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b212057-565c-4246-820a-a804fb6da962" path="/var/lib/kubelet/pods/1b212057-565c-4246-820a-a804fb6da962/volumes" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.598788 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24a53752-0ee5-43db-b968-b4d11414ffdb" path="/var/lib/kubelet/pods/24a53752-0ee5-43db-b968-b4d11414ffdb/volumes" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.599548 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7854eba-8c30-4c39-9c96-36e1e3cf7437" path="/var/lib/kubelet/pods/b7854eba-8c30-4c39-9c96-36e1e3cf7437/volumes" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.604617 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f9fd4e-cc86-43cf-8195-cae0becb9e64" path="/var/lib/kubelet/pods/c5f9fd4e-cc86-43cf-8195-cae0becb9e64/volumes" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.606044 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c70d4759-1065-4412-9685-898f12a23a38" path="/var/lib/kubelet/pods/c70d4759-1065-4412-9685-898f12a23a38/volumes" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.606922 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi9e6e-account-delete-2shf2"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.606969 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-khvqc"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.638792 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-khvqc"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.685132 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mrq9\" (UniqueName: \"kubernetes.io/projected/c417ca1e-22df-4163-96f9-349df3d624e8-kube-api-access-2mrq9\") pod \"novaapi9e6e-account-delete-2shf2\" (UID: \"c417ca1e-22df-4163-96f9-349df3d624e8\") " pod="openstack/novaapi9e6e-account-delete-2shf2" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.685259 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c417ca1e-22df-4163-96f9-349df3d624e8-operator-scripts\") pod \"novaapi9e6e-account-delete-2shf2\" (UID: \"c417ca1e-22df-4163-96f9-349df3d624e8\") " pod="openstack/novaapi9e6e-account-delete-2shf2" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.686190 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c417ca1e-22df-4163-96f9-349df3d624e8-operator-scripts\") pod \"novaapi9e6e-account-delete-2shf2\" (UID: \"c417ca1e-22df-4163-96f9-349df3d624e8\") " pod="openstack/novaapi9e6e-account-delete-2shf2" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.694806 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.695325 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="37a14211-fd70-4578-83c1-d674b2cf6172" containerName="glance-log" containerID="cri-o://0bb0b24e4d3085b50b85c5df8d24f8ed4c7c0c28e8532d6d7a2d8957c39ce556" gracePeriod=30 Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.695982 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="37a14211-fd70-4578-83c1-d674b2cf6172" containerName="glance-httpd" containerID="cri-o://dff634325c63563a5768825e16323ab3d990f36bc7c9fcb6773bf50f64fbe138" gracePeriod=30 Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.747831 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.748125 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" podUID="56254c3b-cd9d-40d9-bb7e-b1c858f3b87f" containerName="dnsmasq-dns" containerID="cri-o://9d7871a1cc043780cd4a4b6b1e5824bece7fc1fc553c7805c5dd188196365a91" gracePeriod=10 Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.805068 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell0f048-account-delete-6rspl"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.806528 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0f048-account-delete-6rspl" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.813913 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl4gg\" (UniqueName: \"kubernetes.io/projected/693e40da-d019-421e-83a8-6dc351580607-kube-api-access-nl4gg\") pod \"barbican664f-account-delete-l67mj\" (UID: \"693e40da-d019-421e-83a8-6dc351580607\") " pod="openstack/barbican664f-account-delete-l67mj" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.816181 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v275\" (UniqueName: \"kubernetes.io/projected/097befa0-58fe-4616-bed7-ada4f7d81ce3-kube-api-access-6v275\") pod \"neutron251d-account-delete-rhsq5\" (UID: \"097befa0-58fe-4616-bed7-ada4f7d81ce3\") " pod="openstack/neutron251d-account-delete-rhsq5" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.816259 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.816522 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c75e5da0-fd73-485e-b53a-b5e96965bb99" containerName="cinder-api-log" containerID="cri-o://2c4a1099f869e71a363c1447c04199831e773a0bfb2ea26125fb2dd4207e631d" gracePeriod=30 Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.816680 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c75e5da0-fd73-485e-b53a-b5e96965bb99" containerName="cinder-api" containerID="cri-o://6880fda36024a9648a750eeda330087a75c544818aceab1279c1dd4c9af76cd2" gracePeriod=30 Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.833455 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0f048-account-delete-6rspl"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.853117 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mrq9\" (UniqueName: \"kubernetes.io/projected/c417ca1e-22df-4163-96f9-349df3d624e8-kube-api-access-2mrq9\") pod \"novaapi9e6e-account-delete-2shf2\" (UID: \"c417ca1e-22df-4163-96f9-349df3d624e8\") " pod="openstack/novaapi9e6e-account-delete-2shf2" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.866974 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-qjzpd"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.873679 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-qjzpd"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.900679 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-dw292"] Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.910636 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81a84a0f-fa51-461c-a281-6b832ad39aa7-operator-scripts\") pod \"novacell0f048-account-delete-6rspl\" (UID: \"81a84a0f-fa51-461c-a281-6b832ad39aa7\") " pod="openstack/novacell0f048-account-delete-6rspl" Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.911010 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc9rr\" (UniqueName: \"kubernetes.io/projected/81a84a0f-fa51-461c-a281-6b832ad39aa7-kube-api-access-hc9rr\") pod \"novacell0f048-account-delete-6rspl\" (UID: \"81a84a0f-fa51-461c-a281-6b832ad39aa7\") " pod="openstack/novacell0f048-account-delete-6rspl" Dec 06 07:22:43 crc kubenswrapper[4954]: E1206 07:22:43.911395 4954 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 06 07:22:43 crc kubenswrapper[4954]: E1206 07:22:43.911455 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-config-data podName:578bec25-a54c-4f52-95f2-19f20f833437 nodeName:}" failed. No retries permitted until 2025-12-06 07:22:45.911438117 +0000 UTC m=+1540.724797506 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-config-data") pod "rabbitmq-server-0" (UID: "578bec25-a54c-4f52-95f2-19f20f833437") : configmap "rabbitmq-config-data" not found Dec 06 07:22:43 crc kubenswrapper[4954]: I1206 07:22:43.935647 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-dw292"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.015916 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc9rr\" (UniqueName: \"kubernetes.io/projected/81a84a0f-fa51-461c-a281-6b832ad39aa7-kube-api-access-hc9rr\") pod \"novacell0f048-account-delete-6rspl\" (UID: \"81a84a0f-fa51-461c-a281-6b832ad39aa7\") " pod="openstack/novacell0f048-account-delete-6rspl" Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.016228 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81a84a0f-fa51-461c-a281-6b832ad39aa7-operator-scripts\") pod \"novacell0f048-account-delete-6rspl\" (UID: \"81a84a0f-fa51-461c-a281-6b832ad39aa7\") " pod="openstack/novacell0f048-account-delete-6rspl" Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.017202 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81a84a0f-fa51-461c-a281-6b832ad39aa7-operator-scripts\") pod \"novacell0f048-account-delete-6rspl\" (UID: \"81a84a0f-fa51-461c-a281-6b832ad39aa7\") " pod="openstack/novacell0f048-account-delete-6rspl" Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.031703 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78654684fc-84hfw"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.032009 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-78654684fc-84hfw" podUID="096c2131-031b-4573-ade7-b1d0d34abc60" containerName="neutron-api" containerID="cri-o://59705182107d1f5a2685c487316a4e8779bdf03f701931d0f8c6734cc83fb7c6" gracePeriod=30 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.032532 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-78654684fc-84hfw" podUID="096c2131-031b-4573-ade7-b1d0d34abc60" containerName="neutron-httpd" containerID="cri-o://feff0af557a19a07e436a4ff2db6401f9895dc0adfba494469ecdb5c414013e9" gracePeriod=30 Dec 06 07:22:44 crc kubenswrapper[4954]: E1206 07:22:44.041128 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9874b322d9222424c03e259859ca289d93dd364308738ff78e86095855f8b6e6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 06 07:22:44 crc kubenswrapper[4954]: E1206 07:22:44.078908 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9874b322d9222424c03e259859ca289d93dd364308738ff78e86095855f8b6e6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.079407 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc9rr\" (UniqueName: \"kubernetes.io/projected/81a84a0f-fa51-461c-a281-6b832ad39aa7-kube-api-access-hc9rr\") pod \"novacell0f048-account-delete-6rspl\" (UID: \"81a84a0f-fa51-461c-a281-6b832ad39aa7\") " pod="openstack/novacell0f048-account-delete-6rspl" Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.079529 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican664f-account-delete-l67mj" Dec 06 07:22:44 crc kubenswrapper[4954]: E1206 07:22:44.107659 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9874b322d9222424c03e259859ca289d93dd364308738ff78e86095855f8b6e6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 06 07:22:44 crc kubenswrapper[4954]: E1206 07:22:44.107745 4954 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="6de2aad5-fb15-489c-b0fc-200e18ad3baa" containerName="ovn-northd" Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.112135 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron251d-account-delete-rhsq5" Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.131125 4954 generic.go:334] "Generic (PLEG): container finished" podID="56254c3b-cd9d-40d9-bb7e-b1c858f3b87f" containerID="9d7871a1cc043780cd4a4b6b1e5824bece7fc1fc553c7805c5dd188196365a91" exitCode=0 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.131268 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.131300 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" event={"ID":"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f","Type":"ContainerDied","Data":"9d7871a1cc043780cd4a4b6b1e5824bece7fc1fc553c7805c5dd188196365a91"} Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.157519 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hcgg8_75a7f9b6-6582-4498-bccd-954270bc3f8e/openstack-network-exporter/0.log" Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.157730 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hcgg8" event={"ID":"75a7f9b6-6582-4498-bccd-954270bc3f8e","Type":"ContainerDied","Data":"671cdc16f347824ef49d91085cf54d6c45dde241f4e4211fa782b345eaf525f3"} Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.157773 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="671cdc16f347824ef49d91085cf54d6c45dde241f4e4211fa782b345eaf525f3" Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.170770 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.171362 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" podUID="f13e11c3-b93d-4671-b9a7-961ab83bd23e" containerName="barbican-keystone-listener-log" containerID="cri-o://0633e2818de57c77f8cbe3341871b6c9295418f75eb7371e9ba35f08bc900d92" gracePeriod=30 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.172307 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" podUID="f13e11c3-b93d-4671-b9a7-961ab83bd23e" containerName="barbican-keystone-listener" containerID="cri-o://37978e335651ac4f0f09719f13e2d04c7acd8c2ac455f49f29118e1f83cda3a9" gracePeriod=30 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.172705 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="40b282e6-b847-4393-89f2-7844fce43388" containerName="ovsdbserver-sb" containerID="cri-o://03d05c358c3170745be24e17518ed67a6f790841fc18537ed370460246d3cb93" gracePeriod=299 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.202127 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.241730 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.242098 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" containerName="nova-api-log" containerID="cri-o://6f34aece6d2e9c27b3779a555391f111659dcf20da766d7854e90f5cfe888d96" gracePeriod=30 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.242736 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" containerName="nova-api-api" containerID="cri-o://5c5b57148f52d4e8a3969fbcc3484fc2918682cc67900402ffb917e03848f564" gracePeriod=30 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263262 4954 generic.go:334] "Generic (PLEG): container finished" podID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerID="60ca35ac3122f717dfae9adfc952b010f5c490d1fbab75b0ecd0e2061e89d87e" exitCode=0 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263289 4954 generic.go:334] "Generic (PLEG): container finished" podID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerID="85237b97891e4838aee5911e5b516a52841339958665c7a83d1d2d7121799c5a" exitCode=0 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263297 4954 generic.go:334] "Generic (PLEG): container finished" podID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerID="3e090d4172cdef300aa94f8d8fba36deddc5d8017b13b15dfe18162d7c6aa8ef" exitCode=0 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263304 4954 generic.go:334] "Generic (PLEG): container finished" podID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerID="9e6c70c416cc446ced1c2039d0651b51abc7bb3be2892db83aa00306a5d00bb3" exitCode=0 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263312 4954 generic.go:334] "Generic (PLEG): container finished" podID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerID="fd7f3929fff74fd65f6989375596655e7c51806663d983bd2aad57effe232114" exitCode=0 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263319 4954 generic.go:334] "Generic (PLEG): container finished" podID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerID="3a5cc68bf70f6cc371a8cb6cffab174f354261b971b711b47d5d116af25c682e" exitCode=0 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263324 4954 generic.go:334] "Generic (PLEG): container finished" podID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerID="ec4a2a2f803658eaa7ba7c7d378bed85b1093f0a8c4791ca3b80cb2214e6e449" exitCode=0 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263330 4954 generic.go:334] "Generic (PLEG): container finished" podID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerID="bb1952930feb993e59b60f78d883b948aa239b6e485225832fb60b3345763360" exitCode=0 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263336 4954 generic.go:334] "Generic (PLEG): container finished" podID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerID="17fe025c33d04ce12223561653e1cb4e9236655eaf9cd02886b52df15dccb969" exitCode=0 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263342 4954 generic.go:334] "Generic (PLEG): container finished" podID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerID="7d0ab20184ee34faada856909d1b2e18f0f407fa1b19e33fe65f5f582f55f86c" exitCode=0 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263348 4954 generic.go:334] "Generic (PLEG): container finished" podID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerID="1bb5213e7cc1c7b665991c2e686f0fcd95258f340b82e006fe07376d1ac56ed8" exitCode=0 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263420 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerDied","Data":"60ca35ac3122f717dfae9adfc952b010f5c490d1fbab75b0ecd0e2061e89d87e"} Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263595 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerDied","Data":"85237b97891e4838aee5911e5b516a52841339958665c7a83d1d2d7121799c5a"} Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263610 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerDied","Data":"3e090d4172cdef300aa94f8d8fba36deddc5d8017b13b15dfe18162d7c6aa8ef"} Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263620 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerDied","Data":"9e6c70c416cc446ced1c2039d0651b51abc7bb3be2892db83aa00306a5d00bb3"} Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263631 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerDied","Data":"fd7f3929fff74fd65f6989375596655e7c51806663d983bd2aad57effe232114"} Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263643 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerDied","Data":"3a5cc68bf70f6cc371a8cb6cffab174f354261b971b711b47d5d116af25c682e"} Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263653 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerDied","Data":"ec4a2a2f803658eaa7ba7c7d378bed85b1093f0a8c4791ca3b80cb2214e6e449"} Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263661 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerDied","Data":"bb1952930feb993e59b60f78d883b948aa239b6e485225832fb60b3345763360"} Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263670 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerDied","Data":"17fe025c33d04ce12223561653e1cb4e9236655eaf9cd02886b52df15dccb969"} Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263714 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerDied","Data":"7d0ab20184ee34faada856909d1b2e18f0f407fa1b19e33fe65f5f582f55f86c"} Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.263726 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerDied","Data":"1bb5213e7cc1c7b665991c2e686f0fcd95258f340b82e006fe07376d1ac56ed8"} Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.285979 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.286027 4954 generic.go:334] "Generic (PLEG): container finished" podID="40b282e6-b847-4393-89f2-7844fce43388" containerID="e5d78eb2813e941bbf1327182aea6826153731bf9030370520ea6cf536e2c69c" exitCode=2 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.286039 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"40b282e6-b847-4393-89f2-7844fce43388","Type":"ContainerDied","Data":"e5d78eb2813e941bbf1327182aea6826153731bf9030370520ea6cf536e2c69c"} Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.324826 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.325086 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="432b9d93-b045-4e25-b58b-b3a6fd8512c4" containerName="nova-metadata-log" containerID="cri-o://ac777ba7c81ee56eb4fb435d7da2f478c6a3c4f40024a4daae7126627007674a" gracePeriod=30 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.325833 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="432b9d93-b045-4e25-b58b-b3a6fd8512c4" containerName="nova-metadata-metadata" containerID="cri-o://d14ca933ddb6a12f4399c39d7a1279cb6dd7f6ad978226ac72fa861be2573071" gracePeriod=30 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.368081 4954 generic.go:334] "Generic (PLEG): container finished" podID="24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" containerID="f5f430ad3df9e0cf4cc84ee32b89af25b995b3297d702d7f97feec4a11fa4787" exitCode=2 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.368204 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7","Type":"ContainerDied","Data":"f5f430ad3df9e0cf4cc84ee32b89af25b995b3297d702d7f97feec4a11fa4787"} Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.406390 4954 generic.go:334] "Generic (PLEG): container finished" podID="092ab1a2-b565-47cd-9b83-f306883b688e" containerID="fa365fab5db7334bced55d4b6aa2cb11e9adda9a3cc808705301e34527d718db" exitCode=0 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.406831 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lnbn8" event={"ID":"092ab1a2-b565-47cd-9b83-f306883b688e","Type":"ContainerDied","Data":"fa365fab5db7334bced55d4b6aa2cb11e9adda9a3cc808705301e34527d718db"} Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.436257 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5bf7788f9-vw5rh"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.436717 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5bf7788f9-vw5rh" podUID="b907c888-706a-4183-b581-ff7b4742fc74" containerName="barbican-worker-log" containerID="cri-o://632cafbc6ab8b4bca12eb7135c2589ae827ea7666ad9a40b63f7adf3a97af9c7" gracePeriod=30 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.437470 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5bf7788f9-vw5rh" podUID="b907c888-706a-4183-b581-ff7b4742fc74" containerName="barbican-worker" containerID="cri-o://ce32b7671edf55a676312d4bc38e6dda973ecc26c56be59a3598a590d1eebc14" gracePeriod=30 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.464926 4954 generic.go:334] "Generic (PLEG): container finished" podID="79dc6de3-cf27-4c1d-91c5-f922acd48400" containerID="c848e10be510af8ec76555e3960b2f1fc5ccf1b3023d60abc7f4ffb8dd93dba2" exitCode=143 Dec 06 07:22:44 crc kubenswrapper[4954]: E1206 07:22:44.466989 4954 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 06 07:22:44 crc kubenswrapper[4954]: E1206 07:22:44.467063 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-config-data podName:31452db7-e2c4-4e61-8f8c-7017476f0bc0 nodeName:}" failed. No retries permitted until 2025-12-06 07:22:46.467040609 +0000 UTC m=+1541.280399998 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-config-data") pod "rabbitmq-cell1-server-0" (UID: "31452db7-e2c4-4e61-8f8c-7017476f0bc0") : configmap "rabbitmq-cell1-config-data" not found Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.464973 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"79dc6de3-cf27-4c1d-91c5-f922acd48400","Type":"ContainerDied","Data":"c848e10be510af8ec76555e3960b2f1fc5ccf1b3023d60abc7f4ffb8dd93dba2"} Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.495207 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-759b9cfd76-jp2pl"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.495464 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-759b9cfd76-jp2pl" podUID="25414b25-a6cc-41e6-8360-3e85f54321d5" containerName="barbican-api-log" containerID="cri-o://d10a4557556efb566e2c349cfa75bf39a1a2ea52776d0c439f69d9cdbd06b4fa" gracePeriod=30 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.495890 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-759b9cfd76-jp2pl" podUID="25414b25-a6cc-41e6-8360-3e85f54321d5" containerName="barbican-api" containerID="cri-o://01b096b10176462db3110ae80702f8ad1ade5c76a6f11bd639c695de9a47e99c" gracePeriod=30 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.505458 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-4dbc4"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.508073 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" containerName="ovsdbserver-nb" containerID="cri-o://98c8778c7e8353cc4dc52283f262371684ed883dec692ca28a39a816cbb59cd0" gracePeriod=299 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.522956 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lzjzc"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.541709 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-4dbc4"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.552367 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lzjzc"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.587289 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-7ee2-account-create-update-z9pcl"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.598625 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-7ee2-account-create-update-z9pcl"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.607183 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.607434 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="ca2306ed-dcff-4143-a452-9a209d0a46a1" containerName="nova-cell0-conductor-conductor" containerID="cri-o://358a83794fb9c4d6619d2cfaee88e7ec329afae2443bcf0de2adee871931aa8f" gracePeriod=30 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.608081 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="578bec25-a54c-4f52-95f2-19f20f833437" containerName="rabbitmq" containerID="cri-o://cf94bc0a088f861f78bc849ba8a3d2602f00e20f5f2438d40190effa61b16e55" gracePeriod=604800 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.616980 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-xskgs" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovs-vswitchd" containerID="cri-o://f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" gracePeriod=28 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.616992 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="31452db7-e2c4-4e61-8f8c-7017476f0bc0" containerName="rabbitmq" containerID="cri-o://146293bb351f1e02c74e1849e3be8838c1377922574653baaa3053a66ff3aad5" gracePeriod=604800 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.618092 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2jlbn"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.626173 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.626747 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="dc34e079-7411-438c-aca8-b2c95854158e" containerName="nova-cell1-conductor-conductor" containerID="cri-o://31234f4535422fb1bf7bada8c66937069f24dc6cdacdb390f9005698b7491575" gracePeriod=30 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.633509 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2jlbn"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.643650 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.643924 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="5e602bfa-4a57-4fe6-adca-90acb13b7458" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9b66120864df3196219880eda3f954ff63b12cccdf4e304f76c83d176d3e19b9" gracePeriod=30 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.686406 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" podUID="56254c3b-cd9d-40d9-bb7e-b1c858f3b87f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.198:5353: connect: connection refused" Dec 06 07:22:44 crc kubenswrapper[4954]: E1206 07:22:44.847484 4954 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 06 07:22:44 crc kubenswrapper[4954]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 06 07:22:44 crc kubenswrapper[4954]: + source /usr/local/bin/container-scripts/functions Dec 06 07:22:44 crc kubenswrapper[4954]: ++ OVNBridge=br-int Dec 06 07:22:44 crc kubenswrapper[4954]: ++ OVNRemote=tcp:localhost:6642 Dec 06 07:22:44 crc kubenswrapper[4954]: ++ OVNEncapType=geneve Dec 06 07:22:44 crc kubenswrapper[4954]: ++ OVNAvailabilityZones= Dec 06 07:22:44 crc kubenswrapper[4954]: ++ EnableChassisAsGateway=true Dec 06 07:22:44 crc kubenswrapper[4954]: ++ PhysicalNetworks= Dec 06 07:22:44 crc kubenswrapper[4954]: ++ OVNHostName= Dec 06 07:22:44 crc kubenswrapper[4954]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 06 07:22:44 crc kubenswrapper[4954]: ++ ovs_dir=/var/lib/openvswitch Dec 06 07:22:44 crc kubenswrapper[4954]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 06 07:22:44 crc kubenswrapper[4954]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 06 07:22:44 crc kubenswrapper[4954]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 06 07:22:44 crc kubenswrapper[4954]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 06 07:22:44 crc kubenswrapper[4954]: + sleep 0.5 Dec 06 07:22:44 crc kubenswrapper[4954]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 06 07:22:44 crc kubenswrapper[4954]: + sleep 0.5 Dec 06 07:22:44 crc kubenswrapper[4954]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 06 07:22:44 crc kubenswrapper[4954]: + cleanup_ovsdb_server_semaphore Dec 06 07:22:44 crc kubenswrapper[4954]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 06 07:22:44 crc kubenswrapper[4954]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 06 07:22:44 crc kubenswrapper[4954]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-xskgs" message=< Dec 06 07:22:44 crc kubenswrapper[4954]: Exiting ovsdb-server (5) [ OK ] Dec 06 07:22:44 crc kubenswrapper[4954]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 06 07:22:44 crc kubenswrapper[4954]: + source /usr/local/bin/container-scripts/functions Dec 06 07:22:44 crc kubenswrapper[4954]: ++ OVNBridge=br-int Dec 06 07:22:44 crc kubenswrapper[4954]: ++ OVNRemote=tcp:localhost:6642 Dec 06 07:22:44 crc kubenswrapper[4954]: ++ OVNEncapType=geneve Dec 06 07:22:44 crc kubenswrapper[4954]: ++ OVNAvailabilityZones= Dec 06 07:22:44 crc kubenswrapper[4954]: ++ EnableChassisAsGateway=true Dec 06 07:22:44 crc kubenswrapper[4954]: ++ PhysicalNetworks= Dec 06 07:22:44 crc kubenswrapper[4954]: ++ OVNHostName= Dec 06 07:22:44 crc kubenswrapper[4954]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 06 07:22:44 crc kubenswrapper[4954]: ++ ovs_dir=/var/lib/openvswitch Dec 06 07:22:44 crc kubenswrapper[4954]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 06 07:22:44 crc kubenswrapper[4954]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 06 07:22:44 crc kubenswrapper[4954]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 06 07:22:44 crc kubenswrapper[4954]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 06 07:22:44 crc kubenswrapper[4954]: + sleep 0.5 Dec 06 07:22:44 crc kubenswrapper[4954]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 06 07:22:44 crc kubenswrapper[4954]: + sleep 0.5 Dec 06 07:22:44 crc kubenswrapper[4954]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 06 07:22:44 crc kubenswrapper[4954]: + cleanup_ovsdb_server_semaphore Dec 06 07:22:44 crc kubenswrapper[4954]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 06 07:22:44 crc kubenswrapper[4954]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 06 07:22:44 crc kubenswrapper[4954]: > Dec 06 07:22:44 crc kubenswrapper[4954]: E1206 07:22:44.847823 4954 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 06 07:22:44 crc kubenswrapper[4954]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 06 07:22:44 crc kubenswrapper[4954]: + source /usr/local/bin/container-scripts/functions Dec 06 07:22:44 crc kubenswrapper[4954]: ++ OVNBridge=br-int Dec 06 07:22:44 crc kubenswrapper[4954]: ++ OVNRemote=tcp:localhost:6642 Dec 06 07:22:44 crc kubenswrapper[4954]: ++ OVNEncapType=geneve Dec 06 07:22:44 crc kubenswrapper[4954]: ++ OVNAvailabilityZones= Dec 06 07:22:44 crc kubenswrapper[4954]: ++ EnableChassisAsGateway=true Dec 06 07:22:44 crc kubenswrapper[4954]: ++ PhysicalNetworks= Dec 06 07:22:44 crc kubenswrapper[4954]: ++ OVNHostName= Dec 06 07:22:44 crc kubenswrapper[4954]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 06 07:22:44 crc kubenswrapper[4954]: ++ ovs_dir=/var/lib/openvswitch Dec 06 07:22:44 crc kubenswrapper[4954]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 06 07:22:44 crc kubenswrapper[4954]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 06 07:22:44 crc kubenswrapper[4954]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 06 07:22:44 crc kubenswrapper[4954]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 06 07:22:44 crc kubenswrapper[4954]: + sleep 0.5 Dec 06 07:22:44 crc kubenswrapper[4954]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 06 07:22:44 crc kubenswrapper[4954]: + sleep 0.5 Dec 06 07:22:44 crc kubenswrapper[4954]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 06 07:22:44 crc kubenswrapper[4954]: + cleanup_ovsdb_server_semaphore Dec 06 07:22:44 crc kubenswrapper[4954]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 06 07:22:44 crc kubenswrapper[4954]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 06 07:22:44 crc kubenswrapper[4954]: > pod="openstack/ovn-controller-ovs-xskgs" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovsdb-server" containerID="cri-o://2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.847865 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-xskgs" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovsdb-server" containerID="cri-o://2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" gracePeriod=28 Dec 06 07:22:44 crc kubenswrapper[4954]: I1206 07:22:44.902914 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="f9eaecc6-8e85-432e-a906-3fcecee9fc1d" containerName="galera" containerID="cri-o://eace79b9d5ea075491fb63d59c67b1bf00e2fa80e5cc5eed0c509ad5570509ae" gracePeriod=30 Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.079353 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.081984 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7924fc4f-0ab9-4805-8d77-3a1fe2953fe1" containerName="nova-scheduler-scheduler" containerID="cri-o://3441df07b213b831010c3f148e9250178d2250f974a1c6530c3c0990a93f5d6f" gracePeriod=30 Dec 06 07:22:45 crc kubenswrapper[4954]: E1206 07:22:45.092200 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb907c888_706a_4183_b581_ff7b4742fc74.slice/crio-632cafbc6ab8b4bca12eb7135c2589ae827ea7666ad9a40b63f7adf3a97af9c7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb304148c_ab0e_42ac_966a_024ff59a8cde.slice/crio-conmon-68942f2c61b18621daaf78d82703376edb42951c9bc918b02a792818df0d6cc3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24a72b28_2cf7_47e0_b7c2_5ff92acedfe7.slice/crio-98c8778c7e8353cc4dc52283f262371684ed883dec692ca28a39a816cbb59cd0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod096c2131_031b_4573_ade7_b1d0d34abc60.slice/crio-conmon-feff0af557a19a07e436a4ff2db6401f9895dc0adfba494469ecdb5c414013e9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod432b9d93_b045_4e25_b58b_b3a6fd8512c4.slice/crio-ac777ba7c81ee56eb4fb435d7da2f478c6a3c4f40024a4daae7126627007674a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb304148c_ab0e_42ac_966a_024ff59a8cde.slice/crio-e533f83b20e9f9300f4fb16b6fe9cdf376566cb8e295ffbbfa5b2a2195bb2002.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb304148c_ab0e_42ac_966a_024ff59a8cde.slice/crio-68942f2c61b18621daaf78d82703376edb42951c9bc918b02a792818df0d6cc3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf13e11c3_b93d_4671_b9a7_961ab83bd23e.slice/crio-conmon-0633e2818de57c77f8cbe3341871b6c9295418f75eb7371e9ba35f08bc900d92.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda69eb5c6_8a5b_4eb9_ad61_16cc3bec008e.slice/crio-conmon-084e897542d772a7c89dc0e1534dec1824e39ef33265b2908249460a05917c64.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc76a6f0b_0bc0_4f2e_8e03_f7871b960ebe.slice/crio-conmon-6f34aece6d2e9c27b3779a555391f111659dcf20da766d7854e90f5cfe888d96.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb304148c_ab0e_42ac_966a_024ff59a8cde.slice/crio-conmon-e533f83b20e9f9300f4fb16b6fe9cdf376566cb8e295ffbbfa5b2a2195bb2002.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25414b25_a6cc_41e6_8360_3e85f54321d5.slice/crio-conmon-d10a4557556efb566e2c349cfa75bf39a1a2ea52776d0c439f69d9cdbd06b4fa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc76a6f0b_0bc0_4f2e_8e03_f7871b960ebe.slice/crio-6f34aece6d2e9c27b3779a555391f111659dcf20da766d7854e90f5cfe888d96.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf13e11c3_b93d_4671_b9a7_961ab83bd23e.slice/crio-0633e2818de57c77f8cbe3341871b6c9295418f75eb7371e9ba35f08bc900d92.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc75e5da0_fd73_485e_b53a_b5e96965bb99.slice/crio-conmon-2c4a1099f869e71a363c1447c04199831e773a0bfb2ea26125fb2dd4207e631d.scope\": RecentStats: unable to find data in memory cache]" Dec 06 07:22:45 crc kubenswrapper[4954]: E1206 07:22:45.259199 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 03d05c358c3170745be24e17518ed67a6f790841fc18537ed370460246d3cb93 is running failed: container process not found" containerID="03d05c358c3170745be24e17518ed67a6f790841fc18537ed370460246d3cb93" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 06 07:22:45 crc kubenswrapper[4954]: E1206 07:22:45.262314 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 03d05c358c3170745be24e17518ed67a6f790841fc18537ed370460246d3cb93 is running failed: container process not found" containerID="03d05c358c3170745be24e17518ed67a6f790841fc18537ed370460246d3cb93" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 06 07:22:45 crc kubenswrapper[4954]: E1206 07:22:45.264674 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 03d05c358c3170745be24e17518ed67a6f790841fc18537ed370460246d3cb93 is running failed: container process not found" containerID="03d05c358c3170745be24e17518ed67a6f790841fc18537ed370460246d3cb93" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 06 07:22:45 crc kubenswrapper[4954]: E1206 07:22:45.264717 4954 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 03d05c358c3170745be24e17518ed67a6f790841fc18537ed370460246d3cb93 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="40b282e6-b847-4393-89f2-7844fce43388" containerName="ovsdbserver-sb" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.295029 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi9e6e-account-delete-2shf2" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.319945 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0f048-account-delete-6rspl" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.336657 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hcgg8_75a7f9b6-6582-4498-bccd-954270bc3f8e/openstack-network-exporter/0.log" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.336758 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.349675 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lnbn8" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.359216 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.377148 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.495582 4954 generic.go:334] "Generic (PLEG): container finished" podID="a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e" containerID="084e897542d772a7c89dc0e1534dec1824e39ef33265b2908249460a05917c64" exitCode=137 Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.496004 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.508145 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/75a7f9b6-6582-4498-bccd-954270bc3f8e-metrics-certs-tls-certs\") pod \"75a7f9b6-6582-4498-bccd-954270bc3f8e\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.508211 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/092ab1a2-b565-47cd-9b83-f306883b688e-scripts\") pod \"092ab1a2-b565-47cd-9b83-f306883b688e\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.508248 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-dns-svc\") pod \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.508277 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75a7f9b6-6582-4498-bccd-954270bc3f8e-config\") pod \"75a7f9b6-6582-4498-bccd-954270bc3f8e\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.508306 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092ab1a2-b565-47cd-9b83-f306883b688e-combined-ca-bundle\") pod \"092ab1a2-b565-47cd-9b83-f306883b688e\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.508349 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-dns-swift-storage-0\") pod \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.508372 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6qj9\" (UniqueName: \"kubernetes.io/projected/75a7f9b6-6582-4498-bccd-954270bc3f8e-kube-api-access-t6qj9\") pod \"75a7f9b6-6582-4498-bccd-954270bc3f8e\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.508410 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-openstack-config\") pod \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\" (UID: \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.508445 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/75a7f9b6-6582-4498-bccd-954270bc3f8e-ovs-rundir\") pod \"75a7f9b6-6582-4498-bccd-954270bc3f8e\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.508464 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/092ab1a2-b565-47cd-9b83-f306883b688e-var-run-ovn\") pod \"092ab1a2-b565-47cd-9b83-f306883b688e\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.508502 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-ovsdbserver-sb\") pod \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.508524 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-combined-ca-bundle\") pod \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\" (UID: \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.508553 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-ovsdbserver-nb\") pod \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.508952 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/092ab1a2-b565-47cd-9b83-f306883b688e-ovn-controller-tls-certs\") pod \"092ab1a2-b565-47cd-9b83-f306883b688e\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.508971 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/092ab1a2-b565-47cd-9b83-f306883b688e-var-log-ovn\") pod \"092ab1a2-b565-47cd-9b83-f306883b688e\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.508993 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-config\") pod \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.509019 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a7f9b6-6582-4498-bccd-954270bc3f8e-combined-ca-bundle\") pod \"75a7f9b6-6582-4498-bccd-954270bc3f8e\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.509050 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25rq9\" (UniqueName: \"kubernetes.io/projected/092ab1a2-b565-47cd-9b83-f306883b688e-kube-api-access-25rq9\") pod \"092ab1a2-b565-47cd-9b83-f306883b688e\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.509071 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmnf9\" (UniqueName: \"kubernetes.io/projected/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-kube-api-access-fmnf9\") pod \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\" (UID: \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.509094 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-openstack-config-secret\") pod \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\" (UID: \"a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.509115 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/092ab1a2-b565-47cd-9b83-f306883b688e-var-run\") pod \"092ab1a2-b565-47cd-9b83-f306883b688e\" (UID: \"092ab1a2-b565-47cd-9b83-f306883b688e\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.509178 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/75a7f9b6-6582-4498-bccd-954270bc3f8e-ovn-rundir\") pod \"75a7f9b6-6582-4498-bccd-954270bc3f8e\" (UID: \"75a7f9b6-6582-4498-bccd-954270bc3f8e\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.509220 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58kkq\" (UniqueName: \"kubernetes.io/projected/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-kube-api-access-58kkq\") pod \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\" (UID: \"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f\") " Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.515637 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/092ab1a2-b565-47cd-9b83-f306883b688e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "092ab1a2-b565-47cd-9b83-f306883b688e" (UID: "092ab1a2-b565-47cd-9b83-f306883b688e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.515665 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/092ab1a2-b565-47cd-9b83-f306883b688e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "092ab1a2-b565-47cd-9b83-f306883b688e" (UID: "092ab1a2-b565-47cd-9b83-f306883b688e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.516140 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75a7f9b6-6582-4498-bccd-954270bc3f8e-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "75a7f9b6-6582-4498-bccd-954270bc3f8e" (UID: "75a7f9b6-6582-4498-bccd-954270bc3f8e"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.517402 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75a7f9b6-6582-4498-bccd-954270bc3f8e-config" (OuterVolumeSpecName: "config") pod "75a7f9b6-6582-4498-bccd-954270bc3f8e" (UID: "75a7f9b6-6582-4498-bccd-954270bc3f8e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.525616 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/092ab1a2-b565-47cd-9b83-f306883b688e-scripts" (OuterVolumeSpecName: "scripts") pod "092ab1a2-b565-47cd-9b83-f306883b688e" (UID: "092ab1a2-b565-47cd-9b83-f306883b688e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.537851 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75a7f9b6-6582-4498-bccd-954270bc3f8e-kube-api-access-t6qj9" (OuterVolumeSpecName: "kube-api-access-t6qj9") pod "75a7f9b6-6582-4498-bccd-954270bc3f8e" (UID: "75a7f9b6-6582-4498-bccd-954270bc3f8e"). InnerVolumeSpecName "kube-api-access-t6qj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.538872 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/092ab1a2-b565-47cd-9b83-f306883b688e-var-run" (OuterVolumeSpecName: "var-run") pod "092ab1a2-b565-47cd-9b83-f306883b688e" (UID: "092ab1a2-b565-47cd-9b83-f306883b688e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.538936 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75a7f9b6-6582-4498-bccd-954270bc3f8e-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "75a7f9b6-6582-4498-bccd-954270bc3f8e" (UID: "75a7f9b6-6582-4498-bccd-954270bc3f8e"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.547662 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="048d9a25-3088-4426-914f-5c4436b1e98a" path="/var/lib/kubelet/pods/048d9a25-3088-4426-914f-5c4436b1e98a/volumes" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.550634 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23dd7041-53d4-4d98-bfc9-fc64828c6c7f" path="/var/lib/kubelet/pods/23dd7041-53d4-4d98-bfc9-fc64828c6c7f/volumes" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.552030 4954 generic.go:334] "Generic (PLEG): container finished" podID="c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" containerID="6f34aece6d2e9c27b3779a555391f111659dcf20da766d7854e90f5cfe888d96" exitCode=143 Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.555389 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d7bda2f-6de7-4e30-be44-adc2196c3510" path="/var/lib/kubelet/pods/3d7bda2f-6de7-4e30-be44-adc2196c3510/volumes" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.555980 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8866facf-9bb1-4deb-9244-15805d162346" path="/var/lib/kubelet/pods/8866facf-9bb1-4deb-9244-15805d162346/volumes" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.556801 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a98f49b7-0577-4cc4-a3cc-bf5392fae7af" path="/var/lib/kubelet/pods/a98f49b7-0577-4cc4-a3cc-bf5392fae7af/volumes" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.558242 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfda18f4-e714-4a51-a6fe-9d52cef7605b" path="/var/lib/kubelet/pods/bfda18f4-e714-4a51-a6fe-9d52cef7605b/volumes" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.559040 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc8cf20f-f0a2-417a-914a-7710fa0fafb9" path="/var/lib/kubelet/pods/dc8cf20f-f0a2-417a-914a-7710fa0fafb9/volumes" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.570738 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-kube-api-access-fmnf9" (OuterVolumeSpecName: "kube-api-access-fmnf9") pod "a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e" (UID: "a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e"). InnerVolumeSpecName "kube-api-access-fmnf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.571522 4954 generic.go:334] "Generic (PLEG): container finished" podID="24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" containerID="98c8778c7e8353cc4dc52283f262371684ed883dec692ca28a39a816cbb59cd0" exitCode=0 Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.573979 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.597428 4954 generic.go:334] "Generic (PLEG): container finished" podID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" exitCode=0 Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.599441 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/092ab1a2-b565-47cd-9b83-f306883b688e-kube-api-access-25rq9" (OuterVolumeSpecName: "kube-api-access-25rq9") pod "092ab1a2-b565-47cd-9b83-f306883b688e" (UID: "092ab1a2-b565-47cd-9b83-f306883b688e"). InnerVolumeSpecName "kube-api-access-25rq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.607218 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-kube-api-access-58kkq" (OuterVolumeSpecName: "kube-api-access-58kkq") pod "56254c3b-cd9d-40d9-bb7e-b1c858f3b87f" (UID: "56254c3b-cd9d-40d9-bb7e-b1c858f3b87f"). InnerVolumeSpecName "kube-api-access-58kkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.612954 4954 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/75a7f9b6-6582-4498-bccd-954270bc3f8e-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.613074 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58kkq\" (UniqueName: \"kubernetes.io/projected/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-kube-api-access-58kkq\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.613109 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/092ab1a2-b565-47cd-9b83-f306883b688e-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.613141 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75a7f9b6-6582-4498-bccd-954270bc3f8e-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.613150 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6qj9\" (UniqueName: \"kubernetes.io/projected/75a7f9b6-6582-4498-bccd-954270bc3f8e-kube-api-access-t6qj9\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.613187 4954 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/75a7f9b6-6582-4498-bccd-954270bc3f8e-ovs-rundir\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.613197 4954 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/092ab1a2-b565-47cd-9b83-f306883b688e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.613205 4954 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/092ab1a2-b565-47cd-9b83-f306883b688e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.613215 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25rq9\" (UniqueName: \"kubernetes.io/projected/092ab1a2-b565-47cd-9b83-f306883b688e-kube-api-access-25rq9\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.613223 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmnf9\" (UniqueName: \"kubernetes.io/projected/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-kube-api-access-fmnf9\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.613231 4954 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/092ab1a2-b565-47cd-9b83-f306883b688e-var-run\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.628173 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e" (UID: "a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.671710 4954 generic.go:334] "Generic (PLEG): container finished" podID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerID="e533f83b20e9f9300f4fb16b6fe9cdf376566cb8e295ffbbfa5b2a2195bb2002" exitCode=0 Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.671750 4954 generic.go:334] "Generic (PLEG): container finished" podID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerID="854d5d618d13ec03ec6c2f4744287e27f89042a1dae6f369b12bbc3408f4ff5c" exitCode=0 Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.671760 4954 generic.go:334] "Generic (PLEG): container finished" podID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerID="68942f2c61b18621daaf78d82703376edb42951c9bc918b02a792818df0d6cc3" exitCode=0 Dec 06 07:22:45 crc kubenswrapper[4954]: E1206 07:22:45.683420 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eace79b9d5ea075491fb63d59c67b1bf00e2fa80e5cc5eed0c509ad5570509ae" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.696946 4954 generic.go:334] "Generic (PLEG): container finished" podID="096c2131-031b-4573-ade7-b1d0d34abc60" containerID="feff0af557a19a07e436a4ff2db6401f9895dc0adfba494469ecdb5c414013e9" exitCode=0 Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.698768 4954 generic.go:334] "Generic (PLEG): container finished" podID="dc34e079-7411-438c-aca8-b2c95854158e" containerID="31234f4535422fb1bf7bada8c66937069f24dc6cdacdb390f9005698b7491575" exitCode=0 Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.707931 4954 generic.go:334] "Generic (PLEG): container finished" podID="a6b3ac97-ce01-4110-9dd5-fee903dd5204" containerID="dde3cc2c35c6b947b115a84354f88ab0c259ca23e0659e70ff1f7de4d80de96e" exitCode=0 Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.719348 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:45 crc kubenswrapper[4954]: E1206 07:22:45.739745 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eace79b9d5ea075491fb63d59c67b1bf00e2fa80e5cc5eed0c509ad5570509ae" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.747676 4954 generic.go:334] "Generic (PLEG): container finished" podID="40b282e6-b847-4393-89f2-7844fce43388" containerID="03d05c358c3170745be24e17518ed67a6f790841fc18537ed370460246d3cb93" exitCode=0 Dec 06 07:22:45 crc kubenswrapper[4954]: W1206 07:22:45.749969 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6d774b5_3c0f_4b86_ad33_b6bfd49c1b46.slice/crio-6e147bbf252b2623e7f1a275240b8caecc7cf44f164c70ee51f71775207cfd6a WatchSource:0}: Error finding container 6e147bbf252b2623e7f1a275240b8caecc7cf44f164c70ee51f71775207cfd6a: Status 404 returned error can't find the container with id 6e147bbf252b2623e7f1a275240b8caecc7cf44f164c70ee51f71775207cfd6a Dec 06 07:22:45 crc kubenswrapper[4954]: E1206 07:22:45.762142 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eace79b9d5ea075491fb63d59c67b1bf00e2fa80e5cc5eed0c509ad5570509ae" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 06 07:22:45 crc kubenswrapper[4954]: E1206 07:22:45.762436 4954 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="f9eaecc6-8e85-432e-a906-3fcecee9fc1d" containerName="galera" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.803179 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e" (UID: "a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.829106 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.863747 4954 generic.go:334] "Generic (PLEG): container finished" podID="f13e11c3-b93d-4671-b9a7-961ab83bd23e" containerID="0633e2818de57c77f8cbe3341871b6c9295418f75eb7371e9ba35f08bc900d92" exitCode=143 Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.867829 4954 generic.go:334] "Generic (PLEG): container finished" podID="25414b25-a6cc-41e6-8360-3e85f54321d5" containerID="d10a4557556efb566e2c349cfa75bf39a1a2ea52776d0c439f69d9cdbd06b4fa" exitCode=143 Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.879807 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092ab1a2-b565-47cd-9b83-f306883b688e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "092ab1a2-b565-47cd-9b83-f306883b688e" (UID: "092ab1a2-b565-47cd-9b83-f306883b688e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.883053 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a7f9b6-6582-4498-bccd-954270bc3f8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75a7f9b6-6582-4498-bccd-954270bc3f8e" (UID: "75a7f9b6-6582-4498-bccd-954270bc3f8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.911700 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e" (UID: "a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.913221 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "56254c3b-cd9d-40d9-bb7e-b1c858f3b87f" (UID: "56254c3b-cd9d-40d9-bb7e-b1c858f3b87f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.925015 4954 generic.go:334] "Generic (PLEG): container finished" podID="432b9d93-b045-4e25-b58b-b3a6fd8512c4" containerID="ac777ba7c81ee56eb4fb435d7da2f478c6a3c4f40024a4daae7126627007674a" exitCode=143 Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.945550 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-config" (OuterVolumeSpecName: "config") pod "56254c3b-cd9d-40d9-bb7e-b1c858f3b87f" (UID: "56254c3b-cd9d-40d9-bb7e-b1c858f3b87f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.947186 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.947216 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092ab1a2-b565-47cd-9b83-f306883b688e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.947228 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.947237 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.947247 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a7f9b6-6582-4498-bccd-954270bc3f8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:45 crc kubenswrapper[4954]: E1206 07:22:45.947318 4954 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 06 07:22:45 crc kubenswrapper[4954]: E1206 07:22:45.947377 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-config-data podName:578bec25-a54c-4f52-95f2-19f20f833437 nodeName:}" failed. No retries permitted until 2025-12-06 07:22:49.947358381 +0000 UTC m=+1544.760717770 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-config-data") pod "rabbitmq-server-0" (UID: "578bec25-a54c-4f52-95f2-19f20f833437") : configmap "rabbitmq-config-data" not found Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.970880 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "56254c3b-cd9d-40d9-bb7e-b1c858f3b87f" (UID: "56254c3b-cd9d-40d9-bb7e-b1c858f3b87f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.972851 4954 generic.go:334] "Generic (PLEG): container finished" podID="f84d04d2-6282-4a9c-89a8-3aa64ef22c74" containerID="27d6334f40e13cafc61de087025fab078b3fe41774336c33565c8ae4035e063f" exitCode=143 Dec 06 07:22:45 crc kubenswrapper[4954]: I1206 07:22:45.974496 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "56254c3b-cd9d-40d9-bb7e-b1c858f3b87f" (UID: "56254c3b-cd9d-40d9-bb7e-b1c858f3b87f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.017009 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092ab1a2-b565-47cd-9b83-f306883b688e-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "092ab1a2-b565-47cd-9b83-f306883b688e" (UID: "092ab1a2-b565-47cd-9b83-f306883b688e"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.019980 4954 generic.go:334] "Generic (PLEG): container finished" podID="c75e5da0-fd73-485e-b53a-b5e96965bb99" containerID="2c4a1099f869e71a363c1447c04199831e773a0bfb2ea26125fb2dd4207e631d" exitCode=143 Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.037831 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "56254c3b-cd9d-40d9-bb7e-b1c858f3b87f" (UID: "56254c3b-cd9d-40d9-bb7e-b1c858f3b87f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.043626 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glanced905-account-delete-6dwqh"] Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.043719 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe","Type":"ContainerDied","Data":"6f34aece6d2e9c27b3779a555391f111659dcf20da766d7854e90f5cfe888d96"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.043777 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7","Type":"ContainerDied","Data":"98c8778c7e8353cc4dc52283f262371684ed883dec692ca28a39a816cbb59cd0"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.043797 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7","Type":"ContainerDied","Data":"67dfa7eeace0009eed1693c30dbc881ceef8362746b2726ed0ab32cac8b04198"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.043810 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67dfa7eeace0009eed1693c30dbc881ceef8362746b2726ed0ab32cac8b04198" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.043822 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n" event={"ID":"56254c3b-cd9d-40d9-bb7e-b1c858f3b87f","Type":"ContainerDied","Data":"20f0554ceee87a95a1da8cbc3375427e393f1dcd7c992d70c597d1754d942d58"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.043839 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xskgs" event={"ID":"a85b49d0-cc8d-4dce-aade-6c63af659f42","Type":"ContainerDied","Data":"2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.043857 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerDied","Data":"e533f83b20e9f9300f4fb16b6fe9cdf376566cb8e295ffbbfa5b2a2195bb2002"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.043874 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerDied","Data":"854d5d618d13ec03ec6c2f4744287e27f89042a1dae6f369b12bbc3408f4ff5c"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.043886 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerDied","Data":"68942f2c61b18621daaf78d82703376edb42951c9bc918b02a792818df0d6cc3"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.043900 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78654684fc-84hfw" event={"ID":"096c2131-031b-4573-ade7-b1d0d34abc60","Type":"ContainerDied","Data":"feff0af557a19a07e436a4ff2db6401f9895dc0adfba494469ecdb5c414013e9"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.043916 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dc34e079-7411-438c-aca8-b2c95854158e","Type":"ContainerDied","Data":"31234f4535422fb1bf7bada8c66937069f24dc6cdacdb390f9005698b7491575"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.043983 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a6b3ac97-ce01-4110-9dd5-fee903dd5204","Type":"ContainerDied","Data":"dde3cc2c35c6b947b115a84354f88ab0c259ca23e0659e70ff1f7de4d80de96e"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.044023 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"40b282e6-b847-4393-89f2-7844fce43388","Type":"ContainerDied","Data":"03d05c358c3170745be24e17518ed67a6f790841fc18537ed370460246d3cb93"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.044048 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" event={"ID":"f13e11c3-b93d-4671-b9a7-961ab83bd23e","Type":"ContainerDied","Data":"0633e2818de57c77f8cbe3341871b6c9295418f75eb7371e9ba35f08bc900d92"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.044069 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759b9cfd76-jp2pl" event={"ID":"25414b25-a6cc-41e6-8360-3e85f54321d5","Type":"ContainerDied","Data":"d10a4557556efb566e2c349cfa75bf39a1a2ea52776d0c439f69d9cdbd06b4fa"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.044360 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"432b9d93-b045-4e25-b58b-b3a6fd8512c4","Type":"ContainerDied","Data":"ac777ba7c81ee56eb4fb435d7da2f478c6a3c4f40024a4daae7126627007674a"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.044387 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55bf8ff4-7brlv" event={"ID":"f84d04d2-6282-4a9c-89a8-3aa64ef22c74","Type":"ContainerDied","Data":"27d6334f40e13cafc61de087025fab078b3fe41774336c33565c8ae4035e063f"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.044406 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c75e5da0-fd73-485e-b53a-b5e96965bb99","Type":"ContainerDied","Data":"2c4a1099f869e71a363c1447c04199831e773a0bfb2ea26125fb2dd4207e631d"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.044430 4954 scope.go:117] "RemoveContainer" containerID="084e897542d772a7c89dc0e1534dec1824e39ef33265b2908249460a05917c64" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.047903 4954 generic.go:334] "Generic (PLEG): container finished" podID="b907c888-706a-4183-b581-ff7b4742fc74" containerID="632cafbc6ab8b4bca12eb7135c2589ae827ea7666ad9a40b63f7adf3a97af9c7" exitCode=143 Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.047982 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5bf7788f9-vw5rh" event={"ID":"b907c888-706a-4183-b581-ff7b4742fc74","Type":"ContainerDied","Data":"632cafbc6ab8b4bca12eb7135c2589ae827ea7666ad9a40b63f7adf3a97af9c7"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.053156 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.054906 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.055120 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.055235 4954 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/092ab1a2-b565-47cd-9b83-f306883b688e-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.054696 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.057844 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lnbn8" event={"ID":"092ab1a2-b565-47cd-9b83-f306883b688e","Type":"ContainerDied","Data":"efbcb8df143034dd4846aa36db22eee233f1f2282bd1aa5a1d965ff7aba400d2"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.057953 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lnbn8" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.082665 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.084079 4954 generic.go:334] "Generic (PLEG): container finished" podID="37a14211-fd70-4578-83c1-d674b2cf6172" containerID="0bb0b24e4d3085b50b85c5df8d24f8ed4c7c0c28e8532d6d7a2d8957c39ce556" exitCode=143 Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.084222 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hcgg8" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.084240 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37a14211-fd70-4578-83c1-d674b2cf6172","Type":"ContainerDied","Data":"0bb0b24e4d3085b50b85c5df8d24f8ed4c7c0c28e8532d6d7a2d8957c39ce556"} Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.096915 4954 scope.go:117] "RemoveContainer" containerID="084e897542d772a7c89dc0e1534dec1824e39ef33265b2908249460a05917c64" Dec 06 07:22:46 crc kubenswrapper[4954]: E1206 07:22:46.099718 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"084e897542d772a7c89dc0e1534dec1824e39ef33265b2908249460a05917c64\": container with ID starting with 084e897542d772a7c89dc0e1534dec1824e39ef33265b2908249460a05917c64 not found: ID does not exist" containerID="084e897542d772a7c89dc0e1534dec1824e39ef33265b2908249460a05917c64" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.099768 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"084e897542d772a7c89dc0e1534dec1824e39ef33265b2908249460a05917c64"} err="failed to get container status \"084e897542d772a7c89dc0e1534dec1824e39ef33265b2908249460a05917c64\": rpc error: code = NotFound desc = could not find container \"084e897542d772a7c89dc0e1534dec1824e39ef33265b2908249460a05917c64\": container with ID starting with 084e897542d772a7c89dc0e1534dec1824e39ef33265b2908249460a05917c64 not found: ID does not exist" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.099805 4954 scope.go:117] "RemoveContainer" containerID="9d7871a1cc043780cd4a4b6b1e5824bece7fc1fc553c7805c5dd188196365a91" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.104674 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.168485 4954 scope.go:117] "RemoveContainer" containerID="837f7307011dcdff84ddf290fbfe19555e90f8ff13ebde2aef98e541e550a775" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.247272 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a7f9b6-6582-4498-bccd-954270bc3f8e-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "75a7f9b6-6582-4498-bccd-954270bc3f8e" (UID: "75a7f9b6-6582-4498-bccd-954270bc3f8e"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.280130 4954 scope.go:117] "RemoveContainer" containerID="fa365fab5db7334bced55d4b6aa2cb11e9adda9a3cc808705301e34527d718db" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.296974 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/40b282e6-b847-4393-89f2-7844fce43388-ovsdb-rundir\") pod \"40b282e6-b847-4393-89f2-7844fce43388\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.300448 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-metrics-certs-tls-certs\") pod \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.304318 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9h9b\" (UniqueName: \"kubernetes.io/projected/40b282e6-b847-4393-89f2-7844fce43388-kube-api-access-v9h9b\") pod \"40b282e6-b847-4393-89f2-7844fce43388\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.304382 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40b282e6-b847-4393-89f2-7844fce43388-scripts\") pod \"40b282e6-b847-4393-89f2-7844fce43388\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.304477 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc34e079-7411-438c-aca8-b2c95854158e-combined-ca-bundle\") pod \"dc34e079-7411-438c-aca8-b2c95854158e\" (UID: \"dc34e079-7411-438c-aca8-b2c95854158e\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.304530 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b282e6-b847-4393-89f2-7844fce43388-metrics-certs-tls-certs\") pod \"40b282e6-b847-4393-89f2-7844fce43388\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.304558 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-config\") pod \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.304657 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc34e079-7411-438c-aca8-b2c95854158e-config-data\") pod \"dc34e079-7411-438c-aca8-b2c95854158e\" (UID: \"dc34e079-7411-438c-aca8-b2c95854158e\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.304708 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b9km\" (UniqueName: \"kubernetes.io/projected/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-kube-api-access-9b9km\") pod \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.304781 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcb2k\" (UniqueName: \"kubernetes.io/projected/dc34e079-7411-438c-aca8-b2c95854158e-kube-api-access-vcb2k\") pod \"dc34e079-7411-438c-aca8-b2c95854158e\" (UID: \"dc34e079-7411-438c-aca8-b2c95854158e\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.304829 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-combined-ca-bundle\") pod \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.304868 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"40b282e6-b847-4393-89f2-7844fce43388\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.304891 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b282e6-b847-4393-89f2-7844fce43388-ovsdbserver-sb-tls-certs\") pod \"40b282e6-b847-4393-89f2-7844fce43388\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.304919 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b282e6-b847-4393-89f2-7844fce43388-combined-ca-bundle\") pod \"40b282e6-b847-4393-89f2-7844fce43388\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.304969 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-scripts\") pod \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.305008 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40b282e6-b847-4393-89f2-7844fce43388-config\") pod \"40b282e6-b847-4393-89f2-7844fce43388\" (UID: \"40b282e6-b847-4393-89f2-7844fce43388\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.305038 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-ovsdb-rundir\") pod \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.305058 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.305112 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-ovsdbserver-nb-tls-certs\") pod \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\" (UID: \"24a72b28-2cf7-47e0-b7c2-5ff92acedfe7\") " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.311592 4954 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/75a7f9b6-6582-4498-bccd-954270bc3f8e-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.320967 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lnbn8"] Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.302931 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40b282e6-b847-4393-89f2-7844fce43388-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "40b282e6-b847-4393-89f2-7844fce43388" (UID: "40b282e6-b847-4393-89f2-7844fce43388"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.320351 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-config" (OuterVolumeSpecName: "config") pod "24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" (UID: "24a72b28-2cf7-47e0-b7c2-5ff92acedfe7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.347748 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" (UID: "24a72b28-2cf7-47e0-b7c2-5ff92acedfe7"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.348163 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lnbn8"] Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.349688 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" (UID: "24a72b28-2cf7-47e0-b7c2-5ff92acedfe7"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 07:22:46 crc kubenswrapper[4954]: W1206 07:22:46.350241 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod693e40da_d019_421e_83a8_6dc351580607.slice/crio-5eafbc9aea9f01ff5550ea2a44b404b3a0fe542bc5f12c899789b18cbdb2b4ae WatchSource:0}: Error finding container 5eafbc9aea9f01ff5550ea2a44b404b3a0fe542bc5f12c899789b18cbdb2b4ae: Status 404 returned error can't find the container with id 5eafbc9aea9f01ff5550ea2a44b404b3a0fe542bc5f12c899789b18cbdb2b4ae Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.354284 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc34e079-7411-438c-aca8-b2c95854158e-kube-api-access-vcb2k" (OuterVolumeSpecName: "kube-api-access-vcb2k") pod "dc34e079-7411-438c-aca8-b2c95854158e" (UID: "dc34e079-7411-438c-aca8-b2c95854158e"). InnerVolumeSpecName "kube-api-access-vcb2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.359864 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40b282e6-b847-4393-89f2-7844fce43388-config" (OuterVolumeSpecName: "config") pod "40b282e6-b847-4393-89f2-7844fce43388" (UID: "40b282e6-b847-4393-89f2-7844fce43388"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.364754 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-scripts" (OuterVolumeSpecName: "scripts") pod "24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" (UID: "24a72b28-2cf7-47e0-b7c2-5ff92acedfe7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.384314 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "40b282e6-b847-4393-89f2-7844fce43388" (UID: "40b282e6-b847-4393-89f2-7844fce43388"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.386418 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40b282e6-b847-4393-89f2-7844fce43388-scripts" (OuterVolumeSpecName: "scripts") pod "40b282e6-b847-4393-89f2-7844fce43388" (UID: "40b282e6-b847-4393-89f2-7844fce43388"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.392845 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-kube-api-access-9b9km" (OuterVolumeSpecName: "kube-api-access-9b9km") pod "24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" (UID: "24a72b28-2cf7-47e0-b7c2-5ff92acedfe7"). InnerVolumeSpecName "kube-api-access-9b9km". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.414981 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40b282e6-b847-4393-89f2-7844fce43388-kube-api-access-v9h9b" (OuterVolumeSpecName: "kube-api-access-v9h9b") pod "40b282e6-b847-4393-89f2-7844fce43388" (UID: "40b282e6-b847-4393-89f2-7844fce43388"). InnerVolumeSpecName "kube-api-access-v9h9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.429070 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/40b282e6-b847-4393-89f2-7844fce43388-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.429116 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9h9b\" (UniqueName: \"kubernetes.io/projected/40b282e6-b847-4393-89f2-7844fce43388-kube-api-access-v9h9b\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.429127 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40b282e6-b847-4393-89f2-7844fce43388-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.429136 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.429148 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b9km\" (UniqueName: \"kubernetes.io/projected/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-kube-api-access-9b9km\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.429157 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcb2k\" (UniqueName: \"kubernetes.io/projected/dc34e079-7411-438c-aca8-b2c95854158e-kube-api-access-vcb2k\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.429226 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.429241 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.429253 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40b282e6-b847-4393-89f2-7844fce43388-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.429262 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.429281 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.464193 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" (UID: "24a72b28-2cf7-47e0-b7c2-5ff92acedfe7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.464865 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b282e6-b847-4393-89f2-7844fce43388-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40b282e6-b847-4393-89f2-7844fce43388" (UID: "40b282e6-b847-4393-89f2-7844fce43388"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.486852 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc34e079-7411-438c-aca8-b2c95854158e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc34e079-7411-438c-aca8-b2c95854158e" (UID: "dc34e079-7411-438c-aca8-b2c95854158e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:46 crc kubenswrapper[4954]: I1206 07:22:46.513351 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-74985f58f-cpdl4"] Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.521634 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-74985f58f-cpdl4" podUID="069bb9c2-be47-445b-ba0e-f32a7db0b96e" containerName="proxy-httpd" containerID="cri-o://cb83e86e00e38d45967cf91f65a342bea85bf6723ea7014271922e9077ba5d0d" gracePeriod=30 Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.522694 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-74985f58f-cpdl4" podUID="069bb9c2-be47-445b-ba0e-f32a7db0b96e" containerName="proxy-server" containerID="cri-o://b55c4714520851c4ac11e12722e037b67a35d180952a2ddf1903aaa5172b8c72" gracePeriod=30 Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.539268 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc34e079-7411-438c-aca8-b2c95854158e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.539299 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.539309 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b282e6-b847-4393-89f2-7844fce43388-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:47 crc kubenswrapper[4954]: E1206 07:22:46.539379 4954 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 06 07:22:47 crc kubenswrapper[4954]: E1206 07:22:46.539430 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-config-data podName:31452db7-e2c4-4e61-8f8c-7017476f0bc0 nodeName:}" failed. No retries permitted until 2025-12-06 07:22:50.539411195 +0000 UTC m=+1545.352770584 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-config-data") pod "rabbitmq-cell1-server-0" (UID: "31452db7-e2c4-4e61-8f8c-7017476f0bc0") : configmap "rabbitmq-cell1-config-data" not found Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.553475 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.571477 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b282e6-b847-4393-89f2-7844fce43388-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "40b282e6-b847-4393-89f2-7844fce43388" (UID: "40b282e6-b847-4393-89f2-7844fce43388"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:47 crc kubenswrapper[4954]: W1206 07:22:46.579522 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod097befa0_58fe_4616_bed7_ada4f7d81ce3.slice/crio-05cb69201f2ea66c003ce693f242485952866971608a671033ff80da6bfe931e WatchSource:0}: Error finding container 05cb69201f2ea66c003ce693f242485952866971608a671033ff80da6bfe931e: Status 404 returned error can't find the container with id 05cb69201f2ea66c003ce693f242485952866971608a671033ff80da6bfe931e Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.579614 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b282e6-b847-4393-89f2-7844fce43388-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "40b282e6-b847-4393-89f2-7844fce43388" (UID: "40b282e6-b847-4393-89f2-7844fce43388"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.587884 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.595890 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinderdf0e-account-delete-fsz84"] Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.645191 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.645607 4954 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b282e6-b847-4393-89f2-7844fce43388-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.645624 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b282e6-b847-4393-89f2-7844fce43388-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.645635 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.647713 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican664f-account-delete-l67mj"] Dec 06 07:22:47 crc kubenswrapper[4954]: E1206 07:22:46.651427 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 06 07:22:47 crc kubenswrapper[4954]: E1206 07:22:46.655789 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.665926 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc34e079-7411-438c-aca8-b2c95854158e-config-data" (OuterVolumeSpecName: "config-data") pod "dc34e079-7411-438c-aca8-b2c95854158e" (UID: "dc34e079-7411-438c-aca8-b2c95854158e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.675284 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement88ab-account-delete-597b8"] Dec 06 07:22:47 crc kubenswrapper[4954]: E1206 07:22:46.684292 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 06 07:22:47 crc kubenswrapper[4954]: E1206 07:22:46.684297 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 06 07:22:47 crc kubenswrapper[4954]: E1206 07:22:46.684639 4954 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-xskgs" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovsdb-server" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.703528 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n"] Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.711205 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f9fbbf6f7-6bn5n"] Dec 06 07:22:47 crc kubenswrapper[4954]: E1206 07:22:46.711533 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.718993 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" (UID: "24a72b28-2cf7-47e0-b7c2-5ff92acedfe7"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.727101 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron251d-account-delete-rhsq5"] Dec 06 07:22:47 crc kubenswrapper[4954]: E1206 07:22:46.727958 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 06 07:22:47 crc kubenswrapper[4954]: E1206 07:22:46.728053 4954 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-xskgs" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovs-vswitchd" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.738273 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0f048-account-delete-6rspl"] Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.747838 4954 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.747866 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc34e079-7411-438c-aca8-b2c95854158e-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.790239 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" (UID: "24a72b28-2cf7-47e0-b7c2-5ff92acedfe7"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.850509 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.895931 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-hcgg8"] Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.904615 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-hcgg8"] Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.916781 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi9e6e-account-delete-2shf2"] Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:46.967674 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.055353 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-config-data\") pod \"5e602bfa-4a57-4fe6-adca-90acb13b7458\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.055539 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-vencrypt-tls-certs\") pod \"5e602bfa-4a57-4fe6-adca-90acb13b7458\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.055582 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwbh7\" (UniqueName: \"kubernetes.io/projected/5e602bfa-4a57-4fe6-adca-90acb13b7458-kube-api-access-cwbh7\") pod \"5e602bfa-4a57-4fe6-adca-90acb13b7458\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.056269 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-combined-ca-bundle\") pod \"5e602bfa-4a57-4fe6-adca-90acb13b7458\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.056706 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-nova-novncproxy-tls-certs\") pod \"5e602bfa-4a57-4fe6-adca-90acb13b7458\" (UID: \"5e602bfa-4a57-4fe6-adca-90acb13b7458\") " Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.079462 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e602bfa-4a57-4fe6-adca-90acb13b7458-kube-api-access-cwbh7" (OuterVolumeSpecName: "kube-api-access-cwbh7") pod "5e602bfa-4a57-4fe6-adca-90acb13b7458" (UID: "5e602bfa-4a57-4fe6-adca-90acb13b7458"). InnerVolumeSpecName "kube-api-access-cwbh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.105242 4954 generic.go:334] "Generic (PLEG): container finished" podID="069bb9c2-be47-445b-ba0e-f32a7db0b96e" containerID="cb83e86e00e38d45967cf91f65a342bea85bf6723ea7014271922e9077ba5d0d" exitCode=0 Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.105295 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74985f58f-cpdl4" event={"ID":"069bb9c2-be47-445b-ba0e-f32a7db0b96e","Type":"ContainerDied","Data":"cb83e86e00e38d45967cf91f65a342bea85bf6723ea7014271922e9077ba5d0d"} Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.112237 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.114656 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"40b282e6-b847-4393-89f2-7844fce43388","Type":"ContainerDied","Data":"50dd9e0f4748fda949608a54563ba2a818995654b61092a245197da47944b2fe"} Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.114718 4954 scope.go:117] "RemoveContainer" containerID="e5d78eb2813e941bbf1327182aea6826153731bf9030370520ea6cf536e2c69c" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.121134 4954 generic.go:334] "Generic (PLEG): container finished" podID="5e602bfa-4a57-4fe6-adca-90acb13b7458" containerID="9b66120864df3196219880eda3f954ff63b12cccdf4e304f76c83d176d3e19b9" exitCode=0 Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.121254 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5e602bfa-4a57-4fe6-adca-90acb13b7458","Type":"ContainerDied","Data":"9b66120864df3196219880eda3f954ff63b12cccdf4e304f76c83d176d3e19b9"} Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.121319 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5e602bfa-4a57-4fe6-adca-90acb13b7458","Type":"ContainerDied","Data":"184f44b211091fd1ce32e67162b351906ae92ae908086377ab1a1fba8b4da095"} Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.121425 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.131125 4954 generic.go:334] "Generic (PLEG): container finished" podID="e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46" containerID="2bc60fc2d15e39cabc0790d5969c57496804901c2638939ec5960ab33712f898" exitCode=0 Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.131189 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glanced905-account-delete-6dwqh" event={"ID":"e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46","Type":"ContainerDied","Data":"2bc60fc2d15e39cabc0790d5969c57496804901c2638939ec5960ab33712f898"} Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.131220 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glanced905-account-delete-6dwqh" event={"ID":"e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46","Type":"ContainerStarted","Data":"6e147bbf252b2623e7f1a275240b8caecc7cf44f164c70ee51f71775207cfd6a"} Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.142841 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron251d-account-delete-rhsq5" event={"ID":"097befa0-58fe-4616-bed7-ada4f7d81ce3","Type":"ContainerStarted","Data":"05cb69201f2ea66c003ce693f242485952866971608a671033ff80da6bfe931e"} Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.146761 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0f048-account-delete-6rspl" event={"ID":"81a84a0f-fa51-461c-a281-6b832ad39aa7","Type":"ContainerStarted","Data":"0f8bdd98d0bbb0caf44bd6a3cfe524e18e9944d3d400afe8c088e77c6f8877d8"} Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.162818 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwbh7\" (UniqueName: \"kubernetes.io/projected/5e602bfa-4a57-4fe6-adca-90acb13b7458-kube-api-access-cwbh7\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.193642 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.205911 4954 generic.go:334] "Generic (PLEG): container finished" podID="f13e11c3-b93d-4671-b9a7-961ab83bd23e" containerID="37978e335651ac4f0f09719f13e2d04c7acd8c2ac455f49f29118e1f83cda3a9" exitCode=0 Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.206050 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" event={"ID":"f13e11c3-b93d-4671-b9a7-961ab83bd23e","Type":"ContainerDied","Data":"37978e335651ac4f0f09719f13e2d04c7acd8c2ac455f49f29118e1f83cda3a9"} Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.206937 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.260633 4954 generic.go:334] "Generic (PLEG): container finished" podID="f9eaecc6-8e85-432e-a906-3fcecee9fc1d" containerID="eace79b9d5ea075491fb63d59c67b1bf00e2fa80e5cc5eed0c509ad5570509ae" exitCode=0 Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.260857 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f9eaecc6-8e85-432e-a906-3fcecee9fc1d","Type":"ContainerDied","Data":"eace79b9d5ea075491fb63d59c67b1bf00e2fa80e5cc5eed0c509ad5570509ae"} Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.265090 4954 scope.go:117] "RemoveContainer" containerID="03d05c358c3170745be24e17518ed67a6f790841fc18537ed370460246d3cb93" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.270076 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderdf0e-account-delete-fsz84" event={"ID":"d2ccdb46-97b8-40d8-aebb-5cf28cb6854d","Type":"ContainerStarted","Data":"8d8c0607cbb962e9a1ad4064bc681757b41669b90c01f54112e79fee12ebf663"} Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.281270 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican664f-account-delete-l67mj" event={"ID":"693e40da-d019-421e-83a8-6dc351580607","Type":"ContainerStarted","Data":"5eafbc9aea9f01ff5550ea2a44b404b3a0fe542bc5f12c899789b18cbdb2b4ae"} Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.294196 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement88ab-account-delete-597b8" event={"ID":"418e92aa-4713-4a55-b4d5-650587fcb6ca","Type":"ContainerStarted","Data":"3825826067af0a6a8a03f9c385ee0605f6e5979cb5203df2425b0450a4e29f2a"} Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.321846 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican664f-account-delete-l67mj" podStartSLOduration=5.321818525 podStartE2EDuration="5.321818525s" podCreationTimestamp="2025-12-06 07:22:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 07:22:47.308067904 +0000 UTC m=+1542.121427283" watchObservedRunningTime="2025-12-06 07:22:47.321818525 +0000 UTC m=+1542.135177914" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.340241 4954 generic.go:334] "Generic (PLEG): container finished" podID="b907c888-706a-4183-b581-ff7b4742fc74" containerID="ce32b7671edf55a676312d4bc38e6dda973ecc26c56be59a3598a590d1eebc14" exitCode=0 Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.340307 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5bf7788f9-vw5rh" event={"ID":"b907c888-706a-4183-b581-ff7b4742fc74","Type":"ContainerDied","Data":"ce32b7671edf55a676312d4bc38e6dda973ecc26c56be59a3598a590d1eebc14"} Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.358614 4954 scope.go:117] "RemoveContainer" containerID="9b66120864df3196219880eda3f954ff63b12cccdf4e304f76c83d176d3e19b9" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.362389 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi9e6e-account-delete-2shf2" event={"ID":"c417ca1e-22df-4163-96f9-349df3d624e8","Type":"ContainerStarted","Data":"767b3518e5655e87d2bf0eeb5f97fee613b6f2a08de9644c18bdbeb3ba89fb96"} Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.391964 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dc34e079-7411-438c-aca8-b2c95854158e","Type":"ContainerDied","Data":"762b3ad59c7c647a311242e16d7f8285386c8c853d84ff67da08816f97c82278"} Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.392218 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.414302 4954 generic.go:334] "Generic (PLEG): container finished" podID="79dc6de3-cf27-4c1d-91c5-f922acd48400" containerID="e09a4c785be2122538bcc0a01f7e6782d054fe47c518d01aab3ac5049b21cf94" exitCode=0 Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.414353 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"79dc6de3-cf27-4c1d-91c5-f922acd48400","Type":"ContainerDied","Data":"e09a4c785be2122538bcc0a01f7e6782d054fe47c518d01aab3ac5049b21cf94"} Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.414491 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.430662 4954 scope.go:117] "RemoveContainer" containerID="9b66120864df3196219880eda3f954ff63b12cccdf4e304f76c83d176d3e19b9" Dec 06 07:22:47 crc kubenswrapper[4954]: E1206 07:22:47.446093 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b66120864df3196219880eda3f954ff63b12cccdf4e304f76c83d176d3e19b9\": container with ID starting with 9b66120864df3196219880eda3f954ff63b12cccdf4e304f76c83d176d3e19b9 not found: ID does not exist" containerID="9b66120864df3196219880eda3f954ff63b12cccdf4e304f76c83d176d3e19b9" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.446163 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b66120864df3196219880eda3f954ff63b12cccdf4e304f76c83d176d3e19b9"} err="failed to get container status \"9b66120864df3196219880eda3f954ff63b12cccdf4e304f76c83d176d3e19b9\": rpc error: code = NotFound desc = could not find container \"9b66120864df3196219880eda3f954ff63b12cccdf4e304f76c83d176d3e19b9\": container with ID starting with 9b66120864df3196219880eda3f954ff63b12cccdf4e304f76c83d176d3e19b9 not found: ID does not exist" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.446200 4954 scope.go:117] "RemoveContainer" containerID="31234f4535422fb1bf7bada8c66937069f24dc6cdacdb390f9005698b7491575" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.475771 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-config-data" (OuterVolumeSpecName: "config-data") pod "5e602bfa-4a57-4fe6-adca-90acb13b7458" (UID: "5e602bfa-4a57-4fe6-adca-90acb13b7458"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.479505 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.486350 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e602bfa-4a57-4fe6-adca-90acb13b7458" (UID: "5e602bfa-4a57-4fe6-adca-90acb13b7458"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.490803 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="092ab1a2-b565-47cd-9b83-f306883b688e" path="/var/lib/kubelet/pods/092ab1a2-b565-47cd-9b83-f306883b688e/volumes" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.492329 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40b282e6-b847-4393-89f2-7844fce43388" path="/var/lib/kubelet/pods/40b282e6-b847-4393-89f2-7844fce43388/volumes" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.493513 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56254c3b-cd9d-40d9-bb7e-b1c858f3b87f" path="/var/lib/kubelet/pods/56254c3b-cd9d-40d9-bb7e-b1c858f3b87f/volumes" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.494335 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75a7f9b6-6582-4498-bccd-954270bc3f8e" path="/var/lib/kubelet/pods/75a7f9b6-6582-4498-bccd-954270bc3f8e/volumes" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.496194 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e" path="/var/lib/kubelet/pods/a69eb5c6-8a5b-4eb9-ad61-16cc3bec008e/volumes" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.588819 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.633250 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "5e602bfa-4a57-4fe6-adca-90acb13b7458" (UID: "5e602bfa-4a57-4fe6-adca-90acb13b7458"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.693089 4954 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.716066 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "5e602bfa-4a57-4fe6-adca-90acb13b7458" (UID: "5e602bfa-4a57-4fe6-adca-90acb13b7458"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.794893 4954 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e602bfa-4a57-4fe6-adca-90acb13b7458-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.881643 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-759b9cfd76-jp2pl" podUID="25414b25-a6cc-41e6-8360-3e85f54321d5" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:58862->10.217.0.165:9311: read: connection reset by peer" Dec 06 07:22:47 crc kubenswrapper[4954]: I1206 07:22:47.881659 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-759b9cfd76-jp2pl" podUID="25414b25-a6cc-41e6-8360-3e85f54321d5" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:58852->10.217.0.165:9311: read: connection reset by peer" Dec 06 07:22:48 crc kubenswrapper[4954]: E1206 07:22:48.283614 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="358a83794fb9c4d6619d2cfaee88e7ec329afae2443bcf0de2adee871931aa8f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 06 07:22:48 crc kubenswrapper[4954]: E1206 07:22:48.293082 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="358a83794fb9c4d6619d2cfaee88e7ec329afae2443bcf0de2adee871931aa8f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 06 07:22:48 crc kubenswrapper[4954]: E1206 07:22:48.301881 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="358a83794fb9c4d6619d2cfaee88e7ec329afae2443bcf0de2adee871931aa8f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 06 07:22:48 crc kubenswrapper[4954]: E1206 07:22:48.301976 4954 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="ca2306ed-dcff-4143-a452-9a209d0a46a1" containerName="nova-cell0-conductor-conductor" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.354772 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.354835 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.354851 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.354889 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.355787 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="938d0b4f-21d2-4972-8436-eb1fbd6db5bc" containerName="kube-state-metrics" containerID="cri-o://ff4a1927bb7fd5e4e0bc2af413f92d09c51402a69dd5aa94a639ab5dbab627ea" gracePeriod=30 Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.356190 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f6b0598-f49f-4300-a2e3-edb512001517" containerName="ceilometer-central-agent" containerID="cri-o://0231a48c4c2f1af85763f5b6c38a7336d6485526192523aeaa1890f0cad332a2" gracePeriod=30 Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.356551 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f6b0598-f49f-4300-a2e3-edb512001517" containerName="proxy-httpd" containerID="cri-o://872a2bac50f05d39969f244d52edcb2950bf78dcb8c8eeda153b5323ad19a271" gracePeriod=30 Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.356616 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f6b0598-f49f-4300-a2e3-edb512001517" containerName="ceilometer-notification-agent" containerID="cri-o://87f6b7dbad9cba2999b009687b2b8792fd5e80223d28635e9b8a19849f7be812" gracePeriod=30 Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.356856 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f6b0598-f49f-4300-a2e3-edb512001517" containerName="sg-core" containerID="cri-o://ec3ed71b2491ef41795c579a498091f87719a945dc7c7ab6d0235f2e1a17dee5" gracePeriod=30 Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.402115 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.507394 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" event={"ID":"f13e11c3-b93d-4671-b9a7-961ab83bd23e","Type":"ContainerDied","Data":"f6db7f340df6fbda501087af9dab6a6a1d0f26bd3fdf5cd8bb0e29db47e1856b"} Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.507483 4954 scope.go:117] "RemoveContainer" containerID="37978e335651ac4f0f09719f13e2d04c7acd8c2ac455f49f29118e1f83cda3a9" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.507634 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.514768 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.517852 4954 generic.go:334] "Generic (PLEG): container finished" podID="37a14211-fd70-4578-83c1-d674b2cf6172" containerID="dff634325c63563a5768825e16323ab3d990f36bc7c9fcb6773bf50f64fbe138" exitCode=0 Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.517937 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37a14211-fd70-4578-83c1-d674b2cf6172","Type":"ContainerDied","Data":"dff634325c63563a5768825e16323ab3d990f36bc7c9fcb6773bf50f64fbe138"} Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.556999 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f13e11c3-b93d-4671-b9a7-961ab83bd23e-logs\") pod \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.557740 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13e11c3-b93d-4671-b9a7-961ab83bd23e-combined-ca-bundle\") pod \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.563191 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f13e11c3-b93d-4671-b9a7-961ab83bd23e-config-data\") pod \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.563233 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmcsq\" (UniqueName: \"kubernetes.io/projected/f13e11c3-b93d-4671-b9a7-961ab83bd23e-kube-api-access-mmcsq\") pod \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.587966 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f13e11c3-b93d-4671-b9a7-961ab83bd23e-config-data-custom\") pod \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\" (UID: \"f13e11c3-b93d-4671-b9a7-961ab83bd23e\") " Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.605730 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f13e11c3-b93d-4671-b9a7-961ab83bd23e-logs" (OuterVolumeSpecName: "logs") pod "f13e11c3-b93d-4671-b9a7-961ab83bd23e" (UID: "f13e11c3-b93d-4671-b9a7-961ab83bd23e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.610831 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13e11c3-b93d-4671-b9a7-961ab83bd23e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f13e11c3-b93d-4671-b9a7-961ab83bd23e" (UID: "f13e11c3-b93d-4671-b9a7-961ab83bd23e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.613958 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.619806 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f13e11c3-b93d-4671-b9a7-961ab83bd23e-kube-api-access-mmcsq" (OuterVolumeSpecName: "kube-api-access-mmcsq") pod "f13e11c3-b93d-4671-b9a7-961ab83bd23e" (UID: "f13e11c3-b93d-4671-b9a7-961ab83bd23e"). InnerVolumeSpecName "kube-api-access-mmcsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.621344 4954 generic.go:334] "Generic (PLEG): container finished" podID="097befa0-58fe-4616-bed7-ada4f7d81ce3" containerID="85abce05f2197fc99761b6476531dcf92d1318db569f7b035a42686be23aca06" exitCode=0 Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.621424 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron251d-account-delete-rhsq5" event={"ID":"097befa0-58fe-4616-bed7-ada4f7d81ce3","Type":"ContainerDied","Data":"85abce05f2197fc99761b6476531dcf92d1318db569f7b035a42686be23aca06"} Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.624577 4954 generic.go:334] "Generic (PLEG): container finished" podID="693e40da-d019-421e-83a8-6dc351580607" containerID="7165148211c75cd9f81f468e8ae42da7229964e346f5de7e874bb757593ba1f9" exitCode=0 Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.624625 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican664f-account-delete-l67mj" event={"ID":"693e40da-d019-421e-83a8-6dc351580607","Type":"ContainerDied","Data":"7165148211c75cd9f81f468e8ae42da7229964e346f5de7e874bb757593ba1f9"} Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.625730 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13e11c3-b93d-4671-b9a7-961ab83bd23e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f13e11c3-b93d-4671-b9a7-961ab83bd23e" (UID: "f13e11c3-b93d-4671-b9a7-961ab83bd23e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.629418 4954 generic.go:334] "Generic (PLEG): container finished" podID="f84d04d2-6282-4a9c-89a8-3aa64ef22c74" containerID="6ab0f45e984c650aee9fc279116d7bc9e68e5fe5d6b74cb802fbb810f6d5de26" exitCode=0 Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.629475 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55bf8ff4-7brlv" event={"ID":"f84d04d2-6282-4a9c-89a8-3aa64ef22c74","Type":"ContainerDied","Data":"6ab0f45e984c650aee9fc279116d7bc9e68e5fe5d6b74cb802fbb810f6d5de26"} Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.663015 4954 generic.go:334] "Generic (PLEG): container finished" podID="c417ca1e-22df-4163-96f9-349df3d624e8" containerID="bbc38d953a02dab3938d603ccd3c213f1803459204492241ede64bcaeec6e808" exitCode=0 Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.663112 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi9e6e-account-delete-2shf2" event={"ID":"c417ca1e-22df-4163-96f9-349df3d624e8","Type":"ContainerDied","Data":"bbc38d953a02dab3938d603ccd3c213f1803459204492241ede64bcaeec6e808"} Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.670085 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="6f6b0598-f49f-4300-a2e3-edb512001517" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.202:3000/\": dial tcp 10.217.0.202:3000: connect: connection refused" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.687712 4954 generic.go:334] "Generic (PLEG): container finished" podID="25414b25-a6cc-41e6-8360-3e85f54321d5" containerID="01b096b10176462db3110ae80702f8ad1ade5c76a6f11bd639c695de9a47e99c" exitCode=0 Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.687788 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759b9cfd76-jp2pl" event={"ID":"25414b25-a6cc-41e6-8360-3e85f54321d5","Type":"ContainerDied","Data":"01b096b10176462db3110ae80702f8ad1ade5c76a6f11bd639c695de9a47e99c"} Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.703326 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmcsq\" (UniqueName: \"kubernetes.io/projected/f13e11c3-b93d-4671-b9a7-961ab83bd23e-kube-api-access-mmcsq\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.703352 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f13e11c3-b93d-4671-b9a7-961ab83bd23e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.703374 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f13e11c3-b93d-4671-b9a7-961ab83bd23e-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.703382 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13e11c3-b93d-4671-b9a7-961ab83bd23e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.727211 4954 generic.go:334] "Generic (PLEG): container finished" podID="d2ccdb46-97b8-40d8-aebb-5cf28cb6854d" containerID="47d681a97ad53060d226bf1814a05e625af235113bd664a53739e0decbc48a7d" exitCode=0 Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.727309 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderdf0e-account-delete-fsz84" event={"ID":"d2ccdb46-97b8-40d8-aebb-5cf28cb6854d","Type":"ContainerDied","Data":"47d681a97ad53060d226bf1814a05e625af235113bd664a53739e0decbc48a7d"} Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.759767 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13e11c3-b93d-4671-b9a7-961ab83bd23e-config-data" (OuterVolumeSpecName: "config-data") pod "f13e11c3-b93d-4671-b9a7-961ab83bd23e" (UID: "f13e11c3-b93d-4671-b9a7-961ab83bd23e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.773883 4954 generic.go:334] "Generic (PLEG): container finished" podID="81a84a0f-fa51-461c-a281-6b832ad39aa7" containerID="3d7db18b93f40985771a850dfbb60c46bb605a815e84de1215841e3e8caefc04" exitCode=0 Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.773993 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0f048-account-delete-6rspl" event={"ID":"81a84a0f-fa51-461c-a281-6b832ad39aa7","Type":"ContainerDied","Data":"3d7db18b93f40985771a850dfbb60c46bb605a815e84de1215841e3e8caefc04"} Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.785899 4954 generic.go:334] "Generic (PLEG): container finished" podID="432b9d93-b045-4e25-b58b-b3a6fd8512c4" containerID="d14ca933ddb6a12f4399c39d7a1279cb6dd7f6ad978226ac72fa861be2573071" exitCode=0 Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.785918 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"432b9d93-b045-4e25-b58b-b3a6fd8512c4","Type":"ContainerDied","Data":"d14ca933ddb6a12f4399c39d7a1279cb6dd7f6ad978226ac72fa861be2573071"} Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.807951 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f13e11c3-b93d-4671-b9a7-961ab83bd23e-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.823411 4954 generic.go:334] "Generic (PLEG): container finished" podID="418e92aa-4713-4a55-b4d5-650587fcb6ca" containerID="ea6bce84684f9d9646ab01dc7bd2553bf2cde9261c87d0499c52ccf5074334f4" exitCode=0 Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.824916 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement88ab-account-delete-597b8" event={"ID":"418e92aa-4713-4a55-b4d5-650587fcb6ca","Type":"ContainerDied","Data":"ea6bce84684f9d9646ab01dc7bd2553bf2cde9261c87d0499c52ccf5074334f4"} Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.854100 4954 generic.go:334] "Generic (PLEG): container finished" podID="c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" containerID="5c5b57148f52d4e8a3969fbcc3484fc2918682cc67900402ffb917e03848f564" exitCode=0 Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.854163 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe","Type":"ContainerDied","Data":"5c5b57148f52d4e8a3969fbcc3484fc2918682cc67900402ffb917e03848f564"} Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.873708 4954 generic.go:334] "Generic (PLEG): container finished" podID="069bb9c2-be47-445b-ba0e-f32a7db0b96e" containerID="b55c4714520851c4ac11e12722e037b67a35d180952a2ddf1903aaa5172b8c72" exitCode=0 Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.876799 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74985f58f-cpdl4" event={"ID":"069bb9c2-be47-445b-ba0e-f32a7db0b96e","Type":"ContainerDied","Data":"b55c4714520851c4ac11e12722e037b67a35d180952a2ddf1903aaa5172b8c72"} Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.887167 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="938d0b4f-21d2-4972-8436-eb1fbd6db5bc" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.171:8081/readyz\": dial tcp 10.217.0.171:8081: connect: connection refused" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.890173 4954 generic.go:334] "Generic (PLEG): container finished" podID="c75e5da0-fd73-485e-b53a-b5e96965bb99" containerID="6880fda36024a9648a750eeda330087a75c544818aceab1279c1dd4c9af76cd2" exitCode=0 Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.890407 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c75e5da0-fd73-485e-b53a-b5e96965bb99","Type":"ContainerDied","Data":"6880fda36024a9648a750eeda330087a75c544818aceab1279c1dd4c9af76cd2"} Dec 06 07:22:48 crc kubenswrapper[4954]: E1206 07:22:48.900767 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3441df07b213b831010c3f148e9250178d2250f974a1c6530c3c0990a93f5d6f is running failed: container process not found" containerID="3441df07b213b831010c3f148e9250178d2250f974a1c6530c3c0990a93f5d6f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 07:22:48 crc kubenswrapper[4954]: E1206 07:22:48.910165 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3441df07b213b831010c3f148e9250178d2250f974a1c6530c3c0990a93f5d6f is running failed: container process not found" containerID="3441df07b213b831010c3f148e9250178d2250f974a1c6530c3c0990a93f5d6f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 07:22:48 crc kubenswrapper[4954]: E1206 07:22:48.912897 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3441df07b213b831010c3f148e9250178d2250f974a1c6530c3c0990a93f5d6f is running failed: container process not found" containerID="3441df07b213b831010c3f148e9250178d2250f974a1c6530c3c0990a93f5d6f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 07:22:48 crc kubenswrapper[4954]: E1206 07:22:48.912976 4954 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3441df07b213b831010c3f148e9250178d2250f974a1c6530c3c0990a93f5d6f is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="7924fc4f-0ab9-4805-8d77-3a1fe2953fe1" containerName="nova-scheduler-scheduler" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.951839 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 07:22:48 crc kubenswrapper[4954]: E1206 07:22:48.966641 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9874b322d9222424c03e259859ca289d93dd364308738ff78e86095855f8b6e6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 06 07:22:48 crc kubenswrapper[4954]: E1206 07:22:48.968738 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9874b322d9222424c03e259859ca289d93dd364308738ff78e86095855f8b6e6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 06 07:22:48 crc kubenswrapper[4954]: E1206 07:22:48.972494 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9874b322d9222424c03e259859ca289d93dd364308738ff78e86095855f8b6e6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 06 07:22:48 crc kubenswrapper[4954]: E1206 07:22:48.972548 4954 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="6de2aad5-fb15-489c-b0fc-200e18ad3baa" containerName="ovn-northd" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.980088 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.988888 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 06 07:22:48 crc kubenswrapper[4954]: I1206 07:22:48.989125 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="4a82ba0f-bb07-4959-bfdd-8a420c617835" containerName="memcached" containerID="cri-o://d27d5d9f536cc22c14abfb14b6f38b3f0d61e96cf9b23ee50d20f02267f9999f" gracePeriod=30 Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.019238 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk"] Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.019792 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.019872 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-config-data-generated\") pod \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.019897 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-combined-ca-bundle\") pod \"37a14211-fd70-4578-83c1-d674b2cf6172\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.030782 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "f9eaecc6-8e85-432e-a906-3fcecee9fc1d" (UID: "f9eaecc6-8e85-432e-a906-3fcecee9fc1d"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.057005 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-7d7bb8bff8-tdqdk"] Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.059864 4954 scope.go:117] "RemoveContainer" containerID="0633e2818de57c77f8cbe3341871b6c9295418f75eb7371e9ba35f08bc900d92" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.067955 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.068704 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.111648 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.120940 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzg28\" (UniqueName: \"kubernetes.io/projected/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-kube-api-access-jzg28\") pod \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.121057 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-operator-scripts\") pod \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.121096 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-config-data-default\") pod \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.121129 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-combined-ca-bundle\") pod \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.121187 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37a14211-fd70-4578-83c1-d674b2cf6172-httpd-run\") pod \"37a14211-fd70-4578-83c1-d674b2cf6172\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.121215 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wkrf\" (UniqueName: \"kubernetes.io/projected/37a14211-fd70-4578-83c1-d674b2cf6172-kube-api-access-2wkrf\") pod \"37a14211-fd70-4578-83c1-d674b2cf6172\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.121254 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-galera-tls-certs\") pod \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.121288 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-config-data\") pod \"37a14211-fd70-4578-83c1-d674b2cf6172\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.121316 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a14211-fd70-4578-83c1-d674b2cf6172-logs\") pod \"37a14211-fd70-4578-83c1-d674b2cf6172\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.121333 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-scripts\") pod \"37a14211-fd70-4578-83c1-d674b2cf6172\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.121386 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-kolla-config\") pod \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\" (UID: \"f9eaecc6-8e85-432e-a906-3fcecee9fc1d\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.121420 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"37a14211-fd70-4578-83c1-d674b2cf6172\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.121453 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-public-tls-certs\") pod \"37a14211-fd70-4578-83c1-d674b2cf6172\" (UID: \"37a14211-fd70-4578-83c1-d674b2cf6172\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.121840 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.125057 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37a14211-fd70-4578-83c1-d674b2cf6172-logs" (OuterVolumeSpecName: "logs") pod "37a14211-fd70-4578-83c1-d674b2cf6172" (UID: "37a14211-fd70-4578-83c1-d674b2cf6172"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.126986 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9eaecc6-8e85-432e-a906-3fcecee9fc1d" (UID: "f9eaecc6-8e85-432e-a906-3fcecee9fc1d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.133755 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "f9eaecc6-8e85-432e-a906-3fcecee9fc1d" (UID: "f9eaecc6-8e85-432e-a906-3fcecee9fc1d"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.143201 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.150507 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "f9eaecc6-8e85-432e-a906-3fcecee9fc1d" (UID: "f9eaecc6-8e85-432e-a906-3fcecee9fc1d"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.153842 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37a14211-fd70-4578-83c1-d674b2cf6172-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "37a14211-fd70-4578-83c1-d674b2cf6172" (UID: "37a14211-fd70-4578-83c1-d674b2cf6172"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.170865 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-wcw8k"] Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.208574 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-scripts" (OuterVolumeSpecName: "scripts") pod "37a14211-fd70-4578-83c1-d674b2cf6172" (UID: "37a14211-fd70-4578-83c1-d674b2cf6172"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.208669 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystoneb67a-account-delete-lc9v4"] Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.208953 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37a14211-fd70-4578-83c1-d674b2cf6172-kube-api-access-2wkrf" (OuterVolumeSpecName: "kube-api-access-2wkrf") pod "37a14211-fd70-4578-83c1-d674b2cf6172" (UID: "37a14211-fd70-4578-83c1-d674b2cf6172"). InnerVolumeSpecName "kube-api-access-2wkrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.210039 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9eaecc6-8e85-432e-a906-3fcecee9fc1d" containerName="mysql-bootstrap" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.210116 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9eaecc6-8e85-432e-a906-3fcecee9fc1d" containerName="mysql-bootstrap" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.210177 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75a7f9b6-6582-4498-bccd-954270bc3f8e" containerName="openstack-network-exporter" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.210224 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a7f9b6-6582-4498-bccd-954270bc3f8e" containerName="openstack-network-exporter" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.210274 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b282e6-b847-4393-89f2-7844fce43388" containerName="openstack-network-exporter" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.210322 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b282e6-b847-4393-89f2-7844fce43388" containerName="openstack-network-exporter" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.210389 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="069bb9c2-be47-445b-ba0e-f32a7db0b96e" containerName="proxy-server" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.210437 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="069bb9c2-be47-445b-ba0e-f32a7db0b96e" containerName="proxy-server" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.210489 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e602bfa-4a57-4fe6-adca-90acb13b7458" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.210634 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e602bfa-4a57-4fe6-adca-90acb13b7458" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.210697 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b282e6-b847-4393-89f2-7844fce43388" containerName="ovsdbserver-sb" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.210744 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b282e6-b847-4393-89f2-7844fce43388" containerName="ovsdbserver-sb" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.210800 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f13e11c3-b93d-4671-b9a7-961ab83bd23e" containerName="barbican-keystone-listener" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.210853 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f13e11c3-b93d-4671-b9a7-961ab83bd23e" containerName="barbican-keystone-listener" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.210901 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc34e079-7411-438c-aca8-b2c95854158e" containerName="nova-cell1-conductor-conductor" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.210948 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc34e079-7411-438c-aca8-b2c95854158e" containerName="nova-cell1-conductor-conductor" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.210999 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79dc6de3-cf27-4c1d-91c5-f922acd48400" containerName="glance-log" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.211043 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="79dc6de3-cf27-4c1d-91c5-f922acd48400" containerName="glance-log" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.211164 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" containerName="nova-api-log" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.211212 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" containerName="nova-api-log" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.211260 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a14211-fd70-4578-83c1-d674b2cf6172" containerName="glance-log" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.211305 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a14211-fd70-4578-83c1-d674b2cf6172" containerName="glance-log" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.211359 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56254c3b-cd9d-40d9-bb7e-b1c858f3b87f" containerName="init" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.211404 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="56254c3b-cd9d-40d9-bb7e-b1c858f3b87f" containerName="init" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.211471 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" containerName="nova-api-api" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.211523 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" containerName="nova-api-api" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.211591 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="092ab1a2-b565-47cd-9b83-f306883b688e" containerName="ovn-controller" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.211798 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="092ab1a2-b565-47cd-9b83-f306883b688e" containerName="ovn-controller" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.212011 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c75e5da0-fd73-485e-b53a-b5e96965bb99" containerName="cinder-api" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.217758 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75e5da0-fd73-485e-b53a-b5e96965bb99" containerName="cinder-api" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.217944 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="069bb9c2-be47-445b-ba0e-f32a7db0b96e" containerName="proxy-httpd" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.218025 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="069bb9c2-be47-445b-ba0e-f32a7db0b96e" containerName="proxy-httpd" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.218083 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56254c3b-cd9d-40d9-bb7e-b1c858f3b87f" containerName="dnsmasq-dns" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.218130 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="56254c3b-cd9d-40d9-bb7e-b1c858f3b87f" containerName="dnsmasq-dns" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.218184 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c75e5da0-fd73-485e-b53a-b5e96965bb99" containerName="cinder-api-log" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.218234 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75e5da0-fd73-485e-b53a-b5e96965bb99" containerName="cinder-api-log" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.218283 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9eaecc6-8e85-432e-a906-3fcecee9fc1d" containerName="galera" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.218332 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9eaecc6-8e85-432e-a906-3fcecee9fc1d" containerName="galera" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.218392 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f13e11c3-b93d-4671-b9a7-961ab83bd23e" containerName="barbican-keystone-listener-log" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.218438 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f13e11c3-b93d-4671-b9a7-961ab83bd23e" containerName="barbican-keystone-listener-log" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.218484 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a14211-fd70-4578-83c1-d674b2cf6172" containerName="glance-httpd" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.218528 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a14211-fd70-4578-83c1-d674b2cf6172" containerName="glance-httpd" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.218622 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79dc6de3-cf27-4c1d-91c5-f922acd48400" containerName="glance-httpd" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.218674 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="79dc6de3-cf27-4c1d-91c5-f922acd48400" containerName="glance-httpd" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.218740 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" containerName="ovsdbserver-nb" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.218787 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" containerName="ovsdbserver-nb" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.218841 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" containerName="openstack-network-exporter" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.219718 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" containerName="openstack-network-exporter" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.220189 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="79dc6de3-cf27-4c1d-91c5-f922acd48400" containerName="glance-log" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.220260 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c75e5da0-fd73-485e-b53a-b5e96965bb99" containerName="cinder-api-log" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.220322 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c75e5da0-fd73-485e-b53a-b5e96965bb99" containerName="cinder-api" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.220373 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" containerName="ovsdbserver-nb" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.221610 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="37a14211-fd70-4578-83c1-d674b2cf6172" containerName="glance-httpd" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.221759 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e602bfa-4a57-4fe6-adca-90acb13b7458" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.221874 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="069bb9c2-be47-445b-ba0e-f32a7db0b96e" containerName="proxy-server" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.221956 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f13e11c3-b93d-4671-b9a7-961ab83bd23e" containerName="barbican-keystone-listener" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.222044 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b282e6-b847-4393-89f2-7844fce43388" containerName="ovsdbserver-sb" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.222097 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f13e11c3-b93d-4671-b9a7-961ab83bd23e" containerName="barbican-keystone-listener-log" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.222144 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="069bb9c2-be47-445b-ba0e-f32a7db0b96e" containerName="proxy-httpd" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.222190 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" containerName="openstack-network-exporter" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.222239 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="75a7f9b6-6582-4498-bccd-954270bc3f8e" containerName="openstack-network-exporter" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.222289 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="79dc6de3-cf27-4c1d-91c5-f922acd48400" containerName="glance-httpd" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.222346 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" containerName="nova-api-api" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.222396 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="37a14211-fd70-4578-83c1-d674b2cf6172" containerName="glance-log" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.222447 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" containerName="nova-api-log" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.222494 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc34e079-7411-438c-aca8-b2c95854158e" containerName="nova-cell1-conductor-conductor" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.222541 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="56254c3b-cd9d-40d9-bb7e-b1c858f3b87f" containerName="dnsmasq-dns" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.222603 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="092ab1a2-b565-47cd-9b83-f306883b688e" containerName="ovn-controller" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.222665 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9eaecc6-8e85-432e-a906-3fcecee9fc1d" containerName="galera" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.222725 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b282e6-b847-4393-89f2-7844fce43388" containerName="openstack-network-exporter" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.226995 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-wcw8k"] Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.227202 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystoneb67a-account-delete-lc9v4" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.235473 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-config-data\") pod \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.243864 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/069bb9c2-be47-445b-ba0e-f32a7db0b96e-log-httpd\") pod \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.243983 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-internal-tls-certs\") pod \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.244055 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79dc6de3-cf27-4c1d-91c5-f922acd48400-logs\") pod \"79dc6de3-cf27-4c1d-91c5-f922acd48400\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.246857 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"79dc6de3-cf27-4c1d-91c5-f922acd48400\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.247210 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-internal-tls-certs\") pod \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.247383 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-public-tls-certs\") pod \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.247711 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-config-data\") pod \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.248025 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-combined-ca-bundle\") pod \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.248119 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-config-data\") pod \"79dc6de3-cf27-4c1d-91c5-f922acd48400\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.248213 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt9jw\" (UniqueName: \"kubernetes.io/projected/79dc6de3-cf27-4c1d-91c5-f922acd48400-kube-api-access-xt9jw\") pod \"79dc6de3-cf27-4c1d-91c5-f922acd48400\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.248359 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-combined-ca-bundle\") pod \"79dc6de3-cf27-4c1d-91c5-f922acd48400\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.248803 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-scripts\") pod \"79dc6de3-cf27-4c1d-91c5-f922acd48400\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.249149 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79dc6de3-cf27-4c1d-91c5-f922acd48400-httpd-run\") pod \"79dc6de3-cf27-4c1d-91c5-f922acd48400\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.249310 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/069bb9c2-be47-445b-ba0e-f32a7db0b96e-etc-swift\") pod \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.249339 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-logs\") pod \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.249366 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46m7r\" (UniqueName: \"kubernetes.io/projected/c75e5da0-fd73-485e-b53a-b5e96965bb99-kube-api-access-46m7r\") pod \"c75e5da0-fd73-485e-b53a-b5e96965bb99\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.249411 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-internal-tls-certs\") pod \"79dc6de3-cf27-4c1d-91c5-f922acd48400\" (UID: \"79dc6de3-cf27-4c1d-91c5-f922acd48400\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.249440 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw8mq\" (UniqueName: \"kubernetes.io/projected/069bb9c2-be47-445b-ba0e-f32a7db0b96e-kube-api-access-nw8mq\") pod \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.249495 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/069bb9c2-be47-445b-ba0e-f32a7db0b96e-run-httpd\") pod \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.249521 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-combined-ca-bundle\") pod \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\" (UID: \"069bb9c2-be47-445b-ba0e-f32a7db0b96e\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.249552 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhspz\" (UniqueName: \"kubernetes.io/projected/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-kube-api-access-mhspz\") pod \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.249609 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-config-data-custom\") pod \"c75e5da0-fd73-485e-b53a-b5e96965bb99\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.249643 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-combined-ca-bundle\") pod \"c75e5da0-fd73-485e-b53a-b5e96965bb99\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.249674 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-public-tls-certs\") pod \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\" (UID: \"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.245952 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79dc6de3-cf27-4c1d-91c5-f922acd48400-logs" (OuterVolumeSpecName: "logs") pod "79dc6de3-cf27-4c1d-91c5-f922acd48400" (UID: "79dc6de3-cf27-4c1d-91c5-f922acd48400"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.252699 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44jth\" (UniqueName: \"kubernetes.io/projected/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-kube-api-access-44jth\") pod \"keystoneb67a-account-delete-lc9v4\" (UID: \"2a9a9a20-3eb3-4fbe-9bab-18f69f01399f\") " pod="openstack/keystoneb67a-account-delete-lc9v4" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.252835 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-operator-scripts\") pod \"keystoneb67a-account-delete-lc9v4\" (UID: \"2a9a9a20-3eb3-4fbe-9bab-18f69f01399f\") " pod="openstack/keystoneb67a-account-delete-lc9v4" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.255251 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/069bb9c2-be47-445b-ba0e-f32a7db0b96e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "069bb9c2-be47-445b-ba0e-f32a7db0b96e" (UID: "069bb9c2-be47-445b-ba0e-f32a7db0b96e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.255383 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-scripts" (OuterVolumeSpecName: "scripts") pod "79dc6de3-cf27-4c1d-91c5-f922acd48400" (UID: "79dc6de3-cf27-4c1d-91c5-f922acd48400"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.258505 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "37a14211-fd70-4578-83c1-d674b2cf6172" (UID: "37a14211-fd70-4578-83c1-d674b2cf6172"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.258866 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.262388 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-logs" (OuterVolumeSpecName: "logs") pod "c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" (UID: "c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.268381 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/069bb9c2-be47-445b-ba0e-f32a7db0b96e-kube-api-access-nw8mq" (OuterVolumeSpecName: "kube-api-access-nw8mq") pod "069bb9c2-be47-445b-ba0e-f32a7db0b96e" (UID: "069bb9c2-be47-445b-ba0e-f32a7db0b96e"). InnerVolumeSpecName "kube-api-access-nw8mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.268460 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79dc6de3-cf27-4c1d-91c5-f922acd48400-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "79dc6de3-cf27-4c1d-91c5-f922acd48400" (UID: "79dc6de3-cf27-4c1d-91c5-f922acd48400"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.268855 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.268874 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.268886 4954 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37a14211-fd70-4578-83c1-d674b2cf6172-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.268896 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.268909 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wkrf\" (UniqueName: \"kubernetes.io/projected/37a14211-fd70-4578-83c1-d674b2cf6172-kube-api-access-2wkrf\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.268919 4954 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79dc6de3-cf27-4c1d-91c5-f922acd48400-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.268927 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.268936 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw8mq\" (UniqueName: \"kubernetes.io/projected/069bb9c2-be47-445b-ba0e-f32a7db0b96e-kube-api-access-nw8mq\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.268944 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a14211-fd70-4578-83c1-d674b2cf6172-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.268954 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.268962 4954 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.268986 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.268995 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/069bb9c2-be47-445b-ba0e-f32a7db0b96e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.269006 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79dc6de3-cf27-4c1d-91c5-f922acd48400-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.271015 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/069bb9c2-be47-445b-ba0e-f32a7db0b96e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "069bb9c2-be47-445b-ba0e-f32a7db0b96e" (UID: "069bb9c2-be47-445b-ba0e-f32a7db0b96e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.273946 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-kube-api-access-jzg28" (OuterVolumeSpecName: "kube-api-access-jzg28") pod "f9eaecc6-8e85-432e-a906-3fcecee9fc1d" (UID: "f9eaecc6-8e85-432e-a906-3fcecee9fc1d"). InnerVolumeSpecName "kube-api-access-jzg28". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.275126 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.282909 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/069bb9c2-be47-445b-ba0e-f32a7db0b96e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "069bb9c2-be47-445b-ba0e-f32a7db0b96e" (UID: "069bb9c2-be47-445b-ba0e-f32a7db0b96e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.285863 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "f9eaecc6-8e85-432e-a906-3fcecee9fc1d" (UID: "f9eaecc6-8e85-432e-a906-3fcecee9fc1d"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.285924 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79dc6de3-cf27-4c1d-91c5-f922acd48400-kube-api-access-xt9jw" (OuterVolumeSpecName: "kube-api-access-xt9jw") pod "79dc6de3-cf27-4c1d-91c5-f922acd48400" (UID: "79dc6de3-cf27-4c1d-91c5-f922acd48400"). InnerVolumeSpecName "kube-api-access-xt9jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.299717 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "79dc6de3-cf27-4c1d-91c5-f922acd48400" (UID: "79dc6de3-cf27-4c1d-91c5-f922acd48400"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.300763 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-4vvr4"] Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.316938 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c75e5da0-fd73-485e-b53a-b5e96965bb99-kube-api-access-46m7r" (OuterVolumeSpecName: "kube-api-access-46m7r") pod "c75e5da0-fd73-485e-b53a-b5e96965bb99" (UID: "c75e5da0-fd73-485e-b53a-b5e96965bb99"). InnerVolumeSpecName "kube-api-access-46m7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.326703 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c75e5da0-fd73-485e-b53a-b5e96965bb99" (UID: "c75e5da0-fd73-485e-b53a-b5e96965bb99"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.326928 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-kube-api-access-mhspz" (OuterVolumeSpecName: "kube-api-access-mhspz") pod "c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" (UID: "c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe"). InnerVolumeSpecName "kube-api-access-mhspz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.328277 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.346527 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37a14211-fd70-4578-83c1-d674b2cf6172" (UID: "37a14211-fd70-4578-83c1-d674b2cf6172"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.371063 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-scripts\") pod \"c75e5da0-fd73-485e-b53a-b5e96965bb99\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.371300 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b907c888-706a-4183-b581-ff7b4742fc74-combined-ca-bundle\") pod \"b907c888-706a-4183-b581-ff7b4742fc74\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.371406 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b907c888-706a-4183-b581-ff7b4742fc74-logs\") pod \"b907c888-706a-4183-b581-ff7b4742fc74\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.371504 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b907c888-706a-4183-b581-ff7b4742fc74-config-data-custom\") pod \"b907c888-706a-4183-b581-ff7b4742fc74\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.371617 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-logs\") pod \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.371699 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-config-data\") pod \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.371899 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c75e5da0-fd73-485e-b53a-b5e96965bb99-etc-machine-id\") pod \"c75e5da0-fd73-485e-b53a-b5e96965bb99\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.372022 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-config-data\") pod \"c75e5da0-fd73-485e-b53a-b5e96965bb99\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.372094 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvdnt\" (UniqueName: \"kubernetes.io/projected/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-kube-api-access-fvdnt\") pod \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.372192 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-internal-tls-certs\") pod \"c75e5da0-fd73-485e-b53a-b5e96965bb99\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.372259 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-internal-tls-certs\") pod \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.372358 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-public-tls-certs\") pod \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.372424 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-combined-ca-bundle\") pod \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.372486 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4jb8\" (UniqueName: \"kubernetes.io/projected/b907c888-706a-4183-b581-ff7b4742fc74-kube-api-access-n4jb8\") pod \"b907c888-706a-4183-b581-ff7b4742fc74\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.372553 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b907c888-706a-4183-b581-ff7b4742fc74-config-data\") pod \"b907c888-706a-4183-b581-ff7b4742fc74\" (UID: \"b907c888-706a-4183-b581-ff7b4742fc74\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.372638 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-scripts\") pod \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\" (UID: \"f84d04d2-6282-4a9c-89a8-3aa64ef22c74\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.372709 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c75e5da0-fd73-485e-b53a-b5e96965bb99-logs\") pod \"c75e5da0-fd73-485e-b53a-b5e96965bb99\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.374395 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-public-tls-certs\") pod \"c75e5da0-fd73-485e-b53a-b5e96965bb99\" (UID: \"c75e5da0-fd73-485e-b53a-b5e96965bb99\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.374835 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44jth\" (UniqueName: \"kubernetes.io/projected/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-kube-api-access-44jth\") pod \"keystoneb67a-account-delete-lc9v4\" (UID: \"2a9a9a20-3eb3-4fbe-9bab-18f69f01399f\") " pod="openstack/keystoneb67a-account-delete-lc9v4" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.374954 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-operator-scripts\") pod \"keystoneb67a-account-delete-lc9v4\" (UID: \"2a9a9a20-3eb3-4fbe-9bab-18f69f01399f\") " pod="openstack/keystoneb67a-account-delete-lc9v4" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.375088 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhspz\" (UniqueName: \"kubernetes.io/projected/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-kube-api-access-mhspz\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.375073 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c75e5da0-fd73-485e-b53a-b5e96965bb99-logs" (OuterVolumeSpecName: "logs") pod "c75e5da0-fd73-485e-b53a-b5e96965bb99" (UID: "c75e5da0-fd73-485e-b53a-b5e96965bb99"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.375144 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.375289 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.375308 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzg28\" (UniqueName: \"kubernetes.io/projected/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-kube-api-access-jzg28\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.375390 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.375410 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.375451 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.375464 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt9jw\" (UniqueName: \"kubernetes.io/projected/79dc6de3-cf27-4c1d-91c5-f922acd48400-kube-api-access-xt9jw\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.375475 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/069bb9c2-be47-445b-ba0e-f32a7db0b96e-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.375486 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46m7r\" (UniqueName: \"kubernetes.io/projected/c75e5da0-fd73-485e-b53a-b5e96965bb99-kube-api-access-46m7r\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.375523 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/069bb9c2-be47-445b-ba0e-f32a7db0b96e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.377910 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.380770 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-logs" (OuterVolumeSpecName: "logs") pod "f84d04d2-6282-4a9c-89a8-3aa64ef22c74" (UID: "f84d04d2-6282-4a9c-89a8-3aa64ef22c74"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.381854 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b907c888-706a-4183-b581-ff7b4742fc74-logs" (OuterVolumeSpecName: "logs") pod "b907c888-706a-4183-b581-ff7b4742fc74" (UID: "b907c888-706a-4183-b581-ff7b4742fc74"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.382710 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c75e5da0-fd73-485e-b53a-b5e96965bb99-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c75e5da0-fd73-485e-b53a-b5e96965bb99" (UID: "c75e5da0-fd73-485e-b53a-b5e96965bb99"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.387760 4954 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.387844 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-operator-scripts podName:2a9a9a20-3eb3-4fbe-9bab-18f69f01399f nodeName:}" failed. No retries permitted until 2025-12-06 07:22:49.887817729 +0000 UTC m=+1544.701177118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-operator-scripts") pod "keystoneb67a-account-delete-lc9v4" (UID: "2a9a9a20-3eb3-4fbe-9bab-18f69f01399f") : configmap "openstack-scripts" not found Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.392204 4954 projected.go:194] Error preparing data for projected volume kube-api-access-44jth for pod openstack/keystoneb67a-account-delete-lc9v4: failed to fetch token: serviceaccounts "galera-openstack" not found Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.392288 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-kube-api-access-44jth podName:2a9a9a20-3eb3-4fbe-9bab-18f69f01399f nodeName:}" failed. No retries permitted until 2025-12-06 07:22:49.892263919 +0000 UTC m=+1544.705623308 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-44jth" (UniqueName: "kubernetes.io/projected/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-kube-api-access-44jth") pod "keystoneb67a-account-delete-lc9v4" (UID: "2a9a9a20-3eb3-4fbe-9bab-18f69f01399f") : failed to fetch token: serviceaccounts "galera-openstack" not found Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.414310 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.423390 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-kube-api-access-fvdnt" (OuterVolumeSpecName: "kube-api-access-fvdnt") pod "f84d04d2-6282-4a9c-89a8-3aa64ef22c74" (UID: "f84d04d2-6282-4a9c-89a8-3aa64ef22c74"). InnerVolumeSpecName "kube-api-access-fvdnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.423928 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b907c888-706a-4183-b581-ff7b4742fc74-kube-api-access-n4jb8" (OuterVolumeSpecName: "kube-api-access-n4jb8") pod "b907c888-706a-4183-b581-ff7b4742fc74" (UID: "b907c888-706a-4183-b581-ff7b4742fc74"). InnerVolumeSpecName "kube-api-access-n4jb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.425271 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-scripts" (OuterVolumeSpecName: "scripts") pod "c75e5da0-fd73-485e-b53a-b5e96965bb99" (UID: "c75e5da0-fd73-485e-b53a-b5e96965bb99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.429118 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-4vvr4"] Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.445145 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.446337 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.455977 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b907c888-706a-4183-b581-ff7b4742fc74-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b907c888-706a-4183-b581-ff7b4742fc74" (UID: "b907c888-706a-4183-b581-ff7b4742fc74"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.458799 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4374b249-a0ad-4b36-9924-fa871e0e63fa" path="/var/lib/kubelet/pods/4374b249-a0ad-4b36-9924-fa871e0e63fa/volumes" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.460555 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e602bfa-4a57-4fe6-adca-90acb13b7458" path="/var/lib/kubelet/pods/5e602bfa-4a57-4fe6-adca-90acb13b7458/volumes" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.461325 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc34e079-7411-438c-aca8-b2c95854158e" path="/var/lib/kubelet/pods/dc34e079-7411-438c-aca8-b2c95854158e/volumes" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.462632 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f13e11c3-b93d-4671-b9a7-961ab83bd23e" path="/var/lib/kubelet/pods/f13e11c3-b93d-4671-b9a7-961ab83bd23e/volumes" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.463415 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3" path="/var/lib/kubelet/pods/fb562ee5-7fe2-4a1e-b412-1e4ac705c8a3/volumes" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.476680 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/432b9d93-b045-4e25-b58b-b3a6fd8512c4-nova-metadata-tls-certs\") pod \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.478291 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/432b9d93-b045-4e25-b58b-b3a6fd8512c4-logs\") pod \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.478463 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-config-data-custom\") pod \"25414b25-a6cc-41e6-8360-3e85f54321d5\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.478637 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/432b9d93-b045-4e25-b58b-b3a6fd8512c4-config-data\") pod \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.478714 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25414b25-a6cc-41e6-8360-3e85f54321d5-logs\") pod \"25414b25-a6cc-41e6-8360-3e85f54321d5\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.478832 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-combined-ca-bundle\") pod \"25414b25-a6cc-41e6-8360-3e85f54321d5\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.478965 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdcp8\" (UniqueName: \"kubernetes.io/projected/432b9d93-b045-4e25-b58b-b3a6fd8512c4-kube-api-access-jdcp8\") pod \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.479200 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-public-tls-certs\") pod \"25414b25-a6cc-41e6-8360-3e85f54321d5\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.479512 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5jfl\" (UniqueName: \"kubernetes.io/projected/25414b25-a6cc-41e6-8360-3e85f54321d5-kube-api-access-h5jfl\") pod \"25414b25-a6cc-41e6-8360-3e85f54321d5\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.479735 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-config-data\") pod \"25414b25-a6cc-41e6-8360-3e85f54321d5\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.479833 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-internal-tls-certs\") pod \"25414b25-a6cc-41e6-8360-3e85f54321d5\" (UID: \"25414b25-a6cc-41e6-8360-3e85f54321d5\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.486340 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/432b9d93-b045-4e25-b58b-b3a6fd8512c4-combined-ca-bundle\") pod \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\" (UID: \"432b9d93-b045-4e25-b58b-b3a6fd8512c4\") " Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.487738 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b907c888-706a-4183-b581-ff7b4742fc74-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.487818 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b907c888-706a-4183-b581-ff7b4742fc74-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.487874 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.490534 4954 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c75e5da0-fd73-485e-b53a-b5e96965bb99-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.491736 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvdnt\" (UniqueName: \"kubernetes.io/projected/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-kube-api-access-fvdnt\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.491782 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4jb8\" (UniqueName: \"kubernetes.io/projected/b907c888-706a-4183-b581-ff7b4742fc74-kube-api-access-n4jb8\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.491797 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c75e5da0-fd73-485e-b53a-b5e96965bb99-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.491811 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.479632 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/432b9d93-b045-4e25-b58b-b3a6fd8512c4-logs" (OuterVolumeSpecName: "logs") pod "432b9d93-b045-4e25-b58b-b3a6fd8512c4" (UID: "432b9d93-b045-4e25-b58b-b3a6fd8512c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.482816 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-scripts" (OuterVolumeSpecName: "scripts") pod "f84d04d2-6282-4a9c-89a8-3aa64ef22c74" (UID: "f84d04d2-6282-4a9c-89a8-3aa64ef22c74"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.483435 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25414b25-a6cc-41e6-8360-3e85f54321d5-logs" (OuterVolumeSpecName: "logs") pod "25414b25-a6cc-41e6-8360-3e85f54321d5" (UID: "25414b25-a6cc-41e6-8360-3e85f54321d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.484624 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/432b9d93-b045-4e25-b58b-b3a6fd8512c4-kube-api-access-jdcp8" (OuterVolumeSpecName: "kube-api-access-jdcp8") pod "432b9d93-b045-4e25-b58b-b3a6fd8512c4" (UID: "432b9d93-b045-4e25-b58b-b3a6fd8512c4"). InnerVolumeSpecName "kube-api-access-jdcp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.515593 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.529940 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "25414b25-a6cc-41e6-8360-3e85f54321d5" (UID: "25414b25-a6cc-41e6-8360-3e85f54321d5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.563757 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9eaecc6-8e85-432e-a906-3fcecee9fc1d" (UID: "f9eaecc6-8e85-432e-a906-3fcecee9fc1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.563898 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25414b25-a6cc-41e6-8360-3e85f54321d5-kube-api-access-h5jfl" (OuterVolumeSpecName: "kube-api-access-h5jfl") pod "25414b25-a6cc-41e6-8360-3e85f54321d5" (UID: "25414b25-a6cc-41e6-8360-3e85f54321d5"). InnerVolumeSpecName "kube-api-access-h5jfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.575768 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-config-data" (OuterVolumeSpecName: "config-data") pod "c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" (UID: "c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.590460 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79dc6de3-cf27-4c1d-91c5-f922acd48400" (UID: "79dc6de3-cf27-4c1d-91c5-f922acd48400"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.600232 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.600275 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25414b25-a6cc-41e6-8360-3e85f54321d5-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.600288 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.600336 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.600351 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdcp8\" (UniqueName: \"kubernetes.io/projected/432b9d93-b045-4e25-b58b-b3a6fd8512c4-kube-api-access-jdcp8\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.600365 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.600380 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.600393 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5jfl\" (UniqueName: \"kubernetes.io/projected/25414b25-a6cc-41e6-8360-3e85f54321d5-kube-api-access-h5jfl\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.600404 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.600413 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/432b9d93-b045-4e25-b58b-b3a6fd8512c4-logs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.670460 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25414b25-a6cc-41e6-8360-3e85f54321d5" (UID: "25414b25-a6cc-41e6-8360-3e85f54321d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.671107 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "37a14211-fd70-4578-83c1-d674b2cf6172" (UID: "37a14211-fd70-4578-83c1-d674b2cf6172"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.672469 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/432b9d93-b045-4e25-b58b-b3a6fd8512c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "432b9d93-b045-4e25-b58b-b3a6fd8512c4" (UID: "432b9d93-b045-4e25-b58b-b3a6fd8512c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.702262 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.702652 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/432b9d93-b045-4e25-b58b-b3a6fd8512c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.702761 4954 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.707422 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b907c888-706a-4183-b581-ff7b4742fc74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b907c888-706a-4183-b581-ff7b4742fc74" (UID: "b907c888-706a-4183-b581-ff7b4742fc74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.711546 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c75e5da0-fd73-485e-b53a-b5e96965bb99" (UID: "c75e5da0-fd73-485e-b53a-b5e96965bb99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.717154 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-config-data" (OuterVolumeSpecName: "config-data") pod "37a14211-fd70-4578-83c1-d674b2cf6172" (UID: "37a14211-fd70-4578-83c1-d674b2cf6172"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.753950 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "069bb9c2-be47-445b-ba0e-f32a7db0b96e" (UID: "069bb9c2-be47-445b-ba0e-f32a7db0b96e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.778680 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "069bb9c2-be47-445b-ba0e-f32a7db0b96e" (UID: "069bb9c2-be47-445b-ba0e-f32a7db0b96e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.790601 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.804740 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a14211-fd70-4578-83c1-d674b2cf6172-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.804773 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.804789 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.804804 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.804817 4954 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.804830 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b907c888-706a-4183-b581-ff7b4742fc74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.808355 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f84d04d2-6282-4a9c-89a8-3aa64ef22c74" (UID: "f84d04d2-6282-4a9c-89a8-3aa64ef22c74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.840608 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/432b9d93-b045-4e25-b58b-b3a6fd8512c4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "432b9d93-b045-4e25-b58b-b3a6fd8512c4" (UID: "432b9d93-b045-4e25-b58b-b3a6fd8512c4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.878889 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-config-data" (OuterVolumeSpecName: "config-data") pod "79dc6de3-cf27-4c1d-91c5-f922acd48400" (UID: "79dc6de3-cf27-4c1d-91c5-f922acd48400"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.888463 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-config-data" (OuterVolumeSpecName: "config-data") pod "25414b25-a6cc-41e6-8360-3e85f54321d5" (UID: "25414b25-a6cc-41e6-8360-3e85f54321d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.892109 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b907c888-706a-4183-b581-ff7b4742fc74-config-data" (OuterVolumeSpecName: "config-data") pod "b907c888-706a-4183-b581-ff7b4742fc74" (UID: "b907c888-706a-4183-b581-ff7b4742fc74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.907210 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-operator-scripts\") pod \"keystoneb67a-account-delete-lc9v4\" (UID: \"2a9a9a20-3eb3-4fbe-9bab-18f69f01399f\") " pod="openstack/keystoneb67a-account-delete-lc9v4" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.907372 4954 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.907523 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-operator-scripts podName:2a9a9a20-3eb3-4fbe-9bab-18f69f01399f nodeName:}" failed. No retries permitted until 2025-12-06 07:22:50.907445683 +0000 UTC m=+1545.720805072 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-operator-scripts") pod "keystoneb67a-account-delete-lc9v4" (UID: "2a9a9a20-3eb3-4fbe-9bab-18f69f01399f") : configmap "openstack-scripts" not found Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.907876 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44jth\" (UniqueName: \"kubernetes.io/projected/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-kube-api-access-44jth\") pod \"keystoneb67a-account-delete-lc9v4\" (UID: \"2a9a9a20-3eb3-4fbe-9bab-18f69f01399f\") " pod="openstack/keystoneb67a-account-delete-lc9v4" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.907980 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.907993 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.908002 4954 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/432b9d93-b045-4e25-b58b-b3a6fd8512c4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.908013 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.908023 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b907c888-706a-4183-b581-ff7b4742fc74-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.908112 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.914856 4954 projected.go:194] Error preparing data for projected volume kube-api-access-44jth for pod openstack/keystoneb67a-account-delete-lc9v4: failed to fetch token: serviceaccounts "galera-openstack" not found Dec 06 07:22:49 crc kubenswrapper[4954]: E1206 07:22:49.914948 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-kube-api-access-44jth podName:2a9a9a20-3eb3-4fbe-9bab-18f69f01399f nodeName:}" failed. No retries permitted until 2025-12-06 07:22:50.914911364 +0000 UTC m=+1545.728270753 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-44jth" (UniqueName: "kubernetes.io/projected/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-kube-api-access-44jth") pod "keystoneb67a-account-delete-lc9v4" (UID: "2a9a9a20-3eb3-4fbe-9bab-18f69f01399f") : failed to fetch token: serviceaccounts "galera-openstack" not found Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.916193 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.920762 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.923747 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.929887 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "25414b25-a6cc-41e6-8360-3e85f54321d5" (UID: "25414b25-a6cc-41e6-8360-3e85f54321d5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.929944 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-74985f58f-cpdl4" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.931525 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-config-data" (OuterVolumeSpecName: "config-data") pod "069bb9c2-be47-445b-ba0e-f32a7db0b96e" (UID: "069bb9c2-be47-445b-ba0e-f32a7db0b96e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.932077 4954 generic.go:334] "Generic (PLEG): container finished" podID="6f6b0598-f49f-4300-a2e3-edb512001517" containerID="872a2bac50f05d39969f244d52edcb2950bf78dcb8c8eeda153b5323ad19a271" exitCode=0 Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.932105 4954 generic.go:334] "Generic (PLEG): container finished" podID="6f6b0598-f49f-4300-a2e3-edb512001517" containerID="ec3ed71b2491ef41795c579a498091f87719a945dc7c7ab6d0235f2e1a17dee5" exitCode=2 Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.932115 4954 generic.go:334] "Generic (PLEG): container finished" podID="6f6b0598-f49f-4300-a2e3-edb512001517" containerID="0231a48c4c2f1af85763f5b6c38a7336d6485526192523aeaa1890f0cad332a2" exitCode=0 Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.933761 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-759b9cfd76-jp2pl" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.948714 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.954037 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.955920 4954 generic.go:334] "Generic (PLEG): container finished" podID="938d0b4f-21d2-4972-8436-eb1fbd6db5bc" containerID="ff4a1927bb7fd5e4e0bc2af413f92d09c51402a69dd5aa94a639ab5dbab627ea" exitCode=2 Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.962917 4954 generic.go:334] "Generic (PLEG): container finished" podID="ca2306ed-dcff-4143-a452-9a209d0a46a1" containerID="358a83794fb9c4d6619d2cfaee88e7ec329afae2443bcf0de2adee871931aa8f" exitCode=0 Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.966324 4954 generic.go:334] "Generic (PLEG): container finished" podID="7924fc4f-0ab9-4805-8d77-3a1fe2953fe1" containerID="3441df07b213b831010c3f148e9250178d2250f974a1c6530c3c0990a93f5d6f" exitCode=0 Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.972790 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55bf8ff4-7brlv" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.982515 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5bf7788f9-vw5rh" Dec 06 07:22:49 crc kubenswrapper[4954]: I1206 07:22:49.983520 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/432b9d93-b045-4e25-b58b-b3a6fd8512c4-config-data" (OuterVolumeSpecName: "config-data") pod "432b9d93-b045-4e25-b58b-b3a6fd8512c4" (UID: "432b9d93-b045-4e25-b58b-b3a6fd8512c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:49.999988 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "f9eaecc6-8e85-432e-a906-3fcecee9fc1d" (UID: "f9eaecc6-8e85-432e-a906-3fcecee9fc1d"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.005057 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-config-data" (OuterVolumeSpecName: "config-data") pod "f84d04d2-6282-4a9c-89a8-3aa64ef22c74" (UID: "f84d04d2-6282-4a9c-89a8-3aa64ef22c74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.011105 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/432b9d93-b045-4e25-b58b-b3a6fd8512c4-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.011145 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.011154 4954 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.011165 4954 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9eaecc6-8e85-432e-a906-3fcecee9fc1d-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.011175 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: E1206 07:22:50.011263 4954 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 06 07:22:50 crc kubenswrapper[4954]: E1206 07:22:50.011712 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-config-data podName:578bec25-a54c-4f52-95f2-19f20f833437 nodeName:}" failed. No retries permitted until 2025-12-06 07:22:58.01130592 +0000 UTC m=+1552.824665309 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-config-data") pod "rabbitmq-server-0" (UID: "578bec25-a54c-4f52-95f2-19f20f833437") : configmap "rabbitmq-config-data" not found Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.029529 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" (UID: "c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.036250 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c75e5da0-fd73-485e-b53a-b5e96965bb99" (UID: "c75e5da0-fd73-485e-b53a-b5e96965bb99"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.052580 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "069bb9c2-be47-445b-ba0e-f32a7db0b96e" (UID: "069bb9c2-be47-445b-ba0e-f32a7db0b96e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.070702 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" (UID: "c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.074867 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" (UID: "c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.080798 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "79dc6de3-cf27-4c1d-91c5-f922acd48400" (UID: "79dc6de3-cf27-4c1d-91c5-f922acd48400"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.098361 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-config-data" (OuterVolumeSpecName: "config-data") pod "c75e5da0-fd73-485e-b53a-b5e96965bb99" (UID: "c75e5da0-fd73-485e-b53a-b5e96965bb99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.112797 4954 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.112830 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.112838 4954 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.112859 4954 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.112868 4954 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/069bb9c2-be47-445b-ba0e-f32a7db0b96e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.112878 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.112889 4954 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79dc6de3-cf27-4c1d-91c5-f922acd48400-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.119046 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "25414b25-a6cc-41e6-8360-3e85f54321d5" (UID: "25414b25-a6cc-41e6-8360-3e85f54321d5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.125973 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c75e5da0-fd73-485e-b53a-b5e96965bb99" (UID: "c75e5da0-fd73-485e-b53a-b5e96965bb99"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150259 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystoneb67a-account-delete-lc9v4"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150314 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-55864b7b7d-dsx9g"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150336 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe","Type":"ContainerDied","Data":"b46e78bf1848a06a70c7514940675caa666a41fb787968055e444933e52c136b"} Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150371 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c75e5da0-fd73-485e-b53a-b5e96965bb99","Type":"ContainerDied","Data":"098992b0ef35dfec1bdd6242548e49ee9144814dd96d3bf5c510386e7ce75337"} Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150393 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f9eaecc6-8e85-432e-a906-3fcecee9fc1d","Type":"ContainerDied","Data":"8cce58a0ff75d9343d7f22f57084ad11c9d91175d3126e4a853a96e0b5431b67"} Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150409 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"432b9d93-b045-4e25-b58b-b3a6fd8512c4","Type":"ContainerDied","Data":"a2f1467030e2ed182f134941e6c80e1c7330809d7284e0dab90842fce7458b87"} Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150428 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74985f58f-cpdl4" event={"ID":"069bb9c2-be47-445b-ba0e-f32a7db0b96e","Type":"ContainerDied","Data":"51421937a8141e0b37995b5aad66e605467b9e0ff73182c1165ee1cc868635ca"} Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150445 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6b0598-f49f-4300-a2e3-edb512001517","Type":"ContainerDied","Data":"872a2bac50f05d39969f244d52edcb2950bf78dcb8c8eeda153b5323ad19a271"} Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150576 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6b0598-f49f-4300-a2e3-edb512001517","Type":"ContainerDied","Data":"ec3ed71b2491ef41795c579a498091f87719a945dc7c7ab6d0235f2e1a17dee5"} Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150591 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6b0598-f49f-4300-a2e3-edb512001517","Type":"ContainerDied","Data":"0231a48c4c2f1af85763f5b6c38a7336d6485526192523aeaa1890f0cad332a2"} Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150606 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759b9cfd76-jp2pl" event={"ID":"25414b25-a6cc-41e6-8360-3e85f54321d5","Type":"ContainerDied","Data":"086efa941c2782c799a58103f81050d1c4c077adbeba240ce660ff290dcc32e1"} Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150625 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150644 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-w9lhq"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150660 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-w9lhq"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150678 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystoneb67a-account-delete-lc9v4"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150697 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37a14211-fd70-4578-83c1-d674b2cf6172","Type":"ContainerDied","Data":"5891bcf5b4323836ecdec4e35150830c0d731502463d8f7b5a584e351eeca4b5"} Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150715 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"79dc6de3-cf27-4c1d-91c5-f922acd48400","Type":"ContainerDied","Data":"ce1cdbb2c6fc2625fbd0e04cdb4ed598c45460bf2f09b1ec8999150c3f5d1660"} Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150729 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b67a-account-create-update-pbmk9"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150747 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b67a-account-create-update-pbmk9"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150780 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"938d0b4f-21d2-4972-8436-eb1fbd6db5bc","Type":"ContainerDied","Data":"ff4a1927bb7fd5e4e0bc2af413f92d09c51402a69dd5aa94a639ab5dbab627ea"} Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150796 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"938d0b4f-21d2-4972-8436-eb1fbd6db5bc","Type":"ContainerDied","Data":"766cff3da05ce40fb43cec977ccc48f30c62227c4141bf538b1adef92021baef"} Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150809 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="766cff3da05ce40fb43cec977ccc48f30c62227c4141bf538b1adef92021baef" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150820 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ca2306ed-dcff-4143-a452-9a209d0a46a1","Type":"ContainerDied","Data":"358a83794fb9c4d6619d2cfaee88e7ec329afae2443bcf0de2adee871931aa8f"} Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150853 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1","Type":"ContainerDied","Data":"3441df07b213b831010c3f148e9250178d2250f974a1c6530c3c0990a93f5d6f"} Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150878 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55bf8ff4-7brlv" event={"ID":"f84d04d2-6282-4a9c-89a8-3aa64ef22c74","Type":"ContainerDied","Data":"c0bf167dc2e4d497caa12502cdabb27efe38c789d6566ae953f0ddb3be6b425e"} Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.150903 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5bf7788f9-vw5rh" event={"ID":"b907c888-706a-4183-b581-ff7b4742fc74","Type":"ContainerDied","Data":"fdf17bd0dc1ae4c72061ac5a70e2ecda5d81bd81ba6b0830f0afb75b6ad277b4"} Dec 06 07:22:50 crc kubenswrapper[4954]: E1206 07:22:50.151371 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-44jth operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystoneb67a-account-delete-lc9v4" podUID="2a9a9a20-3eb3-4fbe-9bab-18f69f01399f" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.151422 4954 scope.go:117] "RemoveContainer" containerID="5c5b57148f52d4e8a3969fbcc3484fc2918682cc67900402ffb917e03848f564" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.151773 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-55864b7b7d-dsx9g" podUID="fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a" containerName="keystone-api" containerID="cri-o://cbea21543fc70a07f7bd97a251a95abf2e5f0008dd200e897c3854aec6d26be7" gracePeriod=30 Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.162588 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f84d04d2-6282-4a9c-89a8-3aa64ef22c74" (UID: "f84d04d2-6282-4a9c-89a8-3aa64ef22c74"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.175019 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f84d04d2-6282-4a9c-89a8-3aa64ef22c74" (UID: "f84d04d2-6282-4a9c-89a8-3aa64ef22c74"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.178114 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.203631 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.205772 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.214503 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.216465 4954 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25414b25-a6cc-41e6-8360-3e85f54321d5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.216479 4954 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c75e5da0-fd73-485e-b53a-b5e96965bb99-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.216489 4954 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.216499 4954 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f84d04d2-6282-4a9c-89a8-3aa64ef22c74-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.222114 4954 scope.go:117] "RemoveContainer" containerID="6f34aece6d2e9c27b3779a555391f111659dcf20da766d7854e90f5cfe888d96" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.227410 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glanced905-account-delete-6dwqh" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.234023 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.283816 4954 scope.go:117] "RemoveContainer" containerID="6880fda36024a9648a750eeda330087a75c544818aceab1279c1dd4c9af76cd2" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.303374 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5bf7788f9-vw5rh"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.320601 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmt8w\" (UniqueName: \"kubernetes.io/projected/e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46-kube-api-access-pmt8w\") pod \"e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46\" (UID: \"e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46\") " Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.320701 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca2306ed-dcff-4143-a452-9a209d0a46a1-config-data\") pod \"ca2306ed-dcff-4143-a452-9a209d0a46a1\" (UID: \"ca2306ed-dcff-4143-a452-9a209d0a46a1\") " Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.320790 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1-combined-ca-bundle\") pod \"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1\" (UID: \"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1\") " Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.320846 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-975mx\" (UniqueName: \"kubernetes.io/projected/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1-kube-api-access-975mx\") pod \"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1\" (UID: \"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1\") " Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.320915 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1-config-data\") pod \"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1\" (UID: \"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1\") " Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.320954 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46-operator-scripts\") pod \"e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46\" (UID: \"e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46\") " Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.321077 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-combined-ca-bundle\") pod \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\" (UID: \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\") " Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.321126 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2306ed-dcff-4143-a452-9a209d0a46a1-combined-ca-bundle\") pod \"ca2306ed-dcff-4143-a452-9a209d0a46a1\" (UID: \"ca2306ed-dcff-4143-a452-9a209d0a46a1\") " Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.321189 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-kube-state-metrics-tls-certs\") pod \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\" (UID: \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\") " Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.321244 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nbf6\" (UniqueName: \"kubernetes.io/projected/ca2306ed-dcff-4143-a452-9a209d0a46a1-kube-api-access-2nbf6\") pod \"ca2306ed-dcff-4143-a452-9a209d0a46a1\" (UID: \"ca2306ed-dcff-4143-a452-9a209d0a46a1\") " Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.321271 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-kube-state-metrics-tls-config\") pod \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\" (UID: \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\") " Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.321327 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmtc9\" (UniqueName: \"kubernetes.io/projected/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-kube-api-access-fmtc9\") pod \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\" (UID: \"938d0b4f-21d2-4972-8436-eb1fbd6db5bc\") " Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.322908 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46" (UID: "e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.349598 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46-kube-api-access-pmt8w" (OuterVolumeSpecName: "kube-api-access-pmt8w") pod "e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46" (UID: "e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46"). InnerVolumeSpecName "kube-api-access-pmt8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.349746 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-5bf7788f9-vw5rh"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.360428 4954 scope.go:117] "RemoveContainer" containerID="2c4a1099f869e71a363c1447c04199831e773a0bfb2ea26125fb2dd4207e631d" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.360618 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-kube-api-access-fmtc9" (OuterVolumeSpecName: "kube-api-access-fmtc9") pod "938d0b4f-21d2-4972-8436-eb1fbd6db5bc" (UID: "938d0b4f-21d2-4972-8436-eb1fbd6db5bc"). InnerVolumeSpecName "kube-api-access-fmtc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.380818 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1-kube-api-access-975mx" (OuterVolumeSpecName: "kube-api-access-975mx") pod "7924fc4f-0ab9-4805-8d77-3a1fe2953fe1" (UID: "7924fc4f-0ab9-4805-8d77-3a1fe2953fe1"). InnerVolumeSpecName "kube-api-access-975mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.387759 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7924fc4f-0ab9-4805-8d77-3a1fe2953fe1" (UID: "7924fc4f-0ab9-4805-8d77-3a1fe2953fe1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.390162 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca2306ed-dcff-4143-a452-9a209d0a46a1-kube-api-access-2nbf6" (OuterVolumeSpecName: "kube-api-access-2nbf6") pod "ca2306ed-dcff-4143-a452-9a209d0a46a1" (UID: "ca2306ed-dcff-4143-a452-9a209d0a46a1"). InnerVolumeSpecName "kube-api-access-2nbf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.397575 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1-config-data" (OuterVolumeSpecName: "config-data") pod "7924fc4f-0ab9-4805-8d77-3a1fe2953fe1" (UID: "7924fc4f-0ab9-4805-8d77-3a1fe2953fe1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.411348 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.411544 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca2306ed-dcff-4143-a452-9a209d0a46a1-config-data" (OuterVolumeSpecName: "config-data") pod "ca2306ed-dcff-4143-a452-9a209d0a46a1" (UID: "ca2306ed-dcff-4143-a452-9a209d0a46a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.421703 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.423579 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nbf6\" (UniqueName: \"kubernetes.io/projected/ca2306ed-dcff-4143-a452-9a209d0a46a1-kube-api-access-2nbf6\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.423609 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmtc9\" (UniqueName: \"kubernetes.io/projected/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-kube-api-access-fmtc9\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.423620 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmt8w\" (UniqueName: \"kubernetes.io/projected/e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46-kube-api-access-pmt8w\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.423629 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca2306ed-dcff-4143-a452-9a209d0a46a1-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.423639 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.423647 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-975mx\" (UniqueName: \"kubernetes.io/projected/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1-kube-api-access-975mx\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.423655 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.423663 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.449880 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.457315 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.459390 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "938d0b4f-21d2-4972-8436-eb1fbd6db5bc" (UID: "938d0b4f-21d2-4972-8436-eb1fbd6db5bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.474350 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.484332 4954 scope.go:117] "RemoveContainer" containerID="eace79b9d5ea075491fb63d59c67b1bf00e2fa80e5cc5eed0c509ad5570509ae" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.484669 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.490400 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "938d0b4f-21d2-4972-8436-eb1fbd6db5bc" (UID: "938d0b4f-21d2-4972-8436-eb1fbd6db5bc"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.494682 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca2306ed-dcff-4143-a452-9a209d0a46a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca2306ed-dcff-4143-a452-9a209d0a46a1" (UID: "ca2306ed-dcff-4143-a452-9a209d0a46a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.510441 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron251d-account-delete-rhsq5" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.512227 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-759b9cfd76-jp2pl"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.518999 4954 scope.go:117] "RemoveContainer" containerID="68a8e45e6d42014e5a385ced480ef41ef9ac3c7dbf71d17042ad3054f509fc4c" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.525873 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.525920 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2306ed-dcff-4143-a452-9a209d0a46a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.525934 4954 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.528963 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-759b9cfd76-jp2pl"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.550329 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "938d0b4f-21d2-4972-8436-eb1fbd6db5bc" (UID: "938d0b4f-21d2-4972-8436-eb1fbd6db5bc"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.552094 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-55bf8ff4-7brlv"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.580403 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-55bf8ff4-7brlv"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.584895 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="841d2cc4-0265-4e02-af59-f0f322208f02" containerName="galera" containerID="cri-o://a193b8a0fa409b5321f53a307fc650e4cd2f601366017277f503ce0ae781286a" gracePeriod=30 Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.594253 4954 scope.go:117] "RemoveContainer" containerID="d14ca933ddb6a12f4399c39d7a1279cb6dd7f6ad978226ac72fa861be2573071" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.602497 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.610637 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.616381 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.622555 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.626574 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v275\" (UniqueName: \"kubernetes.io/projected/097befa0-58fe-4616-bed7-ada4f7d81ce3-kube-api-access-6v275\") pod \"097befa0-58fe-4616-bed7-ada4f7d81ce3\" (UID: \"097befa0-58fe-4616-bed7-ada4f7d81ce3\") " Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.626626 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097befa0-58fe-4616-bed7-ada4f7d81ce3-operator-scripts\") pod \"097befa0-58fe-4616-bed7-ada4f7d81ce3\" (UID: \"097befa0-58fe-4616-bed7-ada4f7d81ce3\") " Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.627194 4954 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/938d0b4f-21d2-4972-8436-eb1fbd6db5bc-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: E1206 07:22:50.627269 4954 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 06 07:22:50 crc kubenswrapper[4954]: E1206 07:22:50.627332 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-config-data podName:31452db7-e2c4-4e61-8f8c-7017476f0bc0 nodeName:}" failed. No retries permitted until 2025-12-06 07:22:58.627309438 +0000 UTC m=+1553.440668827 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-config-data") pod "rabbitmq-cell1-server-0" (UID: "31452db7-e2c4-4e61-8f8c-7017476f0bc0") : configmap "rabbitmq-cell1-config-data" not found Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.628451 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/097befa0-58fe-4616-bed7-ada4f7d81ce3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "097befa0-58fe-4616-bed7-ada4f7d81ce3" (UID: "097befa0-58fe-4616-bed7-ada4f7d81ce3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.630979 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-74985f58f-cpdl4"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.636478 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/097befa0-58fe-4616-bed7-ada4f7d81ce3-kube-api-access-6v275" (OuterVolumeSpecName: "kube-api-access-6v275") pod "097befa0-58fe-4616-bed7-ada4f7d81ce3" (UID: "097befa0-58fe-4616-bed7-ada4f7d81ce3"). InnerVolumeSpecName "kube-api-access-6v275". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.638292 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-74985f58f-cpdl4"] Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.728942 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v275\" (UniqueName: \"kubernetes.io/projected/097befa0-58fe-4616-bed7-ada4f7d81ce3-kube-api-access-6v275\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.728985 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097befa0-58fe-4616-bed7-ada4f7d81ce3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.746777 4954 scope.go:117] "RemoveContainer" containerID="ac777ba7c81ee56eb4fb435d7da2f478c6a3c4f40024a4daae7126627007674a" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.931399 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44jth\" (UniqueName: \"kubernetes.io/projected/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-kube-api-access-44jth\") pod \"keystoneb67a-account-delete-lc9v4\" (UID: \"2a9a9a20-3eb3-4fbe-9bab-18f69f01399f\") " pod="openstack/keystoneb67a-account-delete-lc9v4" Dec 06 07:22:50 crc kubenswrapper[4954]: I1206 07:22:50.931464 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-operator-scripts\") pod \"keystoneb67a-account-delete-lc9v4\" (UID: \"2a9a9a20-3eb3-4fbe-9bab-18f69f01399f\") " pod="openstack/keystoneb67a-account-delete-lc9v4" Dec 06 07:22:50 crc kubenswrapper[4954]: E1206 07:22:50.931695 4954 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 06 07:22:50 crc kubenswrapper[4954]: E1206 07:22:50.931743 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-operator-scripts podName:2a9a9a20-3eb3-4fbe-9bab-18f69f01399f nodeName:}" failed. No retries permitted until 2025-12-06 07:22:52.931726546 +0000 UTC m=+1547.745085935 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-operator-scripts") pod "keystoneb67a-account-delete-lc9v4" (UID: "2a9a9a20-3eb3-4fbe-9bab-18f69f01399f") : configmap "openstack-scripts" not found Dec 06 07:22:50 crc kubenswrapper[4954]: E1206 07:22:50.937913 4954 projected.go:194] Error preparing data for projected volume kube-api-access-44jth for pod openstack/keystoneb67a-account-delete-lc9v4: failed to fetch token: serviceaccounts "galera-openstack" not found Dec 06 07:22:50 crc kubenswrapper[4954]: E1206 07:22:50.938002 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-kube-api-access-44jth podName:2a9a9a20-3eb3-4fbe-9bab-18f69f01399f nodeName:}" failed. No retries permitted until 2025-12-06 07:22:52.937975974 +0000 UTC m=+1547.751335363 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-44jth" (UniqueName: "kubernetes.io/projected/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-kube-api-access-44jth") pod "keystoneb67a-account-delete-lc9v4" (UID: "2a9a9a20-3eb3-4fbe-9bab-18f69f01399f") : failed to fetch token: serviceaccounts "galera-openstack" not found Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.007109 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ca2306ed-dcff-4143-a452-9a209d0a46a1","Type":"ContainerDied","Data":"737c35ee429a8ed15bc551f188d6fb72b21e4170f86011d5bdfa2b550cdd9f47"} Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.007248 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.010996 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement88ab-account-delete-597b8" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.012549 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinderdf0e-account-delete-fsz84" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.019406 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0f048-account-delete-6rspl" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.021248 4954 generic.go:334] "Generic (PLEG): container finished" podID="4a82ba0f-bb07-4959-bfdd-8a420c617835" containerID="d27d5d9f536cc22c14abfb14b6f38b3f0d61e96cf9b23ee50d20f02267f9999f" exitCode=0 Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.021341 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4a82ba0f-bb07-4959-bfdd-8a420c617835","Type":"ContainerDied","Data":"d27d5d9f536cc22c14abfb14b6f38b3f0d61e96cf9b23ee50d20f02267f9999f"} Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.021376 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4a82ba0f-bb07-4959-bfdd-8a420c617835","Type":"ContainerDied","Data":"66af8e751472e272dd2fdfbb1944c78878f95915960d3b1e4c03c64210fcc871"} Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.021393 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66af8e751472e272dd2fdfbb1944c78878f95915960d3b1e4c03c64210fcc871" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.025264 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.025328 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glanced905-account-delete-6dwqh" event={"ID":"e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46","Type":"ContainerDied","Data":"6e147bbf252b2623e7f1a275240b8caecc7cf44f164c70ee51f71775207cfd6a"} Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.025364 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glanced905-account-delete-6dwqh" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.025362 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e147bbf252b2623e7f1a275240b8caecc7cf44f164c70ee51f71775207cfd6a" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.027277 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron251d-account-delete-rhsq5" event={"ID":"097befa0-58fe-4616-bed7-ada4f7d81ce3","Type":"ContainerDied","Data":"05cb69201f2ea66c003ce693f242485952866971608a671033ff80da6bfe931e"} Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.027298 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05cb69201f2ea66c003ce693f242485952866971608a671033ff80da6bfe931e" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.027332 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron251d-account-delete-rhsq5" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.029898 4954 generic.go:334] "Generic (PLEG): container finished" podID="31452db7-e2c4-4e61-8f8c-7017476f0bc0" containerID="146293bb351f1e02c74e1849e3be8838c1377922574653baaa3053a66ff3aad5" exitCode=0 Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.029953 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"31452db7-e2c4-4e61-8f8c-7017476f0bc0","Type":"ContainerDied","Data":"146293bb351f1e02c74e1849e3be8838c1377922574653baaa3053a66ff3aad5"} Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.047581 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican664f-account-delete-l67mj" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.048704 4954 generic.go:334] "Generic (PLEG): container finished" podID="578bec25-a54c-4f52-95f2-19f20f833437" containerID="cf94bc0a088f861f78bc849ba8a3d2602f00e20f5f2438d40190effa61b16e55" exitCode=0 Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.048760 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"578bec25-a54c-4f52-95f2-19f20f833437","Type":"ContainerDied","Data":"cf94bc0a088f861f78bc849ba8a3d2602f00e20f5f2438d40190effa61b16e55"} Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.059556 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican664f-account-delete-l67mj" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.060072 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican664f-account-delete-l67mj" event={"ID":"693e40da-d019-421e-83a8-6dc351580607","Type":"ContainerDied","Data":"5eafbc9aea9f01ff5550ea2a44b404b3a0fe542bc5f12c899789b18cbdb2b4ae"} Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.060115 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5eafbc9aea9f01ff5550ea2a44b404b3a0fe542bc5f12c899789b18cbdb2b4ae" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.062833 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi9e6e-account-delete-2shf2" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.064735 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement88ab-account-delete-597b8" event={"ID":"418e92aa-4713-4a55-b4d5-650587fcb6ca","Type":"ContainerDied","Data":"3825826067af0a6a8a03f9c385ee0605f6e5979cb5203df2425b0450a4e29f2a"} Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.064781 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3825826067af0a6a8a03f9c385ee0605f6e5979cb5203df2425b0450a4e29f2a" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.064856 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement88ab-account-delete-597b8" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.072045 4954 scope.go:117] "RemoveContainer" containerID="b55c4714520851c4ac11e12722e037b67a35d180952a2ddf1903aaa5172b8c72" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.075131 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7924fc4f-0ab9-4805-8d77-3a1fe2953fe1","Type":"ContainerDied","Data":"00ec9c4811b34c7923cdf604064125a472b2f12d6fe03e96a6f5155d5d491496"} Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.075226 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.096299 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0f048-account-delete-6rspl" event={"ID":"81a84a0f-fa51-461c-a281-6b832ad39aa7","Type":"ContainerDied","Data":"0f8bdd98d0bbb0caf44bd6a3cfe524e18e9944d3d400afe8c088e77c6f8877d8"} Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.096346 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f8bdd98d0bbb0caf44bd6a3cfe524e18e9944d3d400afe8c088e77c6f8877d8" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.096404 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0f048-account-delete-6rspl" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.103226 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi9e6e-account-delete-2shf2" event={"ID":"c417ca1e-22df-4163-96f9-349df3d624e8","Type":"ContainerDied","Data":"767b3518e5655e87d2bf0eeb5f97fee613b6f2a08de9644c18bdbeb3ba89fb96"} Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.103313 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="767b3518e5655e87d2bf0eeb5f97fee613b6f2a08de9644c18bdbeb3ba89fb96" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.103419 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi9e6e-account-delete-2shf2" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.113834 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderdf0e-account-delete-fsz84" event={"ID":"d2ccdb46-97b8-40d8-aebb-5cf28cb6854d","Type":"ContainerDied","Data":"8d8c0607cbb962e9a1ad4064bc681757b41669b90c01f54112e79fee12ebf663"} Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.113891 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d8c0607cbb962e9a1ad4064bc681757b41669b90c01f54112e79fee12ebf663" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.113897 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.113969 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinderdf0e-account-delete-fsz84" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.114232 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystoneb67a-account-delete-lc9v4" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.132543 4954 scope.go:117] "RemoveContainer" containerID="cb83e86e00e38d45967cf91f65a342bea85bf6723ea7014271922e9077ba5d0d" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.133118 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a82ba0f-bb07-4959-bfdd-8a420c617835-memcached-tls-certs\") pod \"4a82ba0f-bb07-4959-bfdd-8a420c617835\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.133184 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/418e92aa-4713-4a55-b4d5-650587fcb6ca-operator-scripts\") pod \"418e92aa-4713-4a55-b4d5-650587fcb6ca\" (UID: \"418e92aa-4713-4a55-b4d5-650587fcb6ca\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.133210 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a82ba0f-bb07-4959-bfdd-8a420c617835-combined-ca-bundle\") pod \"4a82ba0f-bb07-4959-bfdd-8a420c617835\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.133234 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c417ca1e-22df-4163-96f9-349df3d624e8-operator-scripts\") pod \"c417ca1e-22df-4163-96f9-349df3d624e8\" (UID: \"c417ca1e-22df-4163-96f9-349df3d624e8\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.133259 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693e40da-d019-421e-83a8-6dc351580607-operator-scripts\") pod \"693e40da-d019-421e-83a8-6dc351580607\" (UID: \"693e40da-d019-421e-83a8-6dc351580607\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.133286 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbc8h\" (UniqueName: \"kubernetes.io/projected/418e92aa-4713-4a55-b4d5-650587fcb6ca-kube-api-access-qbc8h\") pod \"418e92aa-4713-4a55-b4d5-650587fcb6ca\" (UID: \"418e92aa-4713-4a55-b4d5-650587fcb6ca\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.133309 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2ccdb46-97b8-40d8-aebb-5cf28cb6854d-operator-scripts\") pod \"d2ccdb46-97b8-40d8-aebb-5cf28cb6854d\" (UID: \"d2ccdb46-97b8-40d8-aebb-5cf28cb6854d\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.133354 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc9rr\" (UniqueName: \"kubernetes.io/projected/81a84a0f-fa51-461c-a281-6b832ad39aa7-kube-api-access-hc9rr\") pod \"81a84a0f-fa51-461c-a281-6b832ad39aa7\" (UID: \"81a84a0f-fa51-461c-a281-6b832ad39aa7\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.133407 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81a84a0f-fa51-461c-a281-6b832ad39aa7-operator-scripts\") pod \"81a84a0f-fa51-461c-a281-6b832ad39aa7\" (UID: \"81a84a0f-fa51-461c-a281-6b832ad39aa7\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.133444 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mclv\" (UniqueName: \"kubernetes.io/projected/d2ccdb46-97b8-40d8-aebb-5cf28cb6854d-kube-api-access-9mclv\") pod \"d2ccdb46-97b8-40d8-aebb-5cf28cb6854d\" (UID: \"d2ccdb46-97b8-40d8-aebb-5cf28cb6854d\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.133464 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a82ba0f-bb07-4959-bfdd-8a420c617835-config-data\") pod \"4a82ba0f-bb07-4959-bfdd-8a420c617835\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.133493 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4a82ba0f-bb07-4959-bfdd-8a420c617835-kolla-config\") pod \"4a82ba0f-bb07-4959-bfdd-8a420c617835\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.134674 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81a84a0f-fa51-461c-a281-6b832ad39aa7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81a84a0f-fa51-461c-a281-6b832ad39aa7" (UID: "81a84a0f-fa51-461c-a281-6b832ad39aa7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.134782 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2ccdb46-97b8-40d8-aebb-5cf28cb6854d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d2ccdb46-97b8-40d8-aebb-5cf28cb6854d" (UID: "d2ccdb46-97b8-40d8-aebb-5cf28cb6854d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.135241 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/693e40da-d019-421e-83a8-6dc351580607-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "693e40da-d019-421e-83a8-6dc351580607" (UID: "693e40da-d019-421e-83a8-6dc351580607"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.135820 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c417ca1e-22df-4163-96f9-349df3d624e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c417ca1e-22df-4163-96f9-349df3d624e8" (UID: "c417ca1e-22df-4163-96f9-349df3d624e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.135836 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/418e92aa-4713-4a55-b4d5-650587fcb6ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "418e92aa-4713-4a55-b4d5-650587fcb6ca" (UID: "418e92aa-4713-4a55-b4d5-650587fcb6ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.136042 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystoneb67a-account-delete-lc9v4" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.136231 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl4gg\" (UniqueName: \"kubernetes.io/projected/693e40da-d019-421e-83a8-6dc351580607-kube-api-access-nl4gg\") pod \"693e40da-d019-421e-83a8-6dc351580607\" (UID: \"693e40da-d019-421e-83a8-6dc351580607\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.136283 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mrq9\" (UniqueName: \"kubernetes.io/projected/c417ca1e-22df-4163-96f9-349df3d624e8-kube-api-access-2mrq9\") pod \"c417ca1e-22df-4163-96f9-349df3d624e8\" (UID: \"c417ca1e-22df-4163-96f9-349df3d624e8\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.136333 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gct2\" (UniqueName: \"kubernetes.io/projected/4a82ba0f-bb07-4959-bfdd-8a420c617835-kube-api-access-4gct2\") pod \"4a82ba0f-bb07-4959-bfdd-8a420c617835\" (UID: \"4a82ba0f-bb07-4959-bfdd-8a420c617835\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.136412 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a82ba0f-bb07-4959-bfdd-8a420c617835-config-data" (OuterVolumeSpecName: "config-data") pod "4a82ba0f-bb07-4959-bfdd-8a420c617835" (UID: "4a82ba0f-bb07-4959-bfdd-8a420c617835"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.136859 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a82ba0f-bb07-4959-bfdd-8a420c617835-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "4a82ba0f-bb07-4959-bfdd-8a420c617835" (UID: "4a82ba0f-bb07-4959-bfdd-8a420c617835"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.137225 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2ccdb46-97b8-40d8-aebb-5cf28cb6854d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.137254 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81a84a0f-fa51-461c-a281-6b832ad39aa7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.137266 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a82ba0f-bb07-4959-bfdd-8a420c617835-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.137277 4954 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4a82ba0f-bb07-4959-bfdd-8a420c617835-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.137288 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/418e92aa-4713-4a55-b4d5-650587fcb6ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.137298 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c417ca1e-22df-4163-96f9-349df3d624e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.137309 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693e40da-d019-421e-83a8-6dc351580607-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.141218 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/418e92aa-4713-4a55-b4d5-650587fcb6ca-kube-api-access-qbc8h" (OuterVolumeSpecName: "kube-api-access-qbc8h") pod "418e92aa-4713-4a55-b4d5-650587fcb6ca" (UID: "418e92aa-4713-4a55-b4d5-650587fcb6ca"). InnerVolumeSpecName "kube-api-access-qbc8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.143119 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81a84a0f-fa51-461c-a281-6b832ad39aa7-kube-api-access-hc9rr" (OuterVolumeSpecName: "kube-api-access-hc9rr") pod "81a84a0f-fa51-461c-a281-6b832ad39aa7" (UID: "81a84a0f-fa51-461c-a281-6b832ad39aa7"). InnerVolumeSpecName "kube-api-access-hc9rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.150136 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/693e40da-d019-421e-83a8-6dc351580607-kube-api-access-nl4gg" (OuterVolumeSpecName: "kube-api-access-nl4gg") pod "693e40da-d019-421e-83a8-6dc351580607" (UID: "693e40da-d019-421e-83a8-6dc351580607"). InnerVolumeSpecName "kube-api-access-nl4gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.150242 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a82ba0f-bb07-4959-bfdd-8a420c617835-kube-api-access-4gct2" (OuterVolumeSpecName: "kube-api-access-4gct2") pod "4a82ba0f-bb07-4959-bfdd-8a420c617835" (UID: "4a82ba0f-bb07-4959-bfdd-8a420c617835"). InnerVolumeSpecName "kube-api-access-4gct2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.150049 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c417ca1e-22df-4163-96f9-349df3d624e8-kube-api-access-2mrq9" (OuterVolumeSpecName: "kube-api-access-2mrq9") pod "c417ca1e-22df-4163-96f9-349df3d624e8" (UID: "c417ca1e-22df-4163-96f9-349df3d624e8"). InnerVolumeSpecName "kube-api-access-2mrq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.163958 4954 scope.go:117] "RemoveContainer" containerID="01b096b10176462db3110ae80702f8ad1ade5c76a6f11bd639c695de9a47e99c" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.189022 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ccdb46-97b8-40d8-aebb-5cf28cb6854d-kube-api-access-9mclv" (OuterVolumeSpecName: "kube-api-access-9mclv") pod "d2ccdb46-97b8-40d8-aebb-5cf28cb6854d" (UID: "d2ccdb46-97b8-40d8-aebb-5cf28cb6854d"). InnerVolumeSpecName "kube-api-access-9mclv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.194662 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.198256 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a82ba0f-bb07-4959-bfdd-8a420c617835-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a82ba0f-bb07-4959-bfdd-8a420c617835" (UID: "4a82ba0f-bb07-4959-bfdd-8a420c617835"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.200254 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.204079 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a82ba0f-bb07-4959-bfdd-8a420c617835-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "4a82ba0f-bb07-4959-bfdd-8a420c617835" (UID: "4a82ba0f-bb07-4959-bfdd-8a420c617835"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.204853 4954 scope.go:117] "RemoveContainer" containerID="d10a4557556efb566e2c349cfa75bf39a1a2ea52776d0c439f69d9cdbd06b4fa" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.232433 4954 scope.go:117] "RemoveContainer" containerID="dff634325c63563a5768825e16323ab3d990f36bc7c9fcb6773bf50f64fbe138" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.235281 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.239392 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mclv\" (UniqueName: \"kubernetes.io/projected/d2ccdb46-97b8-40d8-aebb-5cf28cb6854d-kube-api-access-9mclv\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.239419 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl4gg\" (UniqueName: \"kubernetes.io/projected/693e40da-d019-421e-83a8-6dc351580607-kube-api-access-nl4gg\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.239430 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mrq9\" (UniqueName: \"kubernetes.io/projected/c417ca1e-22df-4163-96f9-349df3d624e8-kube-api-access-2mrq9\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.239439 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gct2\" (UniqueName: \"kubernetes.io/projected/4a82ba0f-bb07-4959-bfdd-8a420c617835-kube-api-access-4gct2\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.239448 4954 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a82ba0f-bb07-4959-bfdd-8a420c617835-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.239469 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a82ba0f-bb07-4959-bfdd-8a420c617835-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.239478 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbc8h\" (UniqueName: \"kubernetes.io/projected/418e92aa-4713-4a55-b4d5-650587fcb6ca-kube-api-access-qbc8h\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.239488 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc9rr\" (UniqueName: \"kubernetes.io/projected/81a84a0f-fa51-461c-a281-6b832ad39aa7-kube-api-access-hc9rr\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.257533 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.259785 4954 scope.go:117] "RemoveContainer" containerID="0bb0b24e4d3085b50b85c5df8d24f8ed4c7c0c28e8532d6d7a2d8957c39ce556" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.267121 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.277803 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.282214 4954 scope.go:117] "RemoveContainer" containerID="e09a4c785be2122538bcc0a01f7e6782d054fe47c518d01aab3ac5049b21cf94" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.312637 4954 scope.go:117] "RemoveContainer" containerID="c848e10be510af8ec76555e3960b2f1fc5ccf1b3023d60abc7f4ffb8dd93dba2" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.337281 4954 scope.go:117] "RemoveContainer" containerID="6ab0f45e984c650aee9fc279116d7bc9e68e5fe5d6b74cb802fbb810f6d5de26" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.394348 4954 scope.go:117] "RemoveContainer" containerID="27d6334f40e13cafc61de087025fab078b3fe41774336c33565c8ae4035e063f" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.457664 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="054f854f-83bd-4174-bf86-da2b1a6cfedb" path="/var/lib/kubelet/pods/054f854f-83bd-4174-bf86-da2b1a6cfedb/volumes" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.458923 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="069bb9c2-be47-445b-ba0e-f32a7db0b96e" path="/var/lib/kubelet/pods/069bb9c2-be47-445b-ba0e-f32a7db0b96e/volumes" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.460038 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cfba967-3154-41c7-b541-45d9a8a2a122" path="/var/lib/kubelet/pods/0cfba967-3154-41c7-b541-45d9a8a2a122/volumes" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.461845 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25414b25-a6cc-41e6-8360-3e85f54321d5" path="/var/lib/kubelet/pods/25414b25-a6cc-41e6-8360-3e85f54321d5/volumes" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.462866 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37a14211-fd70-4578-83c1-d674b2cf6172" path="/var/lib/kubelet/pods/37a14211-fd70-4578-83c1-d674b2cf6172/volumes" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.463706 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="432b9d93-b045-4e25-b58b-b3a6fd8512c4" path="/var/lib/kubelet/pods/432b9d93-b045-4e25-b58b-b3a6fd8512c4/volumes" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.464804 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7924fc4f-0ab9-4805-8d77-3a1fe2953fe1" path="/var/lib/kubelet/pods/7924fc4f-0ab9-4805-8d77-3a1fe2953fe1/volumes" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.465365 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79dc6de3-cf27-4c1d-91c5-f922acd48400" path="/var/lib/kubelet/pods/79dc6de3-cf27-4c1d-91c5-f922acd48400/volumes" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.466014 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="938d0b4f-21d2-4972-8436-eb1fbd6db5bc" path="/var/lib/kubelet/pods/938d0b4f-21d2-4972-8436-eb1fbd6db5bc/volumes" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.467297 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b907c888-706a-4183-b581-ff7b4742fc74" path="/var/lib/kubelet/pods/b907c888-706a-4183-b581-ff7b4742fc74/volumes" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.468130 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c75e5da0-fd73-485e-b53a-b5e96965bb99" path="/var/lib/kubelet/pods/c75e5da0-fd73-485e-b53a-b5e96965bb99/volumes" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.469629 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe" path="/var/lib/kubelet/pods/c76a6f0b-0bc0-4f2e-8e03-f7871b960ebe/volumes" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.470327 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca2306ed-dcff-4143-a452-9a209d0a46a1" path="/var/lib/kubelet/pods/ca2306ed-dcff-4143-a452-9a209d0a46a1/volumes" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.471018 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f84d04d2-6282-4a9c-89a8-3aa64ef22c74" path="/var/lib/kubelet/pods/f84d04d2-6282-4a9c-89a8-3aa64ef22c74/volumes" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.472713 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9eaecc6-8e85-432e-a906-3fcecee9fc1d" path="/var/lib/kubelet/pods/f9eaecc6-8e85-432e-a906-3fcecee9fc1d/volumes" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.516828 4954 scope.go:117] "RemoveContainer" containerID="ce32b7671edf55a676312d4bc38e6dda973ecc26c56be59a3598a590d1eebc14" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.522861 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="5e602bfa-4a57-4fe6-adca-90acb13b7458" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.196:6080/vnc_lite.html\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.555917 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.628716 4954 scope.go:117] "RemoveContainer" containerID="632cafbc6ab8b4bca12eb7135c2589ae827ea7666ad9a40b63f7adf3a97af9c7" Dec 06 07:22:51 crc kubenswrapper[4954]: E1206 07:22:51.637186 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 06 07:22:51 crc kubenswrapper[4954]: E1206 07:22:51.637446 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.637728 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:22:51 crc kubenswrapper[4954]: E1206 07:22:51.638210 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 06 07:22:51 crc kubenswrapper[4954]: E1206 07:22:51.638238 4954 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-xskgs" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovsdb-server" Dec 06 07:22:51 crc kubenswrapper[4954]: E1206 07:22:51.642044 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 06 07:22:51 crc kubenswrapper[4954]: E1206 07:22:51.653814 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.654821 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhjvx\" (UniqueName: \"kubernetes.io/projected/578bec25-a54c-4f52-95f2-19f20f833437-kube-api-access-qhjvx\") pod \"578bec25-a54c-4f52-95f2-19f20f833437\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.654949 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-tls\") pod \"578bec25-a54c-4f52-95f2-19f20f833437\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.654976 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-plugins-conf\") pod \"578bec25-a54c-4f52-95f2-19f20f833437\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.655025 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-server-conf\") pod \"578bec25-a54c-4f52-95f2-19f20f833437\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.655074 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-plugins\") pod \"578bec25-a54c-4f52-95f2-19f20f833437\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.655138 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/578bec25-a54c-4f52-95f2-19f20f833437-erlang-cookie-secret\") pod \"578bec25-a54c-4f52-95f2-19f20f833437\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.662952 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "578bec25-a54c-4f52-95f2-19f20f833437" (UID: "578bec25-a54c-4f52-95f2-19f20f833437"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: E1206 07:22:51.665329 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 06 07:22:51 crc kubenswrapper[4954]: E1206 07:22:51.665396 4954 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-xskgs" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovs-vswitchd" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.667137 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/578bec25-a54c-4f52-95f2-19f20f833437-kube-api-access-qhjvx" (OuterVolumeSpecName: "kube-api-access-qhjvx") pod "578bec25-a54c-4f52-95f2-19f20f833437" (UID: "578bec25-a54c-4f52-95f2-19f20f833437"). InnerVolumeSpecName "kube-api-access-qhjvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.667166 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "578bec25-a54c-4f52-95f2-19f20f833437" (UID: "578bec25-a54c-4f52-95f2-19f20f833437"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.671706 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-erlang-cookie\") pod \"578bec25-a54c-4f52-95f2-19f20f833437\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.671835 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-confd\") pod \"578bec25-a54c-4f52-95f2-19f20f833437\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.671863 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/578bec25-a54c-4f52-95f2-19f20f833437-pod-info\") pod \"578bec25-a54c-4f52-95f2-19f20f833437\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.671894 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-config-data\") pod \"578bec25-a54c-4f52-95f2-19f20f833437\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.671920 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"578bec25-a54c-4f52-95f2-19f20f833437\" (UID: \"578bec25-a54c-4f52-95f2-19f20f833437\") " Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.672505 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhjvx\" (UniqueName: \"kubernetes.io/projected/578bec25-a54c-4f52-95f2-19f20f833437-kube-api-access-qhjvx\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.672522 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.672535 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.675774 4954 scope.go:117] "RemoveContainer" containerID="358a83794fb9c4d6619d2cfaee88e7ec329afae2443bcf0de2adee871931aa8f" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.680416 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/578bec25-a54c-4f52-95f2-19f20f833437-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "578bec25-a54c-4f52-95f2-19f20f833437" (UID: "578bec25-a54c-4f52-95f2-19f20f833437"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.681512 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "578bec25-a54c-4f52-95f2-19f20f833437" (UID: "578bec25-a54c-4f52-95f2-19f20f833437"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.684096 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/578bec25-a54c-4f52-95f2-19f20f833437-pod-info" (OuterVolumeSpecName: "pod-info") pod "578bec25-a54c-4f52-95f2-19f20f833437" (UID: "578bec25-a54c-4f52-95f2-19f20f833437"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.684130 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "578bec25-a54c-4f52-95f2-19f20f833437" (UID: "578bec25-a54c-4f52-95f2-19f20f833437"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.712898 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "578bec25-a54c-4f52-95f2-19f20f833437" (UID: "578bec25-a54c-4f52-95f2-19f20f833437"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 07:22:51 crc kubenswrapper[4954]: I1206 07:22:51.743547 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-server-conf" (OuterVolumeSpecName: "server-conf") pod "578bec25-a54c-4f52-95f2-19f20f833437" (UID: "578bec25-a54c-4f52-95f2-19f20f833437"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.778799 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-erlang-cookie\") pod \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.782021 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "31452db7-e2c4-4e61-8f8c-7017476f0bc0" (UID: "31452db7-e2c4-4e61-8f8c-7017476f0bc0"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.782172 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-confd\") pod \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.782507 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31452db7-e2c4-4e61-8f8c-7017476f0bc0-erlang-cookie-secret\") pod \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.782539 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31452db7-e2c4-4e61-8f8c-7017476f0bc0-pod-info\") pod \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.782847 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bq5n\" (UniqueName: \"kubernetes.io/projected/31452db7-e2c4-4e61-8f8c-7017476f0bc0-kube-api-access-9bq5n\") pod \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.782869 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-tls\") pod \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.782935 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-config-data\") pod \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.782985 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-plugins-conf\") pod \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.783004 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-plugins\") pod \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.783094 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.783167 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-server-conf\") pod \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\" (UID: \"31452db7-e2c4-4e61-8f8c-7017476f0bc0\") " Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.783486 4954 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.783500 4954 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-server-conf\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.783509 4954 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/578bec25-a54c-4f52-95f2-19f20f833437-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.783519 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.783527 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.783535 4954 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/578bec25-a54c-4f52-95f2-19f20f833437-pod-info\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.783554 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.791067 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "31452db7-e2c4-4e61-8f8c-7017476f0bc0" (UID: "31452db7-e2c4-4e61-8f8c-7017476f0bc0"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.791458 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "31452db7-e2c4-4e61-8f8c-7017476f0bc0" (UID: "31452db7-e2c4-4e61-8f8c-7017476f0bc0"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:51.821663 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.429500 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="c75e5da0-fd73-485e-b53a-b5e96965bb99" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.166:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.436309 4954 scope.go:117] "RemoveContainer" containerID="3441df07b213b831010c3f148e9250178d2250f974a1c6530c3c0990a93f5d6f" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.439175 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31452db7-e2c4-4e61-8f8c-7017476f0bc0-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "31452db7-e2c4-4e61-8f8c-7017476f0bc0" (UID: "31452db7-e2c4-4e61-8f8c-7017476f0bc0"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.441327 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31452db7-e2c4-4e61-8f8c-7017476f0bc0-kube-api-access-9bq5n" (OuterVolumeSpecName: "kube-api-access-9bq5n") pod "31452db7-e2c4-4e61-8f8c-7017476f0bc0" (UID: "31452db7-e2c4-4e61-8f8c-7017476f0bc0"). InnerVolumeSpecName "kube-api-access-9bq5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.441398 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "31452db7-e2c4-4e61-8f8c-7017476f0bc0" (UID: "31452db7-e2c4-4e61-8f8c-7017476f0bc0"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.466280 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-config-data" (OuterVolumeSpecName: "config-data") pod "578bec25-a54c-4f52-95f2-19f20f833437" (UID: "578bec25-a54c-4f52-95f2-19f20f833437"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.466853 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/31452db7-e2c4-4e61-8f8c-7017476f0bc0-pod-info" (OuterVolumeSpecName: "pod-info") pod "31452db7-e2c4-4e61-8f8c-7017476f0bc0" (UID: "31452db7-e2c4-4e61-8f8c-7017476f0bc0"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.475999 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44jth\" (UniqueName: \"kubernetes.io/projected/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-kube-api-access-44jth\") pod \"keystoneb67a-account-delete-lc9v4\" (UID: \"2a9a9a20-3eb3-4fbe-9bab-18f69f01399f\") " pod="openstack/keystoneb67a-account-delete-lc9v4" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.476072 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-operator-scripts\") pod \"keystoneb67a-account-delete-lc9v4\" (UID: \"2a9a9a20-3eb3-4fbe-9bab-18f69f01399f\") " pod="openstack/keystoneb67a-account-delete-lc9v4" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.476740 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/578bec25-a54c-4f52-95f2-19f20f833437-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.476756 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.476766 4954 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31452db7-e2c4-4e61-8f8c-7017476f0bc0-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.476779 4954 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31452db7-e2c4-4e61-8f8c-7017476f0bc0-pod-info\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.476789 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bq5n\" (UniqueName: \"kubernetes.io/projected/31452db7-e2c4-4e61-8f8c-7017476f0bc0-kube-api-access-9bq5n\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.476799 4954 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.476807 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.476827 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 06 07:22:53 crc kubenswrapper[4954]: E1206 07:22:53.482423 4954 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 06 07:22:53 crc kubenswrapper[4954]: E1206 07:22:53.482930 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-operator-scripts podName:2a9a9a20-3eb3-4fbe-9bab-18f69f01399f nodeName:}" failed. No retries permitted until 2025-12-06 07:22:57.482905506 +0000 UTC m=+1552.296264945 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-operator-scripts") pod "keystoneb67a-account-delete-lc9v4" (UID: "2a9a9a20-3eb3-4fbe-9bab-18f69f01399f") : configmap "openstack-scripts" not found Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.484976 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-config-data" (OuterVolumeSpecName: "config-data") pod "31452db7-e2c4-4e61-8f8c-7017476f0bc0" (UID: "31452db7-e2c4-4e61-8f8c-7017476f0bc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:53 crc kubenswrapper[4954]: E1206 07:22:53.491539 4954 projected.go:194] Error preparing data for projected volume kube-api-access-44jth for pod openstack/keystoneb67a-account-delete-lc9v4: failed to fetch token: serviceaccounts "galera-openstack" not found Dec 06 07:22:53 crc kubenswrapper[4954]: E1206 07:22:53.491626 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-kube-api-access-44jth podName:2a9a9a20-3eb3-4fbe-9bab-18f69f01399f nodeName:}" failed. No retries permitted until 2025-12-06 07:22:57.491604661 +0000 UTC m=+1552.304964050 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-44jth" (UniqueName: "kubernetes.io/projected/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-kube-api-access-44jth") pod "keystoneb67a-account-delete-lc9v4" (UID: "2a9a9a20-3eb3-4fbe-9bab-18f69f01399f") : failed to fetch token: serviceaccounts "galera-openstack" not found Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.492328 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "31452db7-e2c4-4e61-8f8c-7017476f0bc0" (UID: "31452db7-e2c4-4e61-8f8c-7017476f0bc0"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.517404 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "578bec25-a54c-4f52-95f2-19f20f833437" (UID: "578bec25-a54c-4f52-95f2-19f20f833437"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.524352 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.537275 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-server-conf" (OuterVolumeSpecName: "server-conf") pod "31452db7-e2c4-4e61-8f8c-7017476f0bc0" (UID: "31452db7-e2c4-4e61-8f8c-7017476f0bc0"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.556522 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.573656 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6de2aad5-fb15-489c-b0fc-200e18ad3baa/ovn-northd/0.log" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.573756 4954 generic.go:334] "Generic (PLEG): container finished" podID="6de2aad5-fb15-489c-b0fc-200e18ad3baa" containerID="9874b322d9222424c03e259859ca289d93dd364308738ff78e86095855f8b6e6" exitCode=139 Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.578893 4954 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-server-conf\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.578930 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/578bec25-a54c-4f52-95f2-19f20f833437-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.578941 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.578950 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31452db7-e2c4-4e61-8f8c-7017476f0bc0-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.578961 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.633830 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "31452db7-e2c4-4e61-8f8c-7017476f0bc0" (UID: "31452db7-e2c4-4e61-8f8c-7017476f0bc0"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.681491 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31452db7-e2c4-4e61-8f8c-7017476f0bc0-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.702103 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.703674 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.703739 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystoneb67a-account-delete-lc9v4" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.823124 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="432b9d93-b045-4e25-b58b-b3a6fd8512c4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": context deadline exceeded" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.823116 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="432b9d93-b045-4e25-b58b-b3a6fd8512c4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.863916 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"578bec25-a54c-4f52-95f2-19f20f833437","Type":"ContainerDied","Data":"143d686cdba200b11f7ddb4a4fa6fe1c57963da39e319be167ff52a4eb9b563b"} Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.863994 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6de2aad5-fb15-489c-b0fc-200e18ad3baa","Type":"ContainerDied","Data":"9874b322d9222424c03e259859ca289d93dd364308738ff78e86095855f8b6e6"} Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.864016 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-829jd"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.864044 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-829jd"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.864066 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glanced905-account-delete-6dwqh"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.864085 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-d905-account-create-update-tljn7"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.864369 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glanced905-account-delete-6dwqh"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.864390 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-d905-account-create-update-tljn7"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.864416 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-xjfwm"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.864441 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"31452db7-e2c4-4e61-8f8c-7017476f0bc0","Type":"ContainerDied","Data":"0df6ae2d74b1f5fbb0262097ec95719175941863764004072af6dd9d2a84a4e7"} Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.864767 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-xjfwm"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.865021 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-df0e-account-create-update-nllgl"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.865040 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinderdf0e-account-delete-fsz84"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.865070 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-df0e-account-create-update-nllgl"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.865087 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinderdf0e-account-delete-fsz84"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.865102 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-m72rg"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.865116 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-m72rg"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.865137 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement88ab-account-delete-597b8"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.865149 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-88ab-account-create-update-8grdl"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.865168 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-88ab-account-create-update-8grdl"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.865185 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement88ab-account-delete-597b8"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.865210 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-s7jcc"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.865225 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-s7jcc"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.865240 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican664f-account-delete-l67mj"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.865255 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-664f-account-create-update-74gq9"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.865282 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican664f-account-delete-l67mj"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.865299 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-664f-account-create-update-74gq9"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.865317 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-498rf"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.865330 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-498rf"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.875338 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron251d-account-delete-rhsq5"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.906035 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-251d-account-create-update-d7mbv"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.913120 4954 scope.go:117] "RemoveContainer" containerID="cf94bc0a088f861f78bc849ba8a3d2602f00e20f5f2438d40190effa61b16e55" Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.932085 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-251d-account-create-update-d7mbv"] Dec 06 07:22:53 crc kubenswrapper[4954]: I1206 07:22:53.946191 4954 scope.go:117] "RemoveContainer" containerID="0a70376c5fc5526ced20f1109a9cab2d6bb28881276a6118e1121122aa40551c" Dec 06 07:22:53 crc kubenswrapper[4954]: E1206 07:22:53.966286 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9874b322d9222424c03e259859ca289d93dd364308738ff78e86095855f8b6e6 is running failed: container process not found" containerID="9874b322d9222424c03e259859ca289d93dd364308738ff78e86095855f8b6e6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 06 07:22:53 crc kubenswrapper[4954]: E1206 07:22:53.967775 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9874b322d9222424c03e259859ca289d93dd364308738ff78e86095855f8b6e6 is running failed: container process not found" containerID="9874b322d9222424c03e259859ca289d93dd364308738ff78e86095855f8b6e6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 06 07:22:53 crc kubenswrapper[4954]: E1206 07:22:53.968378 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9874b322d9222424c03e259859ca289d93dd364308738ff78e86095855f8b6e6 is running failed: container process not found" containerID="9874b322d9222424c03e259859ca289d93dd364308738ff78e86095855f8b6e6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 06 07:22:53 crc kubenswrapper[4954]: E1206 07:22:53.968413 4954 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9874b322d9222424c03e259859ca289d93dd364308738ff78e86095855f8b6e6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="6de2aad5-fb15-489c-b0fc-200e18ad3baa" containerName="ovn-northd" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.009908 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron251d-account-delete-rhsq5"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.010015 4954 scope.go:117] "RemoveContainer" containerID="146293bb351f1e02c74e1849e3be8838c1377922574653baaa3053a66ff3aad5" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.014351 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6de2aad5-fb15-489c-b0fc-200e18ad3baa/ovn-northd/0.log" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.014493 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.027280 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-5fm8l"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.042414 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-5fm8l"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.049644 4954 scope.go:117] "RemoveContainer" containerID="b6518d5655feada488f616ac02858e05fce9b02722643230cbaae6694b695cac" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.053528 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi9e6e-account-delete-2shf2"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.062188 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-9e6e-account-create-update-fp6hk"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.071781 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapi9e6e-account-delete-2shf2"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.083284 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-9e6e-account-create-update-fp6hk"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.091234 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-gqjsq"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.093538 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de2aad5-fb15-489c-b0fc-200e18ad3baa-config\") pod \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.093616 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6de2aad5-fb15-489c-b0fc-200e18ad3baa-scripts\") pod \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.093691 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de2aad5-fb15-489c-b0fc-200e18ad3baa-metrics-certs-tls-certs\") pod \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.093729 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6de2aad5-fb15-489c-b0fc-200e18ad3baa-ovn-rundir\") pod \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.093762 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfdlk\" (UniqueName: \"kubernetes.io/projected/6de2aad5-fb15-489c-b0fc-200e18ad3baa-kube-api-access-gfdlk\") pod \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.093800 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de2aad5-fb15-489c-b0fc-200e18ad3baa-combined-ca-bundle\") pod \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.093823 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de2aad5-fb15-489c-b0fc-200e18ad3baa-ovn-northd-tls-certs\") pod \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\" (UID: \"6de2aad5-fb15-489c-b0fc-200e18ad3baa\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.100012 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de2aad5-fb15-489c-b0fc-200e18ad3baa-config" (OuterVolumeSpecName: "config") pod "6de2aad5-fb15-489c-b0fc-200e18ad3baa" (UID: "6de2aad5-fb15-489c-b0fc-200e18ad3baa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.100866 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de2aad5-fb15-489c-b0fc-200e18ad3baa-scripts" (OuterVolumeSpecName: "scripts") pod "6de2aad5-fb15-489c-b0fc-200e18ad3baa" (UID: "6de2aad5-fb15-489c-b0fc-200e18ad3baa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.101211 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6de2aad5-fb15-489c-b0fc-200e18ad3baa-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "6de2aad5-fb15-489c-b0fc-200e18ad3baa" (UID: "6de2aad5-fb15-489c-b0fc-200e18ad3baa"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.101265 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-gqjsq"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.110168 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0f048-account-delete-6rspl"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.114064 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-f048-account-create-update-88gtz"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.115748 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6de2aad5-fb15-489c-b0fc-200e18ad3baa-kube-api-access-gfdlk" (OuterVolumeSpecName: "kube-api-access-gfdlk") pod "6de2aad5-fb15-489c-b0fc-200e18ad3baa" (UID: "6de2aad5-fb15-489c-b0fc-200e18ad3baa"). InnerVolumeSpecName "kube-api-access-gfdlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.124614 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-f048-account-create-update-88gtz"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.129656 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell0f048-account-delete-6rspl"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.136980 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.143652 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.151696 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.153249 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de2aad5-fb15-489c-b0fc-200e18ad3baa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6de2aad5-fb15-489c-b0fc-200e18ad3baa" (UID: "6de2aad5-fb15-489c-b0fc-200e18ad3baa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.157892 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.160841 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.164193 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.169812 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.182455 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystoneb67a-account-delete-lc9v4"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.188876 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystoneb67a-account-delete-lc9v4"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.190706 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de2aad5-fb15-489c-b0fc-200e18ad3baa-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "6de2aad5-fb15-489c-b0fc-200e18ad3baa" (UID: "6de2aad5-fb15-489c-b0fc-200e18ad3baa"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.191060 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de2aad5-fb15-489c-b0fc-200e18ad3baa-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "6de2aad5-fb15-489c-b0fc-200e18ad3baa" (UID: "6de2aad5-fb15-489c-b0fc-200e18ad3baa"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.195237 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-fernet-keys\") pod \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.195298 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-scripts\") pod \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.195417 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-internal-tls-certs\") pod \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.195457 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-credential-keys\") pod \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.195475 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-public-tls-certs\") pod \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.195535 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6tsh\" (UniqueName: \"kubernetes.io/projected/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-kube-api-access-s6tsh\") pod \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.196112 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-combined-ca-bundle\") pod \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.196391 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-config-data\") pod \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\" (UID: \"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.196898 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de2aad5-fb15-489c-b0fc-200e18ad3baa-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.196912 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6de2aad5-fb15-489c-b0fc-200e18ad3baa-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.196922 4954 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de2aad5-fb15-489c-b0fc-200e18ad3baa-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.196933 4954 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6de2aad5-fb15-489c-b0fc-200e18ad3baa-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.196943 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfdlk\" (UniqueName: \"kubernetes.io/projected/6de2aad5-fb15-489c-b0fc-200e18ad3baa-kube-api-access-gfdlk\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.196952 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de2aad5-fb15-489c-b0fc-200e18ad3baa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.196961 4954 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6de2aad5-fb15-489c-b0fc-200e18ad3baa-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.201069 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a" (UID: "fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.201094 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-scripts" (OuterVolumeSpecName: "scripts") pod "fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a" (UID: "fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.211436 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a" (UID: "fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.211617 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-kube-api-access-s6tsh" (OuterVolumeSpecName: "kube-api-access-s6tsh") pod "fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a" (UID: "fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a"). InnerVolumeSpecName "kube-api-access-s6tsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.232388 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-config-data" (OuterVolumeSpecName: "config-data") pod "fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a" (UID: "fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.235681 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a" (UID: "fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.254753 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a" (UID: "fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.256285 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.292335 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a" (UID: "fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299025 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-ceilometer-tls-certs\") pod \"6f6b0598-f49f-4300-a2e3-edb512001517\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299124 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-combined-ca-bundle\") pod \"6f6b0598-f49f-4300-a2e3-edb512001517\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299155 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gp7f\" (UniqueName: \"kubernetes.io/projected/6f6b0598-f49f-4300-a2e3-edb512001517-kube-api-access-4gp7f\") pod \"6f6b0598-f49f-4300-a2e3-edb512001517\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299181 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6b0598-f49f-4300-a2e3-edb512001517-run-httpd\") pod \"6f6b0598-f49f-4300-a2e3-edb512001517\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299233 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-scripts\") pod \"6f6b0598-f49f-4300-a2e3-edb512001517\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299302 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6b0598-f49f-4300-a2e3-edb512001517-log-httpd\") pod \"6f6b0598-f49f-4300-a2e3-edb512001517\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299329 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-sg-core-conf-yaml\") pod \"6f6b0598-f49f-4300-a2e3-edb512001517\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299367 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-config-data\") pod \"6f6b0598-f49f-4300-a2e3-edb512001517\" (UID: \"6f6b0598-f49f-4300-a2e3-edb512001517\") " Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299690 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299711 4954 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299723 4954 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299717 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f6b0598-f49f-4300-a2e3-edb512001517-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6f6b0598-f49f-4300-a2e3-edb512001517" (UID: "6f6b0598-f49f-4300-a2e3-edb512001517"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299732 4954 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299793 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6tsh\" (UniqueName: \"kubernetes.io/projected/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-kube-api-access-s6tsh\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299812 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299828 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299841 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44jth\" (UniqueName: \"kubernetes.io/projected/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f-kube-api-access-44jth\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299853 4954 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.299864 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.300816 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f6b0598-f49f-4300-a2e3-edb512001517-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6f6b0598-f49f-4300-a2e3-edb512001517" (UID: "6f6b0598-f49f-4300-a2e3-edb512001517"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.303681 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-scripts" (OuterVolumeSpecName: "scripts") pod "6f6b0598-f49f-4300-a2e3-edb512001517" (UID: "6f6b0598-f49f-4300-a2e3-edb512001517"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.308435 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f6b0598-f49f-4300-a2e3-edb512001517-kube-api-access-4gp7f" (OuterVolumeSpecName: "kube-api-access-4gp7f") pod "6f6b0598-f49f-4300-a2e3-edb512001517" (UID: "6f6b0598-f49f-4300-a2e3-edb512001517"). InnerVolumeSpecName "kube-api-access-4gp7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.337070 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6f6b0598-f49f-4300-a2e3-edb512001517" (UID: "6f6b0598-f49f-4300-a2e3-edb512001517"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.338783 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6f6b0598-f49f-4300-a2e3-edb512001517" (UID: "6f6b0598-f49f-4300-a2e3-edb512001517"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.378619 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f6b0598-f49f-4300-a2e3-edb512001517" (UID: "6f6b0598-f49f-4300-a2e3-edb512001517"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.397539 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-config-data" (OuterVolumeSpecName: "config-data") pod "6f6b0598-f49f-4300-a2e3-edb512001517" (UID: "6f6b0598-f49f-4300-a2e3-edb512001517"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.401780 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.401815 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6b0598-f49f-4300-a2e3-edb512001517-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.401825 4954 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.401837 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.401846 4954 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.401857 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f6b0598-f49f-4300-a2e3-edb512001517-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.401866 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gp7f\" (UniqueName: \"kubernetes.io/projected/6f6b0598-f49f-4300-a2e3-edb512001517-kube-api-access-4gp7f\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.401874 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6b0598-f49f-4300-a2e3-edb512001517-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:54 crc kubenswrapper[4954]: E1206 07:22:54.492763 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a193b8a0fa409b5321f53a307fc650e4cd2f601366017277f503ce0ae781286a" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 06 07:22:54 crc kubenswrapper[4954]: E1206 07:22:54.496100 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a193b8a0fa409b5321f53a307fc650e4cd2f601366017277f503ce0ae781286a" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 06 07:22:54 crc kubenswrapper[4954]: E1206 07:22:54.497760 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a193b8a0fa409b5321f53a307fc650e4cd2f601366017277f503ce0ae781286a" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 06 07:22:54 crc kubenswrapper[4954]: E1206 07:22:54.497810 4954 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="841d2cc4-0265-4e02-af59-f0f322208f02" containerName="galera" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.715216 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6de2aad5-fb15-489c-b0fc-200e18ad3baa/ovn-northd/0.log" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.715337 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6de2aad5-fb15-489c-b0fc-200e18ad3baa","Type":"ContainerDied","Data":"5f9b60d52550887376cc37531f59cf1c2cf9cd5805e21382219d8a746606cefb"} Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.715388 4954 scope.go:117] "RemoveContainer" containerID="b185262581277ddddfd5f329caf8d803c0479c8dea4732929b0bc2936e8a4cb6" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.715401 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.726785 4954 generic.go:334] "Generic (PLEG): container finished" podID="fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a" containerID="cbea21543fc70a07f7bd97a251a95abf2e5f0008dd200e897c3854aec6d26be7" exitCode=0 Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.726863 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55864b7b7d-dsx9g" event={"ID":"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a","Type":"ContainerDied","Data":"cbea21543fc70a07f7bd97a251a95abf2e5f0008dd200e897c3854aec6d26be7"} Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.726911 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55864b7b7d-dsx9g" event={"ID":"fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a","Type":"ContainerDied","Data":"bc6e5869146f90030326cf9214bc35bbc536cef99cafd3be22659e03b413e05f"} Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.726999 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55864b7b7d-dsx9g" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.732261 4954 generic.go:334] "Generic (PLEG): container finished" podID="841d2cc4-0265-4e02-af59-f0f322208f02" containerID="a193b8a0fa409b5321f53a307fc650e4cd2f601366017277f503ce0ae781286a" exitCode=0 Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.732359 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"841d2cc4-0265-4e02-af59-f0f322208f02","Type":"ContainerDied","Data":"a193b8a0fa409b5321f53a307fc650e4cd2f601366017277f503ce0ae781286a"} Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.743743 4954 generic.go:334] "Generic (PLEG): container finished" podID="6f6b0598-f49f-4300-a2e3-edb512001517" containerID="87f6b7dbad9cba2999b009687b2b8792fd5e80223d28635e9b8a19849f7be812" exitCode=0 Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.743790 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6b0598-f49f-4300-a2e3-edb512001517","Type":"ContainerDied","Data":"87f6b7dbad9cba2999b009687b2b8792fd5e80223d28635e9b8a19849f7be812"} Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.743815 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.743824 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6b0598-f49f-4300-a2e3-edb512001517","Type":"ContainerDied","Data":"6e7d8d3bc70058aa8b89c559d1a61bc72fc98f0687b379dea6d56813fad2887a"} Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.781270 4954 scope.go:117] "RemoveContainer" containerID="9874b322d9222424c03e259859ca289d93dd364308738ff78e86095855f8b6e6" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.781633 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.794298 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.804964 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.811995 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.817433 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-55864b7b7d-dsx9g"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.824703 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-55864b7b7d-dsx9g"] Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.860316 4954 scope.go:117] "RemoveContainer" containerID="cbea21543fc70a07f7bd97a251a95abf2e5f0008dd200e897c3854aec6d26be7" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.897954 4954 scope.go:117] "RemoveContainer" containerID="cbea21543fc70a07f7bd97a251a95abf2e5f0008dd200e897c3854aec6d26be7" Dec 06 07:22:54 crc kubenswrapper[4954]: E1206 07:22:54.899813 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbea21543fc70a07f7bd97a251a95abf2e5f0008dd200e897c3854aec6d26be7\": container with ID starting with cbea21543fc70a07f7bd97a251a95abf2e5f0008dd200e897c3854aec6d26be7 not found: ID does not exist" containerID="cbea21543fc70a07f7bd97a251a95abf2e5f0008dd200e897c3854aec6d26be7" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.899860 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbea21543fc70a07f7bd97a251a95abf2e5f0008dd200e897c3854aec6d26be7"} err="failed to get container status \"cbea21543fc70a07f7bd97a251a95abf2e5f0008dd200e897c3854aec6d26be7\": rpc error: code = NotFound desc = could not find container \"cbea21543fc70a07f7bd97a251a95abf2e5f0008dd200e897c3854aec6d26be7\": container with ID starting with cbea21543fc70a07f7bd97a251a95abf2e5f0008dd200e897c3854aec6d26be7 not found: ID does not exist" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.899888 4954 scope.go:117] "RemoveContainer" containerID="872a2bac50f05d39969f244d52edcb2950bf78dcb8c8eeda153b5323ad19a271" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.931197 4954 scope.go:117] "RemoveContainer" containerID="ec3ed71b2491ef41795c579a498091f87719a945dc7c7ab6d0235f2e1a17dee5" Dec 06 07:22:54 crc kubenswrapper[4954]: I1206 07:22:54.951423 4954 scope.go:117] "RemoveContainer" containerID="87f6b7dbad9cba2999b009687b2b8792fd5e80223d28635e9b8a19849f7be812" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.005267 4954 scope.go:117] "RemoveContainer" containerID="0231a48c4c2f1af85763f5b6c38a7336d6485526192523aeaa1890f0cad332a2" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.025273 4954 scope.go:117] "RemoveContainer" containerID="872a2bac50f05d39969f244d52edcb2950bf78dcb8c8eeda153b5323ad19a271" Dec 06 07:22:55 crc kubenswrapper[4954]: E1206 07:22:55.025872 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"872a2bac50f05d39969f244d52edcb2950bf78dcb8c8eeda153b5323ad19a271\": container with ID starting with 872a2bac50f05d39969f244d52edcb2950bf78dcb8c8eeda153b5323ad19a271 not found: ID does not exist" containerID="872a2bac50f05d39969f244d52edcb2950bf78dcb8c8eeda153b5323ad19a271" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.025933 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872a2bac50f05d39969f244d52edcb2950bf78dcb8c8eeda153b5323ad19a271"} err="failed to get container status \"872a2bac50f05d39969f244d52edcb2950bf78dcb8c8eeda153b5323ad19a271\": rpc error: code = NotFound desc = could not find container \"872a2bac50f05d39969f244d52edcb2950bf78dcb8c8eeda153b5323ad19a271\": container with ID starting with 872a2bac50f05d39969f244d52edcb2950bf78dcb8c8eeda153b5323ad19a271 not found: ID does not exist" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.025970 4954 scope.go:117] "RemoveContainer" containerID="ec3ed71b2491ef41795c579a498091f87719a945dc7c7ab6d0235f2e1a17dee5" Dec 06 07:22:55 crc kubenswrapper[4954]: E1206 07:22:55.026632 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec3ed71b2491ef41795c579a498091f87719a945dc7c7ab6d0235f2e1a17dee5\": container with ID starting with ec3ed71b2491ef41795c579a498091f87719a945dc7c7ab6d0235f2e1a17dee5 not found: ID does not exist" containerID="ec3ed71b2491ef41795c579a498091f87719a945dc7c7ab6d0235f2e1a17dee5" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.026675 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3ed71b2491ef41795c579a498091f87719a945dc7c7ab6d0235f2e1a17dee5"} err="failed to get container status \"ec3ed71b2491ef41795c579a498091f87719a945dc7c7ab6d0235f2e1a17dee5\": rpc error: code = NotFound desc = could not find container \"ec3ed71b2491ef41795c579a498091f87719a945dc7c7ab6d0235f2e1a17dee5\": container with ID starting with ec3ed71b2491ef41795c579a498091f87719a945dc7c7ab6d0235f2e1a17dee5 not found: ID does not exist" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.026710 4954 scope.go:117] "RemoveContainer" containerID="87f6b7dbad9cba2999b009687b2b8792fd5e80223d28635e9b8a19849f7be812" Dec 06 07:22:55 crc kubenswrapper[4954]: E1206 07:22:55.027057 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87f6b7dbad9cba2999b009687b2b8792fd5e80223d28635e9b8a19849f7be812\": container with ID starting with 87f6b7dbad9cba2999b009687b2b8792fd5e80223d28635e9b8a19849f7be812 not found: ID does not exist" containerID="87f6b7dbad9cba2999b009687b2b8792fd5e80223d28635e9b8a19849f7be812" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.027105 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f6b7dbad9cba2999b009687b2b8792fd5e80223d28635e9b8a19849f7be812"} err="failed to get container status \"87f6b7dbad9cba2999b009687b2b8792fd5e80223d28635e9b8a19849f7be812\": rpc error: code = NotFound desc = could not find container \"87f6b7dbad9cba2999b009687b2b8792fd5e80223d28635e9b8a19849f7be812\": container with ID starting with 87f6b7dbad9cba2999b009687b2b8792fd5e80223d28635e9b8a19849f7be812 not found: ID does not exist" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.027139 4954 scope.go:117] "RemoveContainer" containerID="0231a48c4c2f1af85763f5b6c38a7336d6485526192523aeaa1890f0cad332a2" Dec 06 07:22:55 crc kubenswrapper[4954]: E1206 07:22:55.027444 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0231a48c4c2f1af85763f5b6c38a7336d6485526192523aeaa1890f0cad332a2\": container with ID starting with 0231a48c4c2f1af85763f5b6c38a7336d6485526192523aeaa1890f0cad332a2 not found: ID does not exist" containerID="0231a48c4c2f1af85763f5b6c38a7336d6485526192523aeaa1890f0cad332a2" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.027470 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0231a48c4c2f1af85763f5b6c38a7336d6485526192523aeaa1890f0cad332a2"} err="failed to get container status \"0231a48c4c2f1af85763f5b6c38a7336d6485526192523aeaa1890f0cad332a2\": rpc error: code = NotFound desc = could not find container \"0231a48c4c2f1af85763f5b6c38a7336d6485526192523aeaa1890f0cad332a2\": container with ID starting with 0231a48c4c2f1af85763f5b6c38a7336d6485526192523aeaa1890f0cad332a2 not found: ID does not exist" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.176698 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.213490 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/841d2cc4-0265-4e02-af59-f0f322208f02-operator-scripts\") pod \"841d2cc4-0265-4e02-af59-f0f322208f02\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.213629 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4v4p\" (UniqueName: \"kubernetes.io/projected/841d2cc4-0265-4e02-af59-f0f322208f02-kube-api-access-x4v4p\") pod \"841d2cc4-0265-4e02-af59-f0f322208f02\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.213699 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/841d2cc4-0265-4e02-af59-f0f322208f02-kolla-config\") pod \"841d2cc4-0265-4e02-af59-f0f322208f02\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.213725 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/841d2cc4-0265-4e02-af59-f0f322208f02-config-data-generated\") pod \"841d2cc4-0265-4e02-af59-f0f322208f02\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.214346 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/841d2cc4-0265-4e02-af59-f0f322208f02-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "841d2cc4-0265-4e02-af59-f0f322208f02" (UID: "841d2cc4-0265-4e02-af59-f0f322208f02"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.214661 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841d2cc4-0265-4e02-af59-f0f322208f02-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "841d2cc4-0265-4e02-af59-f0f322208f02" (UID: "841d2cc4-0265-4e02-af59-f0f322208f02"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.214803 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841d2cc4-0265-4e02-af59-f0f322208f02-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "841d2cc4-0265-4e02-af59-f0f322208f02" (UID: "841d2cc4-0265-4e02-af59-f0f322208f02"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.214909 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"841d2cc4-0265-4e02-af59-f0f322208f02\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.214937 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/841d2cc4-0265-4e02-af59-f0f322208f02-galera-tls-certs\") pod \"841d2cc4-0265-4e02-af59-f0f322208f02\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.215246 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/841d2cc4-0265-4e02-af59-f0f322208f02-config-data-default\") pod \"841d2cc4-0265-4e02-af59-f0f322208f02\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.215815 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841d2cc4-0265-4e02-af59-f0f322208f02-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "841d2cc4-0265-4e02-af59-f0f322208f02" (UID: "841d2cc4-0265-4e02-af59-f0f322208f02"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.215912 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841d2cc4-0265-4e02-af59-f0f322208f02-combined-ca-bundle\") pod \"841d2cc4-0265-4e02-af59-f0f322208f02\" (UID: \"841d2cc4-0265-4e02-af59-f0f322208f02\") " Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.216582 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/841d2cc4-0265-4e02-af59-f0f322208f02-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.216600 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/841d2cc4-0265-4e02-af59-f0f322208f02-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.216608 4954 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/841d2cc4-0265-4e02-af59-f0f322208f02-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.216616 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/841d2cc4-0265-4e02-af59-f0f322208f02-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.218948 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/841d2cc4-0265-4e02-af59-f0f322208f02-kube-api-access-x4v4p" (OuterVolumeSpecName: "kube-api-access-x4v4p") pod "841d2cc4-0265-4e02-af59-f0f322208f02" (UID: "841d2cc4-0265-4e02-af59-f0f322208f02"). InnerVolumeSpecName "kube-api-access-x4v4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.226411 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "841d2cc4-0265-4e02-af59-f0f322208f02" (UID: "841d2cc4-0265-4e02-af59-f0f322208f02"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.284058 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/841d2cc4-0265-4e02-af59-f0f322208f02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "841d2cc4-0265-4e02-af59-f0f322208f02" (UID: "841d2cc4-0265-4e02-af59-f0f322208f02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.307972 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/841d2cc4-0265-4e02-af59-f0f322208f02-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "841d2cc4-0265-4e02-af59-f0f322208f02" (UID: "841d2cc4-0265-4e02-af59-f0f322208f02"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.318433 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.318650 4954 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/841d2cc4-0265-4e02-af59-f0f322208f02-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.318913 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841d2cc4-0265-4e02-af59-f0f322208f02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.319003 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4v4p\" (UniqueName: \"kubernetes.io/projected/841d2cc4-0265-4e02-af59-f0f322208f02-kube-api-access-x4v4p\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.337334 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.421551 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.462201 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="097befa0-58fe-4616-bed7-ada4f7d81ce3" path="/var/lib/kubelet/pods/097befa0-58fe-4616-bed7-ada4f7d81ce3/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.462917 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a9a9a20-3eb3-4fbe-9bab-18f69f01399f" path="/var/lib/kubelet/pods/2a9a9a20-3eb3-4fbe-9bab-18f69f01399f/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.463777 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31452db7-e2c4-4e61-8f8c-7017476f0bc0" path="/var/lib/kubelet/pods/31452db7-e2c4-4e61-8f8c-7017476f0bc0/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.466317 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3656e91b-4a6f-4ae8-a097-0f5961df5f25" path="/var/lib/kubelet/pods/3656e91b-4a6f-4ae8-a097-0f5961df5f25/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.468548 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="372dbd7a-e026-4794-9f89-071af1d6e3a8" path="/var/lib/kubelet/pods/372dbd7a-e026-4794-9f89-071af1d6e3a8/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.469269 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dfefc17-9a79-4857-b32d-d1b2f7ba15dc" path="/var/lib/kubelet/pods/3dfefc17-9a79-4857-b32d-d1b2f7ba15dc/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.469986 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ece29cd-d385-457d-853d-f37b09007d05" path="/var/lib/kubelet/pods/3ece29cd-d385-457d-853d-f37b09007d05/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.471204 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="418e92aa-4713-4a55-b4d5-650587fcb6ca" path="/var/lib/kubelet/pods/418e92aa-4713-4a55-b4d5-650587fcb6ca/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.471808 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45760345-6d26-4a9b-931e-1c4d5cc3ca3e" path="/var/lib/kubelet/pods/45760345-6d26-4a9b-931e-1c4d5cc3ca3e/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.472442 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a82ba0f-bb07-4959-bfdd-8a420c617835" path="/var/lib/kubelet/pods/4a82ba0f-bb07-4959-bfdd-8a420c617835/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.475411 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="578bec25-a54c-4f52-95f2-19f20f833437" path="/var/lib/kubelet/pods/578bec25-a54c-4f52-95f2-19f20f833437/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.476267 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aec320e-8fda-4ac0-aac9-85fa4a436c01" path="/var/lib/kubelet/pods/5aec320e-8fda-4ac0-aac9-85fa4a436c01/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.476902 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="693e40da-d019-421e-83a8-6dc351580607" path="/var/lib/kubelet/pods/693e40da-d019-421e-83a8-6dc351580607/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.478004 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6de2aad5-fb15-489c-b0fc-200e18ad3baa" path="/var/lib/kubelet/pods/6de2aad5-fb15-489c-b0fc-200e18ad3baa/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.479975 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f6b0598-f49f-4300-a2e3-edb512001517" path="/var/lib/kubelet/pods/6f6b0598-f49f-4300-a2e3-edb512001517/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.481402 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81a84a0f-fa51-461c-a281-6b832ad39aa7" path="/var/lib/kubelet/pods/81a84a0f-fa51-461c-a281-6b832ad39aa7/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.482005 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81e2968b-ef32-49ec-81e1-e3f07c3b73b8" path="/var/lib/kubelet/pods/81e2968b-ef32-49ec-81e1-e3f07c3b73b8/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.482488 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8234c284-a6cb-4bf4-b63a-534a4ee085aa" path="/var/lib/kubelet/pods/8234c284-a6cb-4bf4-b63a-534a4ee085aa/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.483308 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94b5d766-a567-4328-9942-c0dc2eede97e" path="/var/lib/kubelet/pods/94b5d766-a567-4328-9942-c0dc2eede97e/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.484399 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ca78e5f-9977-49fa-b175-504c48bf1861" path="/var/lib/kubelet/pods/9ca78e5f-9977-49fa-b175-504c48bf1861/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.485247 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a33fd05b-5e69-4042-a5b2-b07dd4edd63d" path="/var/lib/kubelet/pods/a33fd05b-5e69-4042-a5b2-b07dd4edd63d/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.486248 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a823a756-929e-41ba-a100-0fb69d74b7b8" path="/var/lib/kubelet/pods/a823a756-929e-41ba-a100-0fb69d74b7b8/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.487516 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b279d2ed-3be6-4f9e-b27d-8fac529b4d32" path="/var/lib/kubelet/pods/b279d2ed-3be6-4f9e-b27d-8fac529b4d32/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.488400 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c417ca1e-22df-4163-96f9-349df3d624e8" path="/var/lib/kubelet/pods/c417ca1e-22df-4163-96f9-349df3d624e8/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.489175 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4546452-6f33-4f15-9b6e-30aecc6e81d1" path="/var/lib/kubelet/pods/c4546452-6f33-4f15-9b6e-30aecc6e81d1/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.489998 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2ccdb46-97b8-40d8-aebb-5cf28cb6854d" path="/var/lib/kubelet/pods/d2ccdb46-97b8-40d8-aebb-5cf28cb6854d/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.491382 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46" path="/var/lib/kubelet/pods/e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.492141 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a" path="/var/lib/kubelet/pods/fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a/volumes" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.756637 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"841d2cc4-0265-4e02-af59-f0f322208f02","Type":"ContainerDied","Data":"e5166c69c0f67101f30360e8db04da0fed96c43f3c89d13729a52e456226006f"} Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.758043 4954 scope.go:117] "RemoveContainer" containerID="a193b8a0fa409b5321f53a307fc650e4cd2f601366017277f503ce0ae781286a" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.756681 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.766361 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/memcached-0" podUID="4a82ba0f-bb07-4959-bfdd-8a420c617835" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.106:11211: i/o timeout" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.783398 4954 scope.go:117] "RemoveContainer" containerID="776ef8a2e162102128c3ff29599625dfa34fd8480af7d7c05f9b0eae321798f5" Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.786132 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 06 07:22:55 crc kubenswrapper[4954]: I1206 07:22:55.792623 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Dec 06 07:22:56 crc kubenswrapper[4954]: E1206 07:22:56.635908 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 06 07:22:56 crc kubenswrapper[4954]: E1206 07:22:56.636591 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 06 07:22:56 crc kubenswrapper[4954]: E1206 07:22:56.636610 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 06 07:22:56 crc kubenswrapper[4954]: E1206 07:22:56.636886 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 06 07:22:56 crc kubenswrapper[4954]: E1206 07:22:56.636933 4954 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-xskgs" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovsdb-server" Dec 06 07:22:56 crc kubenswrapper[4954]: E1206 07:22:56.637928 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 06 07:22:56 crc kubenswrapper[4954]: E1206 07:22:56.639037 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 06 07:22:56 crc kubenswrapper[4954]: E1206 07:22:56.639138 4954 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-xskgs" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovs-vswitchd" Dec 06 07:22:57 crc kubenswrapper[4954]: I1206 07:22:57.467268 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="841d2cc4-0265-4e02-af59-f0f322208f02" path="/var/lib/kubelet/pods/841d2cc4-0265-4e02-af59-f0f322208f02/volumes" Dec 06 07:23:01 crc kubenswrapper[4954]: I1206 07:23:01.444471 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:23:01 crc kubenswrapper[4954]: E1206 07:23:01.445060 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:23:01 crc kubenswrapper[4954]: E1206 07:23:01.634402 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 06 07:23:01 crc kubenswrapper[4954]: E1206 07:23:01.635124 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 06 07:23:01 crc kubenswrapper[4954]: E1206 07:23:01.635322 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 06 07:23:01 crc kubenswrapper[4954]: E1206 07:23:01.635364 4954 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-xskgs" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovsdb-server" Dec 06 07:23:01 crc kubenswrapper[4954]: E1206 07:23:01.636842 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 06 07:23:01 crc kubenswrapper[4954]: E1206 07:23:01.641158 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 06 07:23:01 crc kubenswrapper[4954]: E1206 07:23:01.642935 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 06 07:23:01 crc kubenswrapper[4954]: E1206 07:23:01.642982 4954 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-xskgs" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovs-vswitchd" Dec 06 07:23:06 crc kubenswrapper[4954]: E1206 07:23:06.634605 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 06 07:23:06 crc kubenswrapper[4954]: E1206 07:23:06.637258 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 06 07:23:06 crc kubenswrapper[4954]: E1206 07:23:06.637352 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 06 07:23:06 crc kubenswrapper[4954]: E1206 07:23:06.637815 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 06 07:23:06 crc kubenswrapper[4954]: E1206 07:23:06.637930 4954 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-xskgs" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovsdb-server" Dec 06 07:23:06 crc kubenswrapper[4954]: E1206 07:23:06.639143 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 06 07:23:06 crc kubenswrapper[4954]: E1206 07:23:06.640814 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 06 07:23:06 crc kubenswrapper[4954]: E1206 07:23:06.641113 4954 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-xskgs" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovs-vswitchd" Dec 06 07:23:07 crc kubenswrapper[4954]: I1206 07:23:07.919479 4954 generic.go:334] "Generic (PLEG): container finished" podID="096c2131-031b-4573-ade7-b1d0d34abc60" containerID="59705182107d1f5a2685c487316a4e8779bdf03f701931d0f8c6734cc83fb7c6" exitCode=0 Dec 06 07:23:07 crc kubenswrapper[4954]: I1206 07:23:07.919622 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78654684fc-84hfw" event={"ID":"096c2131-031b-4573-ade7-b1d0d34abc60","Type":"ContainerDied","Data":"59705182107d1f5a2685c487316a4e8779bdf03f701931d0f8c6734cc83fb7c6"} Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.144846 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.229654 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-public-tls-certs\") pod \"096c2131-031b-4573-ade7-b1d0d34abc60\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.230163 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2jtr\" (UniqueName: \"kubernetes.io/projected/096c2131-031b-4573-ade7-b1d0d34abc60-kube-api-access-l2jtr\") pod \"096c2131-031b-4573-ade7-b1d0d34abc60\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.230202 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-config\") pod \"096c2131-031b-4573-ade7-b1d0d34abc60\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.230230 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-internal-tls-certs\") pod \"096c2131-031b-4573-ade7-b1d0d34abc60\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.230246 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-ovndb-tls-certs\") pod \"096c2131-031b-4573-ade7-b1d0d34abc60\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.230262 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-httpd-config\") pod \"096c2131-031b-4573-ade7-b1d0d34abc60\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.230286 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-combined-ca-bundle\") pod \"096c2131-031b-4573-ade7-b1d0d34abc60\" (UID: \"096c2131-031b-4573-ade7-b1d0d34abc60\") " Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.236612 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/096c2131-031b-4573-ade7-b1d0d34abc60-kube-api-access-l2jtr" (OuterVolumeSpecName: "kube-api-access-l2jtr") pod "096c2131-031b-4573-ade7-b1d0d34abc60" (UID: "096c2131-031b-4573-ade7-b1d0d34abc60"). InnerVolumeSpecName "kube-api-access-l2jtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.238002 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "096c2131-031b-4573-ade7-b1d0d34abc60" (UID: "096c2131-031b-4573-ade7-b1d0d34abc60"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.273541 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-config" (OuterVolumeSpecName: "config") pod "096c2131-031b-4573-ade7-b1d0d34abc60" (UID: "096c2131-031b-4573-ade7-b1d0d34abc60"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.275077 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "096c2131-031b-4573-ade7-b1d0d34abc60" (UID: "096c2131-031b-4573-ade7-b1d0d34abc60"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.277941 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "096c2131-031b-4573-ade7-b1d0d34abc60" (UID: "096c2131-031b-4573-ade7-b1d0d34abc60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.289778 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "096c2131-031b-4573-ade7-b1d0d34abc60" (UID: "096c2131-031b-4573-ade7-b1d0d34abc60"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.305026 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "096c2131-031b-4573-ade7-b1d0d34abc60" (UID: "096c2131-031b-4573-ade7-b1d0d34abc60"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.332005 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2jtr\" (UniqueName: \"kubernetes.io/projected/096c2131-031b-4573-ade7-b1d0d34abc60-kube-api-access-l2jtr\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.332035 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.332048 4954 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.332060 4954 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.332070 4954 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.332079 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.332091 4954 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/096c2131-031b-4573-ade7-b1d0d34abc60-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.931416 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78654684fc-84hfw" event={"ID":"096c2131-031b-4573-ade7-b1d0d34abc60","Type":"ContainerDied","Data":"36be5325833648901d84529ec4c478968dce341b4f206524051d37b86a68aec2"} Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.931478 4954 scope.go:117] "RemoveContainer" containerID="feff0af557a19a07e436a4ff2db6401f9895dc0adfba494469ecdb5c414013e9" Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.931478 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78654684fc-84hfw" Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.964631 4954 scope.go:117] "RemoveContainer" containerID="59705182107d1f5a2685c487316a4e8779bdf03f701931d0f8c6734cc83fb7c6" Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.965594 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78654684fc-84hfw"] Dec 06 07:23:08 crc kubenswrapper[4954]: I1206 07:23:08.971932 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-78654684fc-84hfw"] Dec 06 07:23:09 crc kubenswrapper[4954]: I1206 07:23:09.454633 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="096c2131-031b-4573-ade7-b1d0d34abc60" path="/var/lib/kubelet/pods/096c2131-031b-4573-ade7-b1d0d34abc60/volumes" Dec 06 07:23:11 crc kubenswrapper[4954]: E1206 07:23:11.634990 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 06 07:23:11 crc kubenswrapper[4954]: E1206 07:23:11.637873 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 06 07:23:11 crc kubenswrapper[4954]: E1206 07:23:11.638125 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 06 07:23:11 crc kubenswrapper[4954]: E1206 07:23:11.638405 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 06 07:23:11 crc kubenswrapper[4954]: E1206 07:23:11.638445 4954 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-xskgs" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovsdb-server" Dec 06 07:23:11 crc kubenswrapper[4954]: E1206 07:23:11.639962 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 06 07:23:11 crc kubenswrapper[4954]: E1206 07:23:11.641697 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 06 07:23:11 crc kubenswrapper[4954]: E1206 07:23:11.641738 4954 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-xskgs" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovs-vswitchd" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.003744 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xskgs_a85b49d0-cc8d-4dce-aade-6c63af659f42/ovs-vswitchd/0.log" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.005703 4954 generic.go:334] "Generic (PLEG): container finished" podID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" exitCode=137 Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.005792 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xskgs" event={"ID":"a85b49d0-cc8d-4dce-aade-6c63af659f42","Type":"ContainerDied","Data":"f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d"} Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.023336 4954 generic.go:334] "Generic (PLEG): container finished" podID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerID="aa40c5ec6d752ec2fbe2321f2103def28820a11e0c19b5134c7eb5bd9a35558b" exitCode=137 Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.023390 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerDied","Data":"aa40c5ec6d752ec2fbe2321f2103def28820a11e0c19b5134c7eb5bd9a35558b"} Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.191852 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xskgs_a85b49d0-cc8d-4dce-aade-6c63af659f42/ovs-vswitchd/0.log" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.194088 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.310764 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pxxb\" (UniqueName: \"kubernetes.io/projected/a85b49d0-cc8d-4dce-aade-6c63af659f42-kube-api-access-9pxxb\") pod \"a85b49d0-cc8d-4dce-aade-6c63af659f42\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.310922 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-etc-ovs\") pod \"a85b49d0-cc8d-4dce-aade-6c63af659f42\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.310956 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-var-log\") pod \"a85b49d0-cc8d-4dce-aade-6c63af659f42\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.310974 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-var-run\") pod \"a85b49d0-cc8d-4dce-aade-6c63af659f42\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.310998 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a85b49d0-cc8d-4dce-aade-6c63af659f42-scripts\") pod \"a85b49d0-cc8d-4dce-aade-6c63af659f42\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.311025 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-var-lib\") pod \"a85b49d0-cc8d-4dce-aade-6c63af659f42\" (UID: \"a85b49d0-cc8d-4dce-aade-6c63af659f42\") " Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.311258 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-var-log" (OuterVolumeSpecName: "var-log") pod "a85b49d0-cc8d-4dce-aade-6c63af659f42" (UID: "a85b49d0-cc8d-4dce-aade-6c63af659f42"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.311388 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-var-run" (OuterVolumeSpecName: "var-run") pod "a85b49d0-cc8d-4dce-aade-6c63af659f42" (UID: "a85b49d0-cc8d-4dce-aade-6c63af659f42"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.311574 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-var-lib" (OuterVolumeSpecName: "var-lib") pod "a85b49d0-cc8d-4dce-aade-6c63af659f42" (UID: "a85b49d0-cc8d-4dce-aade-6c63af659f42"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.311607 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "a85b49d0-cc8d-4dce-aade-6c63af659f42" (UID: "a85b49d0-cc8d-4dce-aade-6c63af659f42"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.312023 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.312899 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a85b49d0-cc8d-4dce-aade-6c63af659f42-scripts" (OuterVolumeSpecName: "scripts") pod "a85b49d0-cc8d-4dce-aade-6c63af659f42" (UID: "a85b49d0-cc8d-4dce-aade-6c63af659f42"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.321975 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a85b49d0-cc8d-4dce-aade-6c63af659f42-kube-api-access-9pxxb" (OuterVolumeSpecName: "kube-api-access-9pxxb") pod "a85b49d0-cc8d-4dce-aade-6c63af659f42" (UID: "a85b49d0-cc8d-4dce-aade-6c63af659f42"). InnerVolumeSpecName "kube-api-access-9pxxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.412140 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b304148c-ab0e-42ac-966a-024ff59a8cde-lock\") pod \"b304148c-ab0e-42ac-966a-024ff59a8cde\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.412240 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v69bl\" (UniqueName: \"kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-kube-api-access-v69bl\") pod \"b304148c-ab0e-42ac-966a-024ff59a8cde\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.412271 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift\") pod \"b304148c-ab0e-42ac-966a-024ff59a8cde\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.412294 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"b304148c-ab0e-42ac-966a-024ff59a8cde\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.412315 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b304148c-ab0e-42ac-966a-024ff59a8cde-cache\") pod \"b304148c-ab0e-42ac-966a-024ff59a8cde\" (UID: \"b304148c-ab0e-42ac-966a-024ff59a8cde\") " Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.413076 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b304148c-ab0e-42ac-966a-024ff59a8cde-lock" (OuterVolumeSpecName: "lock") pod "b304148c-ab0e-42ac-966a-024ff59a8cde" (UID: "b304148c-ab0e-42ac-966a-024ff59a8cde"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.413105 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b304148c-ab0e-42ac-966a-024ff59a8cde-cache" (OuterVolumeSpecName: "cache") pod "b304148c-ab0e-42ac-966a-024ff59a8cde" (UID: "b304148c-ab0e-42ac-966a-024ff59a8cde"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.413385 4954 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b304148c-ab0e-42ac-966a-024ff59a8cde-cache\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.413406 4954 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-etc-ovs\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.413415 4954 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-var-log\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.413424 4954 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-var-run\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.413434 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a85b49d0-cc8d-4dce-aade-6c63af659f42-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.413446 4954 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a85b49d0-cc8d-4dce-aade-6c63af659f42-var-lib\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.413454 4954 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b304148c-ab0e-42ac-966a-024ff59a8cde-lock\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.413463 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pxxb\" (UniqueName: \"kubernetes.io/projected/a85b49d0-cc8d-4dce-aade-6c63af659f42-kube-api-access-9pxxb\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.415525 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "swift") pod "b304148c-ab0e-42ac-966a-024ff59a8cde" (UID: "b304148c-ab0e-42ac-966a-024ff59a8cde"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.415837 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b304148c-ab0e-42ac-966a-024ff59a8cde" (UID: "b304148c-ab0e-42ac-966a-024ff59a8cde"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.415867 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-kube-api-access-v69bl" (OuterVolumeSpecName: "kube-api-access-v69bl") pod "b304148c-ab0e-42ac-966a-024ff59a8cde" (UID: "b304148c-ab0e-42ac-966a-024ff59a8cde"). InnerVolumeSpecName "kube-api-access-v69bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.443640 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:23:13 crc kubenswrapper[4954]: E1206 07:23:13.443954 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.515226 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v69bl\" (UniqueName: \"kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-kube-api-access-v69bl\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.515278 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b304148c-ab0e-42ac-966a-024ff59a8cde-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.515300 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.553917 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 06 07:23:13 crc kubenswrapper[4954]: I1206 07:23:13.616899 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.052989 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b304148c-ab0e-42ac-966a-024ff59a8cde","Type":"ContainerDied","Data":"06a142b099c41eaf5928932556cdf7352557bc53eba3e9a9bf04b502832ff6dc"} Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.053061 4954 scope.go:117] "RemoveContainer" containerID="aa40c5ec6d752ec2fbe2321f2103def28820a11e0c19b5134c7eb5bd9a35558b" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.053336 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.062084 4954 generic.go:334] "Generic (PLEG): container finished" podID="a6b3ac97-ce01-4110-9dd5-fee903dd5204" containerID="25aaea2454949bcf5aa44f34e823a052575eba578039116af36245405f9d00cf" exitCode=137 Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.062220 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a6b3ac97-ce01-4110-9dd5-fee903dd5204","Type":"ContainerDied","Data":"25aaea2454949bcf5aa44f34e823a052575eba578039116af36245405f9d00cf"} Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.066734 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xskgs_a85b49d0-cc8d-4dce-aade-6c63af659f42/ovs-vswitchd/0.log" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.070360 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xskgs" event={"ID":"a85b49d0-cc8d-4dce-aade-6c63af659f42","Type":"ContainerDied","Data":"b2797e7aef4be7b6a937371a4afb3ffdfa4ce3a020316108cab88a723f7b13b3"} Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.070524 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-xskgs" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.087339 4954 scope.go:117] "RemoveContainer" containerID="60ca35ac3122f717dfae9adfc952b010f5c490d1fbab75b0ecd0e2061e89d87e" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.098070 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.107835 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.109483 4954 scope.go:117] "RemoveContainer" containerID="85237b97891e4838aee5911e5b516a52841339958665c7a83d1d2d7121799c5a" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.129420 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-xskgs"] Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.138917 4954 scope.go:117] "RemoveContainer" containerID="3e090d4172cdef300aa94f8d8fba36deddc5d8017b13b15dfe18162d7c6aa8ef" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.139520 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-xskgs"] Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.203109 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.210653 4954 scope.go:117] "RemoveContainer" containerID="9e6c70c416cc446ced1c2039d0651b51abc7bb3be2892db83aa00306a5d00bb3" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.227993 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-config-data-custom\") pod \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.228076 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-config-data\") pod \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.228120 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz2n8\" (UniqueName: \"kubernetes.io/projected/a6b3ac97-ce01-4110-9dd5-fee903dd5204-kube-api-access-kz2n8\") pod \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.228150 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a6b3ac97-ce01-4110-9dd5-fee903dd5204-etc-machine-id\") pod \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.228239 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-scripts\") pod \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.228269 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-combined-ca-bundle\") pod \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\" (UID: \"a6b3ac97-ce01-4110-9dd5-fee903dd5204\") " Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.228955 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6b3ac97-ce01-4110-9dd5-fee903dd5204-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a6b3ac97-ce01-4110-9dd5-fee903dd5204" (UID: "a6b3ac97-ce01-4110-9dd5-fee903dd5204"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.233437 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-scripts" (OuterVolumeSpecName: "scripts") pod "a6b3ac97-ce01-4110-9dd5-fee903dd5204" (UID: "a6b3ac97-ce01-4110-9dd5-fee903dd5204"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.233784 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6b3ac97-ce01-4110-9dd5-fee903dd5204-kube-api-access-kz2n8" (OuterVolumeSpecName: "kube-api-access-kz2n8") pod "a6b3ac97-ce01-4110-9dd5-fee903dd5204" (UID: "a6b3ac97-ce01-4110-9dd5-fee903dd5204"). InnerVolumeSpecName "kube-api-access-kz2n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.250132 4954 scope.go:117] "RemoveContainer" containerID="fd7f3929fff74fd65f6989375596655e7c51806663d983bd2aad57effe232114" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.256504 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a6b3ac97-ce01-4110-9dd5-fee903dd5204" (UID: "a6b3ac97-ce01-4110-9dd5-fee903dd5204"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.280271 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6b3ac97-ce01-4110-9dd5-fee903dd5204" (UID: "a6b3ac97-ce01-4110-9dd5-fee903dd5204"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.300811 4954 scope.go:117] "RemoveContainer" containerID="e533f83b20e9f9300f4fb16b6fe9cdf376566cb8e295ffbbfa5b2a2195bb2002" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.315719 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-config-data" (OuterVolumeSpecName: "config-data") pod "a6b3ac97-ce01-4110-9dd5-fee903dd5204" (UID: "a6b3ac97-ce01-4110-9dd5-fee903dd5204"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.329627 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.329664 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.329675 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz2n8\" (UniqueName: \"kubernetes.io/projected/a6b3ac97-ce01-4110-9dd5-fee903dd5204-kube-api-access-kz2n8\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.329689 4954 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a6b3ac97-ce01-4110-9dd5-fee903dd5204-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.329699 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.329708 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6b3ac97-ce01-4110-9dd5-fee903dd5204-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.332497 4954 scope.go:117] "RemoveContainer" containerID="3a5cc68bf70f6cc371a8cb6cffab174f354261b971b711b47d5d116af25c682e" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.363524 4954 scope.go:117] "RemoveContainer" containerID="ec4a2a2f803658eaa7ba7c7d378bed85b1093f0a8c4791ca3b80cb2214e6e449" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.397442 4954 scope.go:117] "RemoveContainer" containerID="bb1952930feb993e59b60f78d883b948aa239b6e485225832fb60b3345763360" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.420228 4954 scope.go:117] "RemoveContainer" containerID="854d5d618d13ec03ec6c2f4744287e27f89042a1dae6f369b12bbc3408f4ff5c" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.449522 4954 scope.go:117] "RemoveContainer" containerID="17fe025c33d04ce12223561653e1cb4e9236655eaf9cd02886b52df15dccb969" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.472429 4954 scope.go:117] "RemoveContainer" containerID="7d0ab20184ee34faada856909d1b2e18f0f407fa1b19e33fe65f5f582f55f86c" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.492152 4954 scope.go:117] "RemoveContainer" containerID="1bb5213e7cc1c7b665991c2e686f0fcd95258f340b82e006fe07376d1ac56ed8" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.516022 4954 scope.go:117] "RemoveContainer" containerID="68942f2c61b18621daaf78d82703376edb42951c9bc918b02a792818df0d6cc3" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.542105 4954 scope.go:117] "RemoveContainer" containerID="f45652c0bc1504fd7b8b9fee6209517c73cff53e334ea69e35618387a77bfb4d" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.562617 4954 scope.go:117] "RemoveContainer" containerID="2a574d36866174e9e7b38c31dd68fef50b58061fdac91210325a7883ddd5d3cc" Dec 06 07:23:14 crc kubenswrapper[4954]: I1206 07:23:14.584191 4954 scope.go:117] "RemoveContainer" containerID="f4bf4637ad6f38b3a7a4d4f5071df3a875d12aa499a4cfceb51cfd663705b57f" Dec 06 07:23:15 crc kubenswrapper[4954]: I1206 07:23:15.093469 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 07:23:15 crc kubenswrapper[4954]: I1206 07:23:15.093433 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a6b3ac97-ce01-4110-9dd5-fee903dd5204","Type":"ContainerDied","Data":"242e83ae58481e6672ec1431f57fd8aa94eb2721e13b54b3a8eb1f6bf979046a"} Dec 06 07:23:15 crc kubenswrapper[4954]: I1206 07:23:15.093640 4954 scope.go:117] "RemoveContainer" containerID="dde3cc2c35c6b947b115a84354f88ab0c259ca23e0659e70ff1f7de4d80de96e" Dec 06 07:23:15 crc kubenswrapper[4954]: I1206 07:23:15.125485 4954 scope.go:117] "RemoveContainer" containerID="25aaea2454949bcf5aa44f34e823a052575eba578039116af36245405f9d00cf" Dec 06 07:23:15 crc kubenswrapper[4954]: I1206 07:23:15.144043 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 07:23:15 crc kubenswrapper[4954]: I1206 07:23:15.150110 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 07:23:15 crc kubenswrapper[4954]: I1206 07:23:15.453759 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6b3ac97-ce01-4110-9dd5-fee903dd5204" path="/var/lib/kubelet/pods/a6b3ac97-ce01-4110-9dd5-fee903dd5204/volumes" Dec 06 07:23:15 crc kubenswrapper[4954]: I1206 07:23:15.454333 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" path="/var/lib/kubelet/pods/a85b49d0-cc8d-4dce-aade-6c63af659f42/volumes" Dec 06 07:23:15 crc kubenswrapper[4954]: I1206 07:23:15.454985 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" path="/var/lib/kubelet/pods/b304148c-ab0e-42ac-966a-024ff59a8cde/volumes" Dec 06 07:23:16 crc kubenswrapper[4954]: I1206 07:23:16.796769 4954 scope.go:117] "RemoveContainer" containerID="98c8778c7e8353cc4dc52283f262371684ed883dec692ca28a39a816cbb59cd0" Dec 06 07:23:16 crc kubenswrapper[4954]: I1206 07:23:16.825289 4954 scope.go:117] "RemoveContainer" containerID="d27d5d9f536cc22c14abfb14b6f38b3f0d61e96cf9b23ee50d20f02267f9999f" Dec 06 07:23:18 crc kubenswrapper[4954]: I1206 07:23:18.359776 4954 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod24a72b28-2cf7-47e0-b7c2-5ff92acedfe7"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod24a72b28-2cf7-47e0-b7c2-5ff92acedfe7] : Timed out while waiting for systemd to remove kubepods-besteffort-pod24a72b28_2cf7_47e0_b7c2_5ff92acedfe7.slice" Dec 06 07:23:18 crc kubenswrapper[4954]: E1206 07:23:18.359884 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod24a72b28-2cf7-47e0-b7c2-5ff92acedfe7] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod24a72b28-2cf7-47e0-b7c2-5ff92acedfe7] : Timed out while waiting for systemd to remove kubepods-besteffort-pod24a72b28_2cf7_47e0_b7c2_5ff92acedfe7.slice" pod="openstack/ovsdbserver-nb-0" podUID="24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" Dec 06 07:23:19 crc kubenswrapper[4954]: I1206 07:23:19.146310 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 06 07:23:19 crc kubenswrapper[4954]: I1206 07:23:19.175154 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 07:23:19 crc kubenswrapper[4954]: I1206 07:23:19.184501 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 07:23:19 crc kubenswrapper[4954]: I1206 07:23:19.454603 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24a72b28-2cf7-47e0-b7c2-5ff92acedfe7" path="/var/lib/kubelet/pods/24a72b28-2cf7-47e0-b7c2-5ff92acedfe7/volumes" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.526252 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zxf5c"] Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.526770 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f6b0598-f49f-4300-a2e3-edb512001517" containerName="ceilometer-notification-agent" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.526788 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6b0598-f49f-4300-a2e3-edb512001517" containerName="ceilometer-notification-agent" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.526815 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-updater" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.526823 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-updater" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.526835 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2306ed-dcff-4143-a452-9a209d0a46a1" containerName="nova-cell0-conductor-conductor" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.526846 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2306ed-dcff-4143-a452-9a209d0a46a1" containerName="nova-cell0-conductor-conductor" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.526862 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f6b0598-f49f-4300-a2e3-edb512001517" containerName="sg-core" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.526871 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6b0598-f49f-4300-a2e3-edb512001517" containerName="sg-core" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.526882 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="container-server" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.526890 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="container-server" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.526901 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="swift-recon-cron" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.526909 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="swift-recon-cron" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.526918 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="938d0b4f-21d2-4972-8436-eb1fbd6db5bc" containerName="kube-state-metrics" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.526929 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="938d0b4f-21d2-4972-8436-eb1fbd6db5bc" containerName="kube-state-metrics" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.526945 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b3ac97-ce01-4110-9dd5-fee903dd5204" containerName="cinder-scheduler" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.526954 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b3ac97-ce01-4110-9dd5-fee903dd5204" containerName="cinder-scheduler" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.526969 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f6b0598-f49f-4300-a2e3-edb512001517" containerName="proxy-httpd" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.526976 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6b0598-f49f-4300-a2e3-edb512001517" containerName="proxy-httpd" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.526987 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovsdb-server-init" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.526995 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovsdb-server-init" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527005 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31452db7-e2c4-4e61-8f8c-7017476f0bc0" containerName="setup-container" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527014 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="31452db7-e2c4-4e61-8f8c-7017476f0bc0" containerName="setup-container" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527026 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="container-updater" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527037 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="container-updater" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527069 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="account-server" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527080 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="account-server" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527096 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578bec25-a54c-4f52-95f2-19f20f833437" containerName="rabbitmq" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527108 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="578bec25-a54c-4f52-95f2-19f20f833437" containerName="rabbitmq" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527122 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c417ca1e-22df-4163-96f9-349df3d624e8" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527133 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c417ca1e-22df-4163-96f9-349df3d624e8" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527154 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="account-reaper" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527169 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="account-reaper" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527182 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b3ac97-ce01-4110-9dd5-fee903dd5204" containerName="probe" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527192 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b3ac97-ce01-4110-9dd5-fee903dd5204" containerName="probe" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527213 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841d2cc4-0265-4e02-af59-f0f322208f02" containerName="galera" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527224 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="841d2cc4-0265-4e02-af59-f0f322208f02" containerName="galera" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527240 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841d2cc4-0265-4e02-af59-f0f322208f02" containerName="mysql-bootstrap" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527251 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="841d2cc4-0265-4e02-af59-f0f322208f02" containerName="mysql-bootstrap" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527266 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-server" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527276 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-server" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527297 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="432b9d93-b045-4e25-b58b-b3a6fd8512c4" containerName="nova-metadata-metadata" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527308 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="432b9d93-b045-4e25-b58b-b3a6fd8512c4" containerName="nova-metadata-metadata" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527326 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-expirer" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527339 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-expirer" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527355 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-auditor" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527364 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-auditor" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527383 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ccdb46-97b8-40d8-aebb-5cf28cb6854d" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527394 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ccdb46-97b8-40d8-aebb-5cf28cb6854d" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527408 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="097befa0-58fe-4616-bed7-ada4f7d81ce3" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527418 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="097befa0-58fe-4616-bed7-ada4f7d81ce3" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527437 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f84d04d2-6282-4a9c-89a8-3aa64ef22c74" containerName="placement-api" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527447 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84d04d2-6282-4a9c-89a8-3aa64ef22c74" containerName="placement-api" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527460 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-replicator" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527471 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-replicator" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527488 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de2aad5-fb15-489c-b0fc-200e18ad3baa" containerName="ovn-northd" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527497 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de2aad5-fb15-489c-b0fc-200e18ad3baa" containerName="ovn-northd" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527512 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527523 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527543 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578bec25-a54c-4f52-95f2-19f20f833437" containerName="setup-container" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527554 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="578bec25-a54c-4f52-95f2-19f20f833437" containerName="setup-container" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527592 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovsdb-server" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527601 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovsdb-server" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527614 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="432b9d93-b045-4e25-b58b-b3a6fd8512c4" containerName="nova-metadata-log" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527622 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="432b9d93-b045-4e25-b58b-b3a6fd8512c4" containerName="nova-metadata-log" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527636 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693e40da-d019-421e-83a8-6dc351580607" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527645 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="693e40da-d019-421e-83a8-6dc351580607" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527656 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31452db7-e2c4-4e61-8f8c-7017476f0bc0" containerName="rabbitmq" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527664 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="31452db7-e2c4-4e61-8f8c-7017476f0bc0" containerName="rabbitmq" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527676 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25414b25-a6cc-41e6-8360-3e85f54321d5" containerName="barbican-api" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527684 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="25414b25-a6cc-41e6-8360-3e85f54321d5" containerName="barbican-api" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527699 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25414b25-a6cc-41e6-8360-3e85f54321d5" containerName="barbican-api-log" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527706 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="25414b25-a6cc-41e6-8360-3e85f54321d5" containerName="barbican-api-log" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527721 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a82ba0f-bb07-4959-bfdd-8a420c617835" containerName="memcached" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527730 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a82ba0f-bb07-4959-bfdd-8a420c617835" containerName="memcached" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527740 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="account-auditor" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527748 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="account-auditor" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527760 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="418e92aa-4713-4a55-b4d5-650587fcb6ca" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527770 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="418e92aa-4713-4a55-b4d5-650587fcb6ca" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527784 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovs-vswitchd" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527794 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovs-vswitchd" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527810 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f84d04d2-6282-4a9c-89a8-3aa64ef22c74" containerName="placement-log" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527822 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84d04d2-6282-4a9c-89a8-3aa64ef22c74" containerName="placement-log" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527839 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de2aad5-fb15-489c-b0fc-200e18ad3baa" containerName="openstack-network-exporter" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527850 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de2aad5-fb15-489c-b0fc-200e18ad3baa" containerName="openstack-network-exporter" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527863 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="rsync" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527875 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="rsync" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527888 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b907c888-706a-4183-b581-ff7b4742fc74" containerName="barbican-worker-log" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527898 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b907c888-706a-4183-b581-ff7b4742fc74" containerName="barbican-worker-log" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527913 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="096c2131-031b-4573-ade7-b1d0d34abc60" containerName="neutron-httpd" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527920 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="096c2131-031b-4573-ade7-b1d0d34abc60" containerName="neutron-httpd" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527936 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81a84a0f-fa51-461c-a281-6b832ad39aa7" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527945 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="81a84a0f-fa51-461c-a281-6b832ad39aa7" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527961 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="account-replicator" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527971 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="account-replicator" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.527984 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b907c888-706a-4183-b581-ff7b4742fc74" containerName="barbican-worker" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.527993 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b907c888-706a-4183-b581-ff7b4742fc74" containerName="barbican-worker" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.528007 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="096c2131-031b-4573-ade7-b1d0d34abc60" containerName="neutron-api" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528016 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="096c2131-031b-4573-ade7-b1d0d34abc60" containerName="neutron-api" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.528034 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="container-replicator" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528045 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="container-replicator" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.528060 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a" containerName="keystone-api" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528070 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a" containerName="keystone-api" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.528089 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f6b0598-f49f-4300-a2e3-edb512001517" containerName="ceilometer-central-agent" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528101 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6b0598-f49f-4300-a2e3-edb512001517" containerName="ceilometer-central-agent" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.528124 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="container-auditor" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528134 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="container-auditor" Dec 06 07:23:20 crc kubenswrapper[4954]: E1206 07:23:20.528147 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7924fc4f-0ab9-4805-8d77-3a1fe2953fe1" containerName="nova-scheduler-scheduler" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528159 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7924fc4f-0ab9-4805-8d77-3a1fe2953fe1" containerName="nova-scheduler-scheduler" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528490 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="25414b25-a6cc-41e6-8360-3e85f54321d5" containerName="barbican-api" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528512 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="container-updater" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528533 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-auditor" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528542 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f84d04d2-6282-4a9c-89a8-3aa64ef22c74" containerName="placement-api" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528556 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b907c888-706a-4183-b581-ff7b4742fc74" containerName="barbican-worker-log" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528611 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b907c888-706a-4183-b581-ff7b4742fc74" containerName="barbican-worker" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528626 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f6b0598-f49f-4300-a2e3-edb512001517" containerName="ceilometer-central-agent" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528643 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="578bec25-a54c-4f52-95f2-19f20f833437" containerName="rabbitmq" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528655 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="25414b25-a6cc-41e6-8360-3e85f54321d5" containerName="barbican-api-log" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528666 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de2aad5-fb15-489c-b0fc-200e18ad3baa" containerName="openstack-network-exporter" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528680 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-server" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528695 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovsdb-server" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528708 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="096c2131-031b-4573-ade7-b1d0d34abc60" containerName="neutron-api" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528718 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="096c2131-031b-4573-ade7-b1d0d34abc60" containerName="neutron-httpd" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528732 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f84d04d2-6282-4a9c-89a8-3aa64ef22c74" containerName="placement-log" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528740 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a85b49d0-cc8d-4dce-aade-6c63af659f42" containerName="ovs-vswitchd" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528751 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca2306ed-dcff-4143-a452-9a209d0a46a1" containerName="nova-cell0-conductor-conductor" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528765 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2ccdb46-97b8-40d8-aebb-5cf28cb6854d" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528776 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-updater" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528791 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6b3ac97-ce01-4110-9dd5-fee903dd5204" containerName="probe" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528803 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a82ba0f-bb07-4959-bfdd-8a420c617835" containerName="memcached" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528815 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="account-auditor" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528832 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="418e92aa-4713-4a55-b4d5-650587fcb6ca" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528845 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="container-server" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528854 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="account-server" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528865 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="container-auditor" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528879 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="432b9d93-b045-4e25-b58b-b3a6fd8512c4" containerName="nova-metadata-metadata" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528896 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="693e40da-d019-421e-83a8-6dc351580607" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528908 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-expirer" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528921 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="account-replicator" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528932 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="object-replicator" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528943 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="097befa0-58fe-4616-bed7-ada4f7d81ce3" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528951 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d774b5-3c0f-4b86-ad33-b6bfd49c1b46" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528961 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="841d2cc4-0265-4e02-af59-f0f322208f02" containerName="galera" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528978 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="432b9d93-b045-4e25-b58b-b3a6fd8512c4" containerName="nova-metadata-log" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.528990 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="81a84a0f-fa51-461c-a281-6b832ad39aa7" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.529004 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f6b0598-f49f-4300-a2e3-edb512001517" containerName="ceilometer-notification-agent" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.529015 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="account-reaper" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.529029 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="rsync" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.529043 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="container-replicator" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.529056 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7924fc4f-0ab9-4805-8d77-3a1fe2953fe1" containerName="nova-scheduler-scheduler" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.529070 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de2aad5-fb15-489c-b0fc-200e18ad3baa" containerName="ovn-northd" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.529083 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="938d0b4f-21d2-4972-8436-eb1fbd6db5bc" containerName="kube-state-metrics" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.529098 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="31452db7-e2c4-4e61-8f8c-7017476f0bc0" containerName="rabbitmq" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.529110 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b304148c-ab0e-42ac-966a-024ff59a8cde" containerName="swift-recon-cron" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.529122 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa0dd9bc-fbf8-4692-bc1f-1f1b001a2d3a" containerName="keystone-api" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.529133 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c417ca1e-22df-4163-96f9-349df3d624e8" containerName="mariadb-account-delete" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.529147 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f6b0598-f49f-4300-a2e3-edb512001517" containerName="proxy-httpd" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.529161 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6b3ac97-ce01-4110-9dd5-fee903dd5204" containerName="cinder-scheduler" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.529175 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f6b0598-f49f-4300-a2e3-edb512001517" containerName="sg-core" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.530459 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zxf5c" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.541826 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zxf5c"] Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.730473 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63156d2f-94cb-4cd8-9c2b-c29620085234-utilities\") pod \"community-operators-zxf5c\" (UID: \"63156d2f-94cb-4cd8-9c2b-c29620085234\") " pod="openshift-marketplace/community-operators-zxf5c" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.730542 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63156d2f-94cb-4cd8-9c2b-c29620085234-catalog-content\") pod \"community-operators-zxf5c\" (UID: \"63156d2f-94cb-4cd8-9c2b-c29620085234\") " pod="openshift-marketplace/community-operators-zxf5c" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.730758 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l69wq\" (UniqueName: \"kubernetes.io/projected/63156d2f-94cb-4cd8-9c2b-c29620085234-kube-api-access-l69wq\") pod \"community-operators-zxf5c\" (UID: \"63156d2f-94cb-4cd8-9c2b-c29620085234\") " pod="openshift-marketplace/community-operators-zxf5c" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.832101 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l69wq\" (UniqueName: \"kubernetes.io/projected/63156d2f-94cb-4cd8-9c2b-c29620085234-kube-api-access-l69wq\") pod \"community-operators-zxf5c\" (UID: \"63156d2f-94cb-4cd8-9c2b-c29620085234\") " pod="openshift-marketplace/community-operators-zxf5c" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.832276 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63156d2f-94cb-4cd8-9c2b-c29620085234-utilities\") pod \"community-operators-zxf5c\" (UID: \"63156d2f-94cb-4cd8-9c2b-c29620085234\") " pod="openshift-marketplace/community-operators-zxf5c" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.832325 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63156d2f-94cb-4cd8-9c2b-c29620085234-catalog-content\") pod \"community-operators-zxf5c\" (UID: \"63156d2f-94cb-4cd8-9c2b-c29620085234\") " pod="openshift-marketplace/community-operators-zxf5c" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.832953 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63156d2f-94cb-4cd8-9c2b-c29620085234-utilities\") pod \"community-operators-zxf5c\" (UID: \"63156d2f-94cb-4cd8-9c2b-c29620085234\") " pod="openshift-marketplace/community-operators-zxf5c" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.833115 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63156d2f-94cb-4cd8-9c2b-c29620085234-catalog-content\") pod \"community-operators-zxf5c\" (UID: \"63156d2f-94cb-4cd8-9c2b-c29620085234\") " pod="openshift-marketplace/community-operators-zxf5c" Dec 06 07:23:20 crc kubenswrapper[4954]: I1206 07:23:20.867719 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l69wq\" (UniqueName: \"kubernetes.io/projected/63156d2f-94cb-4cd8-9c2b-c29620085234-kube-api-access-l69wq\") pod \"community-operators-zxf5c\" (UID: \"63156d2f-94cb-4cd8-9c2b-c29620085234\") " pod="openshift-marketplace/community-operators-zxf5c" Dec 06 07:23:21 crc kubenswrapper[4954]: I1206 07:23:21.159464 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zxf5c" Dec 06 07:23:21 crc kubenswrapper[4954]: I1206 07:23:21.571450 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zxf5c"] Dec 06 07:23:22 crc kubenswrapper[4954]: I1206 07:23:22.173476 4954 generic.go:334] "Generic (PLEG): container finished" podID="63156d2f-94cb-4cd8-9c2b-c29620085234" containerID="be162535674463b3e997243ffdc277f6f2946a9f1aae039a4acf37a4382e0af6" exitCode=0 Dec 06 07:23:22 crc kubenswrapper[4954]: I1206 07:23:22.173595 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxf5c" event={"ID":"63156d2f-94cb-4cd8-9c2b-c29620085234","Type":"ContainerDied","Data":"be162535674463b3e997243ffdc277f6f2946a9f1aae039a4acf37a4382e0af6"} Dec 06 07:23:22 crc kubenswrapper[4954]: I1206 07:23:22.174271 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxf5c" event={"ID":"63156d2f-94cb-4cd8-9c2b-c29620085234","Type":"ContainerStarted","Data":"11e8bbff206cdefe0928fa4297c14028f9f4e688e99cc3bc103cccb9e3739be9"} Dec 06 07:23:23 crc kubenswrapper[4954]: I1206 07:23:23.182627 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxf5c" event={"ID":"63156d2f-94cb-4cd8-9c2b-c29620085234","Type":"ContainerStarted","Data":"bc8bc0dbc158abe68ce8e1d5cf5ddda1b6b04857ccb709a9705e58a16fbe572f"} Dec 06 07:23:24 crc kubenswrapper[4954]: I1206 07:23:24.199114 4954 generic.go:334] "Generic (PLEG): container finished" podID="63156d2f-94cb-4cd8-9c2b-c29620085234" containerID="bc8bc0dbc158abe68ce8e1d5cf5ddda1b6b04857ccb709a9705e58a16fbe572f" exitCode=0 Dec 06 07:23:24 crc kubenswrapper[4954]: I1206 07:23:24.199182 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxf5c" event={"ID":"63156d2f-94cb-4cd8-9c2b-c29620085234","Type":"ContainerDied","Data":"bc8bc0dbc158abe68ce8e1d5cf5ddda1b6b04857ccb709a9705e58a16fbe572f"} Dec 06 07:23:25 crc kubenswrapper[4954]: I1206 07:23:25.213335 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxf5c" event={"ID":"63156d2f-94cb-4cd8-9c2b-c29620085234","Type":"ContainerStarted","Data":"9a2e6dbd875be79268501c5193a1700fc3502ecc34f599a6563955034922ca68"} Dec 06 07:23:25 crc kubenswrapper[4954]: I1206 07:23:25.232969 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zxf5c" podStartSLOduration=2.805725661 podStartE2EDuration="5.232948322s" podCreationTimestamp="2025-12-06 07:23:20 +0000 UTC" firstStartedPulling="2025-12-06 07:23:22.17514633 +0000 UTC m=+1576.988505719" lastFinishedPulling="2025-12-06 07:23:24.602368971 +0000 UTC m=+1579.415728380" observedRunningTime="2025-12-06 07:23:25.231891723 +0000 UTC m=+1580.045251142" watchObservedRunningTime="2025-12-06 07:23:25.232948322 +0000 UTC m=+1580.046307711" Dec 06 07:23:26 crc kubenswrapper[4954]: I1206 07:23:26.443133 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:23:26 crc kubenswrapper[4954]: E1206 07:23:26.443645 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:23:31 crc kubenswrapper[4954]: I1206 07:23:31.159726 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zxf5c" Dec 06 07:23:31 crc kubenswrapper[4954]: I1206 07:23:31.160126 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zxf5c" Dec 06 07:23:31 crc kubenswrapper[4954]: I1206 07:23:31.230843 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zxf5c" Dec 06 07:23:31 crc kubenswrapper[4954]: I1206 07:23:31.314218 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zxf5c" Dec 06 07:23:31 crc kubenswrapper[4954]: I1206 07:23:31.479007 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zxf5c"] Dec 06 07:23:33 crc kubenswrapper[4954]: I1206 07:23:33.284953 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zxf5c" podUID="63156d2f-94cb-4cd8-9c2b-c29620085234" containerName="registry-server" containerID="cri-o://9a2e6dbd875be79268501c5193a1700fc3502ecc34f599a6563955034922ca68" gracePeriod=2 Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.179684 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zxf5c" Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.233541 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63156d2f-94cb-4cd8-9c2b-c29620085234-catalog-content\") pod \"63156d2f-94cb-4cd8-9c2b-c29620085234\" (UID: \"63156d2f-94cb-4cd8-9c2b-c29620085234\") " Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.233644 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l69wq\" (UniqueName: \"kubernetes.io/projected/63156d2f-94cb-4cd8-9c2b-c29620085234-kube-api-access-l69wq\") pod \"63156d2f-94cb-4cd8-9c2b-c29620085234\" (UID: \"63156d2f-94cb-4cd8-9c2b-c29620085234\") " Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.233714 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63156d2f-94cb-4cd8-9c2b-c29620085234-utilities\") pod \"63156d2f-94cb-4cd8-9c2b-c29620085234\" (UID: \"63156d2f-94cb-4cd8-9c2b-c29620085234\") " Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.235064 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63156d2f-94cb-4cd8-9c2b-c29620085234-utilities" (OuterVolumeSpecName: "utilities") pod "63156d2f-94cb-4cd8-9c2b-c29620085234" (UID: "63156d2f-94cb-4cd8-9c2b-c29620085234"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.240295 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63156d2f-94cb-4cd8-9c2b-c29620085234-kube-api-access-l69wq" (OuterVolumeSpecName: "kube-api-access-l69wq") pod "63156d2f-94cb-4cd8-9c2b-c29620085234" (UID: "63156d2f-94cb-4cd8-9c2b-c29620085234"). InnerVolumeSpecName "kube-api-access-l69wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.290954 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63156d2f-94cb-4cd8-9c2b-c29620085234-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63156d2f-94cb-4cd8-9c2b-c29620085234" (UID: "63156d2f-94cb-4cd8-9c2b-c29620085234"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.294740 4954 generic.go:334] "Generic (PLEG): container finished" podID="63156d2f-94cb-4cd8-9c2b-c29620085234" containerID="9a2e6dbd875be79268501c5193a1700fc3502ecc34f599a6563955034922ca68" exitCode=0 Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.294785 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxf5c" event={"ID":"63156d2f-94cb-4cd8-9c2b-c29620085234","Type":"ContainerDied","Data":"9a2e6dbd875be79268501c5193a1700fc3502ecc34f599a6563955034922ca68"} Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.294819 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxf5c" event={"ID":"63156d2f-94cb-4cd8-9c2b-c29620085234","Type":"ContainerDied","Data":"11e8bbff206cdefe0928fa4297c14028f9f4e688e99cc3bc103cccb9e3739be9"} Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.294841 4954 scope.go:117] "RemoveContainer" containerID="9a2e6dbd875be79268501c5193a1700fc3502ecc34f599a6563955034922ca68" Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.295007 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zxf5c" Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.326695 4954 scope.go:117] "RemoveContainer" containerID="bc8bc0dbc158abe68ce8e1d5cf5ddda1b6b04857ccb709a9705e58a16fbe572f" Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.334896 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63156d2f-94cb-4cd8-9c2b-c29620085234-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.334938 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63156d2f-94cb-4cd8-9c2b-c29620085234-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.334967 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l69wq\" (UniqueName: \"kubernetes.io/projected/63156d2f-94cb-4cd8-9c2b-c29620085234-kube-api-access-l69wq\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.337226 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zxf5c"] Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.344328 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zxf5c"] Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.355626 4954 scope.go:117] "RemoveContainer" containerID="be162535674463b3e997243ffdc277f6f2946a9f1aae039a4acf37a4382e0af6" Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.378801 4954 scope.go:117] "RemoveContainer" containerID="9a2e6dbd875be79268501c5193a1700fc3502ecc34f599a6563955034922ca68" Dec 06 07:23:34 crc kubenswrapper[4954]: E1206 07:23:34.379862 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a2e6dbd875be79268501c5193a1700fc3502ecc34f599a6563955034922ca68\": container with ID starting with 9a2e6dbd875be79268501c5193a1700fc3502ecc34f599a6563955034922ca68 not found: ID does not exist" containerID="9a2e6dbd875be79268501c5193a1700fc3502ecc34f599a6563955034922ca68" Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.379918 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a2e6dbd875be79268501c5193a1700fc3502ecc34f599a6563955034922ca68"} err="failed to get container status \"9a2e6dbd875be79268501c5193a1700fc3502ecc34f599a6563955034922ca68\": rpc error: code = NotFound desc = could not find container \"9a2e6dbd875be79268501c5193a1700fc3502ecc34f599a6563955034922ca68\": container with ID starting with 9a2e6dbd875be79268501c5193a1700fc3502ecc34f599a6563955034922ca68 not found: ID does not exist" Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.379961 4954 scope.go:117] "RemoveContainer" containerID="bc8bc0dbc158abe68ce8e1d5cf5ddda1b6b04857ccb709a9705e58a16fbe572f" Dec 06 07:23:34 crc kubenswrapper[4954]: E1206 07:23:34.380388 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8bc0dbc158abe68ce8e1d5cf5ddda1b6b04857ccb709a9705e58a16fbe572f\": container with ID starting with bc8bc0dbc158abe68ce8e1d5cf5ddda1b6b04857ccb709a9705e58a16fbe572f not found: ID does not exist" containerID="bc8bc0dbc158abe68ce8e1d5cf5ddda1b6b04857ccb709a9705e58a16fbe572f" Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.380420 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8bc0dbc158abe68ce8e1d5cf5ddda1b6b04857ccb709a9705e58a16fbe572f"} err="failed to get container status \"bc8bc0dbc158abe68ce8e1d5cf5ddda1b6b04857ccb709a9705e58a16fbe572f\": rpc error: code = NotFound desc = could not find container \"bc8bc0dbc158abe68ce8e1d5cf5ddda1b6b04857ccb709a9705e58a16fbe572f\": container with ID starting with bc8bc0dbc158abe68ce8e1d5cf5ddda1b6b04857ccb709a9705e58a16fbe572f not found: ID does not exist" Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.380441 4954 scope.go:117] "RemoveContainer" containerID="be162535674463b3e997243ffdc277f6f2946a9f1aae039a4acf37a4382e0af6" Dec 06 07:23:34 crc kubenswrapper[4954]: E1206 07:23:34.380819 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be162535674463b3e997243ffdc277f6f2946a9f1aae039a4acf37a4382e0af6\": container with ID starting with be162535674463b3e997243ffdc277f6f2946a9f1aae039a4acf37a4382e0af6 not found: ID does not exist" containerID="be162535674463b3e997243ffdc277f6f2946a9f1aae039a4acf37a4382e0af6" Dec 06 07:23:34 crc kubenswrapper[4954]: I1206 07:23:34.380849 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be162535674463b3e997243ffdc277f6f2946a9f1aae039a4acf37a4382e0af6"} err="failed to get container status \"be162535674463b3e997243ffdc277f6f2946a9f1aae039a4acf37a4382e0af6\": rpc error: code = NotFound desc = could not find container \"be162535674463b3e997243ffdc277f6f2946a9f1aae039a4acf37a4382e0af6\": container with ID starting with be162535674463b3e997243ffdc277f6f2946a9f1aae039a4acf37a4382e0af6 not found: ID does not exist" Dec 06 07:23:35 crc kubenswrapper[4954]: I1206 07:23:35.454281 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63156d2f-94cb-4cd8-9c2b-c29620085234" path="/var/lib/kubelet/pods/63156d2f-94cb-4cd8-9c2b-c29620085234/volumes" Dec 06 07:23:41 crc kubenswrapper[4954]: I1206 07:23:41.443901 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:23:41 crc kubenswrapper[4954]: E1206 07:23:41.444788 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:23:43 crc kubenswrapper[4954]: I1206 07:23:43.865997 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-crjh6"] Dec 06 07:23:43 crc kubenswrapper[4954]: E1206 07:23:43.866443 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63156d2f-94cb-4cd8-9c2b-c29620085234" containerName="extract-utilities" Dec 06 07:23:43 crc kubenswrapper[4954]: I1206 07:23:43.866461 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="63156d2f-94cb-4cd8-9c2b-c29620085234" containerName="extract-utilities" Dec 06 07:23:43 crc kubenswrapper[4954]: E1206 07:23:43.866518 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63156d2f-94cb-4cd8-9c2b-c29620085234" containerName="registry-server" Dec 06 07:23:43 crc kubenswrapper[4954]: I1206 07:23:43.866528 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="63156d2f-94cb-4cd8-9c2b-c29620085234" containerName="registry-server" Dec 06 07:23:43 crc kubenswrapper[4954]: E1206 07:23:43.866700 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63156d2f-94cb-4cd8-9c2b-c29620085234" containerName="extract-content" Dec 06 07:23:43 crc kubenswrapper[4954]: I1206 07:23:43.866805 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="63156d2f-94cb-4cd8-9c2b-c29620085234" containerName="extract-content" Dec 06 07:23:43 crc kubenswrapper[4954]: I1206 07:23:43.867498 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="63156d2f-94cb-4cd8-9c2b-c29620085234" containerName="registry-server" Dec 06 07:23:43 crc kubenswrapper[4954]: I1206 07:23:43.869925 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-crjh6" Dec 06 07:23:43 crc kubenswrapper[4954]: I1206 07:23:43.892658 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-crjh6"] Dec 06 07:23:43 crc kubenswrapper[4954]: I1206 07:23:43.986134 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de4f7704-da43-4626-b1b1-6fb8e157be37-catalog-content\") pod \"redhat-marketplace-crjh6\" (UID: \"de4f7704-da43-4626-b1b1-6fb8e157be37\") " pod="openshift-marketplace/redhat-marketplace-crjh6" Dec 06 07:23:43 crc kubenswrapper[4954]: I1206 07:23:43.986338 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de4f7704-da43-4626-b1b1-6fb8e157be37-utilities\") pod \"redhat-marketplace-crjh6\" (UID: \"de4f7704-da43-4626-b1b1-6fb8e157be37\") " pod="openshift-marketplace/redhat-marketplace-crjh6" Dec 06 07:23:43 crc kubenswrapper[4954]: I1206 07:23:43.986439 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b84lj\" (UniqueName: \"kubernetes.io/projected/de4f7704-da43-4626-b1b1-6fb8e157be37-kube-api-access-b84lj\") pod \"redhat-marketplace-crjh6\" (UID: \"de4f7704-da43-4626-b1b1-6fb8e157be37\") " pod="openshift-marketplace/redhat-marketplace-crjh6" Dec 06 07:23:44 crc kubenswrapper[4954]: I1206 07:23:44.087633 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de4f7704-da43-4626-b1b1-6fb8e157be37-catalog-content\") pod \"redhat-marketplace-crjh6\" (UID: \"de4f7704-da43-4626-b1b1-6fb8e157be37\") " pod="openshift-marketplace/redhat-marketplace-crjh6" Dec 06 07:23:44 crc kubenswrapper[4954]: I1206 07:23:44.087736 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de4f7704-da43-4626-b1b1-6fb8e157be37-utilities\") pod \"redhat-marketplace-crjh6\" (UID: \"de4f7704-da43-4626-b1b1-6fb8e157be37\") " pod="openshift-marketplace/redhat-marketplace-crjh6" Dec 06 07:23:44 crc kubenswrapper[4954]: I1206 07:23:44.087790 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b84lj\" (UniqueName: \"kubernetes.io/projected/de4f7704-da43-4626-b1b1-6fb8e157be37-kube-api-access-b84lj\") pod \"redhat-marketplace-crjh6\" (UID: \"de4f7704-da43-4626-b1b1-6fb8e157be37\") " pod="openshift-marketplace/redhat-marketplace-crjh6" Dec 06 07:23:44 crc kubenswrapper[4954]: I1206 07:23:44.088150 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de4f7704-da43-4626-b1b1-6fb8e157be37-catalog-content\") pod \"redhat-marketplace-crjh6\" (UID: \"de4f7704-da43-4626-b1b1-6fb8e157be37\") " pod="openshift-marketplace/redhat-marketplace-crjh6" Dec 06 07:23:44 crc kubenswrapper[4954]: I1206 07:23:44.088180 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de4f7704-da43-4626-b1b1-6fb8e157be37-utilities\") pod \"redhat-marketplace-crjh6\" (UID: \"de4f7704-da43-4626-b1b1-6fb8e157be37\") " pod="openshift-marketplace/redhat-marketplace-crjh6" Dec 06 07:23:44 crc kubenswrapper[4954]: I1206 07:23:44.123608 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b84lj\" (UniqueName: \"kubernetes.io/projected/de4f7704-da43-4626-b1b1-6fb8e157be37-kube-api-access-b84lj\") pod \"redhat-marketplace-crjh6\" (UID: \"de4f7704-da43-4626-b1b1-6fb8e157be37\") " pod="openshift-marketplace/redhat-marketplace-crjh6" Dec 06 07:23:44 crc kubenswrapper[4954]: I1206 07:23:44.215163 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-crjh6" Dec 06 07:23:44 crc kubenswrapper[4954]: I1206 07:23:44.688448 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-crjh6"] Dec 06 07:23:45 crc kubenswrapper[4954]: I1206 07:23:45.409369 4954 generic.go:334] "Generic (PLEG): container finished" podID="de4f7704-da43-4626-b1b1-6fb8e157be37" containerID="45f4be0644813bcd6eb497f64fd9d28d43a1bbe0410ee73d790b695e0dc933dc" exitCode=0 Dec 06 07:23:45 crc kubenswrapper[4954]: I1206 07:23:45.409437 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crjh6" event={"ID":"de4f7704-da43-4626-b1b1-6fb8e157be37","Type":"ContainerDied","Data":"45f4be0644813bcd6eb497f64fd9d28d43a1bbe0410ee73d790b695e0dc933dc"} Dec 06 07:23:45 crc kubenswrapper[4954]: I1206 07:23:45.409483 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crjh6" event={"ID":"de4f7704-da43-4626-b1b1-6fb8e157be37","Type":"ContainerStarted","Data":"a0eff67951d2ac13831ed8e26f5136b6c035f00ff2df3e04ea03c319666b4002"} Dec 06 07:23:46 crc kubenswrapper[4954]: I1206 07:23:46.420970 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crjh6" event={"ID":"de4f7704-da43-4626-b1b1-6fb8e157be37","Type":"ContainerStarted","Data":"70b3afe42ad51357013e395fbf5106f19b0a3cc2eb9142aea7eb9ed7b64e5f11"} Dec 06 07:23:47 crc kubenswrapper[4954]: I1206 07:23:47.437774 4954 generic.go:334] "Generic (PLEG): container finished" podID="de4f7704-da43-4626-b1b1-6fb8e157be37" containerID="70b3afe42ad51357013e395fbf5106f19b0a3cc2eb9142aea7eb9ed7b64e5f11" exitCode=0 Dec 06 07:23:47 crc kubenswrapper[4954]: I1206 07:23:47.437822 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crjh6" event={"ID":"de4f7704-da43-4626-b1b1-6fb8e157be37","Type":"ContainerDied","Data":"70b3afe42ad51357013e395fbf5106f19b0a3cc2eb9142aea7eb9ed7b64e5f11"} Dec 06 07:23:48 crc kubenswrapper[4954]: I1206 07:23:48.452580 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crjh6" event={"ID":"de4f7704-da43-4626-b1b1-6fb8e157be37","Type":"ContainerStarted","Data":"ebc4e77a2276e6047f10ff0df7c77c50cbc5502ae2809d34193e4e727164aa09"} Dec 06 07:23:48 crc kubenswrapper[4954]: I1206 07:23:48.475636 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-crjh6" podStartSLOduration=3.003140678 podStartE2EDuration="5.475614297s" podCreationTimestamp="2025-12-06 07:23:43 +0000 UTC" firstStartedPulling="2025-12-06 07:23:45.412969086 +0000 UTC m=+1600.226328515" lastFinishedPulling="2025-12-06 07:23:47.885442715 +0000 UTC m=+1602.698802134" observedRunningTime="2025-12-06 07:23:48.474266041 +0000 UTC m=+1603.287625440" watchObservedRunningTime="2025-12-06 07:23:48.475614297 +0000 UTC m=+1603.288973716" Dec 06 07:23:52 crc kubenswrapper[4954]: I1206 07:23:52.443718 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:23:52 crc kubenswrapper[4954]: E1206 07:23:52.444430 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:23:54 crc kubenswrapper[4954]: I1206 07:23:54.217884 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-crjh6" Dec 06 07:23:54 crc kubenswrapper[4954]: I1206 07:23:54.217936 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-crjh6" Dec 06 07:23:54 crc kubenswrapper[4954]: I1206 07:23:54.261993 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-crjh6" Dec 06 07:23:54 crc kubenswrapper[4954]: I1206 07:23:54.568546 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-crjh6" Dec 06 07:23:54 crc kubenswrapper[4954]: I1206 07:23:54.614770 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-crjh6"] Dec 06 07:23:56 crc kubenswrapper[4954]: I1206 07:23:56.526843 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-crjh6" podUID="de4f7704-da43-4626-b1b1-6fb8e157be37" containerName="registry-server" containerID="cri-o://ebc4e77a2276e6047f10ff0df7c77c50cbc5502ae2809d34193e4e727164aa09" gracePeriod=2 Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.495068 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-crjh6" Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.551249 4954 generic.go:334] "Generic (PLEG): container finished" podID="de4f7704-da43-4626-b1b1-6fb8e157be37" containerID="ebc4e77a2276e6047f10ff0df7c77c50cbc5502ae2809d34193e4e727164aa09" exitCode=0 Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.551319 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crjh6" event={"ID":"de4f7704-da43-4626-b1b1-6fb8e157be37","Type":"ContainerDied","Data":"ebc4e77a2276e6047f10ff0df7c77c50cbc5502ae2809d34193e4e727164aa09"} Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.551359 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crjh6" event={"ID":"de4f7704-da43-4626-b1b1-6fb8e157be37","Type":"ContainerDied","Data":"a0eff67951d2ac13831ed8e26f5136b6c035f00ff2df3e04ea03c319666b4002"} Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.551385 4954 scope.go:117] "RemoveContainer" containerID="ebc4e77a2276e6047f10ff0df7c77c50cbc5502ae2809d34193e4e727164aa09" Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.551520 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-crjh6" Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.576674 4954 scope.go:117] "RemoveContainer" containerID="70b3afe42ad51357013e395fbf5106f19b0a3cc2eb9142aea7eb9ed7b64e5f11" Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.597139 4954 scope.go:117] "RemoveContainer" containerID="45f4be0644813bcd6eb497f64fd9d28d43a1bbe0410ee73d790b695e0dc933dc" Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.625463 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de4f7704-da43-4626-b1b1-6fb8e157be37-utilities\") pod \"de4f7704-da43-4626-b1b1-6fb8e157be37\" (UID: \"de4f7704-da43-4626-b1b1-6fb8e157be37\") " Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.625512 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de4f7704-da43-4626-b1b1-6fb8e157be37-catalog-content\") pod \"de4f7704-da43-4626-b1b1-6fb8e157be37\" (UID: \"de4f7704-da43-4626-b1b1-6fb8e157be37\") " Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.625558 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b84lj\" (UniqueName: \"kubernetes.io/projected/de4f7704-da43-4626-b1b1-6fb8e157be37-kube-api-access-b84lj\") pod \"de4f7704-da43-4626-b1b1-6fb8e157be37\" (UID: \"de4f7704-da43-4626-b1b1-6fb8e157be37\") " Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.626842 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de4f7704-da43-4626-b1b1-6fb8e157be37-utilities" (OuterVolumeSpecName: "utilities") pod "de4f7704-da43-4626-b1b1-6fb8e157be37" (UID: "de4f7704-da43-4626-b1b1-6fb8e157be37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.637156 4954 scope.go:117] "RemoveContainer" containerID="ebc4e77a2276e6047f10ff0df7c77c50cbc5502ae2809d34193e4e727164aa09" Dec 06 07:23:57 crc kubenswrapper[4954]: E1206 07:23:57.639042 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc4e77a2276e6047f10ff0df7c77c50cbc5502ae2809d34193e4e727164aa09\": container with ID starting with ebc4e77a2276e6047f10ff0df7c77c50cbc5502ae2809d34193e4e727164aa09 not found: ID does not exist" containerID="ebc4e77a2276e6047f10ff0df7c77c50cbc5502ae2809d34193e4e727164aa09" Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.639122 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc4e77a2276e6047f10ff0df7c77c50cbc5502ae2809d34193e4e727164aa09"} err="failed to get container status \"ebc4e77a2276e6047f10ff0df7c77c50cbc5502ae2809d34193e4e727164aa09\": rpc error: code = NotFound desc = could not find container \"ebc4e77a2276e6047f10ff0df7c77c50cbc5502ae2809d34193e4e727164aa09\": container with ID starting with ebc4e77a2276e6047f10ff0df7c77c50cbc5502ae2809d34193e4e727164aa09 not found: ID does not exist" Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.639169 4954 scope.go:117] "RemoveContainer" containerID="70b3afe42ad51357013e395fbf5106f19b0a3cc2eb9142aea7eb9ed7b64e5f11" Dec 06 07:23:57 crc kubenswrapper[4954]: E1206 07:23:57.639637 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70b3afe42ad51357013e395fbf5106f19b0a3cc2eb9142aea7eb9ed7b64e5f11\": container with ID starting with 70b3afe42ad51357013e395fbf5106f19b0a3cc2eb9142aea7eb9ed7b64e5f11 not found: ID does not exist" containerID="70b3afe42ad51357013e395fbf5106f19b0a3cc2eb9142aea7eb9ed7b64e5f11" Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.639678 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70b3afe42ad51357013e395fbf5106f19b0a3cc2eb9142aea7eb9ed7b64e5f11"} err="failed to get container status \"70b3afe42ad51357013e395fbf5106f19b0a3cc2eb9142aea7eb9ed7b64e5f11\": rpc error: code = NotFound desc = could not find container \"70b3afe42ad51357013e395fbf5106f19b0a3cc2eb9142aea7eb9ed7b64e5f11\": container with ID starting with 70b3afe42ad51357013e395fbf5106f19b0a3cc2eb9142aea7eb9ed7b64e5f11 not found: ID does not exist" Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.639711 4954 scope.go:117] "RemoveContainer" containerID="45f4be0644813bcd6eb497f64fd9d28d43a1bbe0410ee73d790b695e0dc933dc" Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.639848 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de4f7704-da43-4626-b1b1-6fb8e157be37-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:57 crc kubenswrapper[4954]: E1206 07:23:57.641358 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45f4be0644813bcd6eb497f64fd9d28d43a1bbe0410ee73d790b695e0dc933dc\": container with ID starting with 45f4be0644813bcd6eb497f64fd9d28d43a1bbe0410ee73d790b695e0dc933dc not found: ID does not exist" containerID="45f4be0644813bcd6eb497f64fd9d28d43a1bbe0410ee73d790b695e0dc933dc" Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.641418 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f4be0644813bcd6eb497f64fd9d28d43a1bbe0410ee73d790b695e0dc933dc"} err="failed to get container status \"45f4be0644813bcd6eb497f64fd9d28d43a1bbe0410ee73d790b695e0dc933dc\": rpc error: code = NotFound desc = could not find container \"45f4be0644813bcd6eb497f64fd9d28d43a1bbe0410ee73d790b695e0dc933dc\": container with ID starting with 45f4be0644813bcd6eb497f64fd9d28d43a1bbe0410ee73d790b695e0dc933dc not found: ID does not exist" Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.645946 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de4f7704-da43-4626-b1b1-6fb8e157be37-kube-api-access-b84lj" (OuterVolumeSpecName: "kube-api-access-b84lj") pod "de4f7704-da43-4626-b1b1-6fb8e157be37" (UID: "de4f7704-da43-4626-b1b1-6fb8e157be37"). InnerVolumeSpecName "kube-api-access-b84lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.660981 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de4f7704-da43-4626-b1b1-6fb8e157be37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de4f7704-da43-4626-b1b1-6fb8e157be37" (UID: "de4f7704-da43-4626-b1b1-6fb8e157be37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.741317 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de4f7704-da43-4626-b1b1-6fb8e157be37-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.741359 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b84lj\" (UniqueName: \"kubernetes.io/projected/de4f7704-da43-4626-b1b1-6fb8e157be37-kube-api-access-b84lj\") on node \"crc\" DevicePath \"\"" Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.898459 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-crjh6"] Dec 06 07:23:57 crc kubenswrapper[4954]: I1206 07:23:57.903696 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-crjh6"] Dec 06 07:23:59 crc kubenswrapper[4954]: I1206 07:23:59.481142 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de4f7704-da43-4626-b1b1-6fb8e157be37" path="/var/lib/kubelet/pods/de4f7704-da43-4626-b1b1-6fb8e157be37/volumes" Dec 06 07:24:07 crc kubenswrapper[4954]: I1206 07:24:07.443261 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:24:07 crc kubenswrapper[4954]: E1206 07:24:07.444387 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:24:17 crc kubenswrapper[4954]: I1206 07:24:17.306812 4954 scope.go:117] "RemoveContainer" containerID="5da64da480bd207602ab54760066366c1392ecfb9339d3d709fba33f063c4aac" Dec 06 07:24:17 crc kubenswrapper[4954]: I1206 07:24:17.355593 4954 scope.go:117] "RemoveContainer" containerID="70ef882091070bde0bec956facf971437b651cffaa48697c11f169c6fe3617c5" Dec 06 07:24:17 crc kubenswrapper[4954]: I1206 07:24:17.395863 4954 scope.go:117] "RemoveContainer" containerID="3018d3d8b3cb5a9e3856801dbdf5ff7b95038c330f1d30888a44744a692d61e1" Dec 06 07:24:17 crc kubenswrapper[4954]: I1206 07:24:17.454686 4954 scope.go:117] "RemoveContainer" containerID="26f1310e589079d2c29a84c9a9c823b2acb1c438f20660e7aedabf9a59ec76dd" Dec 06 07:24:17 crc kubenswrapper[4954]: I1206 07:24:17.494401 4954 scope.go:117] "RemoveContainer" containerID="f9ce5a5580680603b5f5978b8290b06d0f73e55dfade2adf905db6a7508d152b" Dec 06 07:24:17 crc kubenswrapper[4954]: I1206 07:24:17.525779 4954 scope.go:117] "RemoveContainer" containerID="928615f26172b5cb755987526ac5bb1eb4f54e75a55d4d52248a3bd9c543805c" Dec 06 07:24:17 crc kubenswrapper[4954]: I1206 07:24:17.566642 4954 scope.go:117] "RemoveContainer" containerID="6bab50824cc7d0430e80d33b5c7beda9916d889f5e5c1a012ab2aebf51826a03" Dec 06 07:24:17 crc kubenswrapper[4954]: I1206 07:24:17.595370 4954 scope.go:117] "RemoveContainer" containerID="f5f430ad3df9e0cf4cc84ee32b89af25b995b3297d702d7f97feec4a11fa4787" Dec 06 07:24:17 crc kubenswrapper[4954]: I1206 07:24:17.619966 4954 scope.go:117] "RemoveContainer" containerID="76b9762d25a0f68e1ca9721e3f881fc79744180682ca9bbe031f0e0466f604df" Dec 06 07:24:17 crc kubenswrapper[4954]: I1206 07:24:17.656181 4954 scope.go:117] "RemoveContainer" containerID="fa14aef8f5202091e49f53bf62c1c85cbe928b299e3ba4aafb049519f6b59f0a" Dec 06 07:24:17 crc kubenswrapper[4954]: I1206 07:24:17.696142 4954 scope.go:117] "RemoveContainer" containerID="7871e18bb8292414c78e7334c458b2779277880ca11659a8561bd1664ae80f7d" Dec 06 07:24:17 crc kubenswrapper[4954]: I1206 07:24:17.723901 4954 scope.go:117] "RemoveContainer" containerID="549dfa3cfa0a8efb669e86e3de71c2ea1577bb4859050d03b3d41b11fa4dae74" Dec 06 07:24:17 crc kubenswrapper[4954]: I1206 07:24:17.749701 4954 scope.go:117] "RemoveContainer" containerID="2be43c7ef80f1e974ba1bc9f1ee60912104a84a0dc25733164e3f37e1c0be14a" Dec 06 07:24:17 crc kubenswrapper[4954]: I1206 07:24:17.782786 4954 scope.go:117] "RemoveContainer" containerID="16c047b2e6afd9ce52ca0f1e8a2b7459d989ea4b4352f8df7247e410e814cf59" Dec 06 07:24:17 crc kubenswrapper[4954]: I1206 07:24:17.840312 4954 scope.go:117] "RemoveContainer" containerID="085463d09eb36984e71b25ab74286f6b9bc39b6985a17aff26c32c1c17ebc34b" Dec 06 07:24:17 crc kubenswrapper[4954]: I1206 07:24:17.878220 4954 scope.go:117] "RemoveContainer" containerID="0bf16b7efb1f2617221d73830bc9e49e51a86fe1c47c1f391e18bcba68548179" Dec 06 07:24:17 crc kubenswrapper[4954]: I1206 07:24:17.917143 4954 scope.go:117] "RemoveContainer" containerID="c9cb2b326ffd728ac824c05ed72c22a0ac49dd17dd685e63e0cde489bf63efb1" Dec 06 07:24:17 crc kubenswrapper[4954]: I1206 07:24:17.943625 4954 scope.go:117] "RemoveContainer" containerID="051545d1d756d25636df272c7f761b7921bf9ef327f55451e1d78ab235564be7" Dec 06 07:24:18 crc kubenswrapper[4954]: I1206 07:24:18.444310 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:24:18 crc kubenswrapper[4954]: E1206 07:24:18.445338 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:24:32 crc kubenswrapper[4954]: I1206 07:24:32.444862 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:24:32 crc kubenswrapper[4954]: E1206 07:24:32.446328 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:24:44 crc kubenswrapper[4954]: I1206 07:24:44.443903 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:24:44 crc kubenswrapper[4954]: E1206 07:24:44.445423 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:24:55 crc kubenswrapper[4954]: I1206 07:24:55.454491 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:24:55 crc kubenswrapper[4954]: E1206 07:24:55.455986 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:25:08 crc kubenswrapper[4954]: I1206 07:25:08.444503 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:25:08 crc kubenswrapper[4954]: E1206 07:25:08.445943 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:25:18 crc kubenswrapper[4954]: I1206 07:25:18.412118 4954 scope.go:117] "RemoveContainer" containerID="511c18ca4df0ed8aff0072864f0d0e77eddb735086b7978043e9e0becffa93c2" Dec 06 07:25:18 crc kubenswrapper[4954]: I1206 07:25:18.495467 4954 scope.go:117] "RemoveContainer" containerID="6f6eabd999407ab0205f96cb4ec712282a45d1316429c1b890f020566f380231" Dec 06 07:25:18 crc kubenswrapper[4954]: I1206 07:25:18.526793 4954 scope.go:117] "RemoveContainer" containerID="4dde86b4c7309d7ef903e4cc4005a418902a0316bf43d573aa6b04182fb92324" Dec 06 07:25:18 crc kubenswrapper[4954]: I1206 07:25:18.586941 4954 scope.go:117] "RemoveContainer" containerID="3ff1ddc524f22f0e5266e63bc6127148019f544502da9441ee205d663d4cf95d" Dec 06 07:25:18 crc kubenswrapper[4954]: I1206 07:25:18.607965 4954 scope.go:117] "RemoveContainer" containerID="2ec47098771e81a3e2ed711acad31be02d7763e790d88dde621f4b4966df4ef7" Dec 06 07:25:18 crc kubenswrapper[4954]: I1206 07:25:18.645645 4954 scope.go:117] "RemoveContainer" containerID="df8e570779d97aab43c23df58a337a053e63cc302fbe36e683dc6788a6773e03" Dec 06 07:25:18 crc kubenswrapper[4954]: I1206 07:25:18.721724 4954 scope.go:117] "RemoveContainer" containerID="73b4fc68357e95fab522f5bfaa524801ed8b0c73a9d9a696e2f90c09e2776edb" Dec 06 07:25:18 crc kubenswrapper[4954]: I1206 07:25:18.773047 4954 scope.go:117] "RemoveContainer" containerID="470a3bb2487480880ccb6620497d7d369d86596f60ec23a908780d07204526e8" Dec 06 07:25:18 crc kubenswrapper[4954]: I1206 07:25:18.812065 4954 scope.go:117] "RemoveContainer" containerID="7d138999999fc4107a55db7ab0c535ac484c83e49f640de1f842c8caf3e53734" Dec 06 07:25:18 crc kubenswrapper[4954]: I1206 07:25:18.836631 4954 scope.go:117] "RemoveContainer" containerID="1c274f4ec368dd0bbdd0327af80dcdd0dbe4da385fae01741d472204dc012d51" Dec 06 07:25:18 crc kubenswrapper[4954]: I1206 07:25:18.869926 4954 scope.go:117] "RemoveContainer" containerID="71b04eafce145d86555e0beb19909a5726709e8e20af73819ad4cf85b6cd66fc" Dec 06 07:25:21 crc kubenswrapper[4954]: I1206 07:25:21.444521 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:25:21 crc kubenswrapper[4954]: E1206 07:25:21.445072 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:25:34 crc kubenswrapper[4954]: I1206 07:25:34.443696 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:25:34 crc kubenswrapper[4954]: E1206 07:25:34.445108 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:25:48 crc kubenswrapper[4954]: I1206 07:25:48.445074 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:25:48 crc kubenswrapper[4954]: E1206 07:25:48.446538 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:26:02 crc kubenswrapper[4954]: I1206 07:26:02.444789 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:26:02 crc kubenswrapper[4954]: E1206 07:26:02.446651 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:26:15 crc kubenswrapper[4954]: I1206 07:26:15.452440 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:26:15 crc kubenswrapper[4954]: E1206 07:26:15.453924 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:26:19 crc kubenswrapper[4954]: I1206 07:26:19.146530 4954 scope.go:117] "RemoveContainer" containerID="5bbcaa0af76358e95f0c085a663e99d6d9414aa7753d1930befd841c33827a83" Dec 06 07:26:19 crc kubenswrapper[4954]: I1206 07:26:19.181471 4954 scope.go:117] "RemoveContainer" containerID="3515d930ad170dbba2ebd7d24e7efa55655ce22df58ee2066ea9b387286c8337" Dec 06 07:26:19 crc kubenswrapper[4954]: I1206 07:26:19.238288 4954 scope.go:117] "RemoveContainer" containerID="d13d80f84977f1d7303d6793ff01f19f9db19816a45b876992777f3cab048b0d" Dec 06 07:26:19 crc kubenswrapper[4954]: I1206 07:26:19.277765 4954 scope.go:117] "RemoveContainer" containerID="69544f16f8d5ca7d182a797a8081f7506395ae0820d8b9e4721bb6f746132efe" Dec 06 07:26:19 crc kubenswrapper[4954]: I1206 07:26:19.317976 4954 scope.go:117] "RemoveContainer" containerID="ff4a1927bb7fd5e4e0bc2af413f92d09c51402a69dd5aa94a639ab5dbab627ea" Dec 06 07:26:19 crc kubenswrapper[4954]: I1206 07:26:19.349518 4954 scope.go:117] "RemoveContainer" containerID="81f111a5ef62bb2c7c53d724ac3b5a5c348a8db6b2c997864f73ca9b9165eb0a" Dec 06 07:26:19 crc kubenswrapper[4954]: I1206 07:26:19.378259 4954 scope.go:117] "RemoveContainer" containerID="feb559d91bf9370a973f29e0a2791a8eef9c04f353959aa041a0f510e44856c2" Dec 06 07:26:30 crc kubenswrapper[4954]: I1206 07:26:30.443968 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:26:30 crc kubenswrapper[4954]: E1206 07:26:30.444967 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:26:44 crc kubenswrapper[4954]: I1206 07:26:44.444333 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:26:44 crc kubenswrapper[4954]: E1206 07:26:44.445810 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:26:56 crc kubenswrapper[4954]: I1206 07:26:56.444225 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:26:56 crc kubenswrapper[4954]: E1206 07:26:56.445258 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:27:10 crc kubenswrapper[4954]: I1206 07:27:10.444029 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:27:11 crc kubenswrapper[4954]: I1206 07:27:11.054427 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"1b8fd69f59f015b0d17ba673b772374b46ab183512e18593f68562d659d40775"} Dec 06 07:27:19 crc kubenswrapper[4954]: I1206 07:27:19.537508 4954 scope.go:117] "RemoveContainer" containerID="e3081561a734c7c5623a46dc8f2eda43010b66d623c6f028647ac25db2e6ffee" Dec 06 07:27:19 crc kubenswrapper[4954]: I1206 07:27:19.606255 4954 scope.go:117] "RemoveContainer" containerID="8afbd1dae36253d138592f7b2ecd6d436f3b2d2cbaae7e1362a7c3a410bb0e16" Dec 06 07:27:19 crc kubenswrapper[4954]: I1206 07:27:19.636999 4954 scope.go:117] "RemoveContainer" containerID="073e61f5d90665e18b8c3396d1a787b2657cc16236e7aacf7f53848c5eb3adf8" Dec 06 07:27:19 crc kubenswrapper[4954]: I1206 07:27:19.671128 4954 scope.go:117] "RemoveContainer" containerID="cfdf07915e103b77672f6abb91e43ab46cd191d43f002e4b0c5c7cdd130880af" Dec 06 07:27:19 crc kubenswrapper[4954]: I1206 07:27:19.704726 4954 scope.go:117] "RemoveContainer" containerID="5e837fa95d1fa9b7b115359fe96d4dd565c2b734b97c7894835e55a0e0e55069" Dec 06 07:27:19 crc kubenswrapper[4954]: I1206 07:27:19.771373 4954 scope.go:117] "RemoveContainer" containerID="fbf01d8c738f88aa0917c5da1a6fed31a6e5ed43c6500aa8c04f15f09dd7184a" Dec 06 07:28:19 crc kubenswrapper[4954]: I1206 07:28:19.935693 4954 scope.go:117] "RemoveContainer" containerID="cadf0bf970efee2a4815c1a1b175fffd83284eaf0f45d7a45fe5892947c1264b" Dec 06 07:28:31 crc kubenswrapper[4954]: I1206 07:28:31.034099 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6lrts"] Dec 06 07:28:31 crc kubenswrapper[4954]: E1206 07:28:31.035732 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de4f7704-da43-4626-b1b1-6fb8e157be37" containerName="extract-utilities" Dec 06 07:28:31 crc kubenswrapper[4954]: I1206 07:28:31.035758 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="de4f7704-da43-4626-b1b1-6fb8e157be37" containerName="extract-utilities" Dec 06 07:28:31 crc kubenswrapper[4954]: E1206 07:28:31.035783 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de4f7704-da43-4626-b1b1-6fb8e157be37" containerName="extract-content" Dec 06 07:28:31 crc kubenswrapper[4954]: I1206 07:28:31.035794 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="de4f7704-da43-4626-b1b1-6fb8e157be37" containerName="extract-content" Dec 06 07:28:31 crc kubenswrapper[4954]: E1206 07:28:31.035839 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de4f7704-da43-4626-b1b1-6fb8e157be37" containerName="registry-server" Dec 06 07:28:31 crc kubenswrapper[4954]: I1206 07:28:31.035850 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="de4f7704-da43-4626-b1b1-6fb8e157be37" containerName="registry-server" Dec 06 07:28:31 crc kubenswrapper[4954]: I1206 07:28:31.036159 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="de4f7704-da43-4626-b1b1-6fb8e157be37" containerName="registry-server" Dec 06 07:28:31 crc kubenswrapper[4954]: I1206 07:28:31.037949 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6lrts" Dec 06 07:28:31 crc kubenswrapper[4954]: I1206 07:28:31.050476 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6lrts"] Dec 06 07:28:31 crc kubenswrapper[4954]: I1206 07:28:31.190358 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht9jx\" (UniqueName: \"kubernetes.io/projected/5ef86144-7d9d-4d29-a7b7-6bd18c58600e-kube-api-access-ht9jx\") pod \"certified-operators-6lrts\" (UID: \"5ef86144-7d9d-4d29-a7b7-6bd18c58600e\") " pod="openshift-marketplace/certified-operators-6lrts" Dec 06 07:28:31 crc kubenswrapper[4954]: I1206 07:28:31.190438 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ef86144-7d9d-4d29-a7b7-6bd18c58600e-catalog-content\") pod \"certified-operators-6lrts\" (UID: \"5ef86144-7d9d-4d29-a7b7-6bd18c58600e\") " pod="openshift-marketplace/certified-operators-6lrts" Dec 06 07:28:31 crc kubenswrapper[4954]: I1206 07:28:31.190605 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ef86144-7d9d-4d29-a7b7-6bd18c58600e-utilities\") pod \"certified-operators-6lrts\" (UID: \"5ef86144-7d9d-4d29-a7b7-6bd18c58600e\") " pod="openshift-marketplace/certified-operators-6lrts" Dec 06 07:28:31 crc kubenswrapper[4954]: I1206 07:28:31.292111 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht9jx\" (UniqueName: \"kubernetes.io/projected/5ef86144-7d9d-4d29-a7b7-6bd18c58600e-kube-api-access-ht9jx\") pod \"certified-operators-6lrts\" (UID: \"5ef86144-7d9d-4d29-a7b7-6bd18c58600e\") " pod="openshift-marketplace/certified-operators-6lrts" Dec 06 07:28:31 crc kubenswrapper[4954]: I1206 07:28:31.292180 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ef86144-7d9d-4d29-a7b7-6bd18c58600e-catalog-content\") pod \"certified-operators-6lrts\" (UID: \"5ef86144-7d9d-4d29-a7b7-6bd18c58600e\") " pod="openshift-marketplace/certified-operators-6lrts" Dec 06 07:28:31 crc kubenswrapper[4954]: I1206 07:28:31.292219 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ef86144-7d9d-4d29-a7b7-6bd18c58600e-utilities\") pod \"certified-operators-6lrts\" (UID: \"5ef86144-7d9d-4d29-a7b7-6bd18c58600e\") " pod="openshift-marketplace/certified-operators-6lrts" Dec 06 07:28:31 crc kubenswrapper[4954]: I1206 07:28:31.292887 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ef86144-7d9d-4d29-a7b7-6bd18c58600e-utilities\") pod \"certified-operators-6lrts\" (UID: \"5ef86144-7d9d-4d29-a7b7-6bd18c58600e\") " pod="openshift-marketplace/certified-operators-6lrts" Dec 06 07:28:31 crc kubenswrapper[4954]: I1206 07:28:31.293094 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ef86144-7d9d-4d29-a7b7-6bd18c58600e-catalog-content\") pod \"certified-operators-6lrts\" (UID: \"5ef86144-7d9d-4d29-a7b7-6bd18c58600e\") " pod="openshift-marketplace/certified-operators-6lrts" Dec 06 07:28:31 crc kubenswrapper[4954]: I1206 07:28:31.315318 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht9jx\" (UniqueName: \"kubernetes.io/projected/5ef86144-7d9d-4d29-a7b7-6bd18c58600e-kube-api-access-ht9jx\") pod \"certified-operators-6lrts\" (UID: \"5ef86144-7d9d-4d29-a7b7-6bd18c58600e\") " pod="openshift-marketplace/certified-operators-6lrts" Dec 06 07:28:31 crc kubenswrapper[4954]: I1206 07:28:31.374116 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6lrts" Dec 06 07:28:31 crc kubenswrapper[4954]: I1206 07:28:31.873307 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6lrts"] Dec 06 07:28:32 crc kubenswrapper[4954]: I1206 07:28:32.394983 4954 generic.go:334] "Generic (PLEG): container finished" podID="5ef86144-7d9d-4d29-a7b7-6bd18c58600e" containerID="a8f3add03d74235ef19e248d7d1bbdc757135f25b480d0e003e086e1be1e4f90" exitCode=0 Dec 06 07:28:32 crc kubenswrapper[4954]: I1206 07:28:32.395053 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6lrts" event={"ID":"5ef86144-7d9d-4d29-a7b7-6bd18c58600e","Type":"ContainerDied","Data":"a8f3add03d74235ef19e248d7d1bbdc757135f25b480d0e003e086e1be1e4f90"} Dec 06 07:28:32 crc kubenswrapper[4954]: I1206 07:28:32.395591 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6lrts" event={"ID":"5ef86144-7d9d-4d29-a7b7-6bd18c58600e","Type":"ContainerStarted","Data":"3aa746ed1c5e097efbe4db132d3ce02b324d6e733ad5b8f525cec6094e6a938f"} Dec 06 07:28:32 crc kubenswrapper[4954]: I1206 07:28:32.399015 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 07:28:34 crc kubenswrapper[4954]: I1206 07:28:34.413403 4954 generic.go:334] "Generic (PLEG): container finished" podID="5ef86144-7d9d-4d29-a7b7-6bd18c58600e" containerID="af7d1031941b8695eed1c0c689e867858f3efd39262cc22e6edb86877f28e1d4" exitCode=0 Dec 06 07:28:34 crc kubenswrapper[4954]: I1206 07:28:34.414443 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6lrts" event={"ID":"5ef86144-7d9d-4d29-a7b7-6bd18c58600e","Type":"ContainerDied","Data":"af7d1031941b8695eed1c0c689e867858f3efd39262cc22e6edb86877f28e1d4"} Dec 06 07:28:35 crc kubenswrapper[4954]: I1206 07:28:35.430231 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6lrts" event={"ID":"5ef86144-7d9d-4d29-a7b7-6bd18c58600e","Type":"ContainerStarted","Data":"e89287191437630a444b31baba76b3d1c9244bf7a9a27b73e1b5a6c75e08491b"} Dec 06 07:28:35 crc kubenswrapper[4954]: I1206 07:28:35.463361 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6lrts" podStartSLOduration=2.939047853 podStartE2EDuration="5.463336076s" podCreationTimestamp="2025-12-06 07:28:30 +0000 UTC" firstStartedPulling="2025-12-06 07:28:32.398265171 +0000 UTC m=+1887.211624600" lastFinishedPulling="2025-12-06 07:28:34.922553434 +0000 UTC m=+1889.735912823" observedRunningTime="2025-12-06 07:28:35.457576782 +0000 UTC m=+1890.270936161" watchObservedRunningTime="2025-12-06 07:28:35.463336076 +0000 UTC m=+1890.276695455" Dec 06 07:28:41 crc kubenswrapper[4954]: I1206 07:28:41.375417 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6lrts" Dec 06 07:28:41 crc kubenswrapper[4954]: I1206 07:28:41.376469 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6lrts" Dec 06 07:28:41 crc kubenswrapper[4954]: I1206 07:28:41.438295 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6lrts" Dec 06 07:28:41 crc kubenswrapper[4954]: I1206 07:28:41.546484 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6lrts" Dec 06 07:28:41 crc kubenswrapper[4954]: I1206 07:28:41.687787 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6lrts"] Dec 06 07:28:43 crc kubenswrapper[4954]: I1206 07:28:43.512473 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6lrts" podUID="5ef86144-7d9d-4d29-a7b7-6bd18c58600e" containerName="registry-server" containerID="cri-o://e89287191437630a444b31baba76b3d1c9244bf7a9a27b73e1b5a6c75e08491b" gracePeriod=2 Dec 06 07:28:44 crc kubenswrapper[4954]: I1206 07:28:44.528186 4954 generic.go:334] "Generic (PLEG): container finished" podID="5ef86144-7d9d-4d29-a7b7-6bd18c58600e" containerID="e89287191437630a444b31baba76b3d1c9244bf7a9a27b73e1b5a6c75e08491b" exitCode=0 Dec 06 07:28:44 crc kubenswrapper[4954]: I1206 07:28:44.528731 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6lrts" event={"ID":"5ef86144-7d9d-4d29-a7b7-6bd18c58600e","Type":"ContainerDied","Data":"e89287191437630a444b31baba76b3d1c9244bf7a9a27b73e1b5a6c75e08491b"} Dec 06 07:28:44 crc kubenswrapper[4954]: I1206 07:28:44.725373 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6lrts" Dec 06 07:28:44 crc kubenswrapper[4954]: I1206 07:28:44.836389 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht9jx\" (UniqueName: \"kubernetes.io/projected/5ef86144-7d9d-4d29-a7b7-6bd18c58600e-kube-api-access-ht9jx\") pod \"5ef86144-7d9d-4d29-a7b7-6bd18c58600e\" (UID: \"5ef86144-7d9d-4d29-a7b7-6bd18c58600e\") " Dec 06 07:28:44 crc kubenswrapper[4954]: I1206 07:28:44.836514 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ef86144-7d9d-4d29-a7b7-6bd18c58600e-catalog-content\") pod \"5ef86144-7d9d-4d29-a7b7-6bd18c58600e\" (UID: \"5ef86144-7d9d-4d29-a7b7-6bd18c58600e\") " Dec 06 07:28:44 crc kubenswrapper[4954]: I1206 07:28:44.836602 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ef86144-7d9d-4d29-a7b7-6bd18c58600e-utilities\") pod \"5ef86144-7d9d-4d29-a7b7-6bd18c58600e\" (UID: \"5ef86144-7d9d-4d29-a7b7-6bd18c58600e\") " Dec 06 07:28:44 crc kubenswrapper[4954]: I1206 07:28:44.837854 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ef86144-7d9d-4d29-a7b7-6bd18c58600e-utilities" (OuterVolumeSpecName: "utilities") pod "5ef86144-7d9d-4d29-a7b7-6bd18c58600e" (UID: "5ef86144-7d9d-4d29-a7b7-6bd18c58600e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:28:44 crc kubenswrapper[4954]: I1206 07:28:44.850757 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ef86144-7d9d-4d29-a7b7-6bd18c58600e-kube-api-access-ht9jx" (OuterVolumeSpecName: "kube-api-access-ht9jx") pod "5ef86144-7d9d-4d29-a7b7-6bd18c58600e" (UID: "5ef86144-7d9d-4d29-a7b7-6bd18c58600e"). InnerVolumeSpecName "kube-api-access-ht9jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:28:44 crc kubenswrapper[4954]: I1206 07:28:44.907479 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ef86144-7d9d-4d29-a7b7-6bd18c58600e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ef86144-7d9d-4d29-a7b7-6bd18c58600e" (UID: "5ef86144-7d9d-4d29-a7b7-6bd18c58600e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:28:44 crc kubenswrapper[4954]: I1206 07:28:44.939241 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ef86144-7d9d-4d29-a7b7-6bd18c58600e-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:28:44 crc kubenswrapper[4954]: I1206 07:28:44.939308 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht9jx\" (UniqueName: \"kubernetes.io/projected/5ef86144-7d9d-4d29-a7b7-6bd18c58600e-kube-api-access-ht9jx\") on node \"crc\" DevicePath \"\"" Dec 06 07:28:44 crc kubenswrapper[4954]: I1206 07:28:44.939327 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ef86144-7d9d-4d29-a7b7-6bd18c58600e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:28:45 crc kubenswrapper[4954]: I1206 07:28:45.543975 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6lrts" event={"ID":"5ef86144-7d9d-4d29-a7b7-6bd18c58600e","Type":"ContainerDied","Data":"3aa746ed1c5e097efbe4db132d3ce02b324d6e733ad5b8f525cec6094e6a938f"} Dec 06 07:28:45 crc kubenswrapper[4954]: I1206 07:28:45.544119 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6lrts" Dec 06 07:28:45 crc kubenswrapper[4954]: I1206 07:28:45.544414 4954 scope.go:117] "RemoveContainer" containerID="e89287191437630a444b31baba76b3d1c9244bf7a9a27b73e1b5a6c75e08491b" Dec 06 07:28:45 crc kubenswrapper[4954]: I1206 07:28:45.584622 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6lrts"] Dec 06 07:28:45 crc kubenswrapper[4954]: I1206 07:28:45.591018 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6lrts"] Dec 06 07:28:45 crc kubenswrapper[4954]: I1206 07:28:45.591304 4954 scope.go:117] "RemoveContainer" containerID="af7d1031941b8695eed1c0c689e867858f3efd39262cc22e6edb86877f28e1d4" Dec 06 07:28:45 crc kubenswrapper[4954]: I1206 07:28:45.620825 4954 scope.go:117] "RemoveContainer" containerID="a8f3add03d74235ef19e248d7d1bbdc757135f25b480d0e003e086e1be1e4f90" Dec 06 07:28:47 crc kubenswrapper[4954]: I1206 07:28:47.452090 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ef86144-7d9d-4d29-a7b7-6bd18c58600e" path="/var/lib/kubelet/pods/5ef86144-7d9d-4d29-a7b7-6bd18c58600e/volumes" Dec 06 07:29:10 crc kubenswrapper[4954]: I1206 07:29:10.101425 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:29:10 crc kubenswrapper[4954]: I1206 07:29:10.102796 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:29:20 crc kubenswrapper[4954]: I1206 07:29:20.048307 4954 scope.go:117] "RemoveContainer" containerID="ea6bce84684f9d9646ab01dc7bd2553bf2cde9261c87d0499c52ccf5074334f4" Dec 06 07:29:20 crc kubenswrapper[4954]: I1206 07:29:20.092882 4954 scope.go:117] "RemoveContainer" containerID="7165148211c75cd9f81f468e8ae42da7229964e346f5de7e874bb757593ba1f9" Dec 06 07:29:20 crc kubenswrapper[4954]: I1206 07:29:20.145457 4954 scope.go:117] "RemoveContainer" containerID="bbc38d953a02dab3938d603ccd3c213f1803459204492241ede64bcaeec6e808" Dec 06 07:29:20 crc kubenswrapper[4954]: I1206 07:29:20.176217 4954 scope.go:117] "RemoveContainer" containerID="85abce05f2197fc99761b6476531dcf92d1318db569f7b035a42686be23aca06" Dec 06 07:29:20 crc kubenswrapper[4954]: I1206 07:29:20.209631 4954 scope.go:117] "RemoveContainer" containerID="3d7db18b93f40985771a850dfbb60c46bb605a815e84de1215841e3e8caefc04" Dec 06 07:29:20 crc kubenswrapper[4954]: I1206 07:29:20.247796 4954 scope.go:117] "RemoveContainer" containerID="47d681a97ad53060d226bf1814a05e625af235113bd664a53739e0decbc48a7d" Dec 06 07:29:20 crc kubenswrapper[4954]: I1206 07:29:20.275001 4954 scope.go:117] "RemoveContainer" containerID="2bc60fc2d15e39cabc0790d5969c57496804901c2638939ec5960ab33712f898" Dec 06 07:29:40 crc kubenswrapper[4954]: I1206 07:29:40.101263 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:29:40 crc kubenswrapper[4954]: I1206 07:29:40.102160 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.168810 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv"] Dec 06 07:30:00 crc kubenswrapper[4954]: E1206 07:30:00.169803 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ef86144-7d9d-4d29-a7b7-6bd18c58600e" containerName="registry-server" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.169818 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef86144-7d9d-4d29-a7b7-6bd18c58600e" containerName="registry-server" Dec 06 07:30:00 crc kubenswrapper[4954]: E1206 07:30:00.169829 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ef86144-7d9d-4d29-a7b7-6bd18c58600e" containerName="extract-utilities" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.169836 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef86144-7d9d-4d29-a7b7-6bd18c58600e" containerName="extract-utilities" Dec 06 07:30:00 crc kubenswrapper[4954]: E1206 07:30:00.169864 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ef86144-7d9d-4d29-a7b7-6bd18c58600e" containerName="extract-content" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.169871 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef86144-7d9d-4d29-a7b7-6bd18c58600e" containerName="extract-content" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.170076 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ef86144-7d9d-4d29-a7b7-6bd18c58600e" containerName="registry-server" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.170681 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.174181 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.174553 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.187806 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv"] Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.304310 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xb88\" (UniqueName: \"kubernetes.io/projected/f4735c87-b0cf-4ce4-8e19-506938b991de-kube-api-access-8xb88\") pod \"collect-profiles-29416770-zdsdv\" (UID: \"f4735c87-b0cf-4ce4-8e19-506938b991de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.304485 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4735c87-b0cf-4ce4-8e19-506938b991de-config-volume\") pod \"collect-profiles-29416770-zdsdv\" (UID: \"f4735c87-b0cf-4ce4-8e19-506938b991de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.304899 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4735c87-b0cf-4ce4-8e19-506938b991de-secret-volume\") pod \"collect-profiles-29416770-zdsdv\" (UID: \"f4735c87-b0cf-4ce4-8e19-506938b991de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.409391 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4735c87-b0cf-4ce4-8e19-506938b991de-secret-volume\") pod \"collect-profiles-29416770-zdsdv\" (UID: \"f4735c87-b0cf-4ce4-8e19-506938b991de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.409493 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xb88\" (UniqueName: \"kubernetes.io/projected/f4735c87-b0cf-4ce4-8e19-506938b991de-kube-api-access-8xb88\") pod \"collect-profiles-29416770-zdsdv\" (UID: \"f4735c87-b0cf-4ce4-8e19-506938b991de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.409635 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4735c87-b0cf-4ce4-8e19-506938b991de-config-volume\") pod \"collect-profiles-29416770-zdsdv\" (UID: \"f4735c87-b0cf-4ce4-8e19-506938b991de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.411124 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4735c87-b0cf-4ce4-8e19-506938b991de-config-volume\") pod \"collect-profiles-29416770-zdsdv\" (UID: \"f4735c87-b0cf-4ce4-8e19-506938b991de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.423601 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4735c87-b0cf-4ce4-8e19-506938b991de-secret-volume\") pod \"collect-profiles-29416770-zdsdv\" (UID: \"f4735c87-b0cf-4ce4-8e19-506938b991de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.446925 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xb88\" (UniqueName: \"kubernetes.io/projected/f4735c87-b0cf-4ce4-8e19-506938b991de-kube-api-access-8xb88\") pod \"collect-profiles-29416770-zdsdv\" (UID: \"f4735c87-b0cf-4ce4-8e19-506938b991de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.491453 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv" Dec 06 07:30:00 crc kubenswrapper[4954]: I1206 07:30:00.794213 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv"] Dec 06 07:30:01 crc kubenswrapper[4954]: I1206 07:30:01.580336 4954 generic.go:334] "Generic (PLEG): container finished" podID="f4735c87-b0cf-4ce4-8e19-506938b991de" containerID="91ebdc7bfa8196cde17ffe336f5470cc3a3e672f35b8ddd3b9434cc8105d4baa" exitCode=0 Dec 06 07:30:01 crc kubenswrapper[4954]: I1206 07:30:01.580411 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv" event={"ID":"f4735c87-b0cf-4ce4-8e19-506938b991de","Type":"ContainerDied","Data":"91ebdc7bfa8196cde17ffe336f5470cc3a3e672f35b8ddd3b9434cc8105d4baa"} Dec 06 07:30:01 crc kubenswrapper[4954]: I1206 07:30:01.580878 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv" event={"ID":"f4735c87-b0cf-4ce4-8e19-506938b991de","Type":"ContainerStarted","Data":"00dc6522c8bd6889344600d2d50d67ebe044e14de6d498529617a9233e3b5bcd"} Dec 06 07:30:02 crc kubenswrapper[4954]: I1206 07:30:02.943631 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv" Dec 06 07:30:03 crc kubenswrapper[4954]: I1206 07:30:03.062517 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4735c87-b0cf-4ce4-8e19-506938b991de-secret-volume\") pod \"f4735c87-b0cf-4ce4-8e19-506938b991de\" (UID: \"f4735c87-b0cf-4ce4-8e19-506938b991de\") " Dec 06 07:30:03 crc kubenswrapper[4954]: I1206 07:30:03.062730 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xb88\" (UniqueName: \"kubernetes.io/projected/f4735c87-b0cf-4ce4-8e19-506938b991de-kube-api-access-8xb88\") pod \"f4735c87-b0cf-4ce4-8e19-506938b991de\" (UID: \"f4735c87-b0cf-4ce4-8e19-506938b991de\") " Dec 06 07:30:03 crc kubenswrapper[4954]: I1206 07:30:03.063452 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4735c87-b0cf-4ce4-8e19-506938b991de-config-volume\") pod \"f4735c87-b0cf-4ce4-8e19-506938b991de\" (UID: \"f4735c87-b0cf-4ce4-8e19-506938b991de\") " Dec 06 07:30:03 crc kubenswrapper[4954]: I1206 07:30:03.064600 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4735c87-b0cf-4ce4-8e19-506938b991de-config-volume" (OuterVolumeSpecName: "config-volume") pod "f4735c87-b0cf-4ce4-8e19-506938b991de" (UID: "f4735c87-b0cf-4ce4-8e19-506938b991de"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:30:03 crc kubenswrapper[4954]: I1206 07:30:03.074050 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4735c87-b0cf-4ce4-8e19-506938b991de-kube-api-access-8xb88" (OuterVolumeSpecName: "kube-api-access-8xb88") pod "f4735c87-b0cf-4ce4-8e19-506938b991de" (UID: "f4735c87-b0cf-4ce4-8e19-506938b991de"). InnerVolumeSpecName "kube-api-access-8xb88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:30:03 crc kubenswrapper[4954]: I1206 07:30:03.074070 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4735c87-b0cf-4ce4-8e19-506938b991de-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f4735c87-b0cf-4ce4-8e19-506938b991de" (UID: "f4735c87-b0cf-4ce4-8e19-506938b991de"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:30:03 crc kubenswrapper[4954]: I1206 07:30:03.168084 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xb88\" (UniqueName: \"kubernetes.io/projected/f4735c87-b0cf-4ce4-8e19-506938b991de-kube-api-access-8xb88\") on node \"crc\" DevicePath \"\"" Dec 06 07:30:03 crc kubenswrapper[4954]: I1206 07:30:03.168153 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4735c87-b0cf-4ce4-8e19-506938b991de-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 07:30:03 crc kubenswrapper[4954]: I1206 07:30:03.168172 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4735c87-b0cf-4ce4-8e19-506938b991de-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 07:30:03 crc kubenswrapper[4954]: I1206 07:30:03.603167 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv" event={"ID":"f4735c87-b0cf-4ce4-8e19-506938b991de","Type":"ContainerDied","Data":"00dc6522c8bd6889344600d2d50d67ebe044e14de6d498529617a9233e3b5bcd"} Dec 06 07:30:03 crc kubenswrapper[4954]: I1206 07:30:03.603259 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00dc6522c8bd6889344600d2d50d67ebe044e14de6d498529617a9233e3b5bcd" Dec 06 07:30:03 crc kubenswrapper[4954]: I1206 07:30:03.603318 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv" Dec 06 07:30:04 crc kubenswrapper[4954]: I1206 07:30:04.052955 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4"] Dec 06 07:30:04 crc kubenswrapper[4954]: I1206 07:30:04.063956 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416725-9mhn4"] Dec 06 07:30:05 crc kubenswrapper[4954]: I1206 07:30:05.468754 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2c7e376-ab13-4dfa-900f-8de633105709" path="/var/lib/kubelet/pods/c2c7e376-ab13-4dfa-900f-8de633105709/volumes" Dec 06 07:30:10 crc kubenswrapper[4954]: I1206 07:30:10.101474 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:30:10 crc kubenswrapper[4954]: I1206 07:30:10.103032 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:30:10 crc kubenswrapper[4954]: I1206 07:30:10.103157 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 07:30:10 crc kubenswrapper[4954]: I1206 07:30:10.104704 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b8fd69f59f015b0d17ba673b772374b46ab183512e18593f68562d659d40775"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 07:30:10 crc kubenswrapper[4954]: I1206 07:30:10.104829 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://1b8fd69f59f015b0d17ba673b772374b46ab183512e18593f68562d659d40775" gracePeriod=600 Dec 06 07:30:10 crc kubenswrapper[4954]: I1206 07:30:10.697782 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="1b8fd69f59f015b0d17ba673b772374b46ab183512e18593f68562d659d40775" exitCode=0 Dec 06 07:30:10 crc kubenswrapper[4954]: I1206 07:30:10.697930 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"1b8fd69f59f015b0d17ba673b772374b46ab183512e18593f68562d659d40775"} Dec 06 07:30:10 crc kubenswrapper[4954]: I1206 07:30:10.698228 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653"} Dec 06 07:30:10 crc kubenswrapper[4954]: I1206 07:30:10.698269 4954 scope.go:117] "RemoveContainer" containerID="2c0ca23564142cc2f4b3701c0b813be131c1b3cbf65dc02f3ef915a4c683a689" Dec 06 07:30:20 crc kubenswrapper[4954]: I1206 07:30:20.441396 4954 scope.go:117] "RemoveContainer" containerID="ef93735e42b3a0f6d59ae3c748e4dd01da95673b9109a66563800aa0c64b986c" Dec 06 07:31:20 crc kubenswrapper[4954]: I1206 07:31:20.913280 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lxkcx"] Dec 06 07:31:20 crc kubenswrapper[4954]: E1206 07:31:20.915087 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4735c87-b0cf-4ce4-8e19-506938b991de" containerName="collect-profiles" Dec 06 07:31:20 crc kubenswrapper[4954]: I1206 07:31:20.915117 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4735c87-b0cf-4ce4-8e19-506938b991de" containerName="collect-profiles" Dec 06 07:31:20 crc kubenswrapper[4954]: I1206 07:31:20.915435 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4735c87-b0cf-4ce4-8e19-506938b991de" containerName="collect-profiles" Dec 06 07:31:20 crc kubenswrapper[4954]: I1206 07:31:20.917904 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxkcx" Dec 06 07:31:20 crc kubenswrapper[4954]: I1206 07:31:20.931121 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lxkcx"] Dec 06 07:31:20 crc kubenswrapper[4954]: I1206 07:31:20.964039 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba01ec7d-4260-4e82-aa94-63c250786c9a-utilities\") pod \"redhat-operators-lxkcx\" (UID: \"ba01ec7d-4260-4e82-aa94-63c250786c9a\") " pod="openshift-marketplace/redhat-operators-lxkcx" Dec 06 07:31:20 crc kubenswrapper[4954]: I1206 07:31:20.964114 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba01ec7d-4260-4e82-aa94-63c250786c9a-catalog-content\") pod \"redhat-operators-lxkcx\" (UID: \"ba01ec7d-4260-4e82-aa94-63c250786c9a\") " pod="openshift-marketplace/redhat-operators-lxkcx" Dec 06 07:31:20 crc kubenswrapper[4954]: I1206 07:31:20.964207 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7pxq\" (UniqueName: \"kubernetes.io/projected/ba01ec7d-4260-4e82-aa94-63c250786c9a-kube-api-access-t7pxq\") pod \"redhat-operators-lxkcx\" (UID: \"ba01ec7d-4260-4e82-aa94-63c250786c9a\") " pod="openshift-marketplace/redhat-operators-lxkcx" Dec 06 07:31:21 crc kubenswrapper[4954]: I1206 07:31:21.066288 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7pxq\" (UniqueName: \"kubernetes.io/projected/ba01ec7d-4260-4e82-aa94-63c250786c9a-kube-api-access-t7pxq\") pod \"redhat-operators-lxkcx\" (UID: \"ba01ec7d-4260-4e82-aa94-63c250786c9a\") " pod="openshift-marketplace/redhat-operators-lxkcx" Dec 06 07:31:21 crc kubenswrapper[4954]: I1206 07:31:21.066429 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba01ec7d-4260-4e82-aa94-63c250786c9a-utilities\") pod \"redhat-operators-lxkcx\" (UID: \"ba01ec7d-4260-4e82-aa94-63c250786c9a\") " pod="openshift-marketplace/redhat-operators-lxkcx" Dec 06 07:31:21 crc kubenswrapper[4954]: I1206 07:31:21.066467 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba01ec7d-4260-4e82-aa94-63c250786c9a-catalog-content\") pod \"redhat-operators-lxkcx\" (UID: \"ba01ec7d-4260-4e82-aa94-63c250786c9a\") " pod="openshift-marketplace/redhat-operators-lxkcx" Dec 06 07:31:21 crc kubenswrapper[4954]: I1206 07:31:21.067645 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba01ec7d-4260-4e82-aa94-63c250786c9a-catalog-content\") pod \"redhat-operators-lxkcx\" (UID: \"ba01ec7d-4260-4e82-aa94-63c250786c9a\") " pod="openshift-marketplace/redhat-operators-lxkcx" Dec 06 07:31:21 crc kubenswrapper[4954]: I1206 07:31:21.067782 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba01ec7d-4260-4e82-aa94-63c250786c9a-utilities\") pod \"redhat-operators-lxkcx\" (UID: \"ba01ec7d-4260-4e82-aa94-63c250786c9a\") " pod="openshift-marketplace/redhat-operators-lxkcx" Dec 06 07:31:21 crc kubenswrapper[4954]: I1206 07:31:21.113192 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7pxq\" (UniqueName: \"kubernetes.io/projected/ba01ec7d-4260-4e82-aa94-63c250786c9a-kube-api-access-t7pxq\") pod \"redhat-operators-lxkcx\" (UID: \"ba01ec7d-4260-4e82-aa94-63c250786c9a\") " pod="openshift-marketplace/redhat-operators-lxkcx" Dec 06 07:31:21 crc kubenswrapper[4954]: I1206 07:31:21.266033 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxkcx" Dec 06 07:31:21 crc kubenswrapper[4954]: I1206 07:31:21.558579 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lxkcx"] Dec 06 07:31:22 crc kubenswrapper[4954]: I1206 07:31:22.504641 4954 generic.go:334] "Generic (PLEG): container finished" podID="ba01ec7d-4260-4e82-aa94-63c250786c9a" containerID="aefd1592473a45974b605b85fa3ef776f0134ebfd7239c71b27e8ad6a681752b" exitCode=0 Dec 06 07:31:22 crc kubenswrapper[4954]: I1206 07:31:22.504820 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxkcx" event={"ID":"ba01ec7d-4260-4e82-aa94-63c250786c9a","Type":"ContainerDied","Data":"aefd1592473a45974b605b85fa3ef776f0134ebfd7239c71b27e8ad6a681752b"} Dec 06 07:31:22 crc kubenswrapper[4954]: I1206 07:31:22.505036 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxkcx" event={"ID":"ba01ec7d-4260-4e82-aa94-63c250786c9a","Type":"ContainerStarted","Data":"e943527dc012bef97a5226e2974ec2e53dc3ed173bab0cefad684491e80c99f5"} Dec 06 07:31:23 crc kubenswrapper[4954]: I1206 07:31:23.517141 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxkcx" event={"ID":"ba01ec7d-4260-4e82-aa94-63c250786c9a","Type":"ContainerStarted","Data":"3c300e7ae82fd7c46f27ba504403cccade98bb4b9a6e557a1e532a6f5af52869"} Dec 06 07:31:24 crc kubenswrapper[4954]: I1206 07:31:24.533164 4954 generic.go:334] "Generic (PLEG): container finished" podID="ba01ec7d-4260-4e82-aa94-63c250786c9a" containerID="3c300e7ae82fd7c46f27ba504403cccade98bb4b9a6e557a1e532a6f5af52869" exitCode=0 Dec 06 07:31:24 crc kubenswrapper[4954]: I1206 07:31:24.533231 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxkcx" event={"ID":"ba01ec7d-4260-4e82-aa94-63c250786c9a","Type":"ContainerDied","Data":"3c300e7ae82fd7c46f27ba504403cccade98bb4b9a6e557a1e532a6f5af52869"} Dec 06 07:31:25 crc kubenswrapper[4954]: I1206 07:31:25.549197 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxkcx" event={"ID":"ba01ec7d-4260-4e82-aa94-63c250786c9a","Type":"ContainerStarted","Data":"58b4b7910789bc388e7141f49d1994bddd47d02a754f95f812abeaa18a5d33c2"} Dec 06 07:31:25 crc kubenswrapper[4954]: I1206 07:31:25.586841 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lxkcx" podStartSLOduration=3.143277342 podStartE2EDuration="5.586810925s" podCreationTimestamp="2025-12-06 07:31:20 +0000 UTC" firstStartedPulling="2025-12-06 07:31:22.506608034 +0000 UTC m=+2057.319967423" lastFinishedPulling="2025-12-06 07:31:24.950141587 +0000 UTC m=+2059.763501006" observedRunningTime="2025-12-06 07:31:25.575039639 +0000 UTC m=+2060.388399108" watchObservedRunningTime="2025-12-06 07:31:25.586810925 +0000 UTC m=+2060.400170344" Dec 06 07:31:31 crc kubenswrapper[4954]: I1206 07:31:31.266670 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lxkcx" Dec 06 07:31:31 crc kubenswrapper[4954]: I1206 07:31:31.267647 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lxkcx" Dec 06 07:31:32 crc kubenswrapper[4954]: I1206 07:31:32.350006 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lxkcx" podUID="ba01ec7d-4260-4e82-aa94-63c250786c9a" containerName="registry-server" probeResult="failure" output=< Dec 06 07:31:32 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 07:31:32 crc kubenswrapper[4954]: > Dec 06 07:31:41 crc kubenswrapper[4954]: I1206 07:31:41.356279 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lxkcx" Dec 06 07:31:41 crc kubenswrapper[4954]: I1206 07:31:41.427738 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lxkcx" Dec 06 07:31:41 crc kubenswrapper[4954]: I1206 07:31:41.617731 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lxkcx"] Dec 06 07:31:42 crc kubenswrapper[4954]: I1206 07:31:42.729968 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lxkcx" podUID="ba01ec7d-4260-4e82-aa94-63c250786c9a" containerName="registry-server" containerID="cri-o://58b4b7910789bc388e7141f49d1994bddd47d02a754f95f812abeaa18a5d33c2" gracePeriod=2 Dec 06 07:31:43 crc kubenswrapper[4954]: I1206 07:31:43.746502 4954 generic.go:334] "Generic (PLEG): container finished" podID="ba01ec7d-4260-4e82-aa94-63c250786c9a" containerID="58b4b7910789bc388e7141f49d1994bddd47d02a754f95f812abeaa18a5d33c2" exitCode=0 Dec 06 07:31:43 crc kubenswrapper[4954]: I1206 07:31:43.746639 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxkcx" event={"ID":"ba01ec7d-4260-4e82-aa94-63c250786c9a","Type":"ContainerDied","Data":"58b4b7910789bc388e7141f49d1994bddd47d02a754f95f812abeaa18a5d33c2"} Dec 06 07:31:44 crc kubenswrapper[4954]: I1206 07:31:44.320056 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxkcx" Dec 06 07:31:44 crc kubenswrapper[4954]: I1206 07:31:44.400344 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7pxq\" (UniqueName: \"kubernetes.io/projected/ba01ec7d-4260-4e82-aa94-63c250786c9a-kube-api-access-t7pxq\") pod \"ba01ec7d-4260-4e82-aa94-63c250786c9a\" (UID: \"ba01ec7d-4260-4e82-aa94-63c250786c9a\") " Dec 06 07:31:44 crc kubenswrapper[4954]: I1206 07:31:44.401129 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba01ec7d-4260-4e82-aa94-63c250786c9a-utilities\") pod \"ba01ec7d-4260-4e82-aa94-63c250786c9a\" (UID: \"ba01ec7d-4260-4e82-aa94-63c250786c9a\") " Dec 06 07:31:44 crc kubenswrapper[4954]: I1206 07:31:44.401345 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba01ec7d-4260-4e82-aa94-63c250786c9a-catalog-content\") pod \"ba01ec7d-4260-4e82-aa94-63c250786c9a\" (UID: \"ba01ec7d-4260-4e82-aa94-63c250786c9a\") " Dec 06 07:31:44 crc kubenswrapper[4954]: I1206 07:31:44.402211 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba01ec7d-4260-4e82-aa94-63c250786c9a-utilities" (OuterVolumeSpecName: "utilities") pod "ba01ec7d-4260-4e82-aa94-63c250786c9a" (UID: "ba01ec7d-4260-4e82-aa94-63c250786c9a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:31:44 crc kubenswrapper[4954]: I1206 07:31:44.414885 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba01ec7d-4260-4e82-aa94-63c250786c9a-kube-api-access-t7pxq" (OuterVolumeSpecName: "kube-api-access-t7pxq") pod "ba01ec7d-4260-4e82-aa94-63c250786c9a" (UID: "ba01ec7d-4260-4e82-aa94-63c250786c9a"). InnerVolumeSpecName "kube-api-access-t7pxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:31:44 crc kubenswrapper[4954]: I1206 07:31:44.503389 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7pxq\" (UniqueName: \"kubernetes.io/projected/ba01ec7d-4260-4e82-aa94-63c250786c9a-kube-api-access-t7pxq\") on node \"crc\" DevicePath \"\"" Dec 06 07:31:44 crc kubenswrapper[4954]: I1206 07:31:44.503449 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba01ec7d-4260-4e82-aa94-63c250786c9a-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:31:44 crc kubenswrapper[4954]: I1206 07:31:44.553780 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba01ec7d-4260-4e82-aa94-63c250786c9a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba01ec7d-4260-4e82-aa94-63c250786c9a" (UID: "ba01ec7d-4260-4e82-aa94-63c250786c9a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:31:44 crc kubenswrapper[4954]: I1206 07:31:44.605721 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba01ec7d-4260-4e82-aa94-63c250786c9a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:31:44 crc kubenswrapper[4954]: I1206 07:31:44.762995 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxkcx" event={"ID":"ba01ec7d-4260-4e82-aa94-63c250786c9a","Type":"ContainerDied","Data":"e943527dc012bef97a5226e2974ec2e53dc3ed173bab0cefad684491e80c99f5"} Dec 06 07:31:44 crc kubenswrapper[4954]: I1206 07:31:44.763408 4954 scope.go:117] "RemoveContainer" containerID="58b4b7910789bc388e7141f49d1994bddd47d02a754f95f812abeaa18a5d33c2" Dec 06 07:31:44 crc kubenswrapper[4954]: I1206 07:31:44.763253 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxkcx" Dec 06 07:31:44 crc kubenswrapper[4954]: I1206 07:31:44.829056 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lxkcx"] Dec 06 07:31:44 crc kubenswrapper[4954]: I1206 07:31:44.841672 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lxkcx"] Dec 06 07:31:44 crc kubenswrapper[4954]: I1206 07:31:44.849455 4954 scope.go:117] "RemoveContainer" containerID="3c300e7ae82fd7c46f27ba504403cccade98bb4b9a6e557a1e532a6f5af52869" Dec 06 07:31:44 crc kubenswrapper[4954]: I1206 07:31:44.904119 4954 scope.go:117] "RemoveContainer" containerID="aefd1592473a45974b605b85fa3ef776f0134ebfd7239c71b27e8ad6a681752b" Dec 06 07:31:45 crc kubenswrapper[4954]: I1206 07:31:45.464071 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba01ec7d-4260-4e82-aa94-63c250786c9a" path="/var/lib/kubelet/pods/ba01ec7d-4260-4e82-aa94-63c250786c9a/volumes" Dec 06 07:32:10 crc kubenswrapper[4954]: I1206 07:32:10.101123 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:32:10 crc kubenswrapper[4954]: I1206 07:32:10.102087 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:32:40 crc kubenswrapper[4954]: I1206 07:32:40.101934 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:32:40 crc kubenswrapper[4954]: I1206 07:32:40.102664 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:33:10 crc kubenswrapper[4954]: I1206 07:33:10.101647 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:33:10 crc kubenswrapper[4954]: I1206 07:33:10.102725 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:33:10 crc kubenswrapper[4954]: I1206 07:33:10.102800 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 07:33:10 crc kubenswrapper[4954]: I1206 07:33:10.103452 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 07:33:10 crc kubenswrapper[4954]: I1206 07:33:10.103523 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" gracePeriod=600 Dec 06 07:33:10 crc kubenswrapper[4954]: E1206 07:33:10.245906 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:33:10 crc kubenswrapper[4954]: I1206 07:33:10.614063 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" exitCode=0 Dec 06 07:33:10 crc kubenswrapper[4954]: I1206 07:33:10.614167 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653"} Dec 06 07:33:10 crc kubenswrapper[4954]: I1206 07:33:10.614682 4954 scope.go:117] "RemoveContainer" containerID="1b8fd69f59f015b0d17ba673b772374b46ab183512e18593f68562d659d40775" Dec 06 07:33:10 crc kubenswrapper[4954]: I1206 07:33:10.615429 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:33:10 crc kubenswrapper[4954]: E1206 07:33:10.615719 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:33:23 crc kubenswrapper[4954]: I1206 07:33:23.443193 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:33:23 crc kubenswrapper[4954]: E1206 07:33:23.444315 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:33:34 crc kubenswrapper[4954]: I1206 07:33:34.443912 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:33:34 crc kubenswrapper[4954]: E1206 07:33:34.446353 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:33:48 crc kubenswrapper[4954]: I1206 07:33:48.443480 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:33:48 crc kubenswrapper[4954]: E1206 07:33:48.444828 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:34:02 crc kubenswrapper[4954]: I1206 07:34:02.444439 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:34:02 crc kubenswrapper[4954]: E1206 07:34:02.445696 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:34:13 crc kubenswrapper[4954]: I1206 07:34:13.446057 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:34:13 crc kubenswrapper[4954]: E1206 07:34:13.447721 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:34:26 crc kubenswrapper[4954]: I1206 07:34:26.444285 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:34:26 crc kubenswrapper[4954]: E1206 07:34:26.445900 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:34:29 crc kubenswrapper[4954]: I1206 07:34:29.089195 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xqp7h"] Dec 06 07:34:29 crc kubenswrapper[4954]: E1206 07:34:29.090213 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba01ec7d-4260-4e82-aa94-63c250786c9a" containerName="extract-content" Dec 06 07:34:29 crc kubenswrapper[4954]: I1206 07:34:29.090238 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba01ec7d-4260-4e82-aa94-63c250786c9a" containerName="extract-content" Dec 06 07:34:29 crc kubenswrapper[4954]: E1206 07:34:29.090331 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba01ec7d-4260-4e82-aa94-63c250786c9a" containerName="extract-utilities" Dec 06 07:34:29 crc kubenswrapper[4954]: I1206 07:34:29.090345 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba01ec7d-4260-4e82-aa94-63c250786c9a" containerName="extract-utilities" Dec 06 07:34:29 crc kubenswrapper[4954]: E1206 07:34:29.090373 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba01ec7d-4260-4e82-aa94-63c250786c9a" containerName="registry-server" Dec 06 07:34:29 crc kubenswrapper[4954]: I1206 07:34:29.090387 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba01ec7d-4260-4e82-aa94-63c250786c9a" containerName="registry-server" Dec 06 07:34:29 crc kubenswrapper[4954]: I1206 07:34:29.090724 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba01ec7d-4260-4e82-aa94-63c250786c9a" containerName="registry-server" Dec 06 07:34:29 crc kubenswrapper[4954]: I1206 07:34:29.092613 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xqp7h" Dec 06 07:34:29 crc kubenswrapper[4954]: I1206 07:34:29.099507 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xqp7h"] Dec 06 07:34:29 crc kubenswrapper[4954]: I1206 07:34:29.276138 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/042daf06-8066-4ffe-a2a4-d2a7c7f74439-utilities\") pod \"redhat-marketplace-xqp7h\" (UID: \"042daf06-8066-4ffe-a2a4-d2a7c7f74439\") " pod="openshift-marketplace/redhat-marketplace-xqp7h" Dec 06 07:34:29 crc kubenswrapper[4954]: I1206 07:34:29.276202 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd5fv\" (UniqueName: \"kubernetes.io/projected/042daf06-8066-4ffe-a2a4-d2a7c7f74439-kube-api-access-gd5fv\") pod \"redhat-marketplace-xqp7h\" (UID: \"042daf06-8066-4ffe-a2a4-d2a7c7f74439\") " pod="openshift-marketplace/redhat-marketplace-xqp7h" Dec 06 07:34:29 crc kubenswrapper[4954]: I1206 07:34:29.276818 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/042daf06-8066-4ffe-a2a4-d2a7c7f74439-catalog-content\") pod \"redhat-marketplace-xqp7h\" (UID: \"042daf06-8066-4ffe-a2a4-d2a7c7f74439\") " pod="openshift-marketplace/redhat-marketplace-xqp7h" Dec 06 07:34:29 crc kubenswrapper[4954]: I1206 07:34:29.378441 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/042daf06-8066-4ffe-a2a4-d2a7c7f74439-catalog-content\") pod \"redhat-marketplace-xqp7h\" (UID: \"042daf06-8066-4ffe-a2a4-d2a7c7f74439\") " pod="openshift-marketplace/redhat-marketplace-xqp7h" Dec 06 07:34:29 crc kubenswrapper[4954]: I1206 07:34:29.378621 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/042daf06-8066-4ffe-a2a4-d2a7c7f74439-utilities\") pod \"redhat-marketplace-xqp7h\" (UID: \"042daf06-8066-4ffe-a2a4-d2a7c7f74439\") " pod="openshift-marketplace/redhat-marketplace-xqp7h" Dec 06 07:34:29 crc kubenswrapper[4954]: I1206 07:34:29.378704 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd5fv\" (UniqueName: \"kubernetes.io/projected/042daf06-8066-4ffe-a2a4-d2a7c7f74439-kube-api-access-gd5fv\") pod \"redhat-marketplace-xqp7h\" (UID: \"042daf06-8066-4ffe-a2a4-d2a7c7f74439\") " pod="openshift-marketplace/redhat-marketplace-xqp7h" Dec 06 07:34:29 crc kubenswrapper[4954]: I1206 07:34:29.379225 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/042daf06-8066-4ffe-a2a4-d2a7c7f74439-catalog-content\") pod \"redhat-marketplace-xqp7h\" (UID: \"042daf06-8066-4ffe-a2a4-d2a7c7f74439\") " pod="openshift-marketplace/redhat-marketplace-xqp7h" Dec 06 07:34:29 crc kubenswrapper[4954]: I1206 07:34:29.379543 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/042daf06-8066-4ffe-a2a4-d2a7c7f74439-utilities\") pod \"redhat-marketplace-xqp7h\" (UID: \"042daf06-8066-4ffe-a2a4-d2a7c7f74439\") " pod="openshift-marketplace/redhat-marketplace-xqp7h" Dec 06 07:34:29 crc kubenswrapper[4954]: I1206 07:34:29.403476 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd5fv\" (UniqueName: \"kubernetes.io/projected/042daf06-8066-4ffe-a2a4-d2a7c7f74439-kube-api-access-gd5fv\") pod \"redhat-marketplace-xqp7h\" (UID: \"042daf06-8066-4ffe-a2a4-d2a7c7f74439\") " pod="openshift-marketplace/redhat-marketplace-xqp7h" Dec 06 07:34:29 crc kubenswrapper[4954]: I1206 07:34:29.423032 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xqp7h" Dec 06 07:34:29 crc kubenswrapper[4954]: I1206 07:34:29.904026 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xqp7h"] Dec 06 07:34:30 crc kubenswrapper[4954]: I1206 07:34:30.455438 4954 generic.go:334] "Generic (PLEG): container finished" podID="042daf06-8066-4ffe-a2a4-d2a7c7f74439" containerID="718012a4de54ff4b322bfbf938e63b2710111ed96eae4e54b618b25cddd3de57" exitCode=0 Dec 06 07:34:30 crc kubenswrapper[4954]: I1206 07:34:30.455526 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqp7h" event={"ID":"042daf06-8066-4ffe-a2a4-d2a7c7f74439","Type":"ContainerDied","Data":"718012a4de54ff4b322bfbf938e63b2710111ed96eae4e54b618b25cddd3de57"} Dec 06 07:34:30 crc kubenswrapper[4954]: I1206 07:34:30.455586 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqp7h" event={"ID":"042daf06-8066-4ffe-a2a4-d2a7c7f74439","Type":"ContainerStarted","Data":"85db97b42794b0b4afe60c32211594fcaf98f6ba6ae5640dd6e1220f127fc354"} Dec 06 07:34:30 crc kubenswrapper[4954]: I1206 07:34:30.460059 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 07:34:31 crc kubenswrapper[4954]: I1206 07:34:31.470716 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqp7h" event={"ID":"042daf06-8066-4ffe-a2a4-d2a7c7f74439","Type":"ContainerStarted","Data":"6577dcca473222792f243a10cdc01ee99c179919ce293def177b801fe3df4432"} Dec 06 07:34:32 crc kubenswrapper[4954]: I1206 07:34:32.488858 4954 generic.go:334] "Generic (PLEG): container finished" podID="042daf06-8066-4ffe-a2a4-d2a7c7f74439" containerID="6577dcca473222792f243a10cdc01ee99c179919ce293def177b801fe3df4432" exitCode=0 Dec 06 07:34:32 crc kubenswrapper[4954]: I1206 07:34:32.488973 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqp7h" event={"ID":"042daf06-8066-4ffe-a2a4-d2a7c7f74439","Type":"ContainerDied","Data":"6577dcca473222792f243a10cdc01ee99c179919ce293def177b801fe3df4432"} Dec 06 07:34:33 crc kubenswrapper[4954]: I1206 07:34:33.504152 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqp7h" event={"ID":"042daf06-8066-4ffe-a2a4-d2a7c7f74439","Type":"ContainerStarted","Data":"dbfa7d57d97009c38ecd2c78780a8f3258e977c7c424f8fbd1e358f1a7ebe077"} Dec 06 07:34:33 crc kubenswrapper[4954]: I1206 07:34:33.541860 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xqp7h" podStartSLOduration=2.105460785 podStartE2EDuration="4.54181966s" podCreationTimestamp="2025-12-06 07:34:29 +0000 UTC" firstStartedPulling="2025-12-06 07:34:30.459700505 +0000 UTC m=+2245.273059894" lastFinishedPulling="2025-12-06 07:34:32.89605938 +0000 UTC m=+2247.709418769" observedRunningTime="2025-12-06 07:34:33.530215782 +0000 UTC m=+2248.343575201" watchObservedRunningTime="2025-12-06 07:34:33.54181966 +0000 UTC m=+2248.355179079" Dec 06 07:34:37 crc kubenswrapper[4954]: I1206 07:34:37.444137 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:34:37 crc kubenswrapper[4954]: E1206 07:34:37.445052 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:34:37 crc kubenswrapper[4954]: I1206 07:34:37.468137 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rthwf"] Dec 06 07:34:37 crc kubenswrapper[4954]: I1206 07:34:37.470810 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rthwf" Dec 06 07:34:37 crc kubenswrapper[4954]: I1206 07:34:37.481618 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rthwf"] Dec 06 07:34:37 crc kubenswrapper[4954]: I1206 07:34:37.629391 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fbbc4c6-c6da-42af-b612-d7febe934e94-utilities\") pod \"community-operators-rthwf\" (UID: \"8fbbc4c6-c6da-42af-b612-d7febe934e94\") " pod="openshift-marketplace/community-operators-rthwf" Dec 06 07:34:37 crc kubenswrapper[4954]: I1206 07:34:37.629539 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t24rg\" (UniqueName: \"kubernetes.io/projected/8fbbc4c6-c6da-42af-b612-d7febe934e94-kube-api-access-t24rg\") pod \"community-operators-rthwf\" (UID: \"8fbbc4c6-c6da-42af-b612-d7febe934e94\") " pod="openshift-marketplace/community-operators-rthwf" Dec 06 07:34:37 crc kubenswrapper[4954]: I1206 07:34:37.629601 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fbbc4c6-c6da-42af-b612-d7febe934e94-catalog-content\") pod \"community-operators-rthwf\" (UID: \"8fbbc4c6-c6da-42af-b612-d7febe934e94\") " pod="openshift-marketplace/community-operators-rthwf" Dec 06 07:34:37 crc kubenswrapper[4954]: I1206 07:34:37.730836 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t24rg\" (UniqueName: \"kubernetes.io/projected/8fbbc4c6-c6da-42af-b612-d7febe934e94-kube-api-access-t24rg\") pod \"community-operators-rthwf\" (UID: \"8fbbc4c6-c6da-42af-b612-d7febe934e94\") " pod="openshift-marketplace/community-operators-rthwf" Dec 06 07:34:37 crc kubenswrapper[4954]: I1206 07:34:37.731355 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fbbc4c6-c6da-42af-b612-d7febe934e94-catalog-content\") pod \"community-operators-rthwf\" (UID: \"8fbbc4c6-c6da-42af-b612-d7febe934e94\") " pod="openshift-marketplace/community-operators-rthwf" Dec 06 07:34:37 crc kubenswrapper[4954]: I1206 07:34:37.731692 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fbbc4c6-c6da-42af-b612-d7febe934e94-utilities\") pod \"community-operators-rthwf\" (UID: \"8fbbc4c6-c6da-42af-b612-d7febe934e94\") " pod="openshift-marketplace/community-operators-rthwf" Dec 06 07:34:37 crc kubenswrapper[4954]: I1206 07:34:37.731993 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fbbc4c6-c6da-42af-b612-d7febe934e94-catalog-content\") pod \"community-operators-rthwf\" (UID: \"8fbbc4c6-c6da-42af-b612-d7febe934e94\") " pod="openshift-marketplace/community-operators-rthwf" Dec 06 07:34:37 crc kubenswrapper[4954]: I1206 07:34:37.732370 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fbbc4c6-c6da-42af-b612-d7febe934e94-utilities\") pod \"community-operators-rthwf\" (UID: \"8fbbc4c6-c6da-42af-b612-d7febe934e94\") " pod="openshift-marketplace/community-operators-rthwf" Dec 06 07:34:37 crc kubenswrapper[4954]: I1206 07:34:37.763316 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t24rg\" (UniqueName: \"kubernetes.io/projected/8fbbc4c6-c6da-42af-b612-d7febe934e94-kube-api-access-t24rg\") pod \"community-operators-rthwf\" (UID: \"8fbbc4c6-c6da-42af-b612-d7febe934e94\") " pod="openshift-marketplace/community-operators-rthwf" Dec 06 07:34:37 crc kubenswrapper[4954]: I1206 07:34:37.807799 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rthwf" Dec 06 07:34:38 crc kubenswrapper[4954]: I1206 07:34:38.162146 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rthwf"] Dec 06 07:34:38 crc kubenswrapper[4954]: I1206 07:34:38.582274 4954 generic.go:334] "Generic (PLEG): container finished" podID="8fbbc4c6-c6da-42af-b612-d7febe934e94" containerID="c239d89d2c33e6011e426a7d3a0f1a403decd9b70a8817fc662fa4f406c99978" exitCode=0 Dec 06 07:34:38 crc kubenswrapper[4954]: I1206 07:34:38.582390 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rthwf" event={"ID":"8fbbc4c6-c6da-42af-b612-d7febe934e94","Type":"ContainerDied","Data":"c239d89d2c33e6011e426a7d3a0f1a403decd9b70a8817fc662fa4f406c99978"} Dec 06 07:34:38 crc kubenswrapper[4954]: I1206 07:34:38.582644 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rthwf" event={"ID":"8fbbc4c6-c6da-42af-b612-d7febe934e94","Type":"ContainerStarted","Data":"4efbf27f8be09a4c8902bbd459b2b323f7f109ed10a7c706eb12d76a536b7f75"} Dec 06 07:34:39 crc kubenswrapper[4954]: I1206 07:34:39.423699 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xqp7h" Dec 06 07:34:39 crc kubenswrapper[4954]: I1206 07:34:39.425067 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xqp7h" Dec 06 07:34:39 crc kubenswrapper[4954]: I1206 07:34:39.502693 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xqp7h" Dec 06 07:34:39 crc kubenswrapper[4954]: I1206 07:34:39.675635 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xqp7h" Dec 06 07:34:40 crc kubenswrapper[4954]: I1206 07:34:40.449987 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xqp7h"] Dec 06 07:34:41 crc kubenswrapper[4954]: I1206 07:34:41.607819 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xqp7h" podUID="042daf06-8066-4ffe-a2a4-d2a7c7f74439" containerName="registry-server" containerID="cri-o://dbfa7d57d97009c38ecd2c78780a8f3258e977c7c424f8fbd1e358f1a7ebe077" gracePeriod=2 Dec 06 07:34:42 crc kubenswrapper[4954]: I1206 07:34:42.630305 4954 generic.go:334] "Generic (PLEG): container finished" podID="042daf06-8066-4ffe-a2a4-d2a7c7f74439" containerID="dbfa7d57d97009c38ecd2c78780a8f3258e977c7c424f8fbd1e358f1a7ebe077" exitCode=0 Dec 06 07:34:42 crc kubenswrapper[4954]: I1206 07:34:42.630417 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqp7h" event={"ID":"042daf06-8066-4ffe-a2a4-d2a7c7f74439","Type":"ContainerDied","Data":"dbfa7d57d97009c38ecd2c78780a8f3258e977c7c424f8fbd1e358f1a7ebe077"} Dec 06 07:34:42 crc kubenswrapper[4954]: I1206 07:34:42.831652 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xqp7h" Dec 06 07:34:43 crc kubenswrapper[4954]: I1206 07:34:43.033601 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd5fv\" (UniqueName: \"kubernetes.io/projected/042daf06-8066-4ffe-a2a4-d2a7c7f74439-kube-api-access-gd5fv\") pod \"042daf06-8066-4ffe-a2a4-d2a7c7f74439\" (UID: \"042daf06-8066-4ffe-a2a4-d2a7c7f74439\") " Dec 06 07:34:43 crc kubenswrapper[4954]: I1206 07:34:43.033686 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/042daf06-8066-4ffe-a2a4-d2a7c7f74439-catalog-content\") pod \"042daf06-8066-4ffe-a2a4-d2a7c7f74439\" (UID: \"042daf06-8066-4ffe-a2a4-d2a7c7f74439\") " Dec 06 07:34:43 crc kubenswrapper[4954]: I1206 07:34:43.033894 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/042daf06-8066-4ffe-a2a4-d2a7c7f74439-utilities\") pod \"042daf06-8066-4ffe-a2a4-d2a7c7f74439\" (UID: \"042daf06-8066-4ffe-a2a4-d2a7c7f74439\") " Dec 06 07:34:43 crc kubenswrapper[4954]: I1206 07:34:43.035345 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/042daf06-8066-4ffe-a2a4-d2a7c7f74439-utilities" (OuterVolumeSpecName: "utilities") pod "042daf06-8066-4ffe-a2a4-d2a7c7f74439" (UID: "042daf06-8066-4ffe-a2a4-d2a7c7f74439"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:34:43 crc kubenswrapper[4954]: I1206 07:34:43.044742 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/042daf06-8066-4ffe-a2a4-d2a7c7f74439-kube-api-access-gd5fv" (OuterVolumeSpecName: "kube-api-access-gd5fv") pod "042daf06-8066-4ffe-a2a4-d2a7c7f74439" (UID: "042daf06-8066-4ffe-a2a4-d2a7c7f74439"). InnerVolumeSpecName "kube-api-access-gd5fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:34:43 crc kubenswrapper[4954]: I1206 07:34:43.062980 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/042daf06-8066-4ffe-a2a4-d2a7c7f74439-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "042daf06-8066-4ffe-a2a4-d2a7c7f74439" (UID: "042daf06-8066-4ffe-a2a4-d2a7c7f74439"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:34:43 crc kubenswrapper[4954]: I1206 07:34:43.136999 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/042daf06-8066-4ffe-a2a4-d2a7c7f74439-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:34:43 crc kubenswrapper[4954]: I1206 07:34:43.137047 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd5fv\" (UniqueName: \"kubernetes.io/projected/042daf06-8066-4ffe-a2a4-d2a7c7f74439-kube-api-access-gd5fv\") on node \"crc\" DevicePath \"\"" Dec 06 07:34:43 crc kubenswrapper[4954]: I1206 07:34:43.137059 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/042daf06-8066-4ffe-a2a4-d2a7c7f74439-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:34:43 crc kubenswrapper[4954]: I1206 07:34:43.659137 4954 generic.go:334] "Generic (PLEG): container finished" podID="8fbbc4c6-c6da-42af-b612-d7febe934e94" containerID="ff8e961f71bd6bee6b63fee6fda5b54e4366c45c8bd0750bf53114307e01f8ef" exitCode=0 Dec 06 07:34:43 crc kubenswrapper[4954]: I1206 07:34:43.659488 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rthwf" event={"ID":"8fbbc4c6-c6da-42af-b612-d7febe934e94","Type":"ContainerDied","Data":"ff8e961f71bd6bee6b63fee6fda5b54e4366c45c8bd0750bf53114307e01f8ef"} Dec 06 07:34:43 crc kubenswrapper[4954]: I1206 07:34:43.664799 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqp7h" event={"ID":"042daf06-8066-4ffe-a2a4-d2a7c7f74439","Type":"ContainerDied","Data":"85db97b42794b0b4afe60c32211594fcaf98f6ba6ae5640dd6e1220f127fc354"} Dec 06 07:34:43 crc kubenswrapper[4954]: I1206 07:34:43.664912 4954 scope.go:117] "RemoveContainer" containerID="dbfa7d57d97009c38ecd2c78780a8f3258e977c7c424f8fbd1e358f1a7ebe077" Dec 06 07:34:43 crc kubenswrapper[4954]: I1206 07:34:43.664958 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xqp7h" Dec 06 07:34:43 crc kubenswrapper[4954]: I1206 07:34:43.703164 4954 scope.go:117] "RemoveContainer" containerID="6577dcca473222792f243a10cdc01ee99c179919ce293def177b801fe3df4432" Dec 06 07:34:43 crc kubenswrapper[4954]: I1206 07:34:43.732852 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xqp7h"] Dec 06 07:34:43 crc kubenswrapper[4954]: I1206 07:34:43.741005 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xqp7h"] Dec 06 07:34:43 crc kubenswrapper[4954]: I1206 07:34:43.750969 4954 scope.go:117] "RemoveContainer" containerID="718012a4de54ff4b322bfbf938e63b2710111ed96eae4e54b618b25cddd3de57" Dec 06 07:34:44 crc kubenswrapper[4954]: I1206 07:34:44.679990 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rthwf" event={"ID":"8fbbc4c6-c6da-42af-b612-d7febe934e94","Type":"ContainerStarted","Data":"ef5eadf06b1267aba971faf46bbbd12772c1c64bddcf12820223b03a09e80816"} Dec 06 07:34:44 crc kubenswrapper[4954]: I1206 07:34:44.713967 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rthwf" podStartSLOduration=1.998929228 podStartE2EDuration="7.713933642s" podCreationTimestamp="2025-12-06 07:34:37 +0000 UTC" firstStartedPulling="2025-12-06 07:34:38.584808747 +0000 UTC m=+2253.398168176" lastFinishedPulling="2025-12-06 07:34:44.299813161 +0000 UTC m=+2259.113172590" observedRunningTime="2025-12-06 07:34:44.702505488 +0000 UTC m=+2259.515864897" watchObservedRunningTime="2025-12-06 07:34:44.713933642 +0000 UTC m=+2259.527293071" Dec 06 07:34:45 crc kubenswrapper[4954]: I1206 07:34:45.462398 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="042daf06-8066-4ffe-a2a4-d2a7c7f74439" path="/var/lib/kubelet/pods/042daf06-8066-4ffe-a2a4-d2a7c7f74439/volumes" Dec 06 07:34:47 crc kubenswrapper[4954]: I1206 07:34:47.808913 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rthwf" Dec 06 07:34:47 crc kubenswrapper[4954]: I1206 07:34:47.809482 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rthwf" Dec 06 07:34:47 crc kubenswrapper[4954]: I1206 07:34:47.883165 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rthwf" Dec 06 07:34:51 crc kubenswrapper[4954]: I1206 07:34:51.444421 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:34:51 crc kubenswrapper[4954]: E1206 07:34:51.445696 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:34:57 crc kubenswrapper[4954]: I1206 07:34:57.887272 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rthwf" Dec 06 07:34:57 crc kubenswrapper[4954]: I1206 07:34:57.993185 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rthwf"] Dec 06 07:34:58 crc kubenswrapper[4954]: I1206 07:34:58.134331 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xqpx7"] Dec 06 07:34:58 crc kubenswrapper[4954]: I1206 07:34:58.134753 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xqpx7" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" containerName="registry-server" containerID="cri-o://0ccad4510dcb08566464f3fe79877a3df59cc000b01a9aab1dc3e009c2809632" gracePeriod=2 Dec 06 07:34:58 crc kubenswrapper[4954]: I1206 07:34:58.835460 4954 generic.go:334] "Generic (PLEG): container finished" podID="b9cc4661-bc27-418f-be19-fb8acd289e3f" containerID="0ccad4510dcb08566464f3fe79877a3df59cc000b01a9aab1dc3e009c2809632" exitCode=0 Dec 06 07:34:58 crc kubenswrapper[4954]: I1206 07:34:58.835604 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqpx7" event={"ID":"b9cc4661-bc27-418f-be19-fb8acd289e3f","Type":"ContainerDied","Data":"0ccad4510dcb08566464f3fe79877a3df59cc000b01a9aab1dc3e009c2809632"} Dec 06 07:34:59 crc kubenswrapper[4954]: I1206 07:34:59.038352 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xqpx7" Dec 06 07:34:59 crc kubenswrapper[4954]: I1206 07:34:59.045526 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc7t5\" (UniqueName: \"kubernetes.io/projected/b9cc4661-bc27-418f-be19-fb8acd289e3f-kube-api-access-sc7t5\") pod \"b9cc4661-bc27-418f-be19-fb8acd289e3f\" (UID: \"b9cc4661-bc27-418f-be19-fb8acd289e3f\") " Dec 06 07:34:59 crc kubenswrapper[4954]: I1206 07:34:59.045637 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9cc4661-bc27-418f-be19-fb8acd289e3f-catalog-content\") pod \"b9cc4661-bc27-418f-be19-fb8acd289e3f\" (UID: \"b9cc4661-bc27-418f-be19-fb8acd289e3f\") " Dec 06 07:34:59 crc kubenswrapper[4954]: I1206 07:34:59.045696 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9cc4661-bc27-418f-be19-fb8acd289e3f-utilities\") pod \"b9cc4661-bc27-418f-be19-fb8acd289e3f\" (UID: \"b9cc4661-bc27-418f-be19-fb8acd289e3f\") " Dec 06 07:34:59 crc kubenswrapper[4954]: I1206 07:34:59.047035 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9cc4661-bc27-418f-be19-fb8acd289e3f-utilities" (OuterVolumeSpecName: "utilities") pod "b9cc4661-bc27-418f-be19-fb8acd289e3f" (UID: "b9cc4661-bc27-418f-be19-fb8acd289e3f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:34:59 crc kubenswrapper[4954]: I1206 07:34:59.051698 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9cc4661-bc27-418f-be19-fb8acd289e3f-kube-api-access-sc7t5" (OuterVolumeSpecName: "kube-api-access-sc7t5") pod "b9cc4661-bc27-418f-be19-fb8acd289e3f" (UID: "b9cc4661-bc27-418f-be19-fb8acd289e3f"). InnerVolumeSpecName "kube-api-access-sc7t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:34:59 crc kubenswrapper[4954]: I1206 07:34:59.111273 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9cc4661-bc27-418f-be19-fb8acd289e3f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9cc4661-bc27-418f-be19-fb8acd289e3f" (UID: "b9cc4661-bc27-418f-be19-fb8acd289e3f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:34:59 crc kubenswrapper[4954]: I1206 07:34:59.147047 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc7t5\" (UniqueName: \"kubernetes.io/projected/b9cc4661-bc27-418f-be19-fb8acd289e3f-kube-api-access-sc7t5\") on node \"crc\" DevicePath \"\"" Dec 06 07:34:59 crc kubenswrapper[4954]: I1206 07:34:59.147113 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9cc4661-bc27-418f-be19-fb8acd289e3f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:34:59 crc kubenswrapper[4954]: I1206 07:34:59.147128 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9cc4661-bc27-418f-be19-fb8acd289e3f-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:34:59 crc kubenswrapper[4954]: I1206 07:34:59.849391 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqpx7" event={"ID":"b9cc4661-bc27-418f-be19-fb8acd289e3f","Type":"ContainerDied","Data":"7d155bccbdfd72871e8df9e507c0ba22992ca03a8ea74b4759d253a325f02819"} Dec 06 07:34:59 crc kubenswrapper[4954]: I1206 07:34:59.849444 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xqpx7" Dec 06 07:34:59 crc kubenswrapper[4954]: I1206 07:34:59.849503 4954 scope.go:117] "RemoveContainer" containerID="0ccad4510dcb08566464f3fe79877a3df59cc000b01a9aab1dc3e009c2809632" Dec 06 07:34:59 crc kubenswrapper[4954]: I1206 07:34:59.874891 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xqpx7"] Dec 06 07:34:59 crc kubenswrapper[4954]: I1206 07:34:59.882914 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xqpx7"] Dec 06 07:34:59 crc kubenswrapper[4954]: I1206 07:34:59.887177 4954 scope.go:117] "RemoveContainer" containerID="d10aa37641a42d454224670f9b09eeccb3b4c0a22e6ccb1c4f89c6da044883fe" Dec 06 07:34:59 crc kubenswrapper[4954]: I1206 07:34:59.915005 4954 scope.go:117] "RemoveContainer" containerID="812ab61b745c1904453e4ebeee1f2374493033e2616e18659f7183bd285a6237" Dec 06 07:35:01 crc kubenswrapper[4954]: I1206 07:35:01.457933 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" path="/var/lib/kubelet/pods/b9cc4661-bc27-418f-be19-fb8acd289e3f/volumes" Dec 06 07:35:03 crc kubenswrapper[4954]: I1206 07:35:03.443918 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:35:03 crc kubenswrapper[4954]: E1206 07:35:03.444659 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:35:16 crc kubenswrapper[4954]: I1206 07:35:16.444844 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:35:16 crc kubenswrapper[4954]: E1206 07:35:16.445930 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:35:29 crc kubenswrapper[4954]: I1206 07:35:29.444081 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:35:29 crc kubenswrapper[4954]: E1206 07:35:29.445339 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:35:45 crc kubenswrapper[4954]: I1206 07:35:45.452885 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:35:45 crc kubenswrapper[4954]: E1206 07:35:45.454324 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:36:00 crc kubenswrapper[4954]: I1206 07:36:00.444013 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:36:00 crc kubenswrapper[4954]: E1206 07:36:00.445193 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:36:15 crc kubenswrapper[4954]: I1206 07:36:15.470273 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:36:15 crc kubenswrapper[4954]: E1206 07:36:15.472449 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:36:30 crc kubenswrapper[4954]: I1206 07:36:30.444461 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:36:30 crc kubenswrapper[4954]: E1206 07:36:30.445669 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:36:41 crc kubenswrapper[4954]: I1206 07:36:41.444556 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:36:41 crc kubenswrapper[4954]: E1206 07:36:41.447417 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:36:52 crc kubenswrapper[4954]: I1206 07:36:52.444096 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:36:52 crc kubenswrapper[4954]: E1206 07:36:52.444755 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:37:08 crc kubenswrapper[4954]: I1206 07:37:08.443887 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:37:08 crc kubenswrapper[4954]: E1206 07:37:08.445231 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:37:20 crc kubenswrapper[4954]: I1206 07:37:20.444556 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:37:20 crc kubenswrapper[4954]: E1206 07:37:20.446271 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:37:33 crc kubenswrapper[4954]: I1206 07:37:33.443902 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:37:33 crc kubenswrapper[4954]: E1206 07:37:33.445276 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:37:45 crc kubenswrapper[4954]: I1206 07:37:45.452471 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:37:45 crc kubenswrapper[4954]: E1206 07:37:45.453613 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:37:59 crc kubenswrapper[4954]: I1206 07:37:59.443321 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:37:59 crc kubenswrapper[4954]: E1206 07:37:59.444153 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:38:14 crc kubenswrapper[4954]: I1206 07:38:14.444316 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:38:15 crc kubenswrapper[4954]: I1206 07:38:15.857758 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"4dc2e3035bfcd8bdb88c9fd44bbcb2341c64959383929c667755e4552731577f"} Dec 06 07:40:40 crc kubenswrapper[4954]: I1206 07:40:40.101074 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:40:40 crc kubenswrapper[4954]: I1206 07:40:40.102311 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:40:54 crc kubenswrapper[4954]: I1206 07:40:54.803326 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j7lsp"] Dec 06 07:40:54 crc kubenswrapper[4954]: E1206 07:40:54.804531 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" containerName="extract-content" Dec 06 07:40:54 crc kubenswrapper[4954]: I1206 07:40:54.804547 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" containerName="extract-content" Dec 06 07:40:54 crc kubenswrapper[4954]: E1206 07:40:54.804587 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" containerName="registry-server" Dec 06 07:40:54 crc kubenswrapper[4954]: I1206 07:40:54.804594 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" containerName="registry-server" Dec 06 07:40:54 crc kubenswrapper[4954]: E1206 07:40:54.804612 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="042daf06-8066-4ffe-a2a4-d2a7c7f74439" containerName="extract-utilities" Dec 06 07:40:54 crc kubenswrapper[4954]: I1206 07:40:54.804619 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="042daf06-8066-4ffe-a2a4-d2a7c7f74439" containerName="extract-utilities" Dec 06 07:40:54 crc kubenswrapper[4954]: E1206 07:40:54.804632 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="042daf06-8066-4ffe-a2a4-d2a7c7f74439" containerName="registry-server" Dec 06 07:40:54 crc kubenswrapper[4954]: I1206 07:40:54.804639 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="042daf06-8066-4ffe-a2a4-d2a7c7f74439" containerName="registry-server" Dec 06 07:40:54 crc kubenswrapper[4954]: E1206 07:40:54.804652 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="042daf06-8066-4ffe-a2a4-d2a7c7f74439" containerName="extract-content" Dec 06 07:40:54 crc kubenswrapper[4954]: I1206 07:40:54.804659 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="042daf06-8066-4ffe-a2a4-d2a7c7f74439" containerName="extract-content" Dec 06 07:40:54 crc kubenswrapper[4954]: E1206 07:40:54.804689 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" containerName="extract-utilities" Dec 06 07:40:54 crc kubenswrapper[4954]: I1206 07:40:54.804696 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" containerName="extract-utilities" Dec 06 07:40:54 crc kubenswrapper[4954]: I1206 07:40:54.804919 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9cc4661-bc27-418f-be19-fb8acd289e3f" containerName="registry-server" Dec 06 07:40:54 crc kubenswrapper[4954]: I1206 07:40:54.804944 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="042daf06-8066-4ffe-a2a4-d2a7c7f74439" containerName="registry-server" Dec 06 07:40:54 crc kubenswrapper[4954]: I1206 07:40:54.806396 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j7lsp" Dec 06 07:40:54 crc kubenswrapper[4954]: I1206 07:40:54.822376 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j7lsp"] Dec 06 07:40:54 crc kubenswrapper[4954]: I1206 07:40:54.943277 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n5mb\" (UniqueName: \"kubernetes.io/projected/66f27c1e-e639-42a2-9e1d-48afdd628249-kube-api-access-4n5mb\") pod \"certified-operators-j7lsp\" (UID: \"66f27c1e-e639-42a2-9e1d-48afdd628249\") " pod="openshift-marketplace/certified-operators-j7lsp" Dec 06 07:40:54 crc kubenswrapper[4954]: I1206 07:40:54.943412 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66f27c1e-e639-42a2-9e1d-48afdd628249-catalog-content\") pod \"certified-operators-j7lsp\" (UID: \"66f27c1e-e639-42a2-9e1d-48afdd628249\") " pod="openshift-marketplace/certified-operators-j7lsp" Dec 06 07:40:54 crc kubenswrapper[4954]: I1206 07:40:54.943691 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66f27c1e-e639-42a2-9e1d-48afdd628249-utilities\") pod \"certified-operators-j7lsp\" (UID: \"66f27c1e-e639-42a2-9e1d-48afdd628249\") " pod="openshift-marketplace/certified-operators-j7lsp" Dec 06 07:40:55 crc kubenswrapper[4954]: I1206 07:40:55.045339 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66f27c1e-e639-42a2-9e1d-48afdd628249-utilities\") pod \"certified-operators-j7lsp\" (UID: \"66f27c1e-e639-42a2-9e1d-48afdd628249\") " pod="openshift-marketplace/certified-operators-j7lsp" Dec 06 07:40:55 crc kubenswrapper[4954]: I1206 07:40:55.045422 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n5mb\" (UniqueName: \"kubernetes.io/projected/66f27c1e-e639-42a2-9e1d-48afdd628249-kube-api-access-4n5mb\") pod \"certified-operators-j7lsp\" (UID: \"66f27c1e-e639-42a2-9e1d-48afdd628249\") " pod="openshift-marketplace/certified-operators-j7lsp" Dec 06 07:40:55 crc kubenswrapper[4954]: I1206 07:40:55.045514 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66f27c1e-e639-42a2-9e1d-48afdd628249-catalog-content\") pod \"certified-operators-j7lsp\" (UID: \"66f27c1e-e639-42a2-9e1d-48afdd628249\") " pod="openshift-marketplace/certified-operators-j7lsp" Dec 06 07:40:55 crc kubenswrapper[4954]: I1206 07:40:55.045988 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66f27c1e-e639-42a2-9e1d-48afdd628249-utilities\") pod \"certified-operators-j7lsp\" (UID: \"66f27c1e-e639-42a2-9e1d-48afdd628249\") " pod="openshift-marketplace/certified-operators-j7lsp" Dec 06 07:40:55 crc kubenswrapper[4954]: I1206 07:40:55.046481 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66f27c1e-e639-42a2-9e1d-48afdd628249-catalog-content\") pod \"certified-operators-j7lsp\" (UID: \"66f27c1e-e639-42a2-9e1d-48afdd628249\") " pod="openshift-marketplace/certified-operators-j7lsp" Dec 06 07:40:55 crc kubenswrapper[4954]: I1206 07:40:55.075782 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n5mb\" (UniqueName: \"kubernetes.io/projected/66f27c1e-e639-42a2-9e1d-48afdd628249-kube-api-access-4n5mb\") pod \"certified-operators-j7lsp\" (UID: \"66f27c1e-e639-42a2-9e1d-48afdd628249\") " pod="openshift-marketplace/certified-operators-j7lsp" Dec 06 07:40:55 crc kubenswrapper[4954]: I1206 07:40:55.136404 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j7lsp" Dec 06 07:40:55 crc kubenswrapper[4954]: I1206 07:40:55.685312 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j7lsp"] Dec 06 07:40:56 crc kubenswrapper[4954]: I1206 07:40:56.487012 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7lsp" event={"ID":"66f27c1e-e639-42a2-9e1d-48afdd628249","Type":"ContainerStarted","Data":"52d2ae9bcc27b8068fab17bdebcc71d1d7bd34977a3a459c436586d77c046b14"} Dec 06 07:40:57 crc kubenswrapper[4954]: I1206 07:40:57.502194 4954 generic.go:334] "Generic (PLEG): container finished" podID="66f27c1e-e639-42a2-9e1d-48afdd628249" containerID="05acddbfb4ac35b69995968532c403be1b7ba841b8d5c9bd69f0eb8c260f883c" exitCode=0 Dec 06 07:40:57 crc kubenswrapper[4954]: I1206 07:40:57.502259 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7lsp" event={"ID":"66f27c1e-e639-42a2-9e1d-48afdd628249","Type":"ContainerDied","Data":"05acddbfb4ac35b69995968532c403be1b7ba841b8d5c9bd69f0eb8c260f883c"} Dec 06 07:40:57 crc kubenswrapper[4954]: I1206 07:40:57.506654 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 07:41:02 crc kubenswrapper[4954]: I1206 07:41:02.549552 4954 generic.go:334] "Generic (PLEG): container finished" podID="66f27c1e-e639-42a2-9e1d-48afdd628249" containerID="39aa6988a270584515870e19be8b4cfdd843ed97eda31bd385347cfb17f63f0d" exitCode=0 Dec 06 07:41:02 crc kubenswrapper[4954]: I1206 07:41:02.549677 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7lsp" event={"ID":"66f27c1e-e639-42a2-9e1d-48afdd628249","Type":"ContainerDied","Data":"39aa6988a270584515870e19be8b4cfdd843ed97eda31bd385347cfb17f63f0d"} Dec 06 07:41:10 crc kubenswrapper[4954]: I1206 07:41:10.102113 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:41:10 crc kubenswrapper[4954]: I1206 07:41:10.103202 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:41:15 crc kubenswrapper[4954]: I1206 07:41:15.701748 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7lsp" event={"ID":"66f27c1e-e639-42a2-9e1d-48afdd628249","Type":"ContainerStarted","Data":"ceada92c930bc1b526f78cae51e2ceec041620735dda237abcc514b5ca6c5f12"} Dec 06 07:41:25 crc kubenswrapper[4954]: I1206 07:41:25.137109 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j7lsp" Dec 06 07:41:25 crc kubenswrapper[4954]: I1206 07:41:25.137763 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j7lsp" Dec 06 07:41:25 crc kubenswrapper[4954]: I1206 07:41:25.214604 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j7lsp" Dec 06 07:41:25 crc kubenswrapper[4954]: I1206 07:41:25.247024 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j7lsp" podStartSLOduration=13.639012195 podStartE2EDuration="31.24698652s" podCreationTimestamp="2025-12-06 07:40:54 +0000 UTC" firstStartedPulling="2025-12-06 07:40:57.50626394 +0000 UTC m=+2632.319623339" lastFinishedPulling="2025-12-06 07:41:15.114238275 +0000 UTC m=+2649.927597664" observedRunningTime="2025-12-06 07:41:15.729115892 +0000 UTC m=+2650.542475301" watchObservedRunningTime="2025-12-06 07:41:25.24698652 +0000 UTC m=+2660.060345949" Dec 06 07:41:25 crc kubenswrapper[4954]: I1206 07:41:25.829167 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j7lsp" Dec 06 07:41:26 crc kubenswrapper[4954]: I1206 07:41:26.008992 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j7lsp"] Dec 06 07:41:27 crc kubenswrapper[4954]: I1206 07:41:27.793250 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j7lsp" podUID="66f27c1e-e639-42a2-9e1d-48afdd628249" containerName="registry-server" containerID="cri-o://ceada92c930bc1b526f78cae51e2ceec041620735dda237abcc514b5ca6c5f12" gracePeriod=2 Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.221175 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j7lsp" Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.348795 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66f27c1e-e639-42a2-9e1d-48afdd628249-utilities\") pod \"66f27c1e-e639-42a2-9e1d-48afdd628249\" (UID: \"66f27c1e-e639-42a2-9e1d-48afdd628249\") " Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.349248 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n5mb\" (UniqueName: \"kubernetes.io/projected/66f27c1e-e639-42a2-9e1d-48afdd628249-kube-api-access-4n5mb\") pod \"66f27c1e-e639-42a2-9e1d-48afdd628249\" (UID: \"66f27c1e-e639-42a2-9e1d-48afdd628249\") " Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.349414 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66f27c1e-e639-42a2-9e1d-48afdd628249-catalog-content\") pod \"66f27c1e-e639-42a2-9e1d-48afdd628249\" (UID: \"66f27c1e-e639-42a2-9e1d-48afdd628249\") " Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.350122 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66f27c1e-e639-42a2-9e1d-48afdd628249-utilities" (OuterVolumeSpecName: "utilities") pod "66f27c1e-e639-42a2-9e1d-48afdd628249" (UID: "66f27c1e-e639-42a2-9e1d-48afdd628249"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.360247 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66f27c1e-e639-42a2-9e1d-48afdd628249-kube-api-access-4n5mb" (OuterVolumeSpecName: "kube-api-access-4n5mb") pod "66f27c1e-e639-42a2-9e1d-48afdd628249" (UID: "66f27c1e-e639-42a2-9e1d-48afdd628249"). InnerVolumeSpecName "kube-api-access-4n5mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.428356 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66f27c1e-e639-42a2-9e1d-48afdd628249-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66f27c1e-e639-42a2-9e1d-48afdd628249" (UID: "66f27c1e-e639-42a2-9e1d-48afdd628249"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.450969 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66f27c1e-e639-42a2-9e1d-48afdd628249-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.451024 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n5mb\" (UniqueName: \"kubernetes.io/projected/66f27c1e-e639-42a2-9e1d-48afdd628249-kube-api-access-4n5mb\") on node \"crc\" DevicePath \"\"" Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.451038 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66f27c1e-e639-42a2-9e1d-48afdd628249-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.806667 4954 generic.go:334] "Generic (PLEG): container finished" podID="66f27c1e-e639-42a2-9e1d-48afdd628249" containerID="ceada92c930bc1b526f78cae51e2ceec041620735dda237abcc514b5ca6c5f12" exitCode=0 Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.806721 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7lsp" event={"ID":"66f27c1e-e639-42a2-9e1d-48afdd628249","Type":"ContainerDied","Data":"ceada92c930bc1b526f78cae51e2ceec041620735dda237abcc514b5ca6c5f12"} Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.806742 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j7lsp" Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.806758 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7lsp" event={"ID":"66f27c1e-e639-42a2-9e1d-48afdd628249","Type":"ContainerDied","Data":"52d2ae9bcc27b8068fab17bdebcc71d1d7bd34977a3a459c436586d77c046b14"} Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.806783 4954 scope.go:117] "RemoveContainer" containerID="ceada92c930bc1b526f78cae51e2ceec041620735dda237abcc514b5ca6c5f12" Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.843920 4954 scope.go:117] "RemoveContainer" containerID="39aa6988a270584515870e19be8b4cfdd843ed97eda31bd385347cfb17f63f0d" Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.850532 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j7lsp"] Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.858230 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j7lsp"] Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.865778 4954 scope.go:117] "RemoveContainer" containerID="05acddbfb4ac35b69995968532c403be1b7ba841b8d5c9bd69f0eb8c260f883c" Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.891276 4954 scope.go:117] "RemoveContainer" containerID="ceada92c930bc1b526f78cae51e2ceec041620735dda237abcc514b5ca6c5f12" Dec 06 07:41:28 crc kubenswrapper[4954]: E1206 07:41:28.892066 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceada92c930bc1b526f78cae51e2ceec041620735dda237abcc514b5ca6c5f12\": container with ID starting with ceada92c930bc1b526f78cae51e2ceec041620735dda237abcc514b5ca6c5f12 not found: ID does not exist" containerID="ceada92c930bc1b526f78cae51e2ceec041620735dda237abcc514b5ca6c5f12" Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.892102 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceada92c930bc1b526f78cae51e2ceec041620735dda237abcc514b5ca6c5f12"} err="failed to get container status \"ceada92c930bc1b526f78cae51e2ceec041620735dda237abcc514b5ca6c5f12\": rpc error: code = NotFound desc = could not find container \"ceada92c930bc1b526f78cae51e2ceec041620735dda237abcc514b5ca6c5f12\": container with ID starting with ceada92c930bc1b526f78cae51e2ceec041620735dda237abcc514b5ca6c5f12 not found: ID does not exist" Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.892130 4954 scope.go:117] "RemoveContainer" containerID="39aa6988a270584515870e19be8b4cfdd843ed97eda31bd385347cfb17f63f0d" Dec 06 07:41:28 crc kubenswrapper[4954]: E1206 07:41:28.892718 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39aa6988a270584515870e19be8b4cfdd843ed97eda31bd385347cfb17f63f0d\": container with ID starting with 39aa6988a270584515870e19be8b4cfdd843ed97eda31bd385347cfb17f63f0d not found: ID does not exist" containerID="39aa6988a270584515870e19be8b4cfdd843ed97eda31bd385347cfb17f63f0d" Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.892773 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39aa6988a270584515870e19be8b4cfdd843ed97eda31bd385347cfb17f63f0d"} err="failed to get container status \"39aa6988a270584515870e19be8b4cfdd843ed97eda31bd385347cfb17f63f0d\": rpc error: code = NotFound desc = could not find container \"39aa6988a270584515870e19be8b4cfdd843ed97eda31bd385347cfb17f63f0d\": container with ID starting with 39aa6988a270584515870e19be8b4cfdd843ed97eda31bd385347cfb17f63f0d not found: ID does not exist" Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.892813 4954 scope.go:117] "RemoveContainer" containerID="05acddbfb4ac35b69995968532c403be1b7ba841b8d5c9bd69f0eb8c260f883c" Dec 06 07:41:28 crc kubenswrapper[4954]: E1206 07:41:28.893323 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05acddbfb4ac35b69995968532c403be1b7ba841b8d5c9bd69f0eb8c260f883c\": container with ID starting with 05acddbfb4ac35b69995968532c403be1b7ba841b8d5c9bd69f0eb8c260f883c not found: ID does not exist" containerID="05acddbfb4ac35b69995968532c403be1b7ba841b8d5c9bd69f0eb8c260f883c" Dec 06 07:41:28 crc kubenswrapper[4954]: I1206 07:41:28.893352 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05acddbfb4ac35b69995968532c403be1b7ba841b8d5c9bd69f0eb8c260f883c"} err="failed to get container status \"05acddbfb4ac35b69995968532c403be1b7ba841b8d5c9bd69f0eb8c260f883c\": rpc error: code = NotFound desc = could not find container \"05acddbfb4ac35b69995968532c403be1b7ba841b8d5c9bd69f0eb8c260f883c\": container with ID starting with 05acddbfb4ac35b69995968532c403be1b7ba841b8d5c9bd69f0eb8c260f883c not found: ID does not exist" Dec 06 07:41:29 crc kubenswrapper[4954]: I1206 07:41:29.457789 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66f27c1e-e639-42a2-9e1d-48afdd628249" path="/var/lib/kubelet/pods/66f27c1e-e639-42a2-9e1d-48afdd628249/volumes" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.101689 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.102863 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.102974 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.104321 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4dc2e3035bfcd8bdb88c9fd44bbcb2341c64959383929c667755e4552731577f"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.104439 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://4dc2e3035bfcd8bdb88c9fd44bbcb2341c64959383929c667755e4552731577f" gracePeriod=600 Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.310176 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qbqf2"] Dec 06 07:41:40 crc kubenswrapper[4954]: E1206 07:41:40.311760 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66f27c1e-e639-42a2-9e1d-48afdd628249" containerName="registry-server" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.311797 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f27c1e-e639-42a2-9e1d-48afdd628249" containerName="registry-server" Dec 06 07:41:40 crc kubenswrapper[4954]: E1206 07:41:40.311822 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66f27c1e-e639-42a2-9e1d-48afdd628249" containerName="extract-content" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.311837 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f27c1e-e639-42a2-9e1d-48afdd628249" containerName="extract-content" Dec 06 07:41:40 crc kubenswrapper[4954]: E1206 07:41:40.311895 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66f27c1e-e639-42a2-9e1d-48afdd628249" containerName="extract-utilities" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.311907 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f27c1e-e639-42a2-9e1d-48afdd628249" containerName="extract-utilities" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.312152 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="66f27c1e-e639-42a2-9e1d-48afdd628249" containerName="registry-server" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.315993 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbqf2" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.335973 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qbqf2"] Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.366999 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d0515e1-e0d3-49fd-ac25-7ae706d752ca-utilities\") pod \"redhat-operators-qbqf2\" (UID: \"4d0515e1-e0d3-49fd-ac25-7ae706d752ca\") " pod="openshift-marketplace/redhat-operators-qbqf2" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.367117 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d0515e1-e0d3-49fd-ac25-7ae706d752ca-catalog-content\") pod \"redhat-operators-qbqf2\" (UID: \"4d0515e1-e0d3-49fd-ac25-7ae706d752ca\") " pod="openshift-marketplace/redhat-operators-qbqf2" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.367154 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b5sm\" (UniqueName: \"kubernetes.io/projected/4d0515e1-e0d3-49fd-ac25-7ae706d752ca-kube-api-access-7b5sm\") pod \"redhat-operators-qbqf2\" (UID: \"4d0515e1-e0d3-49fd-ac25-7ae706d752ca\") " pod="openshift-marketplace/redhat-operators-qbqf2" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.468785 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d0515e1-e0d3-49fd-ac25-7ae706d752ca-utilities\") pod \"redhat-operators-qbqf2\" (UID: \"4d0515e1-e0d3-49fd-ac25-7ae706d752ca\") " pod="openshift-marketplace/redhat-operators-qbqf2" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.468893 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d0515e1-e0d3-49fd-ac25-7ae706d752ca-catalog-content\") pod \"redhat-operators-qbqf2\" (UID: \"4d0515e1-e0d3-49fd-ac25-7ae706d752ca\") " pod="openshift-marketplace/redhat-operators-qbqf2" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.468925 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b5sm\" (UniqueName: \"kubernetes.io/projected/4d0515e1-e0d3-49fd-ac25-7ae706d752ca-kube-api-access-7b5sm\") pod \"redhat-operators-qbqf2\" (UID: \"4d0515e1-e0d3-49fd-ac25-7ae706d752ca\") " pod="openshift-marketplace/redhat-operators-qbqf2" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.469609 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d0515e1-e0d3-49fd-ac25-7ae706d752ca-utilities\") pod \"redhat-operators-qbqf2\" (UID: \"4d0515e1-e0d3-49fd-ac25-7ae706d752ca\") " pod="openshift-marketplace/redhat-operators-qbqf2" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.469759 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d0515e1-e0d3-49fd-ac25-7ae706d752ca-catalog-content\") pod \"redhat-operators-qbqf2\" (UID: \"4d0515e1-e0d3-49fd-ac25-7ae706d752ca\") " pod="openshift-marketplace/redhat-operators-qbqf2" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.490018 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b5sm\" (UniqueName: \"kubernetes.io/projected/4d0515e1-e0d3-49fd-ac25-7ae706d752ca-kube-api-access-7b5sm\") pod \"redhat-operators-qbqf2\" (UID: \"4d0515e1-e0d3-49fd-ac25-7ae706d752ca\") " pod="openshift-marketplace/redhat-operators-qbqf2" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.647342 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbqf2" Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.917553 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qbqf2"] Dec 06 07:41:40 crc kubenswrapper[4954]: I1206 07:41:40.949414 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbqf2" event={"ID":"4d0515e1-e0d3-49fd-ac25-7ae706d752ca","Type":"ContainerStarted","Data":"3b940df4477b154f50ed238ed7a0728211c89366ce101091f5b434ba40f5f453"} Dec 06 07:41:41 crc kubenswrapper[4954]: I1206 07:41:41.958617 4954 generic.go:334] "Generic (PLEG): container finished" podID="4d0515e1-e0d3-49fd-ac25-7ae706d752ca" containerID="fc488cf1fe1491d034d33609d7f351a1e76324bf687afa4d042046b4e7c71bd0" exitCode=0 Dec 06 07:41:41 crc kubenswrapper[4954]: I1206 07:41:41.958689 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbqf2" event={"ID":"4d0515e1-e0d3-49fd-ac25-7ae706d752ca","Type":"ContainerDied","Data":"fc488cf1fe1491d034d33609d7f351a1e76324bf687afa4d042046b4e7c71bd0"} Dec 06 07:41:41 crc kubenswrapper[4954]: I1206 07:41:41.963676 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="4dc2e3035bfcd8bdb88c9fd44bbcb2341c64959383929c667755e4552731577f" exitCode=0 Dec 06 07:41:41 crc kubenswrapper[4954]: I1206 07:41:41.963737 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"4dc2e3035bfcd8bdb88c9fd44bbcb2341c64959383929c667755e4552731577f"} Dec 06 07:41:41 crc kubenswrapper[4954]: I1206 07:41:41.963781 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c"} Dec 06 07:41:41 crc kubenswrapper[4954]: I1206 07:41:41.963832 4954 scope.go:117] "RemoveContainer" containerID="2ea335e51989ae57ca4e17099e76162ba18d55986f27a17233f971e9c3c82653" Dec 06 07:41:44 crc kubenswrapper[4954]: I1206 07:41:44.002111 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbqf2" event={"ID":"4d0515e1-e0d3-49fd-ac25-7ae706d752ca","Type":"ContainerStarted","Data":"3dfe0889e993bb600c36a45e059deeca6da020d42fa87b7aa8d0b2953801e783"} Dec 06 07:41:45 crc kubenswrapper[4954]: I1206 07:41:45.018041 4954 generic.go:334] "Generic (PLEG): container finished" podID="4d0515e1-e0d3-49fd-ac25-7ae706d752ca" containerID="3dfe0889e993bb600c36a45e059deeca6da020d42fa87b7aa8d0b2953801e783" exitCode=0 Dec 06 07:41:45 crc kubenswrapper[4954]: I1206 07:41:45.018148 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbqf2" event={"ID":"4d0515e1-e0d3-49fd-ac25-7ae706d752ca","Type":"ContainerDied","Data":"3dfe0889e993bb600c36a45e059deeca6da020d42fa87b7aa8d0b2953801e783"} Dec 06 07:41:46 crc kubenswrapper[4954]: I1206 07:41:46.032458 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbqf2" event={"ID":"4d0515e1-e0d3-49fd-ac25-7ae706d752ca","Type":"ContainerStarted","Data":"f741472dcc38f96a2ef396cef8df4ea366f6ac5d3c8bd16470b3eaf707d65ac4"} Dec 06 07:41:46 crc kubenswrapper[4954]: I1206 07:41:46.054858 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qbqf2" podStartSLOduration=2.216603685 podStartE2EDuration="6.05482331s" podCreationTimestamp="2025-12-06 07:41:40 +0000 UTC" firstStartedPulling="2025-12-06 07:41:41.962696211 +0000 UTC m=+2676.776055600" lastFinishedPulling="2025-12-06 07:41:45.800915836 +0000 UTC m=+2680.614275225" observedRunningTime="2025-12-06 07:41:46.052998182 +0000 UTC m=+2680.866357581" watchObservedRunningTime="2025-12-06 07:41:46.05482331 +0000 UTC m=+2680.868182709" Dec 06 07:41:50 crc kubenswrapper[4954]: I1206 07:41:50.648277 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qbqf2" Dec 06 07:41:50 crc kubenswrapper[4954]: I1206 07:41:50.649381 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qbqf2" Dec 06 07:41:51 crc kubenswrapper[4954]: I1206 07:41:51.730435 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qbqf2" podUID="4d0515e1-e0d3-49fd-ac25-7ae706d752ca" containerName="registry-server" probeResult="failure" output=< Dec 06 07:41:51 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 07:41:51 crc kubenswrapper[4954]: > Dec 06 07:42:00 crc kubenswrapper[4954]: I1206 07:42:00.704944 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qbqf2" Dec 06 07:42:00 crc kubenswrapper[4954]: I1206 07:42:00.762370 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qbqf2" Dec 06 07:42:00 crc kubenswrapper[4954]: I1206 07:42:00.947115 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qbqf2"] Dec 06 07:42:02 crc kubenswrapper[4954]: I1206 07:42:02.202836 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qbqf2" podUID="4d0515e1-e0d3-49fd-ac25-7ae706d752ca" containerName="registry-server" containerID="cri-o://f741472dcc38f96a2ef396cef8df4ea366f6ac5d3c8bd16470b3eaf707d65ac4" gracePeriod=2 Dec 06 07:42:02 crc kubenswrapper[4954]: I1206 07:42:02.738762 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbqf2" Dec 06 07:42:02 crc kubenswrapper[4954]: I1206 07:42:02.783153 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b5sm\" (UniqueName: \"kubernetes.io/projected/4d0515e1-e0d3-49fd-ac25-7ae706d752ca-kube-api-access-7b5sm\") pod \"4d0515e1-e0d3-49fd-ac25-7ae706d752ca\" (UID: \"4d0515e1-e0d3-49fd-ac25-7ae706d752ca\") " Dec 06 07:42:02 crc kubenswrapper[4954]: I1206 07:42:02.783249 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d0515e1-e0d3-49fd-ac25-7ae706d752ca-utilities\") pod \"4d0515e1-e0d3-49fd-ac25-7ae706d752ca\" (UID: \"4d0515e1-e0d3-49fd-ac25-7ae706d752ca\") " Dec 06 07:42:02 crc kubenswrapper[4954]: I1206 07:42:02.783274 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d0515e1-e0d3-49fd-ac25-7ae706d752ca-catalog-content\") pod \"4d0515e1-e0d3-49fd-ac25-7ae706d752ca\" (UID: \"4d0515e1-e0d3-49fd-ac25-7ae706d752ca\") " Dec 06 07:42:02 crc kubenswrapper[4954]: I1206 07:42:02.784623 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d0515e1-e0d3-49fd-ac25-7ae706d752ca-utilities" (OuterVolumeSpecName: "utilities") pod "4d0515e1-e0d3-49fd-ac25-7ae706d752ca" (UID: "4d0515e1-e0d3-49fd-ac25-7ae706d752ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:42:02 crc kubenswrapper[4954]: I1206 07:42:02.795051 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d0515e1-e0d3-49fd-ac25-7ae706d752ca-kube-api-access-7b5sm" (OuterVolumeSpecName: "kube-api-access-7b5sm") pod "4d0515e1-e0d3-49fd-ac25-7ae706d752ca" (UID: "4d0515e1-e0d3-49fd-ac25-7ae706d752ca"). InnerVolumeSpecName "kube-api-access-7b5sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:42:02 crc kubenswrapper[4954]: I1206 07:42:02.885244 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b5sm\" (UniqueName: \"kubernetes.io/projected/4d0515e1-e0d3-49fd-ac25-7ae706d752ca-kube-api-access-7b5sm\") on node \"crc\" DevicePath \"\"" Dec 06 07:42:02 crc kubenswrapper[4954]: I1206 07:42:02.885290 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d0515e1-e0d3-49fd-ac25-7ae706d752ca-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:42:02 crc kubenswrapper[4954]: I1206 07:42:02.914108 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d0515e1-e0d3-49fd-ac25-7ae706d752ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d0515e1-e0d3-49fd-ac25-7ae706d752ca" (UID: "4d0515e1-e0d3-49fd-ac25-7ae706d752ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:42:02 crc kubenswrapper[4954]: I1206 07:42:02.986806 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d0515e1-e0d3-49fd-ac25-7ae706d752ca-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:42:03 crc kubenswrapper[4954]: I1206 07:42:03.218087 4954 generic.go:334] "Generic (PLEG): container finished" podID="4d0515e1-e0d3-49fd-ac25-7ae706d752ca" containerID="f741472dcc38f96a2ef396cef8df4ea366f6ac5d3c8bd16470b3eaf707d65ac4" exitCode=0 Dec 06 07:42:03 crc kubenswrapper[4954]: I1206 07:42:03.218408 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbqf2" event={"ID":"4d0515e1-e0d3-49fd-ac25-7ae706d752ca","Type":"ContainerDied","Data":"f741472dcc38f96a2ef396cef8df4ea366f6ac5d3c8bd16470b3eaf707d65ac4"} Dec 06 07:42:03 crc kubenswrapper[4954]: I1206 07:42:03.218661 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbqf2" event={"ID":"4d0515e1-e0d3-49fd-ac25-7ae706d752ca","Type":"ContainerDied","Data":"3b940df4477b154f50ed238ed7a0728211c89366ce101091f5b434ba40f5f453"} Dec 06 07:42:03 crc kubenswrapper[4954]: I1206 07:42:03.218693 4954 scope.go:117] "RemoveContainer" containerID="f741472dcc38f96a2ef396cef8df4ea366f6ac5d3c8bd16470b3eaf707d65ac4" Dec 06 07:42:03 crc kubenswrapper[4954]: I1206 07:42:03.218524 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbqf2" Dec 06 07:42:03 crc kubenswrapper[4954]: I1206 07:42:03.262870 4954 scope.go:117] "RemoveContainer" containerID="3dfe0889e993bb600c36a45e059deeca6da020d42fa87b7aa8d0b2953801e783" Dec 06 07:42:03 crc kubenswrapper[4954]: I1206 07:42:03.270861 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qbqf2"] Dec 06 07:42:03 crc kubenswrapper[4954]: I1206 07:42:03.286709 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qbqf2"] Dec 06 07:42:03 crc kubenswrapper[4954]: I1206 07:42:03.295238 4954 scope.go:117] "RemoveContainer" containerID="fc488cf1fe1491d034d33609d7f351a1e76324bf687afa4d042046b4e7c71bd0" Dec 06 07:42:03 crc kubenswrapper[4954]: I1206 07:42:03.332384 4954 scope.go:117] "RemoveContainer" containerID="f741472dcc38f96a2ef396cef8df4ea366f6ac5d3c8bd16470b3eaf707d65ac4" Dec 06 07:42:03 crc kubenswrapper[4954]: E1206 07:42:03.332798 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f741472dcc38f96a2ef396cef8df4ea366f6ac5d3c8bd16470b3eaf707d65ac4\": container with ID starting with f741472dcc38f96a2ef396cef8df4ea366f6ac5d3c8bd16470b3eaf707d65ac4 not found: ID does not exist" containerID="f741472dcc38f96a2ef396cef8df4ea366f6ac5d3c8bd16470b3eaf707d65ac4" Dec 06 07:42:03 crc kubenswrapper[4954]: I1206 07:42:03.332835 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f741472dcc38f96a2ef396cef8df4ea366f6ac5d3c8bd16470b3eaf707d65ac4"} err="failed to get container status \"f741472dcc38f96a2ef396cef8df4ea366f6ac5d3c8bd16470b3eaf707d65ac4\": rpc error: code = NotFound desc = could not find container \"f741472dcc38f96a2ef396cef8df4ea366f6ac5d3c8bd16470b3eaf707d65ac4\": container with ID starting with f741472dcc38f96a2ef396cef8df4ea366f6ac5d3c8bd16470b3eaf707d65ac4 not found: ID does not exist" Dec 06 07:42:03 crc kubenswrapper[4954]: I1206 07:42:03.332865 4954 scope.go:117] "RemoveContainer" containerID="3dfe0889e993bb600c36a45e059deeca6da020d42fa87b7aa8d0b2953801e783" Dec 06 07:42:03 crc kubenswrapper[4954]: E1206 07:42:03.333111 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dfe0889e993bb600c36a45e059deeca6da020d42fa87b7aa8d0b2953801e783\": container with ID starting with 3dfe0889e993bb600c36a45e059deeca6da020d42fa87b7aa8d0b2953801e783 not found: ID does not exist" containerID="3dfe0889e993bb600c36a45e059deeca6da020d42fa87b7aa8d0b2953801e783" Dec 06 07:42:03 crc kubenswrapper[4954]: I1206 07:42:03.333253 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dfe0889e993bb600c36a45e059deeca6da020d42fa87b7aa8d0b2953801e783"} err="failed to get container status \"3dfe0889e993bb600c36a45e059deeca6da020d42fa87b7aa8d0b2953801e783\": rpc error: code = NotFound desc = could not find container \"3dfe0889e993bb600c36a45e059deeca6da020d42fa87b7aa8d0b2953801e783\": container with ID starting with 3dfe0889e993bb600c36a45e059deeca6da020d42fa87b7aa8d0b2953801e783 not found: ID does not exist" Dec 06 07:42:03 crc kubenswrapper[4954]: I1206 07:42:03.333377 4954 scope.go:117] "RemoveContainer" containerID="fc488cf1fe1491d034d33609d7f351a1e76324bf687afa4d042046b4e7c71bd0" Dec 06 07:42:03 crc kubenswrapper[4954]: E1206 07:42:03.333768 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc488cf1fe1491d034d33609d7f351a1e76324bf687afa4d042046b4e7c71bd0\": container with ID starting with fc488cf1fe1491d034d33609d7f351a1e76324bf687afa4d042046b4e7c71bd0 not found: ID does not exist" containerID="fc488cf1fe1491d034d33609d7f351a1e76324bf687afa4d042046b4e7c71bd0" Dec 06 07:42:03 crc kubenswrapper[4954]: I1206 07:42:03.333794 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc488cf1fe1491d034d33609d7f351a1e76324bf687afa4d042046b4e7c71bd0"} err="failed to get container status \"fc488cf1fe1491d034d33609d7f351a1e76324bf687afa4d042046b4e7c71bd0\": rpc error: code = NotFound desc = could not find container \"fc488cf1fe1491d034d33609d7f351a1e76324bf687afa4d042046b4e7c71bd0\": container with ID starting with fc488cf1fe1491d034d33609d7f351a1e76324bf687afa4d042046b4e7c71bd0 not found: ID does not exist" Dec 06 07:42:03 crc kubenswrapper[4954]: I1206 07:42:03.465061 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d0515e1-e0d3-49fd-ac25-7ae706d752ca" path="/var/lib/kubelet/pods/4d0515e1-e0d3-49fd-ac25-7ae706d752ca/volumes" Dec 06 07:44:10 crc kubenswrapper[4954]: I1206 07:44:10.101268 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:44:10 crc kubenswrapper[4954]: I1206 07:44:10.101976 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:44:40 crc kubenswrapper[4954]: I1206 07:44:40.101482 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:44:40 crc kubenswrapper[4954]: I1206 07:44:40.102198 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:44:51 crc kubenswrapper[4954]: I1206 07:44:51.877329 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tg6pd"] Dec 06 07:44:51 crc kubenswrapper[4954]: E1206 07:44:51.878553 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d0515e1-e0d3-49fd-ac25-7ae706d752ca" containerName="extract-utilities" Dec 06 07:44:51 crc kubenswrapper[4954]: I1206 07:44:51.878599 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d0515e1-e0d3-49fd-ac25-7ae706d752ca" containerName="extract-utilities" Dec 06 07:44:51 crc kubenswrapper[4954]: E1206 07:44:51.878632 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d0515e1-e0d3-49fd-ac25-7ae706d752ca" containerName="registry-server" Dec 06 07:44:51 crc kubenswrapper[4954]: I1206 07:44:51.878662 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d0515e1-e0d3-49fd-ac25-7ae706d752ca" containerName="registry-server" Dec 06 07:44:51 crc kubenswrapper[4954]: E1206 07:44:51.878675 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d0515e1-e0d3-49fd-ac25-7ae706d752ca" containerName="extract-content" Dec 06 07:44:51 crc kubenswrapper[4954]: I1206 07:44:51.878683 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d0515e1-e0d3-49fd-ac25-7ae706d752ca" containerName="extract-content" Dec 06 07:44:51 crc kubenswrapper[4954]: I1206 07:44:51.879054 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d0515e1-e0d3-49fd-ac25-7ae706d752ca" containerName="registry-server" Dec 06 07:44:51 crc kubenswrapper[4954]: I1206 07:44:51.890070 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tg6pd" Dec 06 07:44:51 crc kubenswrapper[4954]: I1206 07:44:51.898252 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tg6pd"] Dec 06 07:44:52 crc kubenswrapper[4954]: I1206 07:44:52.070166 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34119f10-b92f-4557-9f76-467d886dce3c-utilities\") pod \"community-operators-tg6pd\" (UID: \"34119f10-b92f-4557-9f76-467d886dce3c\") " pod="openshift-marketplace/community-operators-tg6pd" Dec 06 07:44:52 crc kubenswrapper[4954]: I1206 07:44:52.070552 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34119f10-b92f-4557-9f76-467d886dce3c-catalog-content\") pod \"community-operators-tg6pd\" (UID: \"34119f10-b92f-4557-9f76-467d886dce3c\") " pod="openshift-marketplace/community-operators-tg6pd" Dec 06 07:44:52 crc kubenswrapper[4954]: I1206 07:44:52.070639 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffnkl\" (UniqueName: \"kubernetes.io/projected/34119f10-b92f-4557-9f76-467d886dce3c-kube-api-access-ffnkl\") pod \"community-operators-tg6pd\" (UID: \"34119f10-b92f-4557-9f76-467d886dce3c\") " pod="openshift-marketplace/community-operators-tg6pd" Dec 06 07:44:52 crc kubenswrapper[4954]: I1206 07:44:52.172700 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffnkl\" (UniqueName: \"kubernetes.io/projected/34119f10-b92f-4557-9f76-467d886dce3c-kube-api-access-ffnkl\") pod \"community-operators-tg6pd\" (UID: \"34119f10-b92f-4557-9f76-467d886dce3c\") " pod="openshift-marketplace/community-operators-tg6pd" Dec 06 07:44:52 crc kubenswrapper[4954]: I1206 07:44:52.172822 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34119f10-b92f-4557-9f76-467d886dce3c-utilities\") pod \"community-operators-tg6pd\" (UID: \"34119f10-b92f-4557-9f76-467d886dce3c\") " pod="openshift-marketplace/community-operators-tg6pd" Dec 06 07:44:52 crc kubenswrapper[4954]: I1206 07:44:52.172848 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34119f10-b92f-4557-9f76-467d886dce3c-catalog-content\") pod \"community-operators-tg6pd\" (UID: \"34119f10-b92f-4557-9f76-467d886dce3c\") " pod="openshift-marketplace/community-operators-tg6pd" Dec 06 07:44:52 crc kubenswrapper[4954]: I1206 07:44:52.173447 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34119f10-b92f-4557-9f76-467d886dce3c-catalog-content\") pod \"community-operators-tg6pd\" (UID: \"34119f10-b92f-4557-9f76-467d886dce3c\") " pod="openshift-marketplace/community-operators-tg6pd" Dec 06 07:44:52 crc kubenswrapper[4954]: I1206 07:44:52.173553 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34119f10-b92f-4557-9f76-467d886dce3c-utilities\") pod \"community-operators-tg6pd\" (UID: \"34119f10-b92f-4557-9f76-467d886dce3c\") " pod="openshift-marketplace/community-operators-tg6pd" Dec 06 07:44:52 crc kubenswrapper[4954]: I1206 07:44:52.201016 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffnkl\" (UniqueName: \"kubernetes.io/projected/34119f10-b92f-4557-9f76-467d886dce3c-kube-api-access-ffnkl\") pod \"community-operators-tg6pd\" (UID: \"34119f10-b92f-4557-9f76-467d886dce3c\") " pod="openshift-marketplace/community-operators-tg6pd" Dec 06 07:44:52 crc kubenswrapper[4954]: I1206 07:44:52.209311 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tg6pd" Dec 06 07:44:52 crc kubenswrapper[4954]: I1206 07:44:52.503411 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tg6pd"] Dec 06 07:44:52 crc kubenswrapper[4954]: I1206 07:44:52.875686 4954 generic.go:334] "Generic (PLEG): container finished" podID="34119f10-b92f-4557-9f76-467d886dce3c" containerID="d9b2a4e23f60e0c1f1c8338d5e9f0a9186d55a81f04914a1cf4450d2b586ef28" exitCode=0 Dec 06 07:44:52 crc kubenswrapper[4954]: I1206 07:44:52.875742 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tg6pd" event={"ID":"34119f10-b92f-4557-9f76-467d886dce3c","Type":"ContainerDied","Data":"d9b2a4e23f60e0c1f1c8338d5e9f0a9186d55a81f04914a1cf4450d2b586ef28"} Dec 06 07:44:52 crc kubenswrapper[4954]: I1206 07:44:52.875781 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tg6pd" event={"ID":"34119f10-b92f-4557-9f76-467d886dce3c","Type":"ContainerStarted","Data":"8c676a11fa8b691aa66627b6204f6b2d4fd54793a34b6465b242402574d4d870"} Dec 06 07:44:54 crc kubenswrapper[4954]: I1206 07:44:54.899428 4954 generic.go:334] "Generic (PLEG): container finished" podID="34119f10-b92f-4557-9f76-467d886dce3c" containerID="d0467977c1c92d36b96b75c0286edc06fafc52ec67d63d671eb3540d7a274d79" exitCode=0 Dec 06 07:44:54 crc kubenswrapper[4954]: I1206 07:44:54.899535 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tg6pd" event={"ID":"34119f10-b92f-4557-9f76-467d886dce3c","Type":"ContainerDied","Data":"d0467977c1c92d36b96b75c0286edc06fafc52ec67d63d671eb3540d7a274d79"} Dec 06 07:44:55 crc kubenswrapper[4954]: I1206 07:44:55.914058 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tg6pd" event={"ID":"34119f10-b92f-4557-9f76-467d886dce3c","Type":"ContainerStarted","Data":"2970903bd22dca3141460da9af4abd352c0856a0457a12099d5f7f2831ad7c75"} Dec 06 07:44:55 crc kubenswrapper[4954]: I1206 07:44:55.942070 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tg6pd" podStartSLOduration=2.527301006 podStartE2EDuration="4.942042983s" podCreationTimestamp="2025-12-06 07:44:51 +0000 UTC" firstStartedPulling="2025-12-06 07:44:52.877099982 +0000 UTC m=+2867.690459381" lastFinishedPulling="2025-12-06 07:44:55.291841949 +0000 UTC m=+2870.105201358" observedRunningTime="2025-12-06 07:44:55.938260492 +0000 UTC m=+2870.751619911" watchObservedRunningTime="2025-12-06 07:44:55.942042983 +0000 UTC m=+2870.755402372" Dec 06 07:45:00 crc kubenswrapper[4954]: I1206 07:45:00.156171 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l"] Dec 06 07:45:00 crc kubenswrapper[4954]: I1206 07:45:00.158195 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l" Dec 06 07:45:00 crc kubenswrapper[4954]: I1206 07:45:00.166004 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 07:45:00 crc kubenswrapper[4954]: I1206 07:45:00.168132 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 07:45:00 crc kubenswrapper[4954]: I1206 07:45:00.179205 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l"] Dec 06 07:45:00 crc kubenswrapper[4954]: I1206 07:45:00.210147 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4785\" (UniqueName: \"kubernetes.io/projected/5edc0052-160b-4212-9556-7af69bb55a89-kube-api-access-f4785\") pod \"collect-profiles-29416785-n888l\" (UID: \"5edc0052-160b-4212-9556-7af69bb55a89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l" Dec 06 07:45:00 crc kubenswrapper[4954]: I1206 07:45:00.210229 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5edc0052-160b-4212-9556-7af69bb55a89-config-volume\") pod \"collect-profiles-29416785-n888l\" (UID: \"5edc0052-160b-4212-9556-7af69bb55a89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l" Dec 06 07:45:00 crc kubenswrapper[4954]: I1206 07:45:00.210320 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5edc0052-160b-4212-9556-7af69bb55a89-secret-volume\") pod \"collect-profiles-29416785-n888l\" (UID: \"5edc0052-160b-4212-9556-7af69bb55a89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l" Dec 06 07:45:00 crc kubenswrapper[4954]: I1206 07:45:00.312184 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5edc0052-160b-4212-9556-7af69bb55a89-secret-volume\") pod \"collect-profiles-29416785-n888l\" (UID: \"5edc0052-160b-4212-9556-7af69bb55a89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l" Dec 06 07:45:00 crc kubenswrapper[4954]: I1206 07:45:00.312338 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4785\" (UniqueName: \"kubernetes.io/projected/5edc0052-160b-4212-9556-7af69bb55a89-kube-api-access-f4785\") pod \"collect-profiles-29416785-n888l\" (UID: \"5edc0052-160b-4212-9556-7af69bb55a89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l" Dec 06 07:45:00 crc kubenswrapper[4954]: I1206 07:45:00.312383 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5edc0052-160b-4212-9556-7af69bb55a89-config-volume\") pod \"collect-profiles-29416785-n888l\" (UID: \"5edc0052-160b-4212-9556-7af69bb55a89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l" Dec 06 07:45:00 crc kubenswrapper[4954]: I1206 07:45:00.314249 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5edc0052-160b-4212-9556-7af69bb55a89-config-volume\") pod \"collect-profiles-29416785-n888l\" (UID: \"5edc0052-160b-4212-9556-7af69bb55a89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l" Dec 06 07:45:00 crc kubenswrapper[4954]: I1206 07:45:00.324408 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5edc0052-160b-4212-9556-7af69bb55a89-secret-volume\") pod \"collect-profiles-29416785-n888l\" (UID: \"5edc0052-160b-4212-9556-7af69bb55a89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l" Dec 06 07:45:00 crc kubenswrapper[4954]: I1206 07:45:00.338547 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4785\" (UniqueName: \"kubernetes.io/projected/5edc0052-160b-4212-9556-7af69bb55a89-kube-api-access-f4785\") pod \"collect-profiles-29416785-n888l\" (UID: \"5edc0052-160b-4212-9556-7af69bb55a89\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l" Dec 06 07:45:00 crc kubenswrapper[4954]: I1206 07:45:00.483349 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l" Dec 06 07:45:00 crc kubenswrapper[4954]: I1206 07:45:00.958580 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l"] Dec 06 07:45:01 crc kubenswrapper[4954]: I1206 07:45:01.968878 4954 generic.go:334] "Generic (PLEG): container finished" podID="5edc0052-160b-4212-9556-7af69bb55a89" containerID="4f335a34f8d198dec55906916f687644d1185db68ad24b6479190a0673b93a79" exitCode=0 Dec 06 07:45:01 crc kubenswrapper[4954]: I1206 07:45:01.968948 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l" event={"ID":"5edc0052-160b-4212-9556-7af69bb55a89","Type":"ContainerDied","Data":"4f335a34f8d198dec55906916f687644d1185db68ad24b6479190a0673b93a79"} Dec 06 07:45:01 crc kubenswrapper[4954]: I1206 07:45:01.969503 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l" event={"ID":"5edc0052-160b-4212-9556-7af69bb55a89","Type":"ContainerStarted","Data":"6b2047dcfce7131e9e6745c37ba5b26546a17b940eb42c947eb9d64c2f2698d8"} Dec 06 07:45:02 crc kubenswrapper[4954]: I1206 07:45:02.210040 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tg6pd" Dec 06 07:45:02 crc kubenswrapper[4954]: I1206 07:45:02.210241 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tg6pd" Dec 06 07:45:02 crc kubenswrapper[4954]: I1206 07:45:02.300296 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tg6pd" Dec 06 07:45:03 crc kubenswrapper[4954]: I1206 07:45:03.057718 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tg6pd" Dec 06 07:45:03 crc kubenswrapper[4954]: I1206 07:45:03.115521 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tg6pd"] Dec 06 07:45:03 crc kubenswrapper[4954]: I1206 07:45:03.257659 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l" Dec 06 07:45:03 crc kubenswrapper[4954]: I1206 07:45:03.377267 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5edc0052-160b-4212-9556-7af69bb55a89-config-volume\") pod \"5edc0052-160b-4212-9556-7af69bb55a89\" (UID: \"5edc0052-160b-4212-9556-7af69bb55a89\") " Dec 06 07:45:03 crc kubenswrapper[4954]: I1206 07:45:03.377416 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4785\" (UniqueName: \"kubernetes.io/projected/5edc0052-160b-4212-9556-7af69bb55a89-kube-api-access-f4785\") pod \"5edc0052-160b-4212-9556-7af69bb55a89\" (UID: \"5edc0052-160b-4212-9556-7af69bb55a89\") " Dec 06 07:45:03 crc kubenswrapper[4954]: I1206 07:45:03.377559 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5edc0052-160b-4212-9556-7af69bb55a89-secret-volume\") pod \"5edc0052-160b-4212-9556-7af69bb55a89\" (UID: \"5edc0052-160b-4212-9556-7af69bb55a89\") " Dec 06 07:45:03 crc kubenswrapper[4954]: I1206 07:45:03.378848 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5edc0052-160b-4212-9556-7af69bb55a89-config-volume" (OuterVolumeSpecName: "config-volume") pod "5edc0052-160b-4212-9556-7af69bb55a89" (UID: "5edc0052-160b-4212-9556-7af69bb55a89"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 07:45:03 crc kubenswrapper[4954]: I1206 07:45:03.384092 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5edc0052-160b-4212-9556-7af69bb55a89-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5edc0052-160b-4212-9556-7af69bb55a89" (UID: "5edc0052-160b-4212-9556-7af69bb55a89"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 07:45:03 crc kubenswrapper[4954]: I1206 07:45:03.384113 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5edc0052-160b-4212-9556-7af69bb55a89-kube-api-access-f4785" (OuterVolumeSpecName: "kube-api-access-f4785") pod "5edc0052-160b-4212-9556-7af69bb55a89" (UID: "5edc0052-160b-4212-9556-7af69bb55a89"). InnerVolumeSpecName "kube-api-access-f4785". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:45:03 crc kubenswrapper[4954]: I1206 07:45:03.480010 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5edc0052-160b-4212-9556-7af69bb55a89-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 07:45:03 crc kubenswrapper[4954]: I1206 07:45:03.480084 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4785\" (UniqueName: \"kubernetes.io/projected/5edc0052-160b-4212-9556-7af69bb55a89-kube-api-access-f4785\") on node \"crc\" DevicePath \"\"" Dec 06 07:45:03 crc kubenswrapper[4954]: I1206 07:45:03.480101 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5edc0052-160b-4212-9556-7af69bb55a89-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 07:45:03 crc kubenswrapper[4954]: I1206 07:45:03.989906 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l" event={"ID":"5edc0052-160b-4212-9556-7af69bb55a89","Type":"ContainerDied","Data":"6b2047dcfce7131e9e6745c37ba5b26546a17b940eb42c947eb9d64c2f2698d8"} Dec 06 07:45:03 crc kubenswrapper[4954]: I1206 07:45:03.989968 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b2047dcfce7131e9e6745c37ba5b26546a17b940eb42c947eb9d64c2f2698d8" Dec 06 07:45:03 crc kubenswrapper[4954]: I1206 07:45:03.989934 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l" Dec 06 07:45:04 crc kubenswrapper[4954]: I1206 07:45:04.357144 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm"] Dec 06 07:45:04 crc kubenswrapper[4954]: I1206 07:45:04.362580 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416740-z7cjm"] Dec 06 07:45:05 crc kubenswrapper[4954]: I1206 07:45:04.999810 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tg6pd" podUID="34119f10-b92f-4557-9f76-467d886dce3c" containerName="registry-server" containerID="cri-o://2970903bd22dca3141460da9af4abd352c0856a0457a12099d5f7f2831ad7c75" gracePeriod=2 Dec 06 07:45:05 crc kubenswrapper[4954]: I1206 07:45:05.466223 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75a09e25-664b-42af-9587-8ce037ab7b82" path="/var/lib/kubelet/pods/75a09e25-664b-42af-9587-8ce037ab7b82/volumes" Dec 06 07:45:05 crc kubenswrapper[4954]: I1206 07:45:05.949201 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tg6pd" Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.011822 4954 generic.go:334] "Generic (PLEG): container finished" podID="34119f10-b92f-4557-9f76-467d886dce3c" containerID="2970903bd22dca3141460da9af4abd352c0856a0457a12099d5f7f2831ad7c75" exitCode=0 Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.011885 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tg6pd" event={"ID":"34119f10-b92f-4557-9f76-467d886dce3c","Type":"ContainerDied","Data":"2970903bd22dca3141460da9af4abd352c0856a0457a12099d5f7f2831ad7c75"} Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.012329 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tg6pd" event={"ID":"34119f10-b92f-4557-9f76-467d886dce3c","Type":"ContainerDied","Data":"8c676a11fa8b691aa66627b6204f6b2d4fd54793a34b6465b242402574d4d870"} Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.012358 4954 scope.go:117] "RemoveContainer" containerID="2970903bd22dca3141460da9af4abd352c0856a0457a12099d5f7f2831ad7c75" Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.012000 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tg6pd" Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.019446 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34119f10-b92f-4557-9f76-467d886dce3c-catalog-content\") pod \"34119f10-b92f-4557-9f76-467d886dce3c\" (UID: \"34119f10-b92f-4557-9f76-467d886dce3c\") " Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.019502 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffnkl\" (UniqueName: \"kubernetes.io/projected/34119f10-b92f-4557-9f76-467d886dce3c-kube-api-access-ffnkl\") pod \"34119f10-b92f-4557-9f76-467d886dce3c\" (UID: \"34119f10-b92f-4557-9f76-467d886dce3c\") " Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.019728 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34119f10-b92f-4557-9f76-467d886dce3c-utilities\") pod \"34119f10-b92f-4557-9f76-467d886dce3c\" (UID: \"34119f10-b92f-4557-9f76-467d886dce3c\") " Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.020662 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34119f10-b92f-4557-9f76-467d886dce3c-utilities" (OuterVolumeSpecName: "utilities") pod "34119f10-b92f-4557-9f76-467d886dce3c" (UID: "34119f10-b92f-4557-9f76-467d886dce3c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.024781 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34119f10-b92f-4557-9f76-467d886dce3c-kube-api-access-ffnkl" (OuterVolumeSpecName: "kube-api-access-ffnkl") pod "34119f10-b92f-4557-9f76-467d886dce3c" (UID: "34119f10-b92f-4557-9f76-467d886dce3c"). InnerVolumeSpecName "kube-api-access-ffnkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.064738 4954 scope.go:117] "RemoveContainer" containerID="d0467977c1c92d36b96b75c0286edc06fafc52ec67d63d671eb3540d7a274d79" Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.077844 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34119f10-b92f-4557-9f76-467d886dce3c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34119f10-b92f-4557-9f76-467d886dce3c" (UID: "34119f10-b92f-4557-9f76-467d886dce3c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.096545 4954 scope.go:117] "RemoveContainer" containerID="d9b2a4e23f60e0c1f1c8338d5e9f0a9186d55a81f04914a1cf4450d2b586ef28" Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.122197 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34119f10-b92f-4557-9f76-467d886dce3c-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.122249 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34119f10-b92f-4557-9f76-467d886dce3c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.122268 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffnkl\" (UniqueName: \"kubernetes.io/projected/34119f10-b92f-4557-9f76-467d886dce3c-kube-api-access-ffnkl\") on node \"crc\" DevicePath \"\"" Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.129508 4954 scope.go:117] "RemoveContainer" containerID="2970903bd22dca3141460da9af4abd352c0856a0457a12099d5f7f2831ad7c75" Dec 06 07:45:06 crc kubenswrapper[4954]: E1206 07:45:06.130222 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2970903bd22dca3141460da9af4abd352c0856a0457a12099d5f7f2831ad7c75\": container with ID starting with 2970903bd22dca3141460da9af4abd352c0856a0457a12099d5f7f2831ad7c75 not found: ID does not exist" containerID="2970903bd22dca3141460da9af4abd352c0856a0457a12099d5f7f2831ad7c75" Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.130294 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2970903bd22dca3141460da9af4abd352c0856a0457a12099d5f7f2831ad7c75"} err="failed to get container status \"2970903bd22dca3141460da9af4abd352c0856a0457a12099d5f7f2831ad7c75\": rpc error: code = NotFound desc = could not find container \"2970903bd22dca3141460da9af4abd352c0856a0457a12099d5f7f2831ad7c75\": container with ID starting with 2970903bd22dca3141460da9af4abd352c0856a0457a12099d5f7f2831ad7c75 not found: ID does not exist" Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.130334 4954 scope.go:117] "RemoveContainer" containerID="d0467977c1c92d36b96b75c0286edc06fafc52ec67d63d671eb3540d7a274d79" Dec 06 07:45:06 crc kubenswrapper[4954]: E1206 07:45:06.130838 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0467977c1c92d36b96b75c0286edc06fafc52ec67d63d671eb3540d7a274d79\": container with ID starting with d0467977c1c92d36b96b75c0286edc06fafc52ec67d63d671eb3540d7a274d79 not found: ID does not exist" containerID="d0467977c1c92d36b96b75c0286edc06fafc52ec67d63d671eb3540d7a274d79" Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.130880 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0467977c1c92d36b96b75c0286edc06fafc52ec67d63d671eb3540d7a274d79"} err="failed to get container status \"d0467977c1c92d36b96b75c0286edc06fafc52ec67d63d671eb3540d7a274d79\": rpc error: code = NotFound desc = could not find container \"d0467977c1c92d36b96b75c0286edc06fafc52ec67d63d671eb3540d7a274d79\": container with ID starting with d0467977c1c92d36b96b75c0286edc06fafc52ec67d63d671eb3540d7a274d79 not found: ID does not exist" Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.130917 4954 scope.go:117] "RemoveContainer" containerID="d9b2a4e23f60e0c1f1c8338d5e9f0a9186d55a81f04914a1cf4450d2b586ef28" Dec 06 07:45:06 crc kubenswrapper[4954]: E1206 07:45:06.131300 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9b2a4e23f60e0c1f1c8338d5e9f0a9186d55a81f04914a1cf4450d2b586ef28\": container with ID starting with d9b2a4e23f60e0c1f1c8338d5e9f0a9186d55a81f04914a1cf4450d2b586ef28 not found: ID does not exist" containerID="d9b2a4e23f60e0c1f1c8338d5e9f0a9186d55a81f04914a1cf4450d2b586ef28" Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.131353 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9b2a4e23f60e0c1f1c8338d5e9f0a9186d55a81f04914a1cf4450d2b586ef28"} err="failed to get container status \"d9b2a4e23f60e0c1f1c8338d5e9f0a9186d55a81f04914a1cf4450d2b586ef28\": rpc error: code = NotFound desc = could not find container \"d9b2a4e23f60e0c1f1c8338d5e9f0a9186d55a81f04914a1cf4450d2b586ef28\": container with ID starting with d9b2a4e23f60e0c1f1c8338d5e9f0a9186d55a81f04914a1cf4450d2b586ef28 not found: ID does not exist" Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.366172 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tg6pd"] Dec 06 07:45:06 crc kubenswrapper[4954]: I1206 07:45:06.374212 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tg6pd"] Dec 06 07:45:07 crc kubenswrapper[4954]: I1206 07:45:07.454876 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34119f10-b92f-4557-9f76-467d886dce3c" path="/var/lib/kubelet/pods/34119f10-b92f-4557-9f76-467d886dce3c/volumes" Dec 06 07:45:10 crc kubenswrapper[4954]: I1206 07:45:10.101240 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:45:10 crc kubenswrapper[4954]: I1206 07:45:10.101827 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:45:10 crc kubenswrapper[4954]: I1206 07:45:10.101900 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 07:45:10 crc kubenswrapper[4954]: I1206 07:45:10.102566 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 07:45:10 crc kubenswrapper[4954]: I1206 07:45:10.102735 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" gracePeriod=600 Dec 06 07:45:10 crc kubenswrapper[4954]: E1206 07:45:10.228516 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:45:11 crc kubenswrapper[4954]: I1206 07:45:11.057743 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" exitCode=0 Dec 06 07:45:11 crc kubenswrapper[4954]: I1206 07:45:11.057803 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c"} Dec 06 07:45:11 crc kubenswrapper[4954]: I1206 07:45:11.057885 4954 scope.go:117] "RemoveContainer" containerID="4dc2e3035bfcd8bdb88c9fd44bbcb2341c64959383929c667755e4552731577f" Dec 06 07:45:11 crc kubenswrapper[4954]: I1206 07:45:11.058999 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:45:11 crc kubenswrapper[4954]: E1206 07:45:11.059322 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:45:20 crc kubenswrapper[4954]: I1206 07:45:20.887697 4954 scope.go:117] "RemoveContainer" containerID="29592f7784d620334b63ad3b81dfa27d546fe39ea89acaa466a443c3ffdcf938" Dec 06 07:45:23 crc kubenswrapper[4954]: I1206 07:45:23.444287 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:45:23 crc kubenswrapper[4954]: E1206 07:45:23.444614 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.345598 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-th5ws"] Dec 06 07:45:31 crc kubenswrapper[4954]: E1206 07:45:31.347257 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34119f10-b92f-4557-9f76-467d886dce3c" containerName="extract-utilities" Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.347288 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="34119f10-b92f-4557-9f76-467d886dce3c" containerName="extract-utilities" Dec 06 07:45:31 crc kubenswrapper[4954]: E1206 07:45:31.347311 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34119f10-b92f-4557-9f76-467d886dce3c" containerName="extract-content" Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.347328 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="34119f10-b92f-4557-9f76-467d886dce3c" containerName="extract-content" Dec 06 07:45:31 crc kubenswrapper[4954]: E1206 07:45:31.347358 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34119f10-b92f-4557-9f76-467d886dce3c" containerName="registry-server" Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.347376 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="34119f10-b92f-4557-9f76-467d886dce3c" containerName="registry-server" Dec 06 07:45:31 crc kubenswrapper[4954]: E1206 07:45:31.347453 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5edc0052-160b-4212-9556-7af69bb55a89" containerName="collect-profiles" Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.347470 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5edc0052-160b-4212-9556-7af69bb55a89" containerName="collect-profiles" Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.347934 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="5edc0052-160b-4212-9556-7af69bb55a89" containerName="collect-profiles" Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.347977 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="34119f10-b92f-4557-9f76-467d886dce3c" containerName="registry-server" Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.350498 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-th5ws" Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.368305 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-th5ws"] Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.455074 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a7e520-dadf-46bc-94c1-0755f0640698-utilities\") pod \"redhat-marketplace-th5ws\" (UID: \"16a7e520-dadf-46bc-94c1-0755f0640698\") " pod="openshift-marketplace/redhat-marketplace-th5ws" Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.455621 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz77j\" (UniqueName: \"kubernetes.io/projected/16a7e520-dadf-46bc-94c1-0755f0640698-kube-api-access-lz77j\") pod \"redhat-marketplace-th5ws\" (UID: \"16a7e520-dadf-46bc-94c1-0755f0640698\") " pod="openshift-marketplace/redhat-marketplace-th5ws" Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.455901 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a7e520-dadf-46bc-94c1-0755f0640698-catalog-content\") pod \"redhat-marketplace-th5ws\" (UID: \"16a7e520-dadf-46bc-94c1-0755f0640698\") " pod="openshift-marketplace/redhat-marketplace-th5ws" Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.557357 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a7e520-dadf-46bc-94c1-0755f0640698-catalog-content\") pod \"redhat-marketplace-th5ws\" (UID: \"16a7e520-dadf-46bc-94c1-0755f0640698\") " pod="openshift-marketplace/redhat-marketplace-th5ws" Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.557505 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a7e520-dadf-46bc-94c1-0755f0640698-utilities\") pod \"redhat-marketplace-th5ws\" (UID: \"16a7e520-dadf-46bc-94c1-0755f0640698\") " pod="openshift-marketplace/redhat-marketplace-th5ws" Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.557584 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz77j\" (UniqueName: \"kubernetes.io/projected/16a7e520-dadf-46bc-94c1-0755f0640698-kube-api-access-lz77j\") pod \"redhat-marketplace-th5ws\" (UID: \"16a7e520-dadf-46bc-94c1-0755f0640698\") " pod="openshift-marketplace/redhat-marketplace-th5ws" Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.557988 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a7e520-dadf-46bc-94c1-0755f0640698-catalog-content\") pod \"redhat-marketplace-th5ws\" (UID: \"16a7e520-dadf-46bc-94c1-0755f0640698\") " pod="openshift-marketplace/redhat-marketplace-th5ws" Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.558762 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a7e520-dadf-46bc-94c1-0755f0640698-utilities\") pod \"redhat-marketplace-th5ws\" (UID: \"16a7e520-dadf-46bc-94c1-0755f0640698\") " pod="openshift-marketplace/redhat-marketplace-th5ws" Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.600615 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz77j\" (UniqueName: \"kubernetes.io/projected/16a7e520-dadf-46bc-94c1-0755f0640698-kube-api-access-lz77j\") pod \"redhat-marketplace-th5ws\" (UID: \"16a7e520-dadf-46bc-94c1-0755f0640698\") " pod="openshift-marketplace/redhat-marketplace-th5ws" Dec 06 07:45:31 crc kubenswrapper[4954]: I1206 07:45:31.685966 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-th5ws" Dec 06 07:45:32 crc kubenswrapper[4954]: I1206 07:45:32.229874 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-th5ws"] Dec 06 07:45:32 crc kubenswrapper[4954]: I1206 07:45:32.251829 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-th5ws" event={"ID":"16a7e520-dadf-46bc-94c1-0755f0640698","Type":"ContainerStarted","Data":"aad28422460be8f7330c5333583d7467e9983e45b0913f731445b0a2243a13ab"} Dec 06 07:45:33 crc kubenswrapper[4954]: I1206 07:45:33.261825 4954 generic.go:334] "Generic (PLEG): container finished" podID="16a7e520-dadf-46bc-94c1-0755f0640698" containerID="00abf22a0a62d193bf63610159bc3646c689b845d4ec96a3cbe2aee2393fc7a6" exitCode=0 Dec 06 07:45:33 crc kubenswrapper[4954]: I1206 07:45:33.261880 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-th5ws" event={"ID":"16a7e520-dadf-46bc-94c1-0755f0640698","Type":"ContainerDied","Data":"00abf22a0a62d193bf63610159bc3646c689b845d4ec96a3cbe2aee2393fc7a6"} Dec 06 07:45:34 crc kubenswrapper[4954]: I1206 07:45:34.273421 4954 generic.go:334] "Generic (PLEG): container finished" podID="16a7e520-dadf-46bc-94c1-0755f0640698" containerID="cb4489a85fd3aff266f9a53af1b8e0c814183fe246b44dd0381c628e87098546" exitCode=0 Dec 06 07:45:34 crc kubenswrapper[4954]: I1206 07:45:34.273514 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-th5ws" event={"ID":"16a7e520-dadf-46bc-94c1-0755f0640698","Type":"ContainerDied","Data":"cb4489a85fd3aff266f9a53af1b8e0c814183fe246b44dd0381c628e87098546"} Dec 06 07:45:36 crc kubenswrapper[4954]: I1206 07:45:36.308886 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-th5ws" event={"ID":"16a7e520-dadf-46bc-94c1-0755f0640698","Type":"ContainerStarted","Data":"d56c5dacde6f88fdfec6d5240fd60afbc0e31d8372094dec622163a2e597e088"} Dec 06 07:45:36 crc kubenswrapper[4954]: I1206 07:45:36.340472 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-th5ws" podStartSLOduration=3.059990214 podStartE2EDuration="5.340455136s" podCreationTimestamp="2025-12-06 07:45:31 +0000 UTC" firstStartedPulling="2025-12-06 07:45:33.265924719 +0000 UTC m=+2908.079284108" lastFinishedPulling="2025-12-06 07:45:35.546389641 +0000 UTC m=+2910.359749030" observedRunningTime="2025-12-06 07:45:36.337286741 +0000 UTC m=+2911.150646130" watchObservedRunningTime="2025-12-06 07:45:36.340455136 +0000 UTC m=+2911.153814525" Dec 06 07:45:37 crc kubenswrapper[4954]: I1206 07:45:37.443974 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:45:37 crc kubenswrapper[4954]: E1206 07:45:37.444755 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:45:41 crc kubenswrapper[4954]: I1206 07:45:41.686501 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-th5ws" Dec 06 07:45:41 crc kubenswrapper[4954]: I1206 07:45:41.687555 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-th5ws" Dec 06 07:45:41 crc kubenswrapper[4954]: I1206 07:45:41.745813 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-th5ws" Dec 06 07:45:42 crc kubenswrapper[4954]: I1206 07:45:42.416837 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-th5ws" Dec 06 07:45:42 crc kubenswrapper[4954]: I1206 07:45:42.472556 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-th5ws"] Dec 06 07:45:44 crc kubenswrapper[4954]: I1206 07:45:44.385810 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-th5ws" podUID="16a7e520-dadf-46bc-94c1-0755f0640698" containerName="registry-server" containerID="cri-o://d56c5dacde6f88fdfec6d5240fd60afbc0e31d8372094dec622163a2e597e088" gracePeriod=2 Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.334818 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-th5ws" Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.399401 4954 generic.go:334] "Generic (PLEG): container finished" podID="16a7e520-dadf-46bc-94c1-0755f0640698" containerID="d56c5dacde6f88fdfec6d5240fd60afbc0e31d8372094dec622163a2e597e088" exitCode=0 Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.399478 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-th5ws" event={"ID":"16a7e520-dadf-46bc-94c1-0755f0640698","Type":"ContainerDied","Data":"d56c5dacde6f88fdfec6d5240fd60afbc0e31d8372094dec622163a2e597e088"} Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.399509 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-th5ws" Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.399540 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-th5ws" event={"ID":"16a7e520-dadf-46bc-94c1-0755f0640698","Type":"ContainerDied","Data":"aad28422460be8f7330c5333583d7467e9983e45b0913f731445b0a2243a13ab"} Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.399620 4954 scope.go:117] "RemoveContainer" containerID="d56c5dacde6f88fdfec6d5240fd60afbc0e31d8372094dec622163a2e597e088" Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.406537 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a7e520-dadf-46bc-94c1-0755f0640698-catalog-content\") pod \"16a7e520-dadf-46bc-94c1-0755f0640698\" (UID: \"16a7e520-dadf-46bc-94c1-0755f0640698\") " Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.406748 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a7e520-dadf-46bc-94c1-0755f0640698-utilities\") pod \"16a7e520-dadf-46bc-94c1-0755f0640698\" (UID: \"16a7e520-dadf-46bc-94c1-0755f0640698\") " Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.407075 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz77j\" (UniqueName: \"kubernetes.io/projected/16a7e520-dadf-46bc-94c1-0755f0640698-kube-api-access-lz77j\") pod \"16a7e520-dadf-46bc-94c1-0755f0640698\" (UID: \"16a7e520-dadf-46bc-94c1-0755f0640698\") " Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.407719 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16a7e520-dadf-46bc-94c1-0755f0640698-utilities" (OuterVolumeSpecName: "utilities") pod "16a7e520-dadf-46bc-94c1-0755f0640698" (UID: "16a7e520-dadf-46bc-94c1-0755f0640698"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.408373 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a7e520-dadf-46bc-94c1-0755f0640698-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.418080 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16a7e520-dadf-46bc-94c1-0755f0640698-kube-api-access-lz77j" (OuterVolumeSpecName: "kube-api-access-lz77j") pod "16a7e520-dadf-46bc-94c1-0755f0640698" (UID: "16a7e520-dadf-46bc-94c1-0755f0640698"). InnerVolumeSpecName "kube-api-access-lz77j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.430282 4954 scope.go:117] "RemoveContainer" containerID="cb4489a85fd3aff266f9a53af1b8e0c814183fe246b44dd0381c628e87098546" Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.438962 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16a7e520-dadf-46bc-94c1-0755f0640698-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16a7e520-dadf-46bc-94c1-0755f0640698" (UID: "16a7e520-dadf-46bc-94c1-0755f0640698"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.458505 4954 scope.go:117] "RemoveContainer" containerID="00abf22a0a62d193bf63610159bc3646c689b845d4ec96a3cbe2aee2393fc7a6" Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.482824 4954 scope.go:117] "RemoveContainer" containerID="d56c5dacde6f88fdfec6d5240fd60afbc0e31d8372094dec622163a2e597e088" Dec 06 07:45:45 crc kubenswrapper[4954]: E1206 07:45:45.483511 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d56c5dacde6f88fdfec6d5240fd60afbc0e31d8372094dec622163a2e597e088\": container with ID starting with d56c5dacde6f88fdfec6d5240fd60afbc0e31d8372094dec622163a2e597e088 not found: ID does not exist" containerID="d56c5dacde6f88fdfec6d5240fd60afbc0e31d8372094dec622163a2e597e088" Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.483552 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d56c5dacde6f88fdfec6d5240fd60afbc0e31d8372094dec622163a2e597e088"} err="failed to get container status \"d56c5dacde6f88fdfec6d5240fd60afbc0e31d8372094dec622163a2e597e088\": rpc error: code = NotFound desc = could not find container \"d56c5dacde6f88fdfec6d5240fd60afbc0e31d8372094dec622163a2e597e088\": container with ID starting with d56c5dacde6f88fdfec6d5240fd60afbc0e31d8372094dec622163a2e597e088 not found: ID does not exist" Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.483610 4954 scope.go:117] "RemoveContainer" containerID="cb4489a85fd3aff266f9a53af1b8e0c814183fe246b44dd0381c628e87098546" Dec 06 07:45:45 crc kubenswrapper[4954]: E1206 07:45:45.483975 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb4489a85fd3aff266f9a53af1b8e0c814183fe246b44dd0381c628e87098546\": container with ID starting with cb4489a85fd3aff266f9a53af1b8e0c814183fe246b44dd0381c628e87098546 not found: ID does not exist" containerID="cb4489a85fd3aff266f9a53af1b8e0c814183fe246b44dd0381c628e87098546" Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.484008 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb4489a85fd3aff266f9a53af1b8e0c814183fe246b44dd0381c628e87098546"} err="failed to get container status \"cb4489a85fd3aff266f9a53af1b8e0c814183fe246b44dd0381c628e87098546\": rpc error: code = NotFound desc = could not find container \"cb4489a85fd3aff266f9a53af1b8e0c814183fe246b44dd0381c628e87098546\": container with ID starting with cb4489a85fd3aff266f9a53af1b8e0c814183fe246b44dd0381c628e87098546 not found: ID does not exist" Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.484028 4954 scope.go:117] "RemoveContainer" containerID="00abf22a0a62d193bf63610159bc3646c689b845d4ec96a3cbe2aee2393fc7a6" Dec 06 07:45:45 crc kubenswrapper[4954]: E1206 07:45:45.484997 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00abf22a0a62d193bf63610159bc3646c689b845d4ec96a3cbe2aee2393fc7a6\": container with ID starting with 00abf22a0a62d193bf63610159bc3646c689b845d4ec96a3cbe2aee2393fc7a6 not found: ID does not exist" containerID="00abf22a0a62d193bf63610159bc3646c689b845d4ec96a3cbe2aee2393fc7a6" Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.485056 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00abf22a0a62d193bf63610159bc3646c689b845d4ec96a3cbe2aee2393fc7a6"} err="failed to get container status \"00abf22a0a62d193bf63610159bc3646c689b845d4ec96a3cbe2aee2393fc7a6\": rpc error: code = NotFound desc = could not find container \"00abf22a0a62d193bf63610159bc3646c689b845d4ec96a3cbe2aee2393fc7a6\": container with ID starting with 00abf22a0a62d193bf63610159bc3646c689b845d4ec96a3cbe2aee2393fc7a6 not found: ID does not exist" Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.512163 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a7e520-dadf-46bc-94c1-0755f0640698-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.512224 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz77j\" (UniqueName: \"kubernetes.io/projected/16a7e520-dadf-46bc-94c1-0755f0640698-kube-api-access-lz77j\") on node \"crc\" DevicePath \"\"" Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.732463 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-th5ws"] Dec 06 07:45:45 crc kubenswrapper[4954]: I1206 07:45:45.739987 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-th5ws"] Dec 06 07:45:47 crc kubenswrapper[4954]: I1206 07:45:47.460811 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16a7e520-dadf-46bc-94c1-0755f0640698" path="/var/lib/kubelet/pods/16a7e520-dadf-46bc-94c1-0755f0640698/volumes" Dec 06 07:45:49 crc kubenswrapper[4954]: I1206 07:45:49.443228 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:45:49 crc kubenswrapper[4954]: E1206 07:45:49.443883 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:46:02 crc kubenswrapper[4954]: I1206 07:46:02.443764 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:46:02 crc kubenswrapper[4954]: E1206 07:46:02.445010 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:46:15 crc kubenswrapper[4954]: I1206 07:46:15.448791 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:46:15 crc kubenswrapper[4954]: E1206 07:46:15.449857 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:46:29 crc kubenswrapper[4954]: I1206 07:46:29.444621 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:46:29 crc kubenswrapper[4954]: E1206 07:46:29.445884 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:46:44 crc kubenswrapper[4954]: I1206 07:46:44.443669 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:46:44 crc kubenswrapper[4954]: E1206 07:46:44.447304 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:46:58 crc kubenswrapper[4954]: I1206 07:46:58.443919 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:46:58 crc kubenswrapper[4954]: E1206 07:46:58.445006 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:47:11 crc kubenswrapper[4954]: I1206 07:47:11.444209 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:47:11 crc kubenswrapper[4954]: E1206 07:47:11.446048 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:47:24 crc kubenswrapper[4954]: I1206 07:47:24.444271 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:47:24 crc kubenswrapper[4954]: E1206 07:47:24.445434 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:47:36 crc kubenswrapper[4954]: I1206 07:47:36.444211 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:47:36 crc kubenswrapper[4954]: E1206 07:47:36.445411 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:47:48 crc kubenswrapper[4954]: I1206 07:47:48.444179 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:47:48 crc kubenswrapper[4954]: E1206 07:47:48.446136 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:48:02 crc kubenswrapper[4954]: I1206 07:48:02.444485 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:48:02 crc kubenswrapper[4954]: E1206 07:48:02.446211 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:48:13 crc kubenswrapper[4954]: I1206 07:48:13.443928 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:48:13 crc kubenswrapper[4954]: E1206 07:48:13.445133 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:48:27 crc kubenswrapper[4954]: I1206 07:48:27.444025 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:48:27 crc kubenswrapper[4954]: E1206 07:48:27.445549 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:48:42 crc kubenswrapper[4954]: I1206 07:48:42.444262 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:48:42 crc kubenswrapper[4954]: E1206 07:48:42.445799 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:48:57 crc kubenswrapper[4954]: I1206 07:48:57.443828 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:48:57 crc kubenswrapper[4954]: E1206 07:48:57.444971 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:49:11 crc kubenswrapper[4954]: I1206 07:49:11.443777 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:49:11 crc kubenswrapper[4954]: E1206 07:49:11.444911 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:49:23 crc kubenswrapper[4954]: I1206 07:49:23.444281 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:49:23 crc kubenswrapper[4954]: E1206 07:49:23.447288 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:49:35 crc kubenswrapper[4954]: I1206 07:49:35.452171 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:49:35 crc kubenswrapper[4954]: E1206 07:49:35.453341 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:49:50 crc kubenswrapper[4954]: I1206 07:49:50.443098 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:49:50 crc kubenswrapper[4954]: E1206 07:49:50.444340 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:50:01 crc kubenswrapper[4954]: I1206 07:50:01.444054 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:50:01 crc kubenswrapper[4954]: E1206 07:50:01.445105 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:50:16 crc kubenswrapper[4954]: I1206 07:50:16.443480 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:50:16 crc kubenswrapper[4954]: I1206 07:50:16.984618 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"c5bb66dc3bbdf312f3962f2389fa4c8d80855d7bb04ebbd1a394571325d7d80d"} Dec 06 07:52:27 crc kubenswrapper[4954]: I1206 07:52:27.583634 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sjh82"] Dec 06 07:52:27 crc kubenswrapper[4954]: E1206 07:52:27.584795 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a7e520-dadf-46bc-94c1-0755f0640698" containerName="extract-utilities" Dec 06 07:52:27 crc kubenswrapper[4954]: I1206 07:52:27.584817 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a7e520-dadf-46bc-94c1-0755f0640698" containerName="extract-utilities" Dec 06 07:52:27 crc kubenswrapper[4954]: E1206 07:52:27.584848 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a7e520-dadf-46bc-94c1-0755f0640698" containerName="registry-server" Dec 06 07:52:27 crc kubenswrapper[4954]: I1206 07:52:27.584859 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a7e520-dadf-46bc-94c1-0755f0640698" containerName="registry-server" Dec 06 07:52:27 crc kubenswrapper[4954]: E1206 07:52:27.584894 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a7e520-dadf-46bc-94c1-0755f0640698" containerName="extract-content" Dec 06 07:52:27 crc kubenswrapper[4954]: I1206 07:52:27.584903 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a7e520-dadf-46bc-94c1-0755f0640698" containerName="extract-content" Dec 06 07:52:27 crc kubenswrapper[4954]: I1206 07:52:27.585106 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="16a7e520-dadf-46bc-94c1-0755f0640698" containerName="registry-server" Dec 06 07:52:27 crc kubenswrapper[4954]: I1206 07:52:27.586278 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjh82" Dec 06 07:52:27 crc kubenswrapper[4954]: I1206 07:52:27.601836 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sjh82"] Dec 06 07:52:27 crc kubenswrapper[4954]: I1206 07:52:27.706449 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033df9cc-fa98-456d-89be-f5c47b654544-catalog-content\") pod \"certified-operators-sjh82\" (UID: \"033df9cc-fa98-456d-89be-f5c47b654544\") " pod="openshift-marketplace/certified-operators-sjh82" Dec 06 07:52:27 crc kubenswrapper[4954]: I1206 07:52:27.706527 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz4r9\" (UniqueName: \"kubernetes.io/projected/033df9cc-fa98-456d-89be-f5c47b654544-kube-api-access-qz4r9\") pod \"certified-operators-sjh82\" (UID: \"033df9cc-fa98-456d-89be-f5c47b654544\") " pod="openshift-marketplace/certified-operators-sjh82" Dec 06 07:52:27 crc kubenswrapper[4954]: I1206 07:52:27.706614 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033df9cc-fa98-456d-89be-f5c47b654544-utilities\") pod \"certified-operators-sjh82\" (UID: \"033df9cc-fa98-456d-89be-f5c47b654544\") " pod="openshift-marketplace/certified-operators-sjh82" Dec 06 07:52:27 crc kubenswrapper[4954]: I1206 07:52:27.809862 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033df9cc-fa98-456d-89be-f5c47b654544-utilities\") pod \"certified-operators-sjh82\" (UID: \"033df9cc-fa98-456d-89be-f5c47b654544\") " pod="openshift-marketplace/certified-operators-sjh82" Dec 06 07:52:27 crc kubenswrapper[4954]: I1206 07:52:27.809964 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033df9cc-fa98-456d-89be-f5c47b654544-catalog-content\") pod \"certified-operators-sjh82\" (UID: \"033df9cc-fa98-456d-89be-f5c47b654544\") " pod="openshift-marketplace/certified-operators-sjh82" Dec 06 07:52:27 crc kubenswrapper[4954]: I1206 07:52:27.810017 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz4r9\" (UniqueName: \"kubernetes.io/projected/033df9cc-fa98-456d-89be-f5c47b654544-kube-api-access-qz4r9\") pod \"certified-operators-sjh82\" (UID: \"033df9cc-fa98-456d-89be-f5c47b654544\") " pod="openshift-marketplace/certified-operators-sjh82" Dec 06 07:52:27 crc kubenswrapper[4954]: I1206 07:52:27.811064 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033df9cc-fa98-456d-89be-f5c47b654544-utilities\") pod \"certified-operators-sjh82\" (UID: \"033df9cc-fa98-456d-89be-f5c47b654544\") " pod="openshift-marketplace/certified-operators-sjh82" Dec 06 07:52:27 crc kubenswrapper[4954]: I1206 07:52:27.811386 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033df9cc-fa98-456d-89be-f5c47b654544-catalog-content\") pod \"certified-operators-sjh82\" (UID: \"033df9cc-fa98-456d-89be-f5c47b654544\") " pod="openshift-marketplace/certified-operators-sjh82" Dec 06 07:52:27 crc kubenswrapper[4954]: I1206 07:52:27.836817 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz4r9\" (UniqueName: \"kubernetes.io/projected/033df9cc-fa98-456d-89be-f5c47b654544-kube-api-access-qz4r9\") pod \"certified-operators-sjh82\" (UID: \"033df9cc-fa98-456d-89be-f5c47b654544\") " pod="openshift-marketplace/certified-operators-sjh82" Dec 06 07:52:27 crc kubenswrapper[4954]: I1206 07:52:27.912113 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjh82" Dec 06 07:52:28 crc kubenswrapper[4954]: I1206 07:52:28.406726 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sjh82"] Dec 06 07:52:28 crc kubenswrapper[4954]: I1206 07:52:28.514621 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjh82" event={"ID":"033df9cc-fa98-456d-89be-f5c47b654544","Type":"ContainerStarted","Data":"938e39e41c17430ee0052723d24c53ce6da876368773e662d06e103fc84c18eb"} Dec 06 07:52:29 crc kubenswrapper[4954]: I1206 07:52:29.526359 4954 generic.go:334] "Generic (PLEG): container finished" podID="033df9cc-fa98-456d-89be-f5c47b654544" containerID="5f49043a196f86f164b4285e8c49a99ea964fdeb9b6b371ade7a471aedbbc488" exitCode=0 Dec 06 07:52:29 crc kubenswrapper[4954]: I1206 07:52:29.526651 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjh82" event={"ID":"033df9cc-fa98-456d-89be-f5c47b654544","Type":"ContainerDied","Data":"5f49043a196f86f164b4285e8c49a99ea964fdeb9b6b371ade7a471aedbbc488"} Dec 06 07:52:29 crc kubenswrapper[4954]: I1206 07:52:29.530549 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 07:52:30 crc kubenswrapper[4954]: I1206 07:52:30.546519 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjh82" event={"ID":"033df9cc-fa98-456d-89be-f5c47b654544","Type":"ContainerStarted","Data":"734be305558f65fe7f01701d368f6d1c867cad704be7455a8bb13b759cf73740"} Dec 06 07:52:31 crc kubenswrapper[4954]: I1206 07:52:31.558151 4954 generic.go:334] "Generic (PLEG): container finished" podID="033df9cc-fa98-456d-89be-f5c47b654544" containerID="734be305558f65fe7f01701d368f6d1c867cad704be7455a8bb13b759cf73740" exitCode=0 Dec 06 07:52:31 crc kubenswrapper[4954]: I1206 07:52:31.558246 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjh82" event={"ID":"033df9cc-fa98-456d-89be-f5c47b654544","Type":"ContainerDied","Data":"734be305558f65fe7f01701d368f6d1c867cad704be7455a8bb13b759cf73740"} Dec 06 07:52:33 crc kubenswrapper[4954]: I1206 07:52:33.580742 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjh82" event={"ID":"033df9cc-fa98-456d-89be-f5c47b654544","Type":"ContainerStarted","Data":"3498639b5b0ee9f9db42df80a13281728b395fed79be7eb87cd2118b922dc751"} Dec 06 07:52:33 crc kubenswrapper[4954]: I1206 07:52:33.603096 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sjh82" podStartSLOduration=3.285484827 podStartE2EDuration="6.603063385s" podCreationTimestamp="2025-12-06 07:52:27 +0000 UTC" firstStartedPulling="2025-12-06 07:52:29.530188416 +0000 UTC m=+3324.343547805" lastFinishedPulling="2025-12-06 07:52:32.847766974 +0000 UTC m=+3327.661126363" observedRunningTime="2025-12-06 07:52:33.602421788 +0000 UTC m=+3328.415781167" watchObservedRunningTime="2025-12-06 07:52:33.603063385 +0000 UTC m=+3328.416422774" Dec 06 07:52:37 crc kubenswrapper[4954]: I1206 07:52:37.912943 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sjh82" Dec 06 07:52:37 crc kubenswrapper[4954]: I1206 07:52:37.914711 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sjh82" Dec 06 07:52:37 crc kubenswrapper[4954]: I1206 07:52:37.956228 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sjh82" Dec 06 07:52:38 crc kubenswrapper[4954]: I1206 07:52:38.692259 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sjh82" Dec 06 07:52:38 crc kubenswrapper[4954]: I1206 07:52:38.746529 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sjh82"] Dec 06 07:52:40 crc kubenswrapper[4954]: I1206 07:52:40.101634 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:52:40 crc kubenswrapper[4954]: I1206 07:52:40.101747 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:52:40 crc kubenswrapper[4954]: I1206 07:52:40.649947 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sjh82" podUID="033df9cc-fa98-456d-89be-f5c47b654544" containerName="registry-server" containerID="cri-o://3498639b5b0ee9f9db42df80a13281728b395fed79be7eb87cd2118b922dc751" gracePeriod=2 Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.240784 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjh82" Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.274650 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033df9cc-fa98-456d-89be-f5c47b654544-utilities\") pod \"033df9cc-fa98-456d-89be-f5c47b654544\" (UID: \"033df9cc-fa98-456d-89be-f5c47b654544\") " Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.274746 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz4r9\" (UniqueName: \"kubernetes.io/projected/033df9cc-fa98-456d-89be-f5c47b654544-kube-api-access-qz4r9\") pod \"033df9cc-fa98-456d-89be-f5c47b654544\" (UID: \"033df9cc-fa98-456d-89be-f5c47b654544\") " Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.274918 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033df9cc-fa98-456d-89be-f5c47b654544-catalog-content\") pod \"033df9cc-fa98-456d-89be-f5c47b654544\" (UID: \"033df9cc-fa98-456d-89be-f5c47b654544\") " Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.275683 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/033df9cc-fa98-456d-89be-f5c47b654544-utilities" (OuterVolumeSpecName: "utilities") pod "033df9cc-fa98-456d-89be-f5c47b654544" (UID: "033df9cc-fa98-456d-89be-f5c47b654544"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.281784 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/033df9cc-fa98-456d-89be-f5c47b654544-kube-api-access-qz4r9" (OuterVolumeSpecName: "kube-api-access-qz4r9") pod "033df9cc-fa98-456d-89be-f5c47b654544" (UID: "033df9cc-fa98-456d-89be-f5c47b654544"). InnerVolumeSpecName "kube-api-access-qz4r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.351484 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/033df9cc-fa98-456d-89be-f5c47b654544-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "033df9cc-fa98-456d-89be-f5c47b654544" (UID: "033df9cc-fa98-456d-89be-f5c47b654544"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.377588 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033df9cc-fa98-456d-89be-f5c47b654544-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.377660 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033df9cc-fa98-456d-89be-f5c47b654544-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.377671 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz4r9\" (UniqueName: \"kubernetes.io/projected/033df9cc-fa98-456d-89be-f5c47b654544-kube-api-access-qz4r9\") on node \"crc\" DevicePath \"\"" Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.674642 4954 generic.go:334] "Generic (PLEG): container finished" podID="033df9cc-fa98-456d-89be-f5c47b654544" containerID="3498639b5b0ee9f9db42df80a13281728b395fed79be7eb87cd2118b922dc751" exitCode=0 Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.674701 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjh82" event={"ID":"033df9cc-fa98-456d-89be-f5c47b654544","Type":"ContainerDied","Data":"3498639b5b0ee9f9db42df80a13281728b395fed79be7eb87cd2118b922dc751"} Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.674739 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjh82" event={"ID":"033df9cc-fa98-456d-89be-f5c47b654544","Type":"ContainerDied","Data":"938e39e41c17430ee0052723d24c53ce6da876368773e662d06e103fc84c18eb"} Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.674764 4954 scope.go:117] "RemoveContainer" containerID="3498639b5b0ee9f9db42df80a13281728b395fed79be7eb87cd2118b922dc751" Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.674953 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjh82" Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.700279 4954 scope.go:117] "RemoveContainer" containerID="734be305558f65fe7f01701d368f6d1c867cad704be7455a8bb13b759cf73740" Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.719214 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sjh82"] Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.728832 4954 scope.go:117] "RemoveContainer" containerID="5f49043a196f86f164b4285e8c49a99ea964fdeb9b6b371ade7a471aedbbc488" Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.730719 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sjh82"] Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.770779 4954 scope.go:117] "RemoveContainer" containerID="3498639b5b0ee9f9db42df80a13281728b395fed79be7eb87cd2118b922dc751" Dec 06 07:52:42 crc kubenswrapper[4954]: E1206 07:52:42.771218 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3498639b5b0ee9f9db42df80a13281728b395fed79be7eb87cd2118b922dc751\": container with ID starting with 3498639b5b0ee9f9db42df80a13281728b395fed79be7eb87cd2118b922dc751 not found: ID does not exist" containerID="3498639b5b0ee9f9db42df80a13281728b395fed79be7eb87cd2118b922dc751" Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.771248 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3498639b5b0ee9f9db42df80a13281728b395fed79be7eb87cd2118b922dc751"} err="failed to get container status \"3498639b5b0ee9f9db42df80a13281728b395fed79be7eb87cd2118b922dc751\": rpc error: code = NotFound desc = could not find container \"3498639b5b0ee9f9db42df80a13281728b395fed79be7eb87cd2118b922dc751\": container with ID starting with 3498639b5b0ee9f9db42df80a13281728b395fed79be7eb87cd2118b922dc751 not found: ID does not exist" Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.771272 4954 scope.go:117] "RemoveContainer" containerID="734be305558f65fe7f01701d368f6d1c867cad704be7455a8bb13b759cf73740" Dec 06 07:52:42 crc kubenswrapper[4954]: E1206 07:52:42.771645 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"734be305558f65fe7f01701d368f6d1c867cad704be7455a8bb13b759cf73740\": container with ID starting with 734be305558f65fe7f01701d368f6d1c867cad704be7455a8bb13b759cf73740 not found: ID does not exist" containerID="734be305558f65fe7f01701d368f6d1c867cad704be7455a8bb13b759cf73740" Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.771741 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"734be305558f65fe7f01701d368f6d1c867cad704be7455a8bb13b759cf73740"} err="failed to get container status \"734be305558f65fe7f01701d368f6d1c867cad704be7455a8bb13b759cf73740\": rpc error: code = NotFound desc = could not find container \"734be305558f65fe7f01701d368f6d1c867cad704be7455a8bb13b759cf73740\": container with ID starting with 734be305558f65fe7f01701d368f6d1c867cad704be7455a8bb13b759cf73740 not found: ID does not exist" Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.771755 4954 scope.go:117] "RemoveContainer" containerID="5f49043a196f86f164b4285e8c49a99ea964fdeb9b6b371ade7a471aedbbc488" Dec 06 07:52:42 crc kubenswrapper[4954]: E1206 07:52:42.772082 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f49043a196f86f164b4285e8c49a99ea964fdeb9b6b371ade7a471aedbbc488\": container with ID starting with 5f49043a196f86f164b4285e8c49a99ea964fdeb9b6b371ade7a471aedbbc488 not found: ID does not exist" containerID="5f49043a196f86f164b4285e8c49a99ea964fdeb9b6b371ade7a471aedbbc488" Dec 06 07:52:42 crc kubenswrapper[4954]: I1206 07:52:42.772105 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f49043a196f86f164b4285e8c49a99ea964fdeb9b6b371ade7a471aedbbc488"} err="failed to get container status \"5f49043a196f86f164b4285e8c49a99ea964fdeb9b6b371ade7a471aedbbc488\": rpc error: code = NotFound desc = could not find container \"5f49043a196f86f164b4285e8c49a99ea964fdeb9b6b371ade7a471aedbbc488\": container with ID starting with 5f49043a196f86f164b4285e8c49a99ea964fdeb9b6b371ade7a471aedbbc488 not found: ID does not exist" Dec 06 07:52:43 crc kubenswrapper[4954]: I1206 07:52:43.467581 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="033df9cc-fa98-456d-89be-f5c47b654544" path="/var/lib/kubelet/pods/033df9cc-fa98-456d-89be-f5c47b654544/volumes" Dec 06 07:52:57 crc kubenswrapper[4954]: I1206 07:52:57.912895 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hdwdm"] Dec 06 07:52:57 crc kubenswrapper[4954]: E1206 07:52:57.914141 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033df9cc-fa98-456d-89be-f5c47b654544" containerName="registry-server" Dec 06 07:52:57 crc kubenswrapper[4954]: I1206 07:52:57.914160 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="033df9cc-fa98-456d-89be-f5c47b654544" containerName="registry-server" Dec 06 07:52:57 crc kubenswrapper[4954]: E1206 07:52:57.914173 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033df9cc-fa98-456d-89be-f5c47b654544" containerName="extract-content" Dec 06 07:52:57 crc kubenswrapper[4954]: I1206 07:52:57.914180 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="033df9cc-fa98-456d-89be-f5c47b654544" containerName="extract-content" Dec 06 07:52:57 crc kubenswrapper[4954]: E1206 07:52:57.914244 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033df9cc-fa98-456d-89be-f5c47b654544" containerName="extract-utilities" Dec 06 07:52:57 crc kubenswrapper[4954]: I1206 07:52:57.914252 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="033df9cc-fa98-456d-89be-f5c47b654544" containerName="extract-utilities" Dec 06 07:52:57 crc kubenswrapper[4954]: I1206 07:52:57.914656 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="033df9cc-fa98-456d-89be-f5c47b654544" containerName="registry-server" Dec 06 07:52:57 crc kubenswrapper[4954]: I1206 07:52:57.915901 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdwdm" Dec 06 07:52:57 crc kubenswrapper[4954]: I1206 07:52:57.931056 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hdwdm"] Dec 06 07:52:57 crc kubenswrapper[4954]: I1206 07:52:57.974207 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc337a2-3e6f-4568-bdae-59470e5d3988-catalog-content\") pod \"redhat-operators-hdwdm\" (UID: \"acc337a2-3e6f-4568-bdae-59470e5d3988\") " pod="openshift-marketplace/redhat-operators-hdwdm" Dec 06 07:52:57 crc kubenswrapper[4954]: I1206 07:52:57.974606 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgc8p\" (UniqueName: \"kubernetes.io/projected/acc337a2-3e6f-4568-bdae-59470e5d3988-kube-api-access-lgc8p\") pod \"redhat-operators-hdwdm\" (UID: \"acc337a2-3e6f-4568-bdae-59470e5d3988\") " pod="openshift-marketplace/redhat-operators-hdwdm" Dec 06 07:52:57 crc kubenswrapper[4954]: I1206 07:52:57.974737 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc337a2-3e6f-4568-bdae-59470e5d3988-utilities\") pod \"redhat-operators-hdwdm\" (UID: \"acc337a2-3e6f-4568-bdae-59470e5d3988\") " pod="openshift-marketplace/redhat-operators-hdwdm" Dec 06 07:52:58 crc kubenswrapper[4954]: I1206 07:52:58.076894 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgc8p\" (UniqueName: \"kubernetes.io/projected/acc337a2-3e6f-4568-bdae-59470e5d3988-kube-api-access-lgc8p\") pod \"redhat-operators-hdwdm\" (UID: \"acc337a2-3e6f-4568-bdae-59470e5d3988\") " pod="openshift-marketplace/redhat-operators-hdwdm" Dec 06 07:52:58 crc kubenswrapper[4954]: I1206 07:52:58.076952 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc337a2-3e6f-4568-bdae-59470e5d3988-utilities\") pod \"redhat-operators-hdwdm\" (UID: \"acc337a2-3e6f-4568-bdae-59470e5d3988\") " pod="openshift-marketplace/redhat-operators-hdwdm" Dec 06 07:52:58 crc kubenswrapper[4954]: I1206 07:52:58.077035 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc337a2-3e6f-4568-bdae-59470e5d3988-catalog-content\") pod \"redhat-operators-hdwdm\" (UID: \"acc337a2-3e6f-4568-bdae-59470e5d3988\") " pod="openshift-marketplace/redhat-operators-hdwdm" Dec 06 07:52:58 crc kubenswrapper[4954]: I1206 07:52:58.077618 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc337a2-3e6f-4568-bdae-59470e5d3988-catalog-content\") pod \"redhat-operators-hdwdm\" (UID: \"acc337a2-3e6f-4568-bdae-59470e5d3988\") " pod="openshift-marketplace/redhat-operators-hdwdm" Dec 06 07:52:58 crc kubenswrapper[4954]: I1206 07:52:58.078074 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc337a2-3e6f-4568-bdae-59470e5d3988-utilities\") pod \"redhat-operators-hdwdm\" (UID: \"acc337a2-3e6f-4568-bdae-59470e5d3988\") " pod="openshift-marketplace/redhat-operators-hdwdm" Dec 06 07:52:58 crc kubenswrapper[4954]: I1206 07:52:58.107953 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgc8p\" (UniqueName: \"kubernetes.io/projected/acc337a2-3e6f-4568-bdae-59470e5d3988-kube-api-access-lgc8p\") pod \"redhat-operators-hdwdm\" (UID: \"acc337a2-3e6f-4568-bdae-59470e5d3988\") " pod="openshift-marketplace/redhat-operators-hdwdm" Dec 06 07:52:58 crc kubenswrapper[4954]: I1206 07:52:58.240282 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdwdm" Dec 06 07:52:58 crc kubenswrapper[4954]: I1206 07:52:58.724798 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hdwdm"] Dec 06 07:52:58 crc kubenswrapper[4954]: I1206 07:52:58.809524 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdwdm" event={"ID":"acc337a2-3e6f-4568-bdae-59470e5d3988","Type":"ContainerStarted","Data":"ab5ff5d381425131b23cc0d48dc157cd059f51772b792bb49bc544a57615d4d3"} Dec 06 07:52:59 crc kubenswrapper[4954]: I1206 07:52:59.833434 4954 generic.go:334] "Generic (PLEG): container finished" podID="acc337a2-3e6f-4568-bdae-59470e5d3988" containerID="51c5f8f428216f6365241b6e263822e8bdbe90bd3bc31b359edbf447000c9014" exitCode=0 Dec 06 07:52:59 crc kubenswrapper[4954]: I1206 07:52:59.833594 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdwdm" event={"ID":"acc337a2-3e6f-4568-bdae-59470e5d3988","Type":"ContainerDied","Data":"51c5f8f428216f6365241b6e263822e8bdbe90bd3bc31b359edbf447000c9014"} Dec 06 07:53:01 crc kubenswrapper[4954]: I1206 07:53:01.853424 4954 generic.go:334] "Generic (PLEG): container finished" podID="acc337a2-3e6f-4568-bdae-59470e5d3988" containerID="85fab3d6a18cf52e354c2c80b1fd32f7dd97592ddcc8563dd1609b2acbf91225" exitCode=0 Dec 06 07:53:01 crc kubenswrapper[4954]: I1206 07:53:01.853606 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdwdm" event={"ID":"acc337a2-3e6f-4568-bdae-59470e5d3988","Type":"ContainerDied","Data":"85fab3d6a18cf52e354c2c80b1fd32f7dd97592ddcc8563dd1609b2acbf91225"} Dec 06 07:53:02 crc kubenswrapper[4954]: I1206 07:53:02.862950 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdwdm" event={"ID":"acc337a2-3e6f-4568-bdae-59470e5d3988","Type":"ContainerStarted","Data":"4ddfcce6ed6518b4ee87703f0d40427598c47d1d79183048bd5e3c61101b871a"} Dec 06 07:53:02 crc kubenswrapper[4954]: I1206 07:53:02.899082 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hdwdm" podStartSLOduration=3.472018396 podStartE2EDuration="5.899042501s" podCreationTimestamp="2025-12-06 07:52:57 +0000 UTC" firstStartedPulling="2025-12-06 07:52:59.836614327 +0000 UTC m=+3354.649973726" lastFinishedPulling="2025-12-06 07:53:02.263638442 +0000 UTC m=+3357.076997831" observedRunningTime="2025-12-06 07:53:02.895668101 +0000 UTC m=+3357.709027500" watchObservedRunningTime="2025-12-06 07:53:02.899042501 +0000 UTC m=+3357.712401890" Dec 06 07:53:08 crc kubenswrapper[4954]: I1206 07:53:08.240853 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hdwdm" Dec 06 07:53:08 crc kubenswrapper[4954]: I1206 07:53:08.242852 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hdwdm" Dec 06 07:53:08 crc kubenswrapper[4954]: I1206 07:53:08.293889 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hdwdm" Dec 06 07:53:08 crc kubenswrapper[4954]: I1206 07:53:08.963482 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hdwdm" Dec 06 07:53:10 crc kubenswrapper[4954]: I1206 07:53:10.100975 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:53:10 crc kubenswrapper[4954]: I1206 07:53:10.101070 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:53:10 crc kubenswrapper[4954]: I1206 07:53:10.790435 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hdwdm"] Dec 06 07:53:11 crc kubenswrapper[4954]: I1206 07:53:11.942975 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hdwdm" podUID="acc337a2-3e6f-4568-bdae-59470e5d3988" containerName="registry-server" containerID="cri-o://4ddfcce6ed6518b4ee87703f0d40427598c47d1d79183048bd5e3c61101b871a" gracePeriod=2 Dec 06 07:53:15 crc kubenswrapper[4954]: I1206 07:53:14.999290 4954 generic.go:334] "Generic (PLEG): container finished" podID="acc337a2-3e6f-4568-bdae-59470e5d3988" containerID="4ddfcce6ed6518b4ee87703f0d40427598c47d1d79183048bd5e3c61101b871a" exitCode=0 Dec 06 07:53:15 crc kubenswrapper[4954]: I1206 07:53:14.999358 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdwdm" event={"ID":"acc337a2-3e6f-4568-bdae-59470e5d3988","Type":"ContainerDied","Data":"4ddfcce6ed6518b4ee87703f0d40427598c47d1d79183048bd5e3c61101b871a"} Dec 06 07:53:15 crc kubenswrapper[4954]: I1206 07:53:15.256235 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdwdm" Dec 06 07:53:15 crc kubenswrapper[4954]: I1206 07:53:15.402890 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc337a2-3e6f-4568-bdae-59470e5d3988-utilities\") pod \"acc337a2-3e6f-4568-bdae-59470e5d3988\" (UID: \"acc337a2-3e6f-4568-bdae-59470e5d3988\") " Dec 06 07:53:15 crc kubenswrapper[4954]: I1206 07:53:15.403019 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc337a2-3e6f-4568-bdae-59470e5d3988-catalog-content\") pod \"acc337a2-3e6f-4568-bdae-59470e5d3988\" (UID: \"acc337a2-3e6f-4568-bdae-59470e5d3988\") " Dec 06 07:53:15 crc kubenswrapper[4954]: I1206 07:53:15.403192 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgc8p\" (UniqueName: \"kubernetes.io/projected/acc337a2-3e6f-4568-bdae-59470e5d3988-kube-api-access-lgc8p\") pod \"acc337a2-3e6f-4568-bdae-59470e5d3988\" (UID: \"acc337a2-3e6f-4568-bdae-59470e5d3988\") " Dec 06 07:53:15 crc kubenswrapper[4954]: I1206 07:53:15.404201 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acc337a2-3e6f-4568-bdae-59470e5d3988-utilities" (OuterVolumeSpecName: "utilities") pod "acc337a2-3e6f-4568-bdae-59470e5d3988" (UID: "acc337a2-3e6f-4568-bdae-59470e5d3988"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:53:15 crc kubenswrapper[4954]: I1206 07:53:15.411318 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acc337a2-3e6f-4568-bdae-59470e5d3988-kube-api-access-lgc8p" (OuterVolumeSpecName: "kube-api-access-lgc8p") pod "acc337a2-3e6f-4568-bdae-59470e5d3988" (UID: "acc337a2-3e6f-4568-bdae-59470e5d3988"). InnerVolumeSpecName "kube-api-access-lgc8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:53:15 crc kubenswrapper[4954]: I1206 07:53:15.505154 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgc8p\" (UniqueName: \"kubernetes.io/projected/acc337a2-3e6f-4568-bdae-59470e5d3988-kube-api-access-lgc8p\") on node \"crc\" DevicePath \"\"" Dec 06 07:53:15 crc kubenswrapper[4954]: I1206 07:53:15.505191 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc337a2-3e6f-4568-bdae-59470e5d3988-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:53:15 crc kubenswrapper[4954]: I1206 07:53:15.527998 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acc337a2-3e6f-4568-bdae-59470e5d3988-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acc337a2-3e6f-4568-bdae-59470e5d3988" (UID: "acc337a2-3e6f-4568-bdae-59470e5d3988"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:53:15 crc kubenswrapper[4954]: I1206 07:53:15.606797 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc337a2-3e6f-4568-bdae-59470e5d3988-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:53:16 crc kubenswrapper[4954]: I1206 07:53:16.012445 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdwdm" event={"ID":"acc337a2-3e6f-4568-bdae-59470e5d3988","Type":"ContainerDied","Data":"ab5ff5d381425131b23cc0d48dc157cd059f51772b792bb49bc544a57615d4d3"} Dec 06 07:53:16 crc kubenswrapper[4954]: I1206 07:53:16.012510 4954 scope.go:117] "RemoveContainer" containerID="4ddfcce6ed6518b4ee87703f0d40427598c47d1d79183048bd5e3c61101b871a" Dec 06 07:53:16 crc kubenswrapper[4954]: I1206 07:53:16.012630 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdwdm" Dec 06 07:53:16 crc kubenswrapper[4954]: I1206 07:53:16.040164 4954 scope.go:117] "RemoveContainer" containerID="85fab3d6a18cf52e354c2c80b1fd32f7dd97592ddcc8563dd1609b2acbf91225" Dec 06 07:53:16 crc kubenswrapper[4954]: I1206 07:53:16.070678 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hdwdm"] Dec 06 07:53:16 crc kubenswrapper[4954]: I1206 07:53:16.076168 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hdwdm"] Dec 06 07:53:16 crc kubenswrapper[4954]: I1206 07:53:16.082327 4954 scope.go:117] "RemoveContainer" containerID="51c5f8f428216f6365241b6e263822e8bdbe90bd3bc31b359edbf447000c9014" Dec 06 07:53:17 crc kubenswrapper[4954]: I1206 07:53:17.455809 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acc337a2-3e6f-4568-bdae-59470e5d3988" path="/var/lib/kubelet/pods/acc337a2-3e6f-4568-bdae-59470e5d3988/volumes" Dec 06 07:53:40 crc kubenswrapper[4954]: I1206 07:53:40.101595 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:53:40 crc kubenswrapper[4954]: I1206 07:53:40.102273 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:53:40 crc kubenswrapper[4954]: I1206 07:53:40.102333 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 07:53:40 crc kubenswrapper[4954]: I1206 07:53:40.103168 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5bb66dc3bbdf312f3962f2389fa4c8d80855d7bb04ebbd1a394571325d7d80d"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 07:53:40 crc kubenswrapper[4954]: I1206 07:53:40.103228 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://c5bb66dc3bbdf312f3962f2389fa4c8d80855d7bb04ebbd1a394571325d7d80d" gracePeriod=600 Dec 06 07:53:41 crc kubenswrapper[4954]: I1206 07:53:41.243513 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="c5bb66dc3bbdf312f3962f2389fa4c8d80855d7bb04ebbd1a394571325d7d80d" exitCode=0 Dec 06 07:53:41 crc kubenswrapper[4954]: I1206 07:53:41.243612 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"c5bb66dc3bbdf312f3962f2389fa4c8d80855d7bb04ebbd1a394571325d7d80d"} Dec 06 07:53:41 crc kubenswrapper[4954]: I1206 07:53:41.244930 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b"} Dec 06 07:53:41 crc kubenswrapper[4954]: I1206 07:53:41.244988 4954 scope.go:117] "RemoveContainer" containerID="eca44478d0879bfe042d5d7bcd8463ff9c959eaebbc7ea76f9608cef5a9a234c" Dec 06 07:55:40 crc kubenswrapper[4954]: I1206 07:55:40.102015 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:55:40 crc kubenswrapper[4954]: I1206 07:55:40.102934 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:56:04 crc kubenswrapper[4954]: I1206 07:56:04.526145 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hftdx"] Dec 06 07:56:04 crc kubenswrapper[4954]: E1206 07:56:04.527271 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acc337a2-3e6f-4568-bdae-59470e5d3988" containerName="extract-content" Dec 06 07:56:04 crc kubenswrapper[4954]: I1206 07:56:04.527285 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc337a2-3e6f-4568-bdae-59470e5d3988" containerName="extract-content" Dec 06 07:56:04 crc kubenswrapper[4954]: E1206 07:56:04.527296 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acc337a2-3e6f-4568-bdae-59470e5d3988" containerName="registry-server" Dec 06 07:56:04 crc kubenswrapper[4954]: I1206 07:56:04.527305 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc337a2-3e6f-4568-bdae-59470e5d3988" containerName="registry-server" Dec 06 07:56:04 crc kubenswrapper[4954]: E1206 07:56:04.527318 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acc337a2-3e6f-4568-bdae-59470e5d3988" containerName="extract-utilities" Dec 06 07:56:04 crc kubenswrapper[4954]: I1206 07:56:04.527326 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc337a2-3e6f-4568-bdae-59470e5d3988" containerName="extract-utilities" Dec 06 07:56:04 crc kubenswrapper[4954]: I1206 07:56:04.527508 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="acc337a2-3e6f-4568-bdae-59470e5d3988" containerName="registry-server" Dec 06 07:56:04 crc kubenswrapper[4954]: I1206 07:56:04.528691 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hftdx" Dec 06 07:56:04 crc kubenswrapper[4954]: I1206 07:56:04.545964 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hftdx"] Dec 06 07:56:04 crc kubenswrapper[4954]: I1206 07:56:04.726941 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce2db68-dfb2-48ac-814b-2ebaa83cca04-catalog-content\") pod \"community-operators-hftdx\" (UID: \"6ce2db68-dfb2-48ac-814b-2ebaa83cca04\") " pod="openshift-marketplace/community-operators-hftdx" Dec 06 07:56:04 crc kubenswrapper[4954]: I1206 07:56:04.727008 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce2db68-dfb2-48ac-814b-2ebaa83cca04-utilities\") pod \"community-operators-hftdx\" (UID: \"6ce2db68-dfb2-48ac-814b-2ebaa83cca04\") " pod="openshift-marketplace/community-operators-hftdx" Dec 06 07:56:04 crc kubenswrapper[4954]: I1206 07:56:04.727095 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vvkl\" (UniqueName: \"kubernetes.io/projected/6ce2db68-dfb2-48ac-814b-2ebaa83cca04-kube-api-access-2vvkl\") pod \"community-operators-hftdx\" (UID: \"6ce2db68-dfb2-48ac-814b-2ebaa83cca04\") " pod="openshift-marketplace/community-operators-hftdx" Dec 06 07:56:04 crc kubenswrapper[4954]: I1206 07:56:04.828367 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce2db68-dfb2-48ac-814b-2ebaa83cca04-catalog-content\") pod \"community-operators-hftdx\" (UID: \"6ce2db68-dfb2-48ac-814b-2ebaa83cca04\") " pod="openshift-marketplace/community-operators-hftdx" Dec 06 07:56:04 crc kubenswrapper[4954]: I1206 07:56:04.828461 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce2db68-dfb2-48ac-814b-2ebaa83cca04-utilities\") pod \"community-operators-hftdx\" (UID: \"6ce2db68-dfb2-48ac-814b-2ebaa83cca04\") " pod="openshift-marketplace/community-operators-hftdx" Dec 06 07:56:04 crc kubenswrapper[4954]: I1206 07:56:04.828529 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vvkl\" (UniqueName: \"kubernetes.io/projected/6ce2db68-dfb2-48ac-814b-2ebaa83cca04-kube-api-access-2vvkl\") pod \"community-operators-hftdx\" (UID: \"6ce2db68-dfb2-48ac-814b-2ebaa83cca04\") " pod="openshift-marketplace/community-operators-hftdx" Dec 06 07:56:04 crc kubenswrapper[4954]: I1206 07:56:04.829361 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce2db68-dfb2-48ac-814b-2ebaa83cca04-catalog-content\") pod \"community-operators-hftdx\" (UID: \"6ce2db68-dfb2-48ac-814b-2ebaa83cca04\") " pod="openshift-marketplace/community-operators-hftdx" Dec 06 07:56:04 crc kubenswrapper[4954]: I1206 07:56:04.829506 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce2db68-dfb2-48ac-814b-2ebaa83cca04-utilities\") pod \"community-operators-hftdx\" (UID: \"6ce2db68-dfb2-48ac-814b-2ebaa83cca04\") " pod="openshift-marketplace/community-operators-hftdx" Dec 06 07:56:04 crc kubenswrapper[4954]: I1206 07:56:04.861197 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vvkl\" (UniqueName: \"kubernetes.io/projected/6ce2db68-dfb2-48ac-814b-2ebaa83cca04-kube-api-access-2vvkl\") pod \"community-operators-hftdx\" (UID: \"6ce2db68-dfb2-48ac-814b-2ebaa83cca04\") " pod="openshift-marketplace/community-operators-hftdx" Dec 06 07:56:04 crc kubenswrapper[4954]: I1206 07:56:04.869530 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hftdx" Dec 06 07:56:05 crc kubenswrapper[4954]: I1206 07:56:05.234533 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hftdx"] Dec 06 07:56:06 crc kubenswrapper[4954]: I1206 07:56:06.257370 4954 generic.go:334] "Generic (PLEG): container finished" podID="6ce2db68-dfb2-48ac-814b-2ebaa83cca04" containerID="2b0c2334ce95746b395232b5f997b2c3d9bea7a18ccf77d7332380263fc7e975" exitCode=0 Dec 06 07:56:06 crc kubenswrapper[4954]: I1206 07:56:06.257487 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hftdx" event={"ID":"6ce2db68-dfb2-48ac-814b-2ebaa83cca04","Type":"ContainerDied","Data":"2b0c2334ce95746b395232b5f997b2c3d9bea7a18ccf77d7332380263fc7e975"} Dec 06 07:56:06 crc kubenswrapper[4954]: I1206 07:56:06.258055 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hftdx" event={"ID":"6ce2db68-dfb2-48ac-814b-2ebaa83cca04","Type":"ContainerStarted","Data":"d0f861e6d3a826aa712b7e7db8fb6b20d675865921ca90be93743a5dc9a15a42"} Dec 06 07:56:07 crc kubenswrapper[4954]: I1206 07:56:07.291741 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hftdx" event={"ID":"6ce2db68-dfb2-48ac-814b-2ebaa83cca04","Type":"ContainerStarted","Data":"e152523a6457cb1e24afa8d66e5bc211051c909c1fb6591995731745bc799e06"} Dec 06 07:56:08 crc kubenswrapper[4954]: I1206 07:56:08.304407 4954 generic.go:334] "Generic (PLEG): container finished" podID="6ce2db68-dfb2-48ac-814b-2ebaa83cca04" containerID="e152523a6457cb1e24afa8d66e5bc211051c909c1fb6591995731745bc799e06" exitCode=0 Dec 06 07:56:08 crc kubenswrapper[4954]: I1206 07:56:08.304768 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hftdx" event={"ID":"6ce2db68-dfb2-48ac-814b-2ebaa83cca04","Type":"ContainerDied","Data":"e152523a6457cb1e24afa8d66e5bc211051c909c1fb6591995731745bc799e06"} Dec 06 07:56:09 crc kubenswrapper[4954]: I1206 07:56:09.317140 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hftdx" event={"ID":"6ce2db68-dfb2-48ac-814b-2ebaa83cca04","Type":"ContainerStarted","Data":"809ccab58fdbb0f397facba9bc9196dd30eba4bf3ff3705f2e52b36448923793"} Dec 06 07:56:09 crc kubenswrapper[4954]: I1206 07:56:09.347458 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hftdx" podStartSLOduration=2.9085754379999997 podStartE2EDuration="5.347436964s" podCreationTimestamp="2025-12-06 07:56:04 +0000 UTC" firstStartedPulling="2025-12-06 07:56:06.260453338 +0000 UTC m=+3541.073812737" lastFinishedPulling="2025-12-06 07:56:08.699314864 +0000 UTC m=+3543.512674263" observedRunningTime="2025-12-06 07:56:09.343351206 +0000 UTC m=+3544.156710645" watchObservedRunningTime="2025-12-06 07:56:09.347436964 +0000 UTC m=+3544.160796353" Dec 06 07:56:10 crc kubenswrapper[4954]: I1206 07:56:10.102616 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:56:10 crc kubenswrapper[4954]: I1206 07:56:10.102716 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:56:14 crc kubenswrapper[4954]: I1206 07:56:14.870723 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hftdx" Dec 06 07:56:14 crc kubenswrapper[4954]: I1206 07:56:14.871387 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hftdx" Dec 06 07:56:14 crc kubenswrapper[4954]: I1206 07:56:14.925941 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hftdx" Dec 06 07:56:15 crc kubenswrapper[4954]: I1206 07:56:15.453084 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hftdx" Dec 06 07:56:15 crc kubenswrapper[4954]: I1206 07:56:15.512169 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hftdx"] Dec 06 07:56:17 crc kubenswrapper[4954]: I1206 07:56:17.402122 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hftdx" podUID="6ce2db68-dfb2-48ac-814b-2ebaa83cca04" containerName="registry-server" containerID="cri-o://809ccab58fdbb0f397facba9bc9196dd30eba4bf3ff3705f2e52b36448923793" gracePeriod=2 Dec 06 07:56:19 crc kubenswrapper[4954]: I1206 07:56:19.429179 4954 generic.go:334] "Generic (PLEG): container finished" podID="6ce2db68-dfb2-48ac-814b-2ebaa83cca04" containerID="809ccab58fdbb0f397facba9bc9196dd30eba4bf3ff3705f2e52b36448923793" exitCode=0 Dec 06 07:56:19 crc kubenswrapper[4954]: I1206 07:56:19.429620 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hftdx" event={"ID":"6ce2db68-dfb2-48ac-814b-2ebaa83cca04","Type":"ContainerDied","Data":"809ccab58fdbb0f397facba9bc9196dd30eba4bf3ff3705f2e52b36448923793"} Dec 06 07:56:19 crc kubenswrapper[4954]: I1206 07:56:19.673200 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hftdx" Dec 06 07:56:19 crc kubenswrapper[4954]: I1206 07:56:19.817342 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce2db68-dfb2-48ac-814b-2ebaa83cca04-catalog-content\") pod \"6ce2db68-dfb2-48ac-814b-2ebaa83cca04\" (UID: \"6ce2db68-dfb2-48ac-814b-2ebaa83cca04\") " Dec 06 07:56:19 crc kubenswrapper[4954]: I1206 07:56:19.817651 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce2db68-dfb2-48ac-814b-2ebaa83cca04-utilities\") pod \"6ce2db68-dfb2-48ac-814b-2ebaa83cca04\" (UID: \"6ce2db68-dfb2-48ac-814b-2ebaa83cca04\") " Dec 06 07:56:19 crc kubenswrapper[4954]: I1206 07:56:19.817705 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vvkl\" (UniqueName: \"kubernetes.io/projected/6ce2db68-dfb2-48ac-814b-2ebaa83cca04-kube-api-access-2vvkl\") pod \"6ce2db68-dfb2-48ac-814b-2ebaa83cca04\" (UID: \"6ce2db68-dfb2-48ac-814b-2ebaa83cca04\") " Dec 06 07:56:19 crc kubenswrapper[4954]: I1206 07:56:19.819224 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ce2db68-dfb2-48ac-814b-2ebaa83cca04-utilities" (OuterVolumeSpecName: "utilities") pod "6ce2db68-dfb2-48ac-814b-2ebaa83cca04" (UID: "6ce2db68-dfb2-48ac-814b-2ebaa83cca04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:56:19 crc kubenswrapper[4954]: I1206 07:56:19.831839 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ce2db68-dfb2-48ac-814b-2ebaa83cca04-kube-api-access-2vvkl" (OuterVolumeSpecName: "kube-api-access-2vvkl") pod "6ce2db68-dfb2-48ac-814b-2ebaa83cca04" (UID: "6ce2db68-dfb2-48ac-814b-2ebaa83cca04"). InnerVolumeSpecName "kube-api-access-2vvkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:56:19 crc kubenswrapper[4954]: I1206 07:56:19.901289 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ce2db68-dfb2-48ac-814b-2ebaa83cca04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ce2db68-dfb2-48ac-814b-2ebaa83cca04" (UID: "6ce2db68-dfb2-48ac-814b-2ebaa83cca04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:56:19 crc kubenswrapper[4954]: I1206 07:56:19.919728 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce2db68-dfb2-48ac-814b-2ebaa83cca04-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:56:19 crc kubenswrapper[4954]: I1206 07:56:19.919772 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vvkl\" (UniqueName: \"kubernetes.io/projected/6ce2db68-dfb2-48ac-814b-2ebaa83cca04-kube-api-access-2vvkl\") on node \"crc\" DevicePath \"\"" Dec 06 07:56:19 crc kubenswrapper[4954]: I1206 07:56:19.919807 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce2db68-dfb2-48ac-814b-2ebaa83cca04-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:56:20 crc kubenswrapper[4954]: I1206 07:56:20.445950 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hftdx" event={"ID":"6ce2db68-dfb2-48ac-814b-2ebaa83cca04","Type":"ContainerDied","Data":"d0f861e6d3a826aa712b7e7db8fb6b20d675865921ca90be93743a5dc9a15a42"} Dec 06 07:56:20 crc kubenswrapper[4954]: I1206 07:56:20.446044 4954 scope.go:117] "RemoveContainer" containerID="809ccab58fdbb0f397facba9bc9196dd30eba4bf3ff3705f2e52b36448923793" Dec 06 07:56:20 crc kubenswrapper[4954]: I1206 07:56:20.446314 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hftdx" Dec 06 07:56:20 crc kubenswrapper[4954]: I1206 07:56:20.480490 4954 scope.go:117] "RemoveContainer" containerID="e152523a6457cb1e24afa8d66e5bc211051c909c1fb6591995731745bc799e06" Dec 06 07:56:20 crc kubenswrapper[4954]: I1206 07:56:20.526423 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hftdx"] Dec 06 07:56:20 crc kubenswrapper[4954]: I1206 07:56:20.533607 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hftdx"] Dec 06 07:56:20 crc kubenswrapper[4954]: I1206 07:56:20.538890 4954 scope.go:117] "RemoveContainer" containerID="2b0c2334ce95746b395232b5f997b2c3d9bea7a18ccf77d7332380263fc7e975" Dec 06 07:56:21 crc kubenswrapper[4954]: I1206 07:56:21.455389 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ce2db68-dfb2-48ac-814b-2ebaa83cca04" path="/var/lib/kubelet/pods/6ce2db68-dfb2-48ac-814b-2ebaa83cca04/volumes" Dec 06 07:56:40 crc kubenswrapper[4954]: I1206 07:56:40.101501 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 07:56:40 crc kubenswrapper[4954]: I1206 07:56:40.102798 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 07:56:40 crc kubenswrapper[4954]: I1206 07:56:40.102904 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 07:56:40 crc kubenswrapper[4954]: I1206 07:56:40.103786 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 07:56:40 crc kubenswrapper[4954]: I1206 07:56:40.103864 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" gracePeriod=600 Dec 06 07:56:40 crc kubenswrapper[4954]: E1206 07:56:40.232201 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:56:40 crc kubenswrapper[4954]: I1206 07:56:40.631374 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" exitCode=0 Dec 06 07:56:40 crc kubenswrapper[4954]: I1206 07:56:40.631445 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b"} Dec 06 07:56:40 crc kubenswrapper[4954]: I1206 07:56:40.631526 4954 scope.go:117] "RemoveContainer" containerID="c5bb66dc3bbdf312f3962f2389fa4c8d80855d7bb04ebbd1a394571325d7d80d" Dec 06 07:56:40 crc kubenswrapper[4954]: I1206 07:56:40.632335 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 07:56:40 crc kubenswrapper[4954]: E1206 07:56:40.632633 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:56:51 crc kubenswrapper[4954]: I1206 07:56:51.444249 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 07:56:51 crc kubenswrapper[4954]: E1206 07:56:51.445651 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:57:06 crc kubenswrapper[4954]: I1206 07:57:06.443804 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 07:57:06 crc kubenswrapper[4954]: E1206 07:57:06.444703 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:57:19 crc kubenswrapper[4954]: I1206 07:57:19.444011 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 07:57:19 crc kubenswrapper[4954]: E1206 07:57:19.444939 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:57:33 crc kubenswrapper[4954]: I1206 07:57:33.443413 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 07:57:33 crc kubenswrapper[4954]: E1206 07:57:33.444344 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:57:44 crc kubenswrapper[4954]: I1206 07:57:44.444251 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 07:57:44 crc kubenswrapper[4954]: E1206 07:57:44.445475 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:57:58 crc kubenswrapper[4954]: I1206 07:57:58.444359 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 07:57:58 crc kubenswrapper[4954]: E1206 07:57:58.445389 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:58:09 crc kubenswrapper[4954]: I1206 07:58:09.444295 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 07:58:09 crc kubenswrapper[4954]: E1206 07:58:09.445529 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:58:24 crc kubenswrapper[4954]: I1206 07:58:24.443673 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 07:58:24 crc kubenswrapper[4954]: E1206 07:58:24.444606 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:58:36 crc kubenswrapper[4954]: I1206 07:58:36.443848 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 07:58:36 crc kubenswrapper[4954]: E1206 07:58:36.444864 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:58:47 crc kubenswrapper[4954]: I1206 07:58:47.443790 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 07:58:47 crc kubenswrapper[4954]: E1206 07:58:47.445805 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:59:00 crc kubenswrapper[4954]: I1206 07:59:00.443385 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 07:59:00 crc kubenswrapper[4954]: E1206 07:59:00.444200 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:59:15 crc kubenswrapper[4954]: I1206 07:59:15.447751 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 07:59:15 crc kubenswrapper[4954]: E1206 07:59:15.448723 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:59:26 crc kubenswrapper[4954]: I1206 07:59:26.443749 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 07:59:26 crc kubenswrapper[4954]: E1206 07:59:26.444682 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:59:41 crc kubenswrapper[4954]: I1206 07:59:41.444224 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 07:59:41 crc kubenswrapper[4954]: E1206 07:59:41.445285 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:59:43 crc kubenswrapper[4954]: I1206 07:59:43.808823 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7z29x"] Dec 06 07:59:43 crc kubenswrapper[4954]: E1206 07:59:43.809559 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce2db68-dfb2-48ac-814b-2ebaa83cca04" containerName="extract-utilities" Dec 06 07:59:43 crc kubenswrapper[4954]: I1206 07:59:43.809614 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce2db68-dfb2-48ac-814b-2ebaa83cca04" containerName="extract-utilities" Dec 06 07:59:43 crc kubenswrapper[4954]: E1206 07:59:43.809648 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce2db68-dfb2-48ac-814b-2ebaa83cca04" containerName="extract-content" Dec 06 07:59:43 crc kubenswrapper[4954]: I1206 07:59:43.809661 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce2db68-dfb2-48ac-814b-2ebaa83cca04" containerName="extract-content" Dec 06 07:59:43 crc kubenswrapper[4954]: E1206 07:59:43.809682 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce2db68-dfb2-48ac-814b-2ebaa83cca04" containerName="registry-server" Dec 06 07:59:43 crc kubenswrapper[4954]: I1206 07:59:43.809695 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce2db68-dfb2-48ac-814b-2ebaa83cca04" containerName="registry-server" Dec 06 07:59:43 crc kubenswrapper[4954]: I1206 07:59:43.810011 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ce2db68-dfb2-48ac-814b-2ebaa83cca04" containerName="registry-server" Dec 06 07:59:43 crc kubenswrapper[4954]: I1206 07:59:43.812611 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7z29x" Dec 06 07:59:43 crc kubenswrapper[4954]: I1206 07:59:43.828819 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7z29x"] Dec 06 07:59:43 crc kubenswrapper[4954]: I1206 07:59:43.900061 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90-utilities\") pod \"redhat-marketplace-7z29x\" (UID: \"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90\") " pod="openshift-marketplace/redhat-marketplace-7z29x" Dec 06 07:59:43 crc kubenswrapper[4954]: I1206 07:59:43.900117 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmwng\" (UniqueName: \"kubernetes.io/projected/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90-kube-api-access-nmwng\") pod \"redhat-marketplace-7z29x\" (UID: \"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90\") " pod="openshift-marketplace/redhat-marketplace-7z29x" Dec 06 07:59:43 crc kubenswrapper[4954]: I1206 07:59:43.900156 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90-catalog-content\") pod \"redhat-marketplace-7z29x\" (UID: \"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90\") " pod="openshift-marketplace/redhat-marketplace-7z29x" Dec 06 07:59:44 crc kubenswrapper[4954]: I1206 07:59:44.001933 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90-utilities\") pod \"redhat-marketplace-7z29x\" (UID: \"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90\") " pod="openshift-marketplace/redhat-marketplace-7z29x" Dec 06 07:59:44 crc kubenswrapper[4954]: I1206 07:59:44.001988 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmwng\" (UniqueName: \"kubernetes.io/projected/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90-kube-api-access-nmwng\") pod \"redhat-marketplace-7z29x\" (UID: \"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90\") " pod="openshift-marketplace/redhat-marketplace-7z29x" Dec 06 07:59:44 crc kubenswrapper[4954]: I1206 07:59:44.002016 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90-catalog-content\") pod \"redhat-marketplace-7z29x\" (UID: \"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90\") " pod="openshift-marketplace/redhat-marketplace-7z29x" Dec 06 07:59:44 crc kubenswrapper[4954]: I1206 07:59:44.002467 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90-utilities\") pod \"redhat-marketplace-7z29x\" (UID: \"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90\") " pod="openshift-marketplace/redhat-marketplace-7z29x" Dec 06 07:59:44 crc kubenswrapper[4954]: I1206 07:59:44.002757 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90-catalog-content\") pod \"redhat-marketplace-7z29x\" (UID: \"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90\") " pod="openshift-marketplace/redhat-marketplace-7z29x" Dec 06 07:59:44 crc kubenswrapper[4954]: I1206 07:59:44.042804 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmwng\" (UniqueName: \"kubernetes.io/projected/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90-kube-api-access-nmwng\") pod \"redhat-marketplace-7z29x\" (UID: \"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90\") " pod="openshift-marketplace/redhat-marketplace-7z29x" Dec 06 07:59:44 crc kubenswrapper[4954]: I1206 07:59:44.137596 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7z29x" Dec 06 07:59:44 crc kubenswrapper[4954]: I1206 07:59:44.605617 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7z29x"] Dec 06 07:59:45 crc kubenswrapper[4954]: I1206 07:59:45.481439 4954 generic.go:334] "Generic (PLEG): container finished" podID="b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90" containerID="1fa4773e9a15cd5e07f619360da2f9ebb2a3f60ffad0f97606d6fc331abf1648" exitCode=0 Dec 06 07:59:45 crc kubenswrapper[4954]: I1206 07:59:45.481760 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7z29x" event={"ID":"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90","Type":"ContainerDied","Data":"1fa4773e9a15cd5e07f619360da2f9ebb2a3f60ffad0f97606d6fc331abf1648"} Dec 06 07:59:45 crc kubenswrapper[4954]: I1206 07:59:45.481917 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7z29x" event={"ID":"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90","Type":"ContainerStarted","Data":"5582bed373737f2b641d8f085315d715b07d4e5867dea90793fd6b50ce84b55c"} Dec 06 07:59:45 crc kubenswrapper[4954]: I1206 07:59:45.483867 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 07:59:46 crc kubenswrapper[4954]: I1206 07:59:46.495844 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7z29x" event={"ID":"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90","Type":"ContainerStarted","Data":"52abbf8adebde457184ca833ab20da8c933329ce4e7e46318ec38a60832fb0bd"} Dec 06 07:59:46 crc kubenswrapper[4954]: E1206 07:59:46.576674 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4330a2c_20e1_4e9e_b9fe_7ddd563c5a90.slice/crio-conmon-52abbf8adebde457184ca833ab20da8c933329ce4e7e46318ec38a60832fb0bd.scope\": RecentStats: unable to find data in memory cache]" Dec 06 07:59:47 crc kubenswrapper[4954]: I1206 07:59:47.507266 4954 generic.go:334] "Generic (PLEG): container finished" podID="b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90" containerID="52abbf8adebde457184ca833ab20da8c933329ce4e7e46318ec38a60832fb0bd" exitCode=0 Dec 06 07:59:47 crc kubenswrapper[4954]: I1206 07:59:47.507338 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7z29x" event={"ID":"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90","Type":"ContainerDied","Data":"52abbf8adebde457184ca833ab20da8c933329ce4e7e46318ec38a60832fb0bd"} Dec 06 07:59:48 crc kubenswrapper[4954]: I1206 07:59:48.518048 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7z29x" event={"ID":"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90","Type":"ContainerStarted","Data":"9a3306f67e0f6529f38c8833ab338598739bf1913422d5088fdb9e521f4c1b67"} Dec 06 07:59:48 crc kubenswrapper[4954]: I1206 07:59:48.542120 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7z29x" podStartSLOduration=3.120498556 podStartE2EDuration="5.542090733s" podCreationTimestamp="2025-12-06 07:59:43 +0000 UTC" firstStartedPulling="2025-12-06 07:59:45.483623723 +0000 UTC m=+3760.296983112" lastFinishedPulling="2025-12-06 07:59:47.90521589 +0000 UTC m=+3762.718575289" observedRunningTime="2025-12-06 07:59:48.540162062 +0000 UTC m=+3763.353521481" watchObservedRunningTime="2025-12-06 07:59:48.542090733 +0000 UTC m=+3763.355450122" Dec 06 07:59:54 crc kubenswrapper[4954]: I1206 07:59:54.138458 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7z29x" Dec 06 07:59:54 crc kubenswrapper[4954]: I1206 07:59:54.139284 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7z29x" Dec 06 07:59:54 crc kubenswrapper[4954]: I1206 07:59:54.202324 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7z29x" Dec 06 07:59:54 crc kubenswrapper[4954]: I1206 07:59:54.760964 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7z29x" Dec 06 07:59:54 crc kubenswrapper[4954]: I1206 07:59:54.845136 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7z29x"] Dec 06 07:59:56 crc kubenswrapper[4954]: I1206 07:59:56.443901 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 07:59:56 crc kubenswrapper[4954]: E1206 07:59:56.444921 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 07:59:56 crc kubenswrapper[4954]: I1206 07:59:56.722303 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7z29x" podUID="b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90" containerName="registry-server" containerID="cri-o://9a3306f67e0f6529f38c8833ab338598739bf1913422d5088fdb9e521f4c1b67" gracePeriod=2 Dec 06 07:59:57 crc kubenswrapper[4954]: I1206 07:59:57.737334 4954 generic.go:334] "Generic (PLEG): container finished" podID="b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90" containerID="9a3306f67e0f6529f38c8833ab338598739bf1913422d5088fdb9e521f4c1b67" exitCode=0 Dec 06 07:59:57 crc kubenswrapper[4954]: I1206 07:59:57.737427 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7z29x" event={"ID":"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90","Type":"ContainerDied","Data":"9a3306f67e0f6529f38c8833ab338598739bf1913422d5088fdb9e521f4c1b67"} Dec 06 07:59:58 crc kubenswrapper[4954]: I1206 07:59:58.344941 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7z29x" Dec 06 07:59:58 crc kubenswrapper[4954]: I1206 07:59:58.473052 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90-catalog-content\") pod \"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90\" (UID: \"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90\") " Dec 06 07:59:58 crc kubenswrapper[4954]: I1206 07:59:58.473234 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmwng\" (UniqueName: \"kubernetes.io/projected/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90-kube-api-access-nmwng\") pod \"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90\" (UID: \"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90\") " Dec 06 07:59:58 crc kubenswrapper[4954]: I1206 07:59:58.473299 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90-utilities\") pod \"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90\" (UID: \"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90\") " Dec 06 07:59:58 crc kubenswrapper[4954]: I1206 07:59:58.474535 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90-utilities" (OuterVolumeSpecName: "utilities") pod "b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90" (UID: "b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:59:58 crc kubenswrapper[4954]: I1206 07:59:58.482402 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90-kube-api-access-nmwng" (OuterVolumeSpecName: "kube-api-access-nmwng") pod "b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90" (UID: "b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90"). InnerVolumeSpecName "kube-api-access-nmwng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 07:59:58 crc kubenswrapper[4954]: I1206 07:59:58.497833 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90" (UID: "b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 07:59:58 crc kubenswrapper[4954]: I1206 07:59:58.575764 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmwng\" (UniqueName: \"kubernetes.io/projected/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90-kube-api-access-nmwng\") on node \"crc\" DevicePath \"\"" Dec 06 07:59:58 crc kubenswrapper[4954]: I1206 07:59:58.576303 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 07:59:58 crc kubenswrapper[4954]: I1206 07:59:58.576320 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 07:59:58 crc kubenswrapper[4954]: I1206 07:59:58.748851 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7z29x" event={"ID":"b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90","Type":"ContainerDied","Data":"5582bed373737f2b641d8f085315d715b07d4e5867dea90793fd6b50ce84b55c"} Dec 06 07:59:58 crc kubenswrapper[4954]: I1206 07:59:58.748931 4954 scope.go:117] "RemoveContainer" containerID="9a3306f67e0f6529f38c8833ab338598739bf1913422d5088fdb9e521f4c1b67" Dec 06 07:59:58 crc kubenswrapper[4954]: I1206 07:59:58.748929 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7z29x" Dec 06 07:59:58 crc kubenswrapper[4954]: I1206 07:59:58.789013 4954 scope.go:117] "RemoveContainer" containerID="52abbf8adebde457184ca833ab20da8c933329ce4e7e46318ec38a60832fb0bd" Dec 06 07:59:58 crc kubenswrapper[4954]: I1206 07:59:58.813670 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7z29x"] Dec 06 07:59:58 crc kubenswrapper[4954]: I1206 07:59:58.826221 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7z29x"] Dec 06 07:59:58 crc kubenswrapper[4954]: I1206 07:59:58.840401 4954 scope.go:117] "RemoveContainer" containerID="1fa4773e9a15cd5e07f619360da2f9ebb2a3f60ffad0f97606d6fc331abf1648" Dec 06 07:59:59 crc kubenswrapper[4954]: I1206 07:59:59.454204 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90" path="/var/lib/kubelet/pods/b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90/volumes" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.181764 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m"] Dec 06 08:00:00 crc kubenswrapper[4954]: E1206 08:00:00.182165 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90" containerName="registry-server" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.182182 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90" containerName="registry-server" Dec 06 08:00:00 crc kubenswrapper[4954]: E1206 08:00:00.182199 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90" containerName="extract-content" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.182207 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90" containerName="extract-content" Dec 06 08:00:00 crc kubenswrapper[4954]: E1206 08:00:00.182229 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90" containerName="extract-utilities" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.182235 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90" containerName="extract-utilities" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.182402 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4330a2c-20e1-4e9e-b9fe-7ddd563c5a90" containerName="registry-server" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.183032 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.185919 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.185938 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.200198 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m"] Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.309148 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9dded35e-7239-430c-94a8-52974cd4eb0c-config-volume\") pod \"collect-profiles-29416800-hbn6m\" (UID: \"9dded35e-7239-430c-94a8-52974cd4eb0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.309215 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9dded35e-7239-430c-94a8-52974cd4eb0c-secret-volume\") pod \"collect-profiles-29416800-hbn6m\" (UID: \"9dded35e-7239-430c-94a8-52974cd4eb0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.309276 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn4rf\" (UniqueName: \"kubernetes.io/projected/9dded35e-7239-430c-94a8-52974cd4eb0c-kube-api-access-fn4rf\") pod \"collect-profiles-29416800-hbn6m\" (UID: \"9dded35e-7239-430c-94a8-52974cd4eb0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.410433 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn4rf\" (UniqueName: \"kubernetes.io/projected/9dded35e-7239-430c-94a8-52974cd4eb0c-kube-api-access-fn4rf\") pod \"collect-profiles-29416800-hbn6m\" (UID: \"9dded35e-7239-430c-94a8-52974cd4eb0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.410580 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9dded35e-7239-430c-94a8-52974cd4eb0c-config-volume\") pod \"collect-profiles-29416800-hbn6m\" (UID: \"9dded35e-7239-430c-94a8-52974cd4eb0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.410612 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9dded35e-7239-430c-94a8-52974cd4eb0c-secret-volume\") pod \"collect-profiles-29416800-hbn6m\" (UID: \"9dded35e-7239-430c-94a8-52974cd4eb0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.412647 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9dded35e-7239-430c-94a8-52974cd4eb0c-config-volume\") pod \"collect-profiles-29416800-hbn6m\" (UID: \"9dded35e-7239-430c-94a8-52974cd4eb0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.419436 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9dded35e-7239-430c-94a8-52974cd4eb0c-secret-volume\") pod \"collect-profiles-29416800-hbn6m\" (UID: \"9dded35e-7239-430c-94a8-52974cd4eb0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.435936 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn4rf\" (UniqueName: \"kubernetes.io/projected/9dded35e-7239-430c-94a8-52974cd4eb0c-kube-api-access-fn4rf\") pod \"collect-profiles-29416800-hbn6m\" (UID: \"9dded35e-7239-430c-94a8-52974cd4eb0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.502773 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m" Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.755396 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m"] Dec 06 08:00:00 crc kubenswrapper[4954]: I1206 08:00:00.779905 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m" event={"ID":"9dded35e-7239-430c-94a8-52974cd4eb0c","Type":"ContainerStarted","Data":"ce4116820371d9ce2990612cba403562e7bbe2e75f60c40b07640a357d577d3f"} Dec 06 08:00:01 crc kubenswrapper[4954]: I1206 08:00:01.789585 4954 generic.go:334] "Generic (PLEG): container finished" podID="9dded35e-7239-430c-94a8-52974cd4eb0c" containerID="5611d172e5cbcc3fc432e189f4c58475582725ed0b2a724d1201575875585059" exitCode=0 Dec 06 08:00:01 crc kubenswrapper[4954]: I1206 08:00:01.789637 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m" event={"ID":"9dded35e-7239-430c-94a8-52974cd4eb0c","Type":"ContainerDied","Data":"5611d172e5cbcc3fc432e189f4c58475582725ed0b2a724d1201575875585059"} Dec 06 08:00:03 crc kubenswrapper[4954]: I1206 08:00:03.114830 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m" Dec 06 08:00:03 crc kubenswrapper[4954]: I1206 08:00:03.256984 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9dded35e-7239-430c-94a8-52974cd4eb0c-config-volume\") pod \"9dded35e-7239-430c-94a8-52974cd4eb0c\" (UID: \"9dded35e-7239-430c-94a8-52974cd4eb0c\") " Dec 06 08:00:03 crc kubenswrapper[4954]: I1206 08:00:03.257143 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn4rf\" (UniqueName: \"kubernetes.io/projected/9dded35e-7239-430c-94a8-52974cd4eb0c-kube-api-access-fn4rf\") pod \"9dded35e-7239-430c-94a8-52974cd4eb0c\" (UID: \"9dded35e-7239-430c-94a8-52974cd4eb0c\") " Dec 06 08:00:03 crc kubenswrapper[4954]: I1206 08:00:03.257250 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9dded35e-7239-430c-94a8-52974cd4eb0c-secret-volume\") pod \"9dded35e-7239-430c-94a8-52974cd4eb0c\" (UID: \"9dded35e-7239-430c-94a8-52974cd4eb0c\") " Dec 06 08:00:03 crc kubenswrapper[4954]: I1206 08:00:03.258633 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dded35e-7239-430c-94a8-52974cd4eb0c-config-volume" (OuterVolumeSpecName: "config-volume") pod "9dded35e-7239-430c-94a8-52974cd4eb0c" (UID: "9dded35e-7239-430c-94a8-52974cd4eb0c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:00:03 crc kubenswrapper[4954]: I1206 08:00:03.264517 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dded35e-7239-430c-94a8-52974cd4eb0c-kube-api-access-fn4rf" (OuterVolumeSpecName: "kube-api-access-fn4rf") pod "9dded35e-7239-430c-94a8-52974cd4eb0c" (UID: "9dded35e-7239-430c-94a8-52974cd4eb0c"). InnerVolumeSpecName "kube-api-access-fn4rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:00:03 crc kubenswrapper[4954]: I1206 08:00:03.265395 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dded35e-7239-430c-94a8-52974cd4eb0c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9dded35e-7239-430c-94a8-52974cd4eb0c" (UID: "9dded35e-7239-430c-94a8-52974cd4eb0c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:00:03 crc kubenswrapper[4954]: I1206 08:00:03.359312 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn4rf\" (UniqueName: \"kubernetes.io/projected/9dded35e-7239-430c-94a8-52974cd4eb0c-kube-api-access-fn4rf\") on node \"crc\" DevicePath \"\"" Dec 06 08:00:03 crc kubenswrapper[4954]: I1206 08:00:03.359386 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9dded35e-7239-430c-94a8-52974cd4eb0c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 08:00:03 crc kubenswrapper[4954]: I1206 08:00:03.359400 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9dded35e-7239-430c-94a8-52974cd4eb0c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 08:00:03 crc kubenswrapper[4954]: I1206 08:00:03.823140 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m" event={"ID":"9dded35e-7239-430c-94a8-52974cd4eb0c","Type":"ContainerDied","Data":"ce4116820371d9ce2990612cba403562e7bbe2e75f60c40b07640a357d577d3f"} Dec 06 08:00:03 crc kubenswrapper[4954]: I1206 08:00:03.823203 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce4116820371d9ce2990612cba403562e7bbe2e75f60c40b07640a357d577d3f" Dec 06 08:00:03 crc kubenswrapper[4954]: I1206 08:00:03.823279 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m" Dec 06 08:00:04 crc kubenswrapper[4954]: I1206 08:00:04.211388 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j"] Dec 06 08:00:04 crc kubenswrapper[4954]: I1206 08:00:04.219438 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416755-nkc4j"] Dec 06 08:00:05 crc kubenswrapper[4954]: I1206 08:00:05.456995 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94b53cf7-013c-4df3-82de-c438b35806ac" path="/var/lib/kubelet/pods/94b53cf7-013c-4df3-82de-c438b35806ac/volumes" Dec 06 08:00:10 crc kubenswrapper[4954]: I1206 08:00:10.443410 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 08:00:10 crc kubenswrapper[4954]: E1206 08:00:10.444526 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:00:21 crc kubenswrapper[4954]: I1206 08:00:21.309648 4954 scope.go:117] "RemoveContainer" containerID="9bf0a4efe2e24ad85348bb01c4e3fe64356fe21da9707c1dbcbb87c496f8dea8" Dec 06 08:00:21 crc kubenswrapper[4954]: I1206 08:00:21.443917 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 08:00:21 crc kubenswrapper[4954]: E1206 08:00:21.444846 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:00:34 crc kubenswrapper[4954]: I1206 08:00:34.443459 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 08:00:34 crc kubenswrapper[4954]: E1206 08:00:34.446488 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:00:47 crc kubenswrapper[4954]: I1206 08:00:47.443943 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 08:00:47 crc kubenswrapper[4954]: E1206 08:00:47.445105 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:01:01 crc kubenswrapper[4954]: I1206 08:01:01.444750 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 08:01:01 crc kubenswrapper[4954]: E1206 08:01:01.448385 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:01:14 crc kubenswrapper[4954]: I1206 08:01:14.444178 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 08:01:14 crc kubenswrapper[4954]: E1206 08:01:14.445760 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:01:26 crc kubenswrapper[4954]: I1206 08:01:26.443912 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 08:01:26 crc kubenswrapper[4954]: E1206 08:01:26.444886 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:01:37 crc kubenswrapper[4954]: I1206 08:01:37.444514 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 08:01:37 crc kubenswrapper[4954]: E1206 08:01:37.446143 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:01:48 crc kubenswrapper[4954]: I1206 08:01:48.444319 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 08:01:48 crc kubenswrapper[4954]: I1206 08:01:48.815897 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"c2300b47345b86a27880c3fcf86e24393577c43bb1d3608f3efe09d39387c68a"} Dec 06 08:03:43 crc kubenswrapper[4954]: I1206 08:03:43.971644 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j2kn4"] Dec 06 08:03:43 crc kubenswrapper[4954]: E1206 08:03:43.973049 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dded35e-7239-430c-94a8-52974cd4eb0c" containerName="collect-profiles" Dec 06 08:03:43 crc kubenswrapper[4954]: I1206 08:03:43.973066 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dded35e-7239-430c-94a8-52974cd4eb0c" containerName="collect-profiles" Dec 06 08:03:43 crc kubenswrapper[4954]: I1206 08:03:43.973236 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dded35e-7239-430c-94a8-52974cd4eb0c" containerName="collect-profiles" Dec 06 08:03:43 crc kubenswrapper[4954]: I1206 08:03:43.974495 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j2kn4" Dec 06 08:03:44 crc kubenswrapper[4954]: I1206 08:03:44.002816 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j2kn4"] Dec 06 08:03:44 crc kubenswrapper[4954]: I1206 08:03:44.127159 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e277997-61c5-4d3d-8222-12f1eceaa016-catalog-content\") pod \"redhat-operators-j2kn4\" (UID: \"5e277997-61c5-4d3d-8222-12f1eceaa016\") " pod="openshift-marketplace/redhat-operators-j2kn4" Dec 06 08:03:44 crc kubenswrapper[4954]: I1206 08:03:44.127341 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66rsj\" (UniqueName: \"kubernetes.io/projected/5e277997-61c5-4d3d-8222-12f1eceaa016-kube-api-access-66rsj\") pod \"redhat-operators-j2kn4\" (UID: \"5e277997-61c5-4d3d-8222-12f1eceaa016\") " pod="openshift-marketplace/redhat-operators-j2kn4" Dec 06 08:03:44 crc kubenswrapper[4954]: I1206 08:03:44.127404 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e277997-61c5-4d3d-8222-12f1eceaa016-utilities\") pod \"redhat-operators-j2kn4\" (UID: \"5e277997-61c5-4d3d-8222-12f1eceaa016\") " pod="openshift-marketplace/redhat-operators-j2kn4" Dec 06 08:03:44 crc kubenswrapper[4954]: I1206 08:03:44.229413 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66rsj\" (UniqueName: \"kubernetes.io/projected/5e277997-61c5-4d3d-8222-12f1eceaa016-kube-api-access-66rsj\") pod \"redhat-operators-j2kn4\" (UID: \"5e277997-61c5-4d3d-8222-12f1eceaa016\") " pod="openshift-marketplace/redhat-operators-j2kn4" Dec 06 08:03:44 crc kubenswrapper[4954]: I1206 08:03:44.229491 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e277997-61c5-4d3d-8222-12f1eceaa016-utilities\") pod \"redhat-operators-j2kn4\" (UID: \"5e277997-61c5-4d3d-8222-12f1eceaa016\") " pod="openshift-marketplace/redhat-operators-j2kn4" Dec 06 08:03:44 crc kubenswrapper[4954]: I1206 08:03:44.229517 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e277997-61c5-4d3d-8222-12f1eceaa016-catalog-content\") pod \"redhat-operators-j2kn4\" (UID: \"5e277997-61c5-4d3d-8222-12f1eceaa016\") " pod="openshift-marketplace/redhat-operators-j2kn4" Dec 06 08:03:44 crc kubenswrapper[4954]: I1206 08:03:44.230888 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e277997-61c5-4d3d-8222-12f1eceaa016-catalog-content\") pod \"redhat-operators-j2kn4\" (UID: \"5e277997-61c5-4d3d-8222-12f1eceaa016\") " pod="openshift-marketplace/redhat-operators-j2kn4" Dec 06 08:03:44 crc kubenswrapper[4954]: I1206 08:03:44.230921 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e277997-61c5-4d3d-8222-12f1eceaa016-utilities\") pod \"redhat-operators-j2kn4\" (UID: \"5e277997-61c5-4d3d-8222-12f1eceaa016\") " pod="openshift-marketplace/redhat-operators-j2kn4" Dec 06 08:03:44 crc kubenswrapper[4954]: I1206 08:03:44.371282 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66rsj\" (UniqueName: \"kubernetes.io/projected/5e277997-61c5-4d3d-8222-12f1eceaa016-kube-api-access-66rsj\") pod \"redhat-operators-j2kn4\" (UID: \"5e277997-61c5-4d3d-8222-12f1eceaa016\") " pod="openshift-marketplace/redhat-operators-j2kn4" Dec 06 08:03:44 crc kubenswrapper[4954]: I1206 08:03:44.599735 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j2kn4" Dec 06 08:03:45 crc kubenswrapper[4954]: I1206 08:03:45.054624 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j2kn4"] Dec 06 08:03:46 crc kubenswrapper[4954]: I1206 08:03:46.013451 4954 generic.go:334] "Generic (PLEG): container finished" podID="5e277997-61c5-4d3d-8222-12f1eceaa016" containerID="d3d6c273697319822a20510ce8e1a748f1c26dadd2eda65f8a4c8339670f1762" exitCode=0 Dec 06 08:03:46 crc kubenswrapper[4954]: I1206 08:03:46.013622 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2kn4" event={"ID":"5e277997-61c5-4d3d-8222-12f1eceaa016","Type":"ContainerDied","Data":"d3d6c273697319822a20510ce8e1a748f1c26dadd2eda65f8a4c8339670f1762"} Dec 06 08:03:46 crc kubenswrapper[4954]: I1206 08:03:46.013929 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2kn4" event={"ID":"5e277997-61c5-4d3d-8222-12f1eceaa016","Type":"ContainerStarted","Data":"0a910873f27e66b906a2e47a98775a70e8dac33ade02ce695e9f8f755fd83fb6"} Dec 06 08:03:47 crc kubenswrapper[4954]: I1206 08:03:47.024419 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2kn4" event={"ID":"5e277997-61c5-4d3d-8222-12f1eceaa016","Type":"ContainerStarted","Data":"2b974417dab449fb75049a8752fb4576cddc76d1d14b3e9ba2bd5f97d90b3c77"} Dec 06 08:03:48 crc kubenswrapper[4954]: I1206 08:03:48.041227 4954 generic.go:334] "Generic (PLEG): container finished" podID="5e277997-61c5-4d3d-8222-12f1eceaa016" containerID="2b974417dab449fb75049a8752fb4576cddc76d1d14b3e9ba2bd5f97d90b3c77" exitCode=0 Dec 06 08:03:48 crc kubenswrapper[4954]: I1206 08:03:48.041306 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2kn4" event={"ID":"5e277997-61c5-4d3d-8222-12f1eceaa016","Type":"ContainerDied","Data":"2b974417dab449fb75049a8752fb4576cddc76d1d14b3e9ba2bd5f97d90b3c77"} Dec 06 08:03:49 crc kubenswrapper[4954]: I1206 08:03:49.052837 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2kn4" event={"ID":"5e277997-61c5-4d3d-8222-12f1eceaa016","Type":"ContainerStarted","Data":"2d03ea1a546598f6ef8d155f56d64bc2b73650dd94c636ae56449a1b4b42f48e"} Dec 06 08:03:49 crc kubenswrapper[4954]: I1206 08:03:49.087252 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j2kn4" podStartSLOduration=3.649984464 podStartE2EDuration="6.087227591s" podCreationTimestamp="2025-12-06 08:03:43 +0000 UTC" firstStartedPulling="2025-12-06 08:03:46.018711791 +0000 UTC m=+4000.832071180" lastFinishedPulling="2025-12-06 08:03:48.455954918 +0000 UTC m=+4003.269314307" observedRunningTime="2025-12-06 08:03:49.080850376 +0000 UTC m=+4003.894209765" watchObservedRunningTime="2025-12-06 08:03:49.087227591 +0000 UTC m=+4003.900586980" Dec 06 08:03:54 crc kubenswrapper[4954]: I1206 08:03:54.600803 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j2kn4" Dec 06 08:03:54 crc kubenswrapper[4954]: I1206 08:03:54.601789 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j2kn4" Dec 06 08:03:54 crc kubenswrapper[4954]: I1206 08:03:54.648265 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j2kn4" Dec 06 08:03:55 crc kubenswrapper[4954]: I1206 08:03:55.184823 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j2kn4" Dec 06 08:03:55 crc kubenswrapper[4954]: I1206 08:03:55.249752 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j2kn4"] Dec 06 08:03:57 crc kubenswrapper[4954]: I1206 08:03:57.125582 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j2kn4" podUID="5e277997-61c5-4d3d-8222-12f1eceaa016" containerName="registry-server" containerID="cri-o://2d03ea1a546598f6ef8d155f56d64bc2b73650dd94c636ae56449a1b4b42f48e" gracePeriod=2 Dec 06 08:04:00 crc kubenswrapper[4954]: I1206 08:04:00.162584 4954 generic.go:334] "Generic (PLEG): container finished" podID="5e277997-61c5-4d3d-8222-12f1eceaa016" containerID="2d03ea1a546598f6ef8d155f56d64bc2b73650dd94c636ae56449a1b4b42f48e" exitCode=0 Dec 06 08:04:00 crc kubenswrapper[4954]: I1206 08:04:00.162639 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2kn4" event={"ID":"5e277997-61c5-4d3d-8222-12f1eceaa016","Type":"ContainerDied","Data":"2d03ea1a546598f6ef8d155f56d64bc2b73650dd94c636ae56449a1b4b42f48e"} Dec 06 08:04:00 crc kubenswrapper[4954]: I1206 08:04:00.285504 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j2kn4" Dec 06 08:04:00 crc kubenswrapper[4954]: I1206 08:04:00.426246 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e277997-61c5-4d3d-8222-12f1eceaa016-catalog-content\") pod \"5e277997-61c5-4d3d-8222-12f1eceaa016\" (UID: \"5e277997-61c5-4d3d-8222-12f1eceaa016\") " Dec 06 08:04:00 crc kubenswrapper[4954]: I1206 08:04:00.426385 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66rsj\" (UniqueName: \"kubernetes.io/projected/5e277997-61c5-4d3d-8222-12f1eceaa016-kube-api-access-66rsj\") pod \"5e277997-61c5-4d3d-8222-12f1eceaa016\" (UID: \"5e277997-61c5-4d3d-8222-12f1eceaa016\") " Dec 06 08:04:00 crc kubenswrapper[4954]: I1206 08:04:00.426450 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e277997-61c5-4d3d-8222-12f1eceaa016-utilities\") pod \"5e277997-61c5-4d3d-8222-12f1eceaa016\" (UID: \"5e277997-61c5-4d3d-8222-12f1eceaa016\") " Dec 06 08:04:00 crc kubenswrapper[4954]: I1206 08:04:00.427857 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e277997-61c5-4d3d-8222-12f1eceaa016-utilities" (OuterVolumeSpecName: "utilities") pod "5e277997-61c5-4d3d-8222-12f1eceaa016" (UID: "5e277997-61c5-4d3d-8222-12f1eceaa016"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:04:00 crc kubenswrapper[4954]: I1206 08:04:00.428505 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e277997-61c5-4d3d-8222-12f1eceaa016-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:04:00 crc kubenswrapper[4954]: I1206 08:04:00.435270 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e277997-61c5-4d3d-8222-12f1eceaa016-kube-api-access-66rsj" (OuterVolumeSpecName: "kube-api-access-66rsj") pod "5e277997-61c5-4d3d-8222-12f1eceaa016" (UID: "5e277997-61c5-4d3d-8222-12f1eceaa016"). InnerVolumeSpecName "kube-api-access-66rsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:04:00 crc kubenswrapper[4954]: I1206 08:04:00.531252 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66rsj\" (UniqueName: \"kubernetes.io/projected/5e277997-61c5-4d3d-8222-12f1eceaa016-kube-api-access-66rsj\") on node \"crc\" DevicePath \"\"" Dec 06 08:04:00 crc kubenswrapper[4954]: I1206 08:04:00.541622 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e277997-61c5-4d3d-8222-12f1eceaa016-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e277997-61c5-4d3d-8222-12f1eceaa016" (UID: "5e277997-61c5-4d3d-8222-12f1eceaa016"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:04:00 crc kubenswrapper[4954]: I1206 08:04:00.633092 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e277997-61c5-4d3d-8222-12f1eceaa016-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:04:01 crc kubenswrapper[4954]: I1206 08:04:01.177924 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2kn4" event={"ID":"5e277997-61c5-4d3d-8222-12f1eceaa016","Type":"ContainerDied","Data":"0a910873f27e66b906a2e47a98775a70e8dac33ade02ce695e9f8f755fd83fb6"} Dec 06 08:04:01 crc kubenswrapper[4954]: I1206 08:04:01.178760 4954 scope.go:117] "RemoveContainer" containerID="2d03ea1a546598f6ef8d155f56d64bc2b73650dd94c636ae56449a1b4b42f48e" Dec 06 08:04:01 crc kubenswrapper[4954]: I1206 08:04:01.177987 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j2kn4" Dec 06 08:04:01 crc kubenswrapper[4954]: I1206 08:04:01.202296 4954 scope.go:117] "RemoveContainer" containerID="2b974417dab449fb75049a8752fb4576cddc76d1d14b3e9ba2bd5f97d90b3c77" Dec 06 08:04:01 crc kubenswrapper[4954]: I1206 08:04:01.223977 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j2kn4"] Dec 06 08:04:01 crc kubenswrapper[4954]: I1206 08:04:01.229703 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j2kn4"] Dec 06 08:04:01 crc kubenswrapper[4954]: I1206 08:04:01.233836 4954 scope.go:117] "RemoveContainer" containerID="d3d6c273697319822a20510ce8e1a748f1c26dadd2eda65f8a4c8339670f1762" Dec 06 08:04:01 crc kubenswrapper[4954]: I1206 08:04:01.468928 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e277997-61c5-4d3d-8222-12f1eceaa016" path="/var/lib/kubelet/pods/5e277997-61c5-4d3d-8222-12f1eceaa016/volumes" Dec 06 08:04:10 crc kubenswrapper[4954]: I1206 08:04:10.101820 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:04:10 crc kubenswrapper[4954]: I1206 08:04:10.102678 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:04:40 crc kubenswrapper[4954]: I1206 08:04:40.101867 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:04:40 crc kubenswrapper[4954]: I1206 08:04:40.103003 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:05:10 crc kubenswrapper[4954]: I1206 08:05:10.101101 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:05:10 crc kubenswrapper[4954]: I1206 08:05:10.101967 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:05:10 crc kubenswrapper[4954]: I1206 08:05:10.102086 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 08:05:10 crc kubenswrapper[4954]: I1206 08:05:10.103278 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2300b47345b86a27880c3fcf86e24393577c43bb1d3608f3efe09d39387c68a"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 08:05:10 crc kubenswrapper[4954]: I1206 08:05:10.103428 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://c2300b47345b86a27880c3fcf86e24393577c43bb1d3608f3efe09d39387c68a" gracePeriod=600 Dec 06 08:05:10 crc kubenswrapper[4954]: I1206 08:05:10.839374 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="c2300b47345b86a27880c3fcf86e24393577c43bb1d3608f3efe09d39387c68a" exitCode=0 Dec 06 08:05:10 crc kubenswrapper[4954]: I1206 08:05:10.839485 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"c2300b47345b86a27880c3fcf86e24393577c43bb1d3608f3efe09d39387c68a"} Dec 06 08:05:10 crc kubenswrapper[4954]: I1206 08:05:10.839852 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316"} Dec 06 08:05:10 crc kubenswrapper[4954]: I1206 08:05:10.839880 4954 scope.go:117] "RemoveContainer" containerID="b6aa4613ae42df83ddd85c65bec1fd884572246d241fdd326789e414bbde493b" Dec 06 08:05:48 crc kubenswrapper[4954]: I1206 08:05:48.429684 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gnvzt"] Dec 06 08:05:48 crc kubenswrapper[4954]: E1206 08:05:48.433059 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e277997-61c5-4d3d-8222-12f1eceaa016" containerName="registry-server" Dec 06 08:05:48 crc kubenswrapper[4954]: I1206 08:05:48.433161 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e277997-61c5-4d3d-8222-12f1eceaa016" containerName="registry-server" Dec 06 08:05:48 crc kubenswrapper[4954]: E1206 08:05:48.433247 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e277997-61c5-4d3d-8222-12f1eceaa016" containerName="extract-content" Dec 06 08:05:48 crc kubenswrapper[4954]: I1206 08:05:48.433304 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e277997-61c5-4d3d-8222-12f1eceaa016" containerName="extract-content" Dec 06 08:05:48 crc kubenswrapper[4954]: E1206 08:05:48.433391 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e277997-61c5-4d3d-8222-12f1eceaa016" containerName="extract-utilities" Dec 06 08:05:48 crc kubenswrapper[4954]: I1206 08:05:48.433455 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e277997-61c5-4d3d-8222-12f1eceaa016" containerName="extract-utilities" Dec 06 08:05:48 crc kubenswrapper[4954]: I1206 08:05:48.434386 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e277997-61c5-4d3d-8222-12f1eceaa016" containerName="registry-server" Dec 06 08:05:48 crc kubenswrapper[4954]: I1206 08:05:48.438398 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gnvzt" Dec 06 08:05:48 crc kubenswrapper[4954]: I1206 08:05:48.449053 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gnvzt"] Dec 06 08:05:48 crc kubenswrapper[4954]: I1206 08:05:48.549898 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3a44dc3-419d-4d4d-a25e-424212957a18-utilities\") pod \"certified-operators-gnvzt\" (UID: \"b3a44dc3-419d-4d4d-a25e-424212957a18\") " pod="openshift-marketplace/certified-operators-gnvzt" Dec 06 08:05:48 crc kubenswrapper[4954]: I1206 08:05:48.549999 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3a44dc3-419d-4d4d-a25e-424212957a18-catalog-content\") pod \"certified-operators-gnvzt\" (UID: \"b3a44dc3-419d-4d4d-a25e-424212957a18\") " pod="openshift-marketplace/certified-operators-gnvzt" Dec 06 08:05:48 crc kubenswrapper[4954]: I1206 08:05:48.550075 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8s6q\" (UniqueName: \"kubernetes.io/projected/b3a44dc3-419d-4d4d-a25e-424212957a18-kube-api-access-s8s6q\") pod \"certified-operators-gnvzt\" (UID: \"b3a44dc3-419d-4d4d-a25e-424212957a18\") " pod="openshift-marketplace/certified-operators-gnvzt" Dec 06 08:05:48 crc kubenswrapper[4954]: I1206 08:05:48.652237 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3a44dc3-419d-4d4d-a25e-424212957a18-utilities\") pod \"certified-operators-gnvzt\" (UID: \"b3a44dc3-419d-4d4d-a25e-424212957a18\") " pod="openshift-marketplace/certified-operators-gnvzt" Dec 06 08:05:48 crc kubenswrapper[4954]: I1206 08:05:48.652323 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3a44dc3-419d-4d4d-a25e-424212957a18-catalog-content\") pod \"certified-operators-gnvzt\" (UID: \"b3a44dc3-419d-4d4d-a25e-424212957a18\") " pod="openshift-marketplace/certified-operators-gnvzt" Dec 06 08:05:48 crc kubenswrapper[4954]: I1206 08:05:48.652405 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8s6q\" (UniqueName: \"kubernetes.io/projected/b3a44dc3-419d-4d4d-a25e-424212957a18-kube-api-access-s8s6q\") pod \"certified-operators-gnvzt\" (UID: \"b3a44dc3-419d-4d4d-a25e-424212957a18\") " pod="openshift-marketplace/certified-operators-gnvzt" Dec 06 08:05:48 crc kubenswrapper[4954]: I1206 08:05:48.653218 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3a44dc3-419d-4d4d-a25e-424212957a18-utilities\") pod \"certified-operators-gnvzt\" (UID: \"b3a44dc3-419d-4d4d-a25e-424212957a18\") " pod="openshift-marketplace/certified-operators-gnvzt" Dec 06 08:05:48 crc kubenswrapper[4954]: I1206 08:05:48.653333 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3a44dc3-419d-4d4d-a25e-424212957a18-catalog-content\") pod \"certified-operators-gnvzt\" (UID: \"b3a44dc3-419d-4d4d-a25e-424212957a18\") " pod="openshift-marketplace/certified-operators-gnvzt" Dec 06 08:05:48 crc kubenswrapper[4954]: I1206 08:05:48.688024 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8s6q\" (UniqueName: \"kubernetes.io/projected/b3a44dc3-419d-4d4d-a25e-424212957a18-kube-api-access-s8s6q\") pod \"certified-operators-gnvzt\" (UID: \"b3a44dc3-419d-4d4d-a25e-424212957a18\") " pod="openshift-marketplace/certified-operators-gnvzt" Dec 06 08:05:48 crc kubenswrapper[4954]: I1206 08:05:48.777773 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gnvzt" Dec 06 08:05:49 crc kubenswrapper[4954]: I1206 08:05:49.150682 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gnvzt"] Dec 06 08:05:49 crc kubenswrapper[4954]: I1206 08:05:49.196468 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnvzt" event={"ID":"b3a44dc3-419d-4d4d-a25e-424212957a18","Type":"ContainerStarted","Data":"4bc183f27e8c4e4ed58b3c01c9a98985914943ecdfd47e24d1cf9b8a382080d8"} Dec 06 08:05:50 crc kubenswrapper[4954]: I1206 08:05:50.210487 4954 generic.go:334] "Generic (PLEG): container finished" podID="b3a44dc3-419d-4d4d-a25e-424212957a18" containerID="f940f0bc7c8e33bd1843b33269dc7813d75c9476b111cf1d0c5181e837a0be10" exitCode=0 Dec 06 08:05:50 crc kubenswrapper[4954]: I1206 08:05:50.212629 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnvzt" event={"ID":"b3a44dc3-419d-4d4d-a25e-424212957a18","Type":"ContainerDied","Data":"f940f0bc7c8e33bd1843b33269dc7813d75c9476b111cf1d0c5181e837a0be10"} Dec 06 08:05:50 crc kubenswrapper[4954]: I1206 08:05:50.214432 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 08:05:51 crc kubenswrapper[4954]: I1206 08:05:51.223861 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnvzt" event={"ID":"b3a44dc3-419d-4d4d-a25e-424212957a18","Type":"ContainerStarted","Data":"362b5e111684db372bc55c13992319f1338b2b90445152db4c51b9a3abb17f65"} Dec 06 08:05:52 crc kubenswrapper[4954]: I1206 08:05:52.235744 4954 generic.go:334] "Generic (PLEG): container finished" podID="b3a44dc3-419d-4d4d-a25e-424212957a18" containerID="362b5e111684db372bc55c13992319f1338b2b90445152db4c51b9a3abb17f65" exitCode=0 Dec 06 08:05:52 crc kubenswrapper[4954]: I1206 08:05:52.235821 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnvzt" event={"ID":"b3a44dc3-419d-4d4d-a25e-424212957a18","Type":"ContainerDied","Data":"362b5e111684db372bc55c13992319f1338b2b90445152db4c51b9a3abb17f65"} Dec 06 08:05:53 crc kubenswrapper[4954]: I1206 08:05:53.248406 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnvzt" event={"ID":"b3a44dc3-419d-4d4d-a25e-424212957a18","Type":"ContainerStarted","Data":"841515e671b45069a0138ffc0feeeb411d6f22903bc97be6144a95361a7c0333"} Dec 06 08:05:53 crc kubenswrapper[4954]: I1206 08:05:53.271436 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gnvzt" podStartSLOduration=2.8573574109999997 podStartE2EDuration="5.271415877s" podCreationTimestamp="2025-12-06 08:05:48 +0000 UTC" firstStartedPulling="2025-12-06 08:05:50.214145579 +0000 UTC m=+4125.027504968" lastFinishedPulling="2025-12-06 08:05:52.628204005 +0000 UTC m=+4127.441563434" observedRunningTime="2025-12-06 08:05:53.269834336 +0000 UTC m=+4128.083193755" watchObservedRunningTime="2025-12-06 08:05:53.271415877 +0000 UTC m=+4128.084775266" Dec 06 08:05:58 crc kubenswrapper[4954]: I1206 08:05:58.778215 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gnvzt" Dec 06 08:05:58 crc kubenswrapper[4954]: I1206 08:05:58.779121 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gnvzt" Dec 06 08:05:58 crc kubenswrapper[4954]: I1206 08:05:58.832876 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gnvzt" Dec 06 08:05:59 crc kubenswrapper[4954]: I1206 08:05:59.392986 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gnvzt" Dec 06 08:05:59 crc kubenswrapper[4954]: I1206 08:05:59.454257 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gnvzt"] Dec 06 08:06:01 crc kubenswrapper[4954]: I1206 08:06:01.365877 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gnvzt" podUID="b3a44dc3-419d-4d4d-a25e-424212957a18" containerName="registry-server" containerID="cri-o://841515e671b45069a0138ffc0feeeb411d6f22903bc97be6144a95361a7c0333" gracePeriod=2 Dec 06 08:06:03 crc kubenswrapper[4954]: I1206 08:06:03.390600 4954 generic.go:334] "Generic (PLEG): container finished" podID="b3a44dc3-419d-4d4d-a25e-424212957a18" containerID="841515e671b45069a0138ffc0feeeb411d6f22903bc97be6144a95361a7c0333" exitCode=0 Dec 06 08:06:03 crc kubenswrapper[4954]: I1206 08:06:03.390642 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnvzt" event={"ID":"b3a44dc3-419d-4d4d-a25e-424212957a18","Type":"ContainerDied","Data":"841515e671b45069a0138ffc0feeeb411d6f22903bc97be6144a95361a7c0333"} Dec 06 08:06:03 crc kubenswrapper[4954]: I1206 08:06:03.707733 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gnvzt" Dec 06 08:06:03 crc kubenswrapper[4954]: I1206 08:06:03.862849 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3a44dc3-419d-4d4d-a25e-424212957a18-catalog-content\") pod \"b3a44dc3-419d-4d4d-a25e-424212957a18\" (UID: \"b3a44dc3-419d-4d4d-a25e-424212957a18\") " Dec 06 08:06:03 crc kubenswrapper[4954]: I1206 08:06:03.863069 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3a44dc3-419d-4d4d-a25e-424212957a18-utilities\") pod \"b3a44dc3-419d-4d4d-a25e-424212957a18\" (UID: \"b3a44dc3-419d-4d4d-a25e-424212957a18\") " Dec 06 08:06:03 crc kubenswrapper[4954]: I1206 08:06:03.863190 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8s6q\" (UniqueName: \"kubernetes.io/projected/b3a44dc3-419d-4d4d-a25e-424212957a18-kube-api-access-s8s6q\") pod \"b3a44dc3-419d-4d4d-a25e-424212957a18\" (UID: \"b3a44dc3-419d-4d4d-a25e-424212957a18\") " Dec 06 08:06:03 crc kubenswrapper[4954]: I1206 08:06:03.864761 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3a44dc3-419d-4d4d-a25e-424212957a18-utilities" (OuterVolumeSpecName: "utilities") pod "b3a44dc3-419d-4d4d-a25e-424212957a18" (UID: "b3a44dc3-419d-4d4d-a25e-424212957a18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:06:03 crc kubenswrapper[4954]: I1206 08:06:03.885786 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3a44dc3-419d-4d4d-a25e-424212957a18-kube-api-access-s8s6q" (OuterVolumeSpecName: "kube-api-access-s8s6q") pod "b3a44dc3-419d-4d4d-a25e-424212957a18" (UID: "b3a44dc3-419d-4d4d-a25e-424212957a18"). InnerVolumeSpecName "kube-api-access-s8s6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:06:03 crc kubenswrapper[4954]: I1206 08:06:03.925216 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3a44dc3-419d-4d4d-a25e-424212957a18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3a44dc3-419d-4d4d-a25e-424212957a18" (UID: "b3a44dc3-419d-4d4d-a25e-424212957a18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:06:03 crc kubenswrapper[4954]: I1206 08:06:03.965596 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3a44dc3-419d-4d4d-a25e-424212957a18-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:06:03 crc kubenswrapper[4954]: I1206 08:06:03.965651 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3a44dc3-419d-4d4d-a25e-424212957a18-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:06:03 crc kubenswrapper[4954]: I1206 08:06:03.965667 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8s6q\" (UniqueName: \"kubernetes.io/projected/b3a44dc3-419d-4d4d-a25e-424212957a18-kube-api-access-s8s6q\") on node \"crc\" DevicePath \"\"" Dec 06 08:06:04 crc kubenswrapper[4954]: I1206 08:06:04.403942 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnvzt" event={"ID":"b3a44dc3-419d-4d4d-a25e-424212957a18","Type":"ContainerDied","Data":"4bc183f27e8c4e4ed58b3c01c9a98985914943ecdfd47e24d1cf9b8a382080d8"} Dec 06 08:06:04 crc kubenswrapper[4954]: I1206 08:06:04.404001 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gnvzt" Dec 06 08:06:04 crc kubenswrapper[4954]: I1206 08:06:04.404518 4954 scope.go:117] "RemoveContainer" containerID="841515e671b45069a0138ffc0feeeb411d6f22903bc97be6144a95361a7c0333" Dec 06 08:06:04 crc kubenswrapper[4954]: I1206 08:06:04.438194 4954 scope.go:117] "RemoveContainer" containerID="362b5e111684db372bc55c13992319f1338b2b90445152db4c51b9a3abb17f65" Dec 06 08:06:04 crc kubenswrapper[4954]: I1206 08:06:04.457679 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gnvzt"] Dec 06 08:06:04 crc kubenswrapper[4954]: I1206 08:06:04.469783 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gnvzt"] Dec 06 08:06:04 crc kubenswrapper[4954]: I1206 08:06:04.477051 4954 scope.go:117] "RemoveContainer" containerID="f940f0bc7c8e33bd1843b33269dc7813d75c9476b111cf1d0c5181e837a0be10" Dec 06 08:06:05 crc kubenswrapper[4954]: I1206 08:06:05.464719 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3a44dc3-419d-4d4d-a25e-424212957a18" path="/var/lib/kubelet/pods/b3a44dc3-419d-4d4d-a25e-424212957a18/volumes" Dec 06 08:06:16 crc kubenswrapper[4954]: I1206 08:06:16.289432 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dhld5"] Dec 06 08:06:16 crc kubenswrapper[4954]: E1206 08:06:16.290639 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a44dc3-419d-4d4d-a25e-424212957a18" containerName="registry-server" Dec 06 08:06:16 crc kubenswrapper[4954]: I1206 08:06:16.290657 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a44dc3-419d-4d4d-a25e-424212957a18" containerName="registry-server" Dec 06 08:06:16 crc kubenswrapper[4954]: E1206 08:06:16.290694 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a44dc3-419d-4d4d-a25e-424212957a18" containerName="extract-utilities" Dec 06 08:06:16 crc kubenswrapper[4954]: I1206 08:06:16.290701 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a44dc3-419d-4d4d-a25e-424212957a18" containerName="extract-utilities" Dec 06 08:06:16 crc kubenswrapper[4954]: E1206 08:06:16.290714 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a44dc3-419d-4d4d-a25e-424212957a18" containerName="extract-content" Dec 06 08:06:16 crc kubenswrapper[4954]: I1206 08:06:16.290721 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a44dc3-419d-4d4d-a25e-424212957a18" containerName="extract-content" Dec 06 08:06:16 crc kubenswrapper[4954]: I1206 08:06:16.290880 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3a44dc3-419d-4d4d-a25e-424212957a18" containerName="registry-server" Dec 06 08:06:16 crc kubenswrapper[4954]: I1206 08:06:16.292202 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dhld5" Dec 06 08:06:16 crc kubenswrapper[4954]: I1206 08:06:16.323552 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dhld5"] Dec 06 08:06:16 crc kubenswrapper[4954]: I1206 08:06:16.380920 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxwwc\" (UniqueName: \"kubernetes.io/projected/a57399ed-c982-4211-87dd-125d607e881a-kube-api-access-gxwwc\") pod \"community-operators-dhld5\" (UID: \"a57399ed-c982-4211-87dd-125d607e881a\") " pod="openshift-marketplace/community-operators-dhld5" Dec 06 08:06:16 crc kubenswrapper[4954]: I1206 08:06:16.381001 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a57399ed-c982-4211-87dd-125d607e881a-catalog-content\") pod \"community-operators-dhld5\" (UID: \"a57399ed-c982-4211-87dd-125d607e881a\") " pod="openshift-marketplace/community-operators-dhld5" Dec 06 08:06:16 crc kubenswrapper[4954]: I1206 08:06:16.381069 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a57399ed-c982-4211-87dd-125d607e881a-utilities\") pod \"community-operators-dhld5\" (UID: \"a57399ed-c982-4211-87dd-125d607e881a\") " pod="openshift-marketplace/community-operators-dhld5" Dec 06 08:06:16 crc kubenswrapper[4954]: I1206 08:06:16.483006 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxwwc\" (UniqueName: \"kubernetes.io/projected/a57399ed-c982-4211-87dd-125d607e881a-kube-api-access-gxwwc\") pod \"community-operators-dhld5\" (UID: \"a57399ed-c982-4211-87dd-125d607e881a\") " pod="openshift-marketplace/community-operators-dhld5" Dec 06 08:06:16 crc kubenswrapper[4954]: I1206 08:06:16.483596 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a57399ed-c982-4211-87dd-125d607e881a-catalog-content\") pod \"community-operators-dhld5\" (UID: \"a57399ed-c982-4211-87dd-125d607e881a\") " pod="openshift-marketplace/community-operators-dhld5" Dec 06 08:06:16 crc kubenswrapper[4954]: I1206 08:06:16.483779 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a57399ed-c982-4211-87dd-125d607e881a-utilities\") pod \"community-operators-dhld5\" (UID: \"a57399ed-c982-4211-87dd-125d607e881a\") " pod="openshift-marketplace/community-operators-dhld5" Dec 06 08:06:16 crc kubenswrapper[4954]: I1206 08:06:16.484409 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a57399ed-c982-4211-87dd-125d607e881a-catalog-content\") pod \"community-operators-dhld5\" (UID: \"a57399ed-c982-4211-87dd-125d607e881a\") " pod="openshift-marketplace/community-operators-dhld5" Dec 06 08:06:16 crc kubenswrapper[4954]: I1206 08:06:16.484491 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a57399ed-c982-4211-87dd-125d607e881a-utilities\") pod \"community-operators-dhld5\" (UID: \"a57399ed-c982-4211-87dd-125d607e881a\") " pod="openshift-marketplace/community-operators-dhld5" Dec 06 08:06:16 crc kubenswrapper[4954]: I1206 08:06:16.508815 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxwwc\" (UniqueName: \"kubernetes.io/projected/a57399ed-c982-4211-87dd-125d607e881a-kube-api-access-gxwwc\") pod \"community-operators-dhld5\" (UID: \"a57399ed-c982-4211-87dd-125d607e881a\") " pod="openshift-marketplace/community-operators-dhld5" Dec 06 08:06:16 crc kubenswrapper[4954]: I1206 08:06:16.618125 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dhld5" Dec 06 08:06:16 crc kubenswrapper[4954]: I1206 08:06:16.982147 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dhld5"] Dec 06 08:06:17 crc kubenswrapper[4954]: I1206 08:06:17.543476 4954 generic.go:334] "Generic (PLEG): container finished" podID="a57399ed-c982-4211-87dd-125d607e881a" containerID="c987751526957d6a3c65b196e8f74f00eb4c1f76245a24ac933156beb711a684" exitCode=0 Dec 06 08:06:17 crc kubenswrapper[4954]: I1206 08:06:17.543597 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhld5" event={"ID":"a57399ed-c982-4211-87dd-125d607e881a","Type":"ContainerDied","Data":"c987751526957d6a3c65b196e8f74f00eb4c1f76245a24ac933156beb711a684"} Dec 06 08:06:17 crc kubenswrapper[4954]: I1206 08:06:17.544090 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhld5" event={"ID":"a57399ed-c982-4211-87dd-125d607e881a","Type":"ContainerStarted","Data":"a056e42d7a83a0e964d247876461bcf504ba30f3b598b5384e65b66b88e35254"} Dec 06 08:06:18 crc kubenswrapper[4954]: I1206 08:06:18.553631 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhld5" event={"ID":"a57399ed-c982-4211-87dd-125d607e881a","Type":"ContainerStarted","Data":"30224e762f21dad03f7032d6b6dc42aaa02af01d900c2bdd05c262c03a2d97f0"} Dec 06 08:06:19 crc kubenswrapper[4954]: I1206 08:06:19.564993 4954 generic.go:334] "Generic (PLEG): container finished" podID="a57399ed-c982-4211-87dd-125d607e881a" containerID="30224e762f21dad03f7032d6b6dc42aaa02af01d900c2bdd05c262c03a2d97f0" exitCode=0 Dec 06 08:06:19 crc kubenswrapper[4954]: I1206 08:06:19.565181 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhld5" event={"ID":"a57399ed-c982-4211-87dd-125d607e881a","Type":"ContainerDied","Data":"30224e762f21dad03f7032d6b6dc42aaa02af01d900c2bdd05c262c03a2d97f0"} Dec 06 08:06:20 crc kubenswrapper[4954]: I1206 08:06:20.576926 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhld5" event={"ID":"a57399ed-c982-4211-87dd-125d607e881a","Type":"ContainerStarted","Data":"57945529e9b2e96d72fad00f0c890a5eba92fe8928da804c7820d4ea9cf1b943"} Dec 06 08:06:20 crc kubenswrapper[4954]: I1206 08:06:20.600278 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dhld5" podStartSLOduration=2.1672557120000002 podStartE2EDuration="4.600252837s" podCreationTimestamp="2025-12-06 08:06:16 +0000 UTC" firstStartedPulling="2025-12-06 08:06:17.546974303 +0000 UTC m=+4152.360333702" lastFinishedPulling="2025-12-06 08:06:19.979971418 +0000 UTC m=+4154.793330827" observedRunningTime="2025-12-06 08:06:20.594831026 +0000 UTC m=+4155.408190415" watchObservedRunningTime="2025-12-06 08:06:20.600252837 +0000 UTC m=+4155.413612226" Dec 06 08:06:26 crc kubenswrapper[4954]: I1206 08:06:26.618603 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dhld5" Dec 06 08:06:26 crc kubenswrapper[4954]: I1206 08:06:26.619036 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dhld5" Dec 06 08:06:26 crc kubenswrapper[4954]: I1206 08:06:26.664743 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dhld5" Dec 06 08:06:27 crc kubenswrapper[4954]: I1206 08:06:27.678095 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dhld5" Dec 06 08:06:27 crc kubenswrapper[4954]: I1206 08:06:27.738383 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dhld5"] Dec 06 08:06:29 crc kubenswrapper[4954]: I1206 08:06:29.641393 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dhld5" podUID="a57399ed-c982-4211-87dd-125d607e881a" containerName="registry-server" containerID="cri-o://57945529e9b2e96d72fad00f0c890a5eba92fe8928da804c7820d4ea9cf1b943" gracePeriod=2 Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.516933 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dhld5" Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.626436 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a57399ed-c982-4211-87dd-125d607e881a-catalog-content\") pod \"a57399ed-c982-4211-87dd-125d607e881a\" (UID: \"a57399ed-c982-4211-87dd-125d607e881a\") " Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.626608 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a57399ed-c982-4211-87dd-125d607e881a-utilities\") pod \"a57399ed-c982-4211-87dd-125d607e881a\" (UID: \"a57399ed-c982-4211-87dd-125d607e881a\") " Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.626650 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxwwc\" (UniqueName: \"kubernetes.io/projected/a57399ed-c982-4211-87dd-125d607e881a-kube-api-access-gxwwc\") pod \"a57399ed-c982-4211-87dd-125d607e881a\" (UID: \"a57399ed-c982-4211-87dd-125d607e881a\") " Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.627779 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a57399ed-c982-4211-87dd-125d607e881a-utilities" (OuterVolumeSpecName: "utilities") pod "a57399ed-c982-4211-87dd-125d607e881a" (UID: "a57399ed-c982-4211-87dd-125d607e881a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.633330 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a57399ed-c982-4211-87dd-125d607e881a-kube-api-access-gxwwc" (OuterVolumeSpecName: "kube-api-access-gxwwc") pod "a57399ed-c982-4211-87dd-125d607e881a" (UID: "a57399ed-c982-4211-87dd-125d607e881a"). InnerVolumeSpecName "kube-api-access-gxwwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.654550 4954 generic.go:334] "Generic (PLEG): container finished" podID="a57399ed-c982-4211-87dd-125d607e881a" containerID="57945529e9b2e96d72fad00f0c890a5eba92fe8928da804c7820d4ea9cf1b943" exitCode=0 Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.654614 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhld5" event={"ID":"a57399ed-c982-4211-87dd-125d607e881a","Type":"ContainerDied","Data":"57945529e9b2e96d72fad00f0c890a5eba92fe8928da804c7820d4ea9cf1b943"} Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.654642 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhld5" event={"ID":"a57399ed-c982-4211-87dd-125d607e881a","Type":"ContainerDied","Data":"a056e42d7a83a0e964d247876461bcf504ba30f3b598b5384e65b66b88e35254"} Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.654660 4954 scope.go:117] "RemoveContainer" containerID="57945529e9b2e96d72fad00f0c890a5eba92fe8928da804c7820d4ea9cf1b943" Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.654802 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dhld5" Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.696613 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a57399ed-c982-4211-87dd-125d607e881a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a57399ed-c982-4211-87dd-125d607e881a" (UID: "a57399ed-c982-4211-87dd-125d607e881a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.696671 4954 scope.go:117] "RemoveContainer" containerID="30224e762f21dad03f7032d6b6dc42aaa02af01d900c2bdd05c262c03a2d97f0" Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.714162 4954 scope.go:117] "RemoveContainer" containerID="c987751526957d6a3c65b196e8f74f00eb4c1f76245a24ac933156beb711a684" Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.728757 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a57399ed-c982-4211-87dd-125d607e881a-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.728794 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxwwc\" (UniqueName: \"kubernetes.io/projected/a57399ed-c982-4211-87dd-125d607e881a-kube-api-access-gxwwc\") on node \"crc\" DevicePath \"\"" Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.728805 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a57399ed-c982-4211-87dd-125d607e881a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.736842 4954 scope.go:117] "RemoveContainer" containerID="57945529e9b2e96d72fad00f0c890a5eba92fe8928da804c7820d4ea9cf1b943" Dec 06 08:06:30 crc kubenswrapper[4954]: E1206 08:06:30.737460 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57945529e9b2e96d72fad00f0c890a5eba92fe8928da804c7820d4ea9cf1b943\": container with ID starting with 57945529e9b2e96d72fad00f0c890a5eba92fe8928da804c7820d4ea9cf1b943 not found: ID does not exist" containerID="57945529e9b2e96d72fad00f0c890a5eba92fe8928da804c7820d4ea9cf1b943" Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.737522 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57945529e9b2e96d72fad00f0c890a5eba92fe8928da804c7820d4ea9cf1b943"} err="failed to get container status \"57945529e9b2e96d72fad00f0c890a5eba92fe8928da804c7820d4ea9cf1b943\": rpc error: code = NotFound desc = could not find container \"57945529e9b2e96d72fad00f0c890a5eba92fe8928da804c7820d4ea9cf1b943\": container with ID starting with 57945529e9b2e96d72fad00f0c890a5eba92fe8928da804c7820d4ea9cf1b943 not found: ID does not exist" Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.737550 4954 scope.go:117] "RemoveContainer" containerID="30224e762f21dad03f7032d6b6dc42aaa02af01d900c2bdd05c262c03a2d97f0" Dec 06 08:06:30 crc kubenswrapper[4954]: E1206 08:06:30.737996 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30224e762f21dad03f7032d6b6dc42aaa02af01d900c2bdd05c262c03a2d97f0\": container with ID starting with 30224e762f21dad03f7032d6b6dc42aaa02af01d900c2bdd05c262c03a2d97f0 not found: ID does not exist" containerID="30224e762f21dad03f7032d6b6dc42aaa02af01d900c2bdd05c262c03a2d97f0" Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.738021 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30224e762f21dad03f7032d6b6dc42aaa02af01d900c2bdd05c262c03a2d97f0"} err="failed to get container status \"30224e762f21dad03f7032d6b6dc42aaa02af01d900c2bdd05c262c03a2d97f0\": rpc error: code = NotFound desc = could not find container \"30224e762f21dad03f7032d6b6dc42aaa02af01d900c2bdd05c262c03a2d97f0\": container with ID starting with 30224e762f21dad03f7032d6b6dc42aaa02af01d900c2bdd05c262c03a2d97f0 not found: ID does not exist" Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.738039 4954 scope.go:117] "RemoveContainer" containerID="c987751526957d6a3c65b196e8f74f00eb4c1f76245a24ac933156beb711a684" Dec 06 08:06:30 crc kubenswrapper[4954]: E1206 08:06:30.738549 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c987751526957d6a3c65b196e8f74f00eb4c1f76245a24ac933156beb711a684\": container with ID starting with c987751526957d6a3c65b196e8f74f00eb4c1f76245a24ac933156beb711a684 not found: ID does not exist" containerID="c987751526957d6a3c65b196e8f74f00eb4c1f76245a24ac933156beb711a684" Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.738587 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c987751526957d6a3c65b196e8f74f00eb4c1f76245a24ac933156beb711a684"} err="failed to get container status \"c987751526957d6a3c65b196e8f74f00eb4c1f76245a24ac933156beb711a684\": rpc error: code = NotFound desc = could not find container \"c987751526957d6a3c65b196e8f74f00eb4c1f76245a24ac933156beb711a684\": container with ID starting with c987751526957d6a3c65b196e8f74f00eb4c1f76245a24ac933156beb711a684 not found: ID does not exist" Dec 06 08:06:30 crc kubenswrapper[4954]: I1206 08:06:30.994341 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dhld5"] Dec 06 08:06:31 crc kubenswrapper[4954]: I1206 08:06:31.000361 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dhld5"] Dec 06 08:06:31 crc kubenswrapper[4954]: I1206 08:06:31.452878 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a57399ed-c982-4211-87dd-125d607e881a" path="/var/lib/kubelet/pods/a57399ed-c982-4211-87dd-125d607e881a/volumes" Dec 06 08:07:10 crc kubenswrapper[4954]: I1206 08:07:10.101992 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:07:10 crc kubenswrapper[4954]: I1206 08:07:10.102748 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:07:40 crc kubenswrapper[4954]: I1206 08:07:40.101827 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:07:40 crc kubenswrapper[4954]: I1206 08:07:40.102857 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:08:10 crc kubenswrapper[4954]: I1206 08:08:10.101852 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:08:10 crc kubenswrapper[4954]: I1206 08:08:10.102681 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:08:10 crc kubenswrapper[4954]: I1206 08:08:10.102797 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 08:08:10 crc kubenswrapper[4954]: I1206 08:08:10.104228 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 08:08:10 crc kubenswrapper[4954]: I1206 08:08:10.104629 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" gracePeriod=600 Dec 06 08:08:10 crc kubenswrapper[4954]: E1206 08:08:10.243661 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:08:10 crc kubenswrapper[4954]: I1206 08:08:10.625744 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" exitCode=0 Dec 06 08:08:10 crc kubenswrapper[4954]: I1206 08:08:10.625843 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316"} Dec 06 08:08:10 crc kubenswrapper[4954]: I1206 08:08:10.625964 4954 scope.go:117] "RemoveContainer" containerID="c2300b47345b86a27880c3fcf86e24393577c43bb1d3608f3efe09d39387c68a" Dec 06 08:08:10 crc kubenswrapper[4954]: I1206 08:08:10.626835 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:08:10 crc kubenswrapper[4954]: E1206 08:08:10.627148 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:08:23 crc kubenswrapper[4954]: I1206 08:08:23.443549 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:08:23 crc kubenswrapper[4954]: E1206 08:08:23.444803 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:08:36 crc kubenswrapper[4954]: I1206 08:08:36.443729 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:08:36 crc kubenswrapper[4954]: E1206 08:08:36.446025 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:08:51 crc kubenswrapper[4954]: I1206 08:08:51.444267 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:08:51 crc kubenswrapper[4954]: E1206 08:08:51.445389 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:09:05 crc kubenswrapper[4954]: I1206 08:09:05.453165 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:09:05 crc kubenswrapper[4954]: E1206 08:09:05.454317 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:09:18 crc kubenswrapper[4954]: I1206 08:09:18.444745 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:09:18 crc kubenswrapper[4954]: E1206 08:09:18.445974 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:09:29 crc kubenswrapper[4954]: I1206 08:09:29.443381 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:09:29 crc kubenswrapper[4954]: E1206 08:09:29.444362 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:09:44 crc kubenswrapper[4954]: I1206 08:09:44.444151 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:09:44 crc kubenswrapper[4954]: E1206 08:09:44.445254 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:09:57 crc kubenswrapper[4954]: I1206 08:09:57.443356 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:09:57 crc kubenswrapper[4954]: E1206 08:09:57.444284 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:10:10 crc kubenswrapper[4954]: I1206 08:10:10.444492 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:10:10 crc kubenswrapper[4954]: E1206 08:10:10.446811 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:10:24 crc kubenswrapper[4954]: I1206 08:10:24.443650 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:10:24 crc kubenswrapper[4954]: E1206 08:10:24.444658 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:10:37 crc kubenswrapper[4954]: I1206 08:10:37.444492 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:10:37 crc kubenswrapper[4954]: E1206 08:10:37.445758 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:10:50 crc kubenswrapper[4954]: I1206 08:10:50.443277 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:10:50 crc kubenswrapper[4954]: E1206 08:10:50.444257 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:11:01 crc kubenswrapper[4954]: I1206 08:11:01.444202 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:11:01 crc kubenswrapper[4954]: E1206 08:11:01.445079 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:11:15 crc kubenswrapper[4954]: I1206 08:11:15.449764 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:11:15 crc kubenswrapper[4954]: E1206 08:11:15.450802 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:11:26 crc kubenswrapper[4954]: I1206 08:11:26.443250 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:11:26 crc kubenswrapper[4954]: E1206 08:11:26.444315 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:11:40 crc kubenswrapper[4954]: I1206 08:11:40.444615 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:11:40 crc kubenswrapper[4954]: E1206 08:11:40.445894 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:11:52 crc kubenswrapper[4954]: I1206 08:11:52.443119 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:11:52 crc kubenswrapper[4954]: E1206 08:11:52.444038 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:12:07 crc kubenswrapper[4954]: I1206 08:12:07.443101 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:12:07 crc kubenswrapper[4954]: E1206 08:12:07.443946 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:12:19 crc kubenswrapper[4954]: I1206 08:12:19.444138 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:12:19 crc kubenswrapper[4954]: E1206 08:12:19.445936 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:12:31 crc kubenswrapper[4954]: I1206 08:12:31.444105 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:12:31 crc kubenswrapper[4954]: E1206 08:12:31.445050 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:12:44 crc kubenswrapper[4954]: I1206 08:12:44.443337 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:12:44 crc kubenswrapper[4954]: E1206 08:12:44.444337 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:12:57 crc kubenswrapper[4954]: I1206 08:12:57.444370 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:12:57 crc kubenswrapper[4954]: E1206 08:12:57.445326 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:13:11 crc kubenswrapper[4954]: I1206 08:13:11.448451 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:13:12 crc kubenswrapper[4954]: I1206 08:13:12.260284 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"4766bf634a6ae62ad652b30c578e6fcff00da433de1b1fcd0bfc25ade2fefed0"} Dec 06 08:13:56 crc kubenswrapper[4954]: I1206 08:13:56.422340 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6swxb"] Dec 06 08:13:56 crc kubenswrapper[4954]: E1206 08:13:56.423648 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57399ed-c982-4211-87dd-125d607e881a" containerName="extract-utilities" Dec 06 08:13:56 crc kubenswrapper[4954]: I1206 08:13:56.423668 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57399ed-c982-4211-87dd-125d607e881a" containerName="extract-utilities" Dec 06 08:13:56 crc kubenswrapper[4954]: E1206 08:13:56.423698 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57399ed-c982-4211-87dd-125d607e881a" containerName="extract-content" Dec 06 08:13:56 crc kubenswrapper[4954]: I1206 08:13:56.423706 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57399ed-c982-4211-87dd-125d607e881a" containerName="extract-content" Dec 06 08:13:56 crc kubenswrapper[4954]: E1206 08:13:56.423718 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57399ed-c982-4211-87dd-125d607e881a" containerName="registry-server" Dec 06 08:13:56 crc kubenswrapper[4954]: I1206 08:13:56.423726 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57399ed-c982-4211-87dd-125d607e881a" containerName="registry-server" Dec 06 08:13:56 crc kubenswrapper[4954]: I1206 08:13:56.423934 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a57399ed-c982-4211-87dd-125d607e881a" containerName="registry-server" Dec 06 08:13:56 crc kubenswrapper[4954]: I1206 08:13:56.425258 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6swxb" Dec 06 08:13:56 crc kubenswrapper[4954]: I1206 08:13:56.455117 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6swxb"] Dec 06 08:13:56 crc kubenswrapper[4954]: I1206 08:13:56.591854 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04425154-6841-41a7-8c39-4bc7309ad8ec-utilities\") pod \"redhat-marketplace-6swxb\" (UID: \"04425154-6841-41a7-8c39-4bc7309ad8ec\") " pod="openshift-marketplace/redhat-marketplace-6swxb" Dec 06 08:13:56 crc kubenswrapper[4954]: I1206 08:13:56.591944 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04425154-6841-41a7-8c39-4bc7309ad8ec-catalog-content\") pod \"redhat-marketplace-6swxb\" (UID: \"04425154-6841-41a7-8c39-4bc7309ad8ec\") " pod="openshift-marketplace/redhat-marketplace-6swxb" Dec 06 08:13:56 crc kubenswrapper[4954]: I1206 08:13:56.592264 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89t85\" (UniqueName: \"kubernetes.io/projected/04425154-6841-41a7-8c39-4bc7309ad8ec-kube-api-access-89t85\") pod \"redhat-marketplace-6swxb\" (UID: \"04425154-6841-41a7-8c39-4bc7309ad8ec\") " pod="openshift-marketplace/redhat-marketplace-6swxb" Dec 06 08:13:56 crc kubenswrapper[4954]: I1206 08:13:56.694104 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89t85\" (UniqueName: \"kubernetes.io/projected/04425154-6841-41a7-8c39-4bc7309ad8ec-kube-api-access-89t85\") pod \"redhat-marketplace-6swxb\" (UID: \"04425154-6841-41a7-8c39-4bc7309ad8ec\") " pod="openshift-marketplace/redhat-marketplace-6swxb" Dec 06 08:13:56 crc kubenswrapper[4954]: I1206 08:13:56.694587 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04425154-6841-41a7-8c39-4bc7309ad8ec-utilities\") pod \"redhat-marketplace-6swxb\" (UID: \"04425154-6841-41a7-8c39-4bc7309ad8ec\") " pod="openshift-marketplace/redhat-marketplace-6swxb" Dec 06 08:13:56 crc kubenswrapper[4954]: I1206 08:13:56.694631 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04425154-6841-41a7-8c39-4bc7309ad8ec-catalog-content\") pod \"redhat-marketplace-6swxb\" (UID: \"04425154-6841-41a7-8c39-4bc7309ad8ec\") " pod="openshift-marketplace/redhat-marketplace-6swxb" Dec 06 08:13:56 crc kubenswrapper[4954]: I1206 08:13:56.695358 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04425154-6841-41a7-8c39-4bc7309ad8ec-catalog-content\") pod \"redhat-marketplace-6swxb\" (UID: \"04425154-6841-41a7-8c39-4bc7309ad8ec\") " pod="openshift-marketplace/redhat-marketplace-6swxb" Dec 06 08:13:56 crc kubenswrapper[4954]: I1206 08:13:56.695574 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04425154-6841-41a7-8c39-4bc7309ad8ec-utilities\") pod \"redhat-marketplace-6swxb\" (UID: \"04425154-6841-41a7-8c39-4bc7309ad8ec\") " pod="openshift-marketplace/redhat-marketplace-6swxb" Dec 06 08:13:56 crc kubenswrapper[4954]: I1206 08:13:56.722171 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89t85\" (UniqueName: \"kubernetes.io/projected/04425154-6841-41a7-8c39-4bc7309ad8ec-kube-api-access-89t85\") pod \"redhat-marketplace-6swxb\" (UID: \"04425154-6841-41a7-8c39-4bc7309ad8ec\") " pod="openshift-marketplace/redhat-marketplace-6swxb" Dec 06 08:13:56 crc kubenswrapper[4954]: I1206 08:13:56.792058 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6swxb" Dec 06 08:13:57 crc kubenswrapper[4954]: I1206 08:13:57.256296 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6swxb"] Dec 06 08:13:57 crc kubenswrapper[4954]: I1206 08:13:57.683281 4954 generic.go:334] "Generic (PLEG): container finished" podID="04425154-6841-41a7-8c39-4bc7309ad8ec" containerID="4ed3714eded4937cbc9304b045eefe88aa6c9e7afdcec982c965b80ed4cbd4f0" exitCode=0 Dec 06 08:13:57 crc kubenswrapper[4954]: I1206 08:13:57.683331 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6swxb" event={"ID":"04425154-6841-41a7-8c39-4bc7309ad8ec","Type":"ContainerDied","Data":"4ed3714eded4937cbc9304b045eefe88aa6c9e7afdcec982c965b80ed4cbd4f0"} Dec 06 08:13:57 crc kubenswrapper[4954]: I1206 08:13:57.683364 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6swxb" event={"ID":"04425154-6841-41a7-8c39-4bc7309ad8ec","Type":"ContainerStarted","Data":"fae9807149a532d975895c389bd4c471989a9df7231a597a3aa1d3279c2a51d8"} Dec 06 08:13:57 crc kubenswrapper[4954]: I1206 08:13:57.686633 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 08:13:58 crc kubenswrapper[4954]: I1206 08:13:58.216348 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vm7hp"] Dec 06 08:13:58 crc kubenswrapper[4954]: I1206 08:13:58.218179 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vm7hp" Dec 06 08:13:58 crc kubenswrapper[4954]: I1206 08:13:58.230525 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vm7hp"] Dec 06 08:13:58 crc kubenswrapper[4954]: I1206 08:13:58.334902 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw89c\" (UniqueName: \"kubernetes.io/projected/9b2351ca-e87b-4176-8546-231e57f3a405-kube-api-access-dw89c\") pod \"redhat-operators-vm7hp\" (UID: \"9b2351ca-e87b-4176-8546-231e57f3a405\") " pod="openshift-marketplace/redhat-operators-vm7hp" Dec 06 08:13:58 crc kubenswrapper[4954]: I1206 08:13:58.335058 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b2351ca-e87b-4176-8546-231e57f3a405-catalog-content\") pod \"redhat-operators-vm7hp\" (UID: \"9b2351ca-e87b-4176-8546-231e57f3a405\") " pod="openshift-marketplace/redhat-operators-vm7hp" Dec 06 08:13:58 crc kubenswrapper[4954]: I1206 08:13:58.335408 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b2351ca-e87b-4176-8546-231e57f3a405-utilities\") pod \"redhat-operators-vm7hp\" (UID: \"9b2351ca-e87b-4176-8546-231e57f3a405\") " pod="openshift-marketplace/redhat-operators-vm7hp" Dec 06 08:13:58 crc kubenswrapper[4954]: I1206 08:13:58.436676 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b2351ca-e87b-4176-8546-231e57f3a405-utilities\") pod \"redhat-operators-vm7hp\" (UID: \"9b2351ca-e87b-4176-8546-231e57f3a405\") " pod="openshift-marketplace/redhat-operators-vm7hp" Dec 06 08:13:58 crc kubenswrapper[4954]: I1206 08:13:58.436743 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw89c\" (UniqueName: \"kubernetes.io/projected/9b2351ca-e87b-4176-8546-231e57f3a405-kube-api-access-dw89c\") pod \"redhat-operators-vm7hp\" (UID: \"9b2351ca-e87b-4176-8546-231e57f3a405\") " pod="openshift-marketplace/redhat-operators-vm7hp" Dec 06 08:13:58 crc kubenswrapper[4954]: I1206 08:13:58.436833 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b2351ca-e87b-4176-8546-231e57f3a405-catalog-content\") pod \"redhat-operators-vm7hp\" (UID: \"9b2351ca-e87b-4176-8546-231e57f3a405\") " pod="openshift-marketplace/redhat-operators-vm7hp" Dec 06 08:13:58 crc kubenswrapper[4954]: I1206 08:13:58.437254 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b2351ca-e87b-4176-8546-231e57f3a405-utilities\") pod \"redhat-operators-vm7hp\" (UID: \"9b2351ca-e87b-4176-8546-231e57f3a405\") " pod="openshift-marketplace/redhat-operators-vm7hp" Dec 06 08:13:58 crc kubenswrapper[4954]: I1206 08:13:58.437275 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b2351ca-e87b-4176-8546-231e57f3a405-catalog-content\") pod \"redhat-operators-vm7hp\" (UID: \"9b2351ca-e87b-4176-8546-231e57f3a405\") " pod="openshift-marketplace/redhat-operators-vm7hp" Dec 06 08:13:58 crc kubenswrapper[4954]: I1206 08:13:58.462933 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw89c\" (UniqueName: \"kubernetes.io/projected/9b2351ca-e87b-4176-8546-231e57f3a405-kube-api-access-dw89c\") pod \"redhat-operators-vm7hp\" (UID: \"9b2351ca-e87b-4176-8546-231e57f3a405\") " pod="openshift-marketplace/redhat-operators-vm7hp" Dec 06 08:13:58 crc kubenswrapper[4954]: I1206 08:13:58.543161 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vm7hp" Dec 06 08:13:58 crc kubenswrapper[4954]: I1206 08:13:58.697785 4954 generic.go:334] "Generic (PLEG): container finished" podID="04425154-6841-41a7-8c39-4bc7309ad8ec" containerID="ea01132d582737bdf4c99e750f28ed6679d26d833c15d2f5c26afd96d47acdf2" exitCode=0 Dec 06 08:13:58 crc kubenswrapper[4954]: I1206 08:13:58.698236 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6swxb" event={"ID":"04425154-6841-41a7-8c39-4bc7309ad8ec","Type":"ContainerDied","Data":"ea01132d582737bdf4c99e750f28ed6679d26d833c15d2f5c26afd96d47acdf2"} Dec 06 08:13:59 crc kubenswrapper[4954]: I1206 08:13:59.043042 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vm7hp"] Dec 06 08:13:59 crc kubenswrapper[4954]: W1206 08:13:59.052488 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b2351ca_e87b_4176_8546_231e57f3a405.slice/crio-df4a831754716e7479ba459579d345f0d8e61a8a3136b9d68e4e5c8f8bd801a3 WatchSource:0}: Error finding container df4a831754716e7479ba459579d345f0d8e61a8a3136b9d68e4e5c8f8bd801a3: Status 404 returned error can't find the container with id df4a831754716e7479ba459579d345f0d8e61a8a3136b9d68e4e5c8f8bd801a3 Dec 06 08:13:59 crc kubenswrapper[4954]: I1206 08:13:59.706194 4954 generic.go:334] "Generic (PLEG): container finished" podID="9b2351ca-e87b-4176-8546-231e57f3a405" containerID="36e888a2fc87337a4e2125b53e15c5731fff43dce85f303791adb8a0d0df4971" exitCode=0 Dec 06 08:13:59 crc kubenswrapper[4954]: I1206 08:13:59.706289 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vm7hp" event={"ID":"9b2351ca-e87b-4176-8546-231e57f3a405","Type":"ContainerDied","Data":"36e888a2fc87337a4e2125b53e15c5731fff43dce85f303791adb8a0d0df4971"} Dec 06 08:13:59 crc kubenswrapper[4954]: I1206 08:13:59.706386 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vm7hp" event={"ID":"9b2351ca-e87b-4176-8546-231e57f3a405","Type":"ContainerStarted","Data":"df4a831754716e7479ba459579d345f0d8e61a8a3136b9d68e4e5c8f8bd801a3"} Dec 06 08:13:59 crc kubenswrapper[4954]: I1206 08:13:59.710320 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6swxb" event={"ID":"04425154-6841-41a7-8c39-4bc7309ad8ec","Type":"ContainerStarted","Data":"9d76cc67826a994a14e00eb8cfb208e506604983b7f6b886df61887b61a7e8e2"} Dec 06 08:14:01 crc kubenswrapper[4954]: I1206 08:14:01.727614 4954 generic.go:334] "Generic (PLEG): container finished" podID="9b2351ca-e87b-4176-8546-231e57f3a405" containerID="75504273e83747a5d3c3773c28a98bc30bf222b8c85ecd9b91b5f077a73949c4" exitCode=0 Dec 06 08:14:01 crc kubenswrapper[4954]: I1206 08:14:01.727672 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vm7hp" event={"ID":"9b2351ca-e87b-4176-8546-231e57f3a405","Type":"ContainerDied","Data":"75504273e83747a5d3c3773c28a98bc30bf222b8c85ecd9b91b5f077a73949c4"} Dec 06 08:14:01 crc kubenswrapper[4954]: I1206 08:14:01.748095 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6swxb" podStartSLOduration=4.325349266 podStartE2EDuration="5.748056495s" podCreationTimestamp="2025-12-06 08:13:56 +0000 UTC" firstStartedPulling="2025-12-06 08:13:57.686359685 +0000 UTC m=+4612.499719074" lastFinishedPulling="2025-12-06 08:13:59.109066904 +0000 UTC m=+4613.922426303" observedRunningTime="2025-12-06 08:13:59.770987666 +0000 UTC m=+4614.584347055" watchObservedRunningTime="2025-12-06 08:14:01.748056495 +0000 UTC m=+4616.561415894" Dec 06 08:14:02 crc kubenswrapper[4954]: I1206 08:14:02.737445 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vm7hp" event={"ID":"9b2351ca-e87b-4176-8546-231e57f3a405","Type":"ContainerStarted","Data":"8a2cb41e1bd770d24f4711030443e0eea72844479bb85409afb7554521edc7f3"} Dec 06 08:14:06 crc kubenswrapper[4954]: I1206 08:14:06.792960 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6swxb" Dec 06 08:14:06 crc kubenswrapper[4954]: I1206 08:14:06.794452 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6swxb" Dec 06 08:14:06 crc kubenswrapper[4954]: I1206 08:14:06.842951 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6swxb" Dec 06 08:14:06 crc kubenswrapper[4954]: I1206 08:14:06.875470 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vm7hp" podStartSLOduration=6.446338929 podStartE2EDuration="8.875440775s" podCreationTimestamp="2025-12-06 08:13:58 +0000 UTC" firstStartedPulling="2025-12-06 08:13:59.709837015 +0000 UTC m=+4614.523196394" lastFinishedPulling="2025-12-06 08:14:02.138938861 +0000 UTC m=+4616.952298240" observedRunningTime="2025-12-06 08:14:02.755485991 +0000 UTC m=+4617.568845400" watchObservedRunningTime="2025-12-06 08:14:06.875440775 +0000 UTC m=+4621.688800164" Dec 06 08:14:07 crc kubenswrapper[4954]: I1206 08:14:07.875914 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6swxb" Dec 06 08:14:07 crc kubenswrapper[4954]: I1206 08:14:07.929181 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6swxb"] Dec 06 08:14:08 crc kubenswrapper[4954]: I1206 08:14:08.543773 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vm7hp" Dec 06 08:14:08 crc kubenswrapper[4954]: I1206 08:14:08.543845 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vm7hp" Dec 06 08:14:08 crc kubenswrapper[4954]: I1206 08:14:08.594637 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vm7hp" Dec 06 08:14:08 crc kubenswrapper[4954]: I1206 08:14:08.883664 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vm7hp" Dec 06 08:14:09 crc kubenswrapper[4954]: I1206 08:14:09.514281 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vm7hp"] Dec 06 08:14:09 crc kubenswrapper[4954]: I1206 08:14:09.841872 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6swxb" podUID="04425154-6841-41a7-8c39-4bc7309ad8ec" containerName="registry-server" containerID="cri-o://9d76cc67826a994a14e00eb8cfb208e506604983b7f6b886df61887b61a7e8e2" gracePeriod=2 Dec 06 08:14:10 crc kubenswrapper[4954]: I1206 08:14:10.749652 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6swxb" Dec 06 08:14:10 crc kubenswrapper[4954]: I1206 08:14:10.851888 4954 generic.go:334] "Generic (PLEG): container finished" podID="04425154-6841-41a7-8c39-4bc7309ad8ec" containerID="9d76cc67826a994a14e00eb8cfb208e506604983b7f6b886df61887b61a7e8e2" exitCode=0 Dec 06 08:14:10 crc kubenswrapper[4954]: I1206 08:14:10.851943 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6swxb" event={"ID":"04425154-6841-41a7-8c39-4bc7309ad8ec","Type":"ContainerDied","Data":"9d76cc67826a994a14e00eb8cfb208e506604983b7f6b886df61887b61a7e8e2"} Dec 06 08:14:10 crc kubenswrapper[4954]: I1206 08:14:10.851986 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6swxb" Dec 06 08:14:10 crc kubenswrapper[4954]: I1206 08:14:10.852005 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6swxb" event={"ID":"04425154-6841-41a7-8c39-4bc7309ad8ec","Type":"ContainerDied","Data":"fae9807149a532d975895c389bd4c471989a9df7231a597a3aa1d3279c2a51d8"} Dec 06 08:14:10 crc kubenswrapper[4954]: I1206 08:14:10.852032 4954 scope.go:117] "RemoveContainer" containerID="9d76cc67826a994a14e00eb8cfb208e506604983b7f6b886df61887b61a7e8e2" Dec 06 08:14:10 crc kubenswrapper[4954]: I1206 08:14:10.852149 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vm7hp" podUID="9b2351ca-e87b-4176-8546-231e57f3a405" containerName="registry-server" containerID="cri-o://8a2cb41e1bd770d24f4711030443e0eea72844479bb85409afb7554521edc7f3" gracePeriod=2 Dec 06 08:14:10 crc kubenswrapper[4954]: I1206 08:14:10.871009 4954 scope.go:117] "RemoveContainer" containerID="ea01132d582737bdf4c99e750f28ed6679d26d833c15d2f5c26afd96d47acdf2" Dec 06 08:14:10 crc kubenswrapper[4954]: I1206 08:14:10.890320 4954 scope.go:117] "RemoveContainer" containerID="4ed3714eded4937cbc9304b045eefe88aa6c9e7afdcec982c965b80ed4cbd4f0" Dec 06 08:14:10 crc kubenswrapper[4954]: I1206 08:14:10.927204 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04425154-6841-41a7-8c39-4bc7309ad8ec-utilities\") pod \"04425154-6841-41a7-8c39-4bc7309ad8ec\" (UID: \"04425154-6841-41a7-8c39-4bc7309ad8ec\") " Dec 06 08:14:10 crc kubenswrapper[4954]: I1206 08:14:10.927302 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89t85\" (UniqueName: \"kubernetes.io/projected/04425154-6841-41a7-8c39-4bc7309ad8ec-kube-api-access-89t85\") pod \"04425154-6841-41a7-8c39-4bc7309ad8ec\" (UID: \"04425154-6841-41a7-8c39-4bc7309ad8ec\") " Dec 06 08:14:10 crc kubenswrapper[4954]: I1206 08:14:10.927370 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04425154-6841-41a7-8c39-4bc7309ad8ec-catalog-content\") pod \"04425154-6841-41a7-8c39-4bc7309ad8ec\" (UID: \"04425154-6841-41a7-8c39-4bc7309ad8ec\") " Dec 06 08:14:10 crc kubenswrapper[4954]: I1206 08:14:10.928488 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04425154-6841-41a7-8c39-4bc7309ad8ec-utilities" (OuterVolumeSpecName: "utilities") pod "04425154-6841-41a7-8c39-4bc7309ad8ec" (UID: "04425154-6841-41a7-8c39-4bc7309ad8ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:14:10 crc kubenswrapper[4954]: I1206 08:14:10.935208 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04425154-6841-41a7-8c39-4bc7309ad8ec-kube-api-access-89t85" (OuterVolumeSpecName: "kube-api-access-89t85") pod "04425154-6841-41a7-8c39-4bc7309ad8ec" (UID: "04425154-6841-41a7-8c39-4bc7309ad8ec"). InnerVolumeSpecName "kube-api-access-89t85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:14:10 crc kubenswrapper[4954]: I1206 08:14:10.951415 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04425154-6841-41a7-8c39-4bc7309ad8ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04425154-6841-41a7-8c39-4bc7309ad8ec" (UID: "04425154-6841-41a7-8c39-4bc7309ad8ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.024963 4954 scope.go:117] "RemoveContainer" containerID="9d76cc67826a994a14e00eb8cfb208e506604983b7f6b886df61887b61a7e8e2" Dec 06 08:14:11 crc kubenswrapper[4954]: E1206 08:14:11.025880 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d76cc67826a994a14e00eb8cfb208e506604983b7f6b886df61887b61a7e8e2\": container with ID starting with 9d76cc67826a994a14e00eb8cfb208e506604983b7f6b886df61887b61a7e8e2 not found: ID does not exist" containerID="9d76cc67826a994a14e00eb8cfb208e506604983b7f6b886df61887b61a7e8e2" Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.025931 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d76cc67826a994a14e00eb8cfb208e506604983b7f6b886df61887b61a7e8e2"} err="failed to get container status \"9d76cc67826a994a14e00eb8cfb208e506604983b7f6b886df61887b61a7e8e2\": rpc error: code = NotFound desc = could not find container \"9d76cc67826a994a14e00eb8cfb208e506604983b7f6b886df61887b61a7e8e2\": container with ID starting with 9d76cc67826a994a14e00eb8cfb208e506604983b7f6b886df61887b61a7e8e2 not found: ID does not exist" Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.025961 4954 scope.go:117] "RemoveContainer" containerID="ea01132d582737bdf4c99e750f28ed6679d26d833c15d2f5c26afd96d47acdf2" Dec 06 08:14:11 crc kubenswrapper[4954]: E1206 08:14:11.027779 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea01132d582737bdf4c99e750f28ed6679d26d833c15d2f5c26afd96d47acdf2\": container with ID starting with ea01132d582737bdf4c99e750f28ed6679d26d833c15d2f5c26afd96d47acdf2 not found: ID does not exist" containerID="ea01132d582737bdf4c99e750f28ed6679d26d833c15d2f5c26afd96d47acdf2" Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.027926 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea01132d582737bdf4c99e750f28ed6679d26d833c15d2f5c26afd96d47acdf2"} err="failed to get container status \"ea01132d582737bdf4c99e750f28ed6679d26d833c15d2f5c26afd96d47acdf2\": rpc error: code = NotFound desc = could not find container \"ea01132d582737bdf4c99e750f28ed6679d26d833c15d2f5c26afd96d47acdf2\": container with ID starting with ea01132d582737bdf4c99e750f28ed6679d26d833c15d2f5c26afd96d47acdf2 not found: ID does not exist" Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.027970 4954 scope.go:117] "RemoveContainer" containerID="4ed3714eded4937cbc9304b045eefe88aa6c9e7afdcec982c965b80ed4cbd4f0" Dec 06 08:14:11 crc kubenswrapper[4954]: E1206 08:14:11.028421 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ed3714eded4937cbc9304b045eefe88aa6c9e7afdcec982c965b80ed4cbd4f0\": container with ID starting with 4ed3714eded4937cbc9304b045eefe88aa6c9e7afdcec982c965b80ed4cbd4f0 not found: ID does not exist" containerID="4ed3714eded4937cbc9304b045eefe88aa6c9e7afdcec982c965b80ed4cbd4f0" Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.028448 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed3714eded4937cbc9304b045eefe88aa6c9e7afdcec982c965b80ed4cbd4f0"} err="failed to get container status \"4ed3714eded4937cbc9304b045eefe88aa6c9e7afdcec982c965b80ed4cbd4f0\": rpc error: code = NotFound desc = could not find container \"4ed3714eded4937cbc9304b045eefe88aa6c9e7afdcec982c965b80ed4cbd4f0\": container with ID starting with 4ed3714eded4937cbc9304b045eefe88aa6c9e7afdcec982c965b80ed4cbd4f0 not found: ID does not exist" Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.029360 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04425154-6841-41a7-8c39-4bc7309ad8ec-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.029387 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89t85\" (UniqueName: \"kubernetes.io/projected/04425154-6841-41a7-8c39-4bc7309ad8ec-kube-api-access-89t85\") on node \"crc\" DevicePath \"\"" Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.029403 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04425154-6841-41a7-8c39-4bc7309ad8ec-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.189910 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6swxb"] Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.195841 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6swxb"] Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.261775 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vm7hp" Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.435168 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b2351ca-e87b-4176-8546-231e57f3a405-utilities\") pod \"9b2351ca-e87b-4176-8546-231e57f3a405\" (UID: \"9b2351ca-e87b-4176-8546-231e57f3a405\") " Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.435331 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b2351ca-e87b-4176-8546-231e57f3a405-catalog-content\") pod \"9b2351ca-e87b-4176-8546-231e57f3a405\" (UID: \"9b2351ca-e87b-4176-8546-231e57f3a405\") " Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.435377 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw89c\" (UniqueName: \"kubernetes.io/projected/9b2351ca-e87b-4176-8546-231e57f3a405-kube-api-access-dw89c\") pod \"9b2351ca-e87b-4176-8546-231e57f3a405\" (UID: \"9b2351ca-e87b-4176-8546-231e57f3a405\") " Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.436776 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b2351ca-e87b-4176-8546-231e57f3a405-utilities" (OuterVolumeSpecName: "utilities") pod "9b2351ca-e87b-4176-8546-231e57f3a405" (UID: "9b2351ca-e87b-4176-8546-231e57f3a405"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.441413 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b2351ca-e87b-4176-8546-231e57f3a405-kube-api-access-dw89c" (OuterVolumeSpecName: "kube-api-access-dw89c") pod "9b2351ca-e87b-4176-8546-231e57f3a405" (UID: "9b2351ca-e87b-4176-8546-231e57f3a405"). InnerVolumeSpecName "kube-api-access-dw89c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.454161 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04425154-6841-41a7-8c39-4bc7309ad8ec" path="/var/lib/kubelet/pods/04425154-6841-41a7-8c39-4bc7309ad8ec/volumes" Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.538034 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b2351ca-e87b-4176-8546-231e57f3a405-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.538083 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw89c\" (UniqueName: \"kubernetes.io/projected/9b2351ca-e87b-4176-8546-231e57f3a405-kube-api-access-dw89c\") on node \"crc\" DevicePath \"\"" Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.862622 4954 generic.go:334] "Generic (PLEG): container finished" podID="9b2351ca-e87b-4176-8546-231e57f3a405" containerID="8a2cb41e1bd770d24f4711030443e0eea72844479bb85409afb7554521edc7f3" exitCode=0 Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.862724 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vm7hp" event={"ID":"9b2351ca-e87b-4176-8546-231e57f3a405","Type":"ContainerDied","Data":"8a2cb41e1bd770d24f4711030443e0eea72844479bb85409afb7554521edc7f3"} Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.862794 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vm7hp" event={"ID":"9b2351ca-e87b-4176-8546-231e57f3a405","Type":"ContainerDied","Data":"df4a831754716e7479ba459579d345f0d8e61a8a3136b9d68e4e5c8f8bd801a3"} Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.862822 4954 scope.go:117] "RemoveContainer" containerID="8a2cb41e1bd770d24f4711030443e0eea72844479bb85409afb7554521edc7f3" Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.862818 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vm7hp" Dec 06 08:14:11 crc kubenswrapper[4954]: I1206 08:14:11.914679 4954 scope.go:117] "RemoveContainer" containerID="75504273e83747a5d3c3773c28a98bc30bf222b8c85ecd9b91b5f077a73949c4" Dec 06 08:14:12 crc kubenswrapper[4954]: I1206 08:14:12.001028 4954 scope.go:117] "RemoveContainer" containerID="36e888a2fc87337a4e2125b53e15c5731fff43dce85f303791adb8a0d0df4971" Dec 06 08:14:12 crc kubenswrapper[4954]: I1206 08:14:12.023845 4954 scope.go:117] "RemoveContainer" containerID="8a2cb41e1bd770d24f4711030443e0eea72844479bb85409afb7554521edc7f3" Dec 06 08:14:12 crc kubenswrapper[4954]: E1206 08:14:12.024311 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a2cb41e1bd770d24f4711030443e0eea72844479bb85409afb7554521edc7f3\": container with ID starting with 8a2cb41e1bd770d24f4711030443e0eea72844479bb85409afb7554521edc7f3 not found: ID does not exist" containerID="8a2cb41e1bd770d24f4711030443e0eea72844479bb85409afb7554521edc7f3" Dec 06 08:14:12 crc kubenswrapper[4954]: I1206 08:14:12.024359 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a2cb41e1bd770d24f4711030443e0eea72844479bb85409afb7554521edc7f3"} err="failed to get container status \"8a2cb41e1bd770d24f4711030443e0eea72844479bb85409afb7554521edc7f3\": rpc error: code = NotFound desc = could not find container \"8a2cb41e1bd770d24f4711030443e0eea72844479bb85409afb7554521edc7f3\": container with ID starting with 8a2cb41e1bd770d24f4711030443e0eea72844479bb85409afb7554521edc7f3 not found: ID does not exist" Dec 06 08:14:12 crc kubenswrapper[4954]: I1206 08:14:12.024394 4954 scope.go:117] "RemoveContainer" containerID="75504273e83747a5d3c3773c28a98bc30bf222b8c85ecd9b91b5f077a73949c4" Dec 06 08:14:12 crc kubenswrapper[4954]: E1206 08:14:12.024946 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75504273e83747a5d3c3773c28a98bc30bf222b8c85ecd9b91b5f077a73949c4\": container with ID starting with 75504273e83747a5d3c3773c28a98bc30bf222b8c85ecd9b91b5f077a73949c4 not found: ID does not exist" containerID="75504273e83747a5d3c3773c28a98bc30bf222b8c85ecd9b91b5f077a73949c4" Dec 06 08:14:12 crc kubenswrapper[4954]: I1206 08:14:12.024982 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75504273e83747a5d3c3773c28a98bc30bf222b8c85ecd9b91b5f077a73949c4"} err="failed to get container status \"75504273e83747a5d3c3773c28a98bc30bf222b8c85ecd9b91b5f077a73949c4\": rpc error: code = NotFound desc = could not find container \"75504273e83747a5d3c3773c28a98bc30bf222b8c85ecd9b91b5f077a73949c4\": container with ID starting with 75504273e83747a5d3c3773c28a98bc30bf222b8c85ecd9b91b5f077a73949c4 not found: ID does not exist" Dec 06 08:14:12 crc kubenswrapper[4954]: I1206 08:14:12.025011 4954 scope.go:117] "RemoveContainer" containerID="36e888a2fc87337a4e2125b53e15c5731fff43dce85f303791adb8a0d0df4971" Dec 06 08:14:12 crc kubenswrapper[4954]: E1206 08:14:12.025354 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36e888a2fc87337a4e2125b53e15c5731fff43dce85f303791adb8a0d0df4971\": container with ID starting with 36e888a2fc87337a4e2125b53e15c5731fff43dce85f303791adb8a0d0df4971 not found: ID does not exist" containerID="36e888a2fc87337a4e2125b53e15c5731fff43dce85f303791adb8a0d0df4971" Dec 06 08:14:12 crc kubenswrapper[4954]: I1206 08:14:12.025412 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36e888a2fc87337a4e2125b53e15c5731fff43dce85f303791adb8a0d0df4971"} err="failed to get container status \"36e888a2fc87337a4e2125b53e15c5731fff43dce85f303791adb8a0d0df4971\": rpc error: code = NotFound desc = could not find container \"36e888a2fc87337a4e2125b53e15c5731fff43dce85f303791adb8a0d0df4971\": container with ID starting with 36e888a2fc87337a4e2125b53e15c5731fff43dce85f303791adb8a0d0df4971 not found: ID does not exist" Dec 06 08:14:12 crc kubenswrapper[4954]: I1206 08:14:12.484830 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b2351ca-e87b-4176-8546-231e57f3a405-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b2351ca-e87b-4176-8546-231e57f3a405" (UID: "9b2351ca-e87b-4176-8546-231e57f3a405"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:14:12 crc kubenswrapper[4954]: I1206 08:14:12.556953 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b2351ca-e87b-4176-8546-231e57f3a405-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:14:12 crc kubenswrapper[4954]: I1206 08:14:12.801342 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vm7hp"] Dec 06 08:14:12 crc kubenswrapper[4954]: I1206 08:14:12.806445 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vm7hp"] Dec 06 08:14:13 crc kubenswrapper[4954]: I1206 08:14:13.451045 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b2351ca-e87b-4176-8546-231e57f3a405" path="/var/lib/kubelet/pods/9b2351ca-e87b-4176-8546-231e57f3a405/volumes" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.176999 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c"] Dec 06 08:15:00 crc kubenswrapper[4954]: E1206 08:15:00.178025 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04425154-6841-41a7-8c39-4bc7309ad8ec" containerName="extract-content" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.178047 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="04425154-6841-41a7-8c39-4bc7309ad8ec" containerName="extract-content" Dec 06 08:15:00 crc kubenswrapper[4954]: E1206 08:15:00.178102 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04425154-6841-41a7-8c39-4bc7309ad8ec" containerName="extract-utilities" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.178110 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="04425154-6841-41a7-8c39-4bc7309ad8ec" containerName="extract-utilities" Dec 06 08:15:00 crc kubenswrapper[4954]: E1206 08:15:00.178122 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b2351ca-e87b-4176-8546-231e57f3a405" containerName="registry-server" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.178128 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b2351ca-e87b-4176-8546-231e57f3a405" containerName="registry-server" Dec 06 08:15:00 crc kubenswrapper[4954]: E1206 08:15:00.178139 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b2351ca-e87b-4176-8546-231e57f3a405" containerName="extract-utilities" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.178148 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b2351ca-e87b-4176-8546-231e57f3a405" containerName="extract-utilities" Dec 06 08:15:00 crc kubenswrapper[4954]: E1206 08:15:00.178163 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b2351ca-e87b-4176-8546-231e57f3a405" containerName="extract-content" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.178172 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b2351ca-e87b-4176-8546-231e57f3a405" containerName="extract-content" Dec 06 08:15:00 crc kubenswrapper[4954]: E1206 08:15:00.178185 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04425154-6841-41a7-8c39-4bc7309ad8ec" containerName="registry-server" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.178191 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="04425154-6841-41a7-8c39-4bc7309ad8ec" containerName="registry-server" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.178394 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b2351ca-e87b-4176-8546-231e57f3a405" containerName="registry-server" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.178417 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="04425154-6841-41a7-8c39-4bc7309ad8ec" containerName="registry-server" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.179021 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.181357 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.184164 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.193146 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c"] Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.300925 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f-secret-volume\") pod \"collect-profiles-29416815-lqg6c\" (UID: \"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.301009 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fv8q\" (UniqueName: \"kubernetes.io/projected/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f-kube-api-access-9fv8q\") pod \"collect-profiles-29416815-lqg6c\" (UID: \"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.301077 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f-config-volume\") pod \"collect-profiles-29416815-lqg6c\" (UID: \"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.402450 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f-config-volume\") pod \"collect-profiles-29416815-lqg6c\" (UID: \"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.402502 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f-secret-volume\") pod \"collect-profiles-29416815-lqg6c\" (UID: \"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.402555 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fv8q\" (UniqueName: \"kubernetes.io/projected/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f-kube-api-access-9fv8q\") pod \"collect-profiles-29416815-lqg6c\" (UID: \"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.403672 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f-config-volume\") pod \"collect-profiles-29416815-lqg6c\" (UID: \"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.409367 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f-secret-volume\") pod \"collect-profiles-29416815-lqg6c\" (UID: \"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.422924 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fv8q\" (UniqueName: \"kubernetes.io/projected/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f-kube-api-access-9fv8q\") pod \"collect-profiles-29416815-lqg6c\" (UID: \"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.504296 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c" Dec 06 08:15:00 crc kubenswrapper[4954]: I1206 08:15:00.952210 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c"] Dec 06 08:15:01 crc kubenswrapper[4954]: I1206 08:15:01.257378 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c" event={"ID":"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f","Type":"ContainerStarted","Data":"801ac756c0693be85b21854668a57a9627135699f1d8a4611326e3352422245e"} Dec 06 08:15:01 crc kubenswrapper[4954]: I1206 08:15:01.257449 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c" event={"ID":"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f","Type":"ContainerStarted","Data":"900f58639016c939271ae8a182cda08ea417d0fadcee458432e8c0fed8639828"} Dec 06 08:15:01 crc kubenswrapper[4954]: I1206 08:15:01.278375 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c" podStartSLOduration=1.278346899 podStartE2EDuration="1.278346899s" podCreationTimestamp="2025-12-06 08:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:15:01.273332476 +0000 UTC m=+4676.086691865" watchObservedRunningTime="2025-12-06 08:15:01.278346899 +0000 UTC m=+4676.091706288" Dec 06 08:15:02 crc kubenswrapper[4954]: I1206 08:15:02.268921 4954 generic.go:334] "Generic (PLEG): container finished" podID="3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f" containerID="801ac756c0693be85b21854668a57a9627135699f1d8a4611326e3352422245e" exitCode=0 Dec 06 08:15:02 crc kubenswrapper[4954]: I1206 08:15:02.269050 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c" event={"ID":"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f","Type":"ContainerDied","Data":"801ac756c0693be85b21854668a57a9627135699f1d8a4611326e3352422245e"} Dec 06 08:15:03 crc kubenswrapper[4954]: I1206 08:15:03.617081 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c" Dec 06 08:15:03 crc kubenswrapper[4954]: I1206 08:15:03.768335 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f-config-volume\") pod \"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f\" (UID: \"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f\") " Dec 06 08:15:03 crc kubenswrapper[4954]: I1206 08:15:03.768450 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fv8q\" (UniqueName: \"kubernetes.io/projected/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f-kube-api-access-9fv8q\") pod \"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f\" (UID: \"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f\") " Dec 06 08:15:03 crc kubenswrapper[4954]: I1206 08:15:03.768525 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f-secret-volume\") pod \"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f\" (UID: \"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f\") " Dec 06 08:15:03 crc kubenswrapper[4954]: I1206 08:15:03.769039 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f-config-volume" (OuterVolumeSpecName: "config-volume") pod "3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f" (UID: "3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:15:03 crc kubenswrapper[4954]: I1206 08:15:03.775834 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f" (UID: "3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:15:03 crc kubenswrapper[4954]: I1206 08:15:03.776098 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f-kube-api-access-9fv8q" (OuterVolumeSpecName: "kube-api-access-9fv8q") pod "3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f" (UID: "3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f"). InnerVolumeSpecName "kube-api-access-9fv8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:15:03 crc kubenswrapper[4954]: I1206 08:15:03.870390 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 08:15:03 crc kubenswrapper[4954]: I1206 08:15:03.870451 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fv8q\" (UniqueName: \"kubernetes.io/projected/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f-kube-api-access-9fv8q\") on node \"crc\" DevicePath \"\"" Dec 06 08:15:03 crc kubenswrapper[4954]: I1206 08:15:03.870465 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 08:15:04 crc kubenswrapper[4954]: I1206 08:15:04.289182 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c" event={"ID":"3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f","Type":"ContainerDied","Data":"900f58639016c939271ae8a182cda08ea417d0fadcee458432e8c0fed8639828"} Dec 06 08:15:04 crc kubenswrapper[4954]: I1206 08:15:04.289238 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="900f58639016c939271ae8a182cda08ea417d0fadcee458432e8c0fed8639828" Dec 06 08:15:04 crc kubenswrapper[4954]: I1206 08:15:04.289814 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c" Dec 06 08:15:04 crc kubenswrapper[4954]: I1206 08:15:04.368969 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv"] Dec 06 08:15:04 crc kubenswrapper[4954]: I1206 08:15:04.373700 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416770-zdsdv"] Dec 06 08:15:05 crc kubenswrapper[4954]: I1206 08:15:05.456129 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4735c87-b0cf-4ce4-8e19-506938b991de" path="/var/lib/kubelet/pods/f4735c87-b0cf-4ce4-8e19-506938b991de/volumes" Dec 06 08:15:21 crc kubenswrapper[4954]: I1206 08:15:21.727429 4954 scope.go:117] "RemoveContainer" containerID="91ebdc7bfa8196cde17ffe336f5470cc3a3e672f35b8ddd3b9434cc8105d4baa" Dec 06 08:15:40 crc kubenswrapper[4954]: I1206 08:15:40.102197 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:15:40 crc kubenswrapper[4954]: I1206 08:15:40.103087 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:16:10 crc kubenswrapper[4954]: I1206 08:16:10.101746 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:16:10 crc kubenswrapper[4954]: I1206 08:16:10.102508 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:16:16 crc kubenswrapper[4954]: I1206 08:16:16.558301 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-47695"] Dec 06 08:16:16 crc kubenswrapper[4954]: E1206 08:16:16.559348 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f" containerName="collect-profiles" Dec 06 08:16:16 crc kubenswrapper[4954]: I1206 08:16:16.559365 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f" containerName="collect-profiles" Dec 06 08:16:16 crc kubenswrapper[4954]: I1206 08:16:16.559512 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f" containerName="collect-profiles" Dec 06 08:16:16 crc kubenswrapper[4954]: I1206 08:16:16.560825 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47695" Dec 06 08:16:16 crc kubenswrapper[4954]: I1206 08:16:16.574631 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-47695"] Dec 06 08:16:16 crc kubenswrapper[4954]: I1206 08:16:16.723057 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b-catalog-content\") pod \"certified-operators-47695\" (UID: \"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b\") " pod="openshift-marketplace/certified-operators-47695" Dec 06 08:16:16 crc kubenswrapper[4954]: I1206 08:16:16.723230 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lbbx\" (UniqueName: \"kubernetes.io/projected/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b-kube-api-access-4lbbx\") pod \"certified-operators-47695\" (UID: \"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b\") " pod="openshift-marketplace/certified-operators-47695" Dec 06 08:16:16 crc kubenswrapper[4954]: I1206 08:16:16.723279 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b-utilities\") pod \"certified-operators-47695\" (UID: \"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b\") " pod="openshift-marketplace/certified-operators-47695" Dec 06 08:16:16 crc kubenswrapper[4954]: I1206 08:16:16.825017 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b-catalog-content\") pod \"certified-operators-47695\" (UID: \"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b\") " pod="openshift-marketplace/certified-operators-47695" Dec 06 08:16:16 crc kubenswrapper[4954]: I1206 08:16:16.825134 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lbbx\" (UniqueName: \"kubernetes.io/projected/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b-kube-api-access-4lbbx\") pod \"certified-operators-47695\" (UID: \"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b\") " pod="openshift-marketplace/certified-operators-47695" Dec 06 08:16:16 crc kubenswrapper[4954]: I1206 08:16:16.825204 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b-utilities\") pod \"certified-operators-47695\" (UID: \"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b\") " pod="openshift-marketplace/certified-operators-47695" Dec 06 08:16:16 crc kubenswrapper[4954]: I1206 08:16:16.825789 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b-utilities\") pod \"certified-operators-47695\" (UID: \"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b\") " pod="openshift-marketplace/certified-operators-47695" Dec 06 08:16:16 crc kubenswrapper[4954]: I1206 08:16:16.825838 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b-catalog-content\") pod \"certified-operators-47695\" (UID: \"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b\") " pod="openshift-marketplace/certified-operators-47695" Dec 06 08:16:16 crc kubenswrapper[4954]: I1206 08:16:16.848452 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lbbx\" (UniqueName: \"kubernetes.io/projected/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b-kube-api-access-4lbbx\") pod \"certified-operators-47695\" (UID: \"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b\") " pod="openshift-marketplace/certified-operators-47695" Dec 06 08:16:16 crc kubenswrapper[4954]: I1206 08:16:16.905726 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47695" Dec 06 08:16:17 crc kubenswrapper[4954]: I1206 08:16:17.634124 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-47695"] Dec 06 08:16:17 crc kubenswrapper[4954]: I1206 08:16:17.954608 4954 generic.go:334] "Generic (PLEG): container finished" podID="3bbaf2ea-d0a3-4885-8b30-a979bc9c878b" containerID="f8fb8adbe3247c19cbdfe09bdf12c265a5f917399542f9891b6b9f60db693e4c" exitCode=0 Dec 06 08:16:17 crc kubenswrapper[4954]: I1206 08:16:17.955064 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47695" event={"ID":"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b","Type":"ContainerDied","Data":"f8fb8adbe3247c19cbdfe09bdf12c265a5f917399542f9891b6b9f60db693e4c"} Dec 06 08:16:17 crc kubenswrapper[4954]: I1206 08:16:17.955111 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47695" event={"ID":"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b","Type":"ContainerStarted","Data":"8b1dac05d2538abfabf46e4a6872c760ef708251ba6482a0d5ce522eb867ef13"} Dec 06 08:16:18 crc kubenswrapper[4954]: I1206 08:16:18.972019 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47695" event={"ID":"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b","Type":"ContainerStarted","Data":"1a376ec05a59480234e3141ded0e89ad93e5aa301ec83d187d6ca7a5dd124099"} Dec 06 08:16:19 crc kubenswrapper[4954]: I1206 08:16:19.980239 4954 generic.go:334] "Generic (PLEG): container finished" podID="3bbaf2ea-d0a3-4885-8b30-a979bc9c878b" containerID="1a376ec05a59480234e3141ded0e89ad93e5aa301ec83d187d6ca7a5dd124099" exitCode=0 Dec 06 08:16:19 crc kubenswrapper[4954]: I1206 08:16:19.980315 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47695" event={"ID":"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b","Type":"ContainerDied","Data":"1a376ec05a59480234e3141ded0e89ad93e5aa301ec83d187d6ca7a5dd124099"} Dec 06 08:16:20 crc kubenswrapper[4954]: I1206 08:16:20.992272 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47695" event={"ID":"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b","Type":"ContainerStarted","Data":"03305f6f8e3ed0c8deaf011abf1b5c0ba58a12a5b1440c27adbff100d4e8d856"} Dec 06 08:16:21 crc kubenswrapper[4954]: I1206 08:16:21.014689 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-47695" podStartSLOduration=2.609180163 podStartE2EDuration="5.014664542s" podCreationTimestamp="2025-12-06 08:16:16 +0000 UTC" firstStartedPulling="2025-12-06 08:16:17.956634349 +0000 UTC m=+4752.769993728" lastFinishedPulling="2025-12-06 08:16:20.362118718 +0000 UTC m=+4755.175478107" observedRunningTime="2025-12-06 08:16:21.011088398 +0000 UTC m=+4755.824447807" watchObservedRunningTime="2025-12-06 08:16:21.014664542 +0000 UTC m=+4755.828023931" Dec 06 08:16:25 crc kubenswrapper[4954]: I1206 08:16:25.538677 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h6vrz"] Dec 06 08:16:25 crc kubenswrapper[4954]: I1206 08:16:25.540968 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6vrz" Dec 06 08:16:25 crc kubenswrapper[4954]: I1206 08:16:25.545127 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h6vrz"] Dec 06 08:16:25 crc kubenswrapper[4954]: I1206 08:16:25.589221 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5-catalog-content\") pod \"community-operators-h6vrz\" (UID: \"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5\") " pod="openshift-marketplace/community-operators-h6vrz" Dec 06 08:16:25 crc kubenswrapper[4954]: I1206 08:16:25.589426 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5-utilities\") pod \"community-operators-h6vrz\" (UID: \"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5\") " pod="openshift-marketplace/community-operators-h6vrz" Dec 06 08:16:25 crc kubenswrapper[4954]: I1206 08:16:25.691458 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5pm4\" (UniqueName: \"kubernetes.io/projected/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5-kube-api-access-d5pm4\") pod \"community-operators-h6vrz\" (UID: \"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5\") " pod="openshift-marketplace/community-operators-h6vrz" Dec 06 08:16:25 crc kubenswrapper[4954]: I1206 08:16:25.691604 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5-catalog-content\") pod \"community-operators-h6vrz\" (UID: \"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5\") " pod="openshift-marketplace/community-operators-h6vrz" Dec 06 08:16:25 crc kubenswrapper[4954]: I1206 08:16:25.691696 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5-utilities\") pod \"community-operators-h6vrz\" (UID: \"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5\") " pod="openshift-marketplace/community-operators-h6vrz" Dec 06 08:16:25 crc kubenswrapper[4954]: I1206 08:16:25.692391 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5-catalog-content\") pod \"community-operators-h6vrz\" (UID: \"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5\") " pod="openshift-marketplace/community-operators-h6vrz" Dec 06 08:16:25 crc kubenswrapper[4954]: I1206 08:16:25.692493 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5-utilities\") pod \"community-operators-h6vrz\" (UID: \"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5\") " pod="openshift-marketplace/community-operators-h6vrz" Dec 06 08:16:25 crc kubenswrapper[4954]: I1206 08:16:25.793167 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5pm4\" (UniqueName: \"kubernetes.io/projected/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5-kube-api-access-d5pm4\") pod \"community-operators-h6vrz\" (UID: \"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5\") " pod="openshift-marketplace/community-operators-h6vrz" Dec 06 08:16:25 crc kubenswrapper[4954]: I1206 08:16:25.822446 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5pm4\" (UniqueName: \"kubernetes.io/projected/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5-kube-api-access-d5pm4\") pod \"community-operators-h6vrz\" (UID: \"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5\") " pod="openshift-marketplace/community-operators-h6vrz" Dec 06 08:16:25 crc kubenswrapper[4954]: I1206 08:16:25.880267 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6vrz" Dec 06 08:16:26 crc kubenswrapper[4954]: I1206 08:16:26.210295 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h6vrz"] Dec 06 08:16:26 crc kubenswrapper[4954]: I1206 08:16:26.906727 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-47695" Dec 06 08:16:26 crc kubenswrapper[4954]: I1206 08:16:26.907287 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-47695" Dec 06 08:16:26 crc kubenswrapper[4954]: I1206 08:16:26.959293 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-47695" Dec 06 08:16:27 crc kubenswrapper[4954]: I1206 08:16:27.075937 4954 generic.go:334] "Generic (PLEG): container finished" podID="3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5" containerID="3a1f885e2f77dc8bbeea3a800ea77e0d889216d3df0ca16c45f390322736ba30" exitCode=0 Dec 06 08:16:27 crc kubenswrapper[4954]: I1206 08:16:27.076367 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6vrz" event={"ID":"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5","Type":"ContainerDied","Data":"3a1f885e2f77dc8bbeea3a800ea77e0d889216d3df0ca16c45f390322736ba30"} Dec 06 08:16:27 crc kubenswrapper[4954]: I1206 08:16:27.076515 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6vrz" event={"ID":"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5","Type":"ContainerStarted","Data":"bfee2d196091e2390f6162200bb6259f6ee0044696d28391ff7c25ed0cbcbde9"} Dec 06 08:16:27 crc kubenswrapper[4954]: I1206 08:16:27.130687 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-47695" Dec 06 08:16:29 crc kubenswrapper[4954]: I1206 08:16:29.096759 4954 generic.go:334] "Generic (PLEG): container finished" podID="3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5" containerID="cc64690c333d02ab5d3c0990ba1a5b3a9afbfd8374a7c8ab70a4c83605ef046b" exitCode=0 Dec 06 08:16:29 crc kubenswrapper[4954]: I1206 08:16:29.096851 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6vrz" event={"ID":"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5","Type":"ContainerDied","Data":"cc64690c333d02ab5d3c0990ba1a5b3a9afbfd8374a7c8ab70a4c83605ef046b"} Dec 06 08:16:29 crc kubenswrapper[4954]: I1206 08:16:29.298116 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-47695"] Dec 06 08:16:29 crc kubenswrapper[4954]: I1206 08:16:29.298688 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-47695" podUID="3bbaf2ea-d0a3-4885-8b30-a979bc9c878b" containerName="registry-server" containerID="cri-o://03305f6f8e3ed0c8deaf011abf1b5c0ba58a12a5b1440c27adbff100d4e8d856" gracePeriod=2 Dec 06 08:16:30 crc kubenswrapper[4954]: I1206 08:16:30.109039 4954 generic.go:334] "Generic (PLEG): container finished" podID="3bbaf2ea-d0a3-4885-8b30-a979bc9c878b" containerID="03305f6f8e3ed0c8deaf011abf1b5c0ba58a12a5b1440c27adbff100d4e8d856" exitCode=0 Dec 06 08:16:30 crc kubenswrapper[4954]: I1206 08:16:30.109142 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47695" event={"ID":"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b","Type":"ContainerDied","Data":"03305f6f8e3ed0c8deaf011abf1b5c0ba58a12a5b1440c27adbff100d4e8d856"} Dec 06 08:16:30 crc kubenswrapper[4954]: I1206 08:16:30.243375 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47695" Dec 06 08:16:30 crc kubenswrapper[4954]: I1206 08:16:30.291044 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lbbx\" (UniqueName: \"kubernetes.io/projected/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b-kube-api-access-4lbbx\") pod \"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b\" (UID: \"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b\") " Dec 06 08:16:30 crc kubenswrapper[4954]: I1206 08:16:30.291203 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b-utilities\") pod \"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b\" (UID: \"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b\") " Dec 06 08:16:30 crc kubenswrapper[4954]: I1206 08:16:30.291323 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b-catalog-content\") pod \"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b\" (UID: \"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b\") " Dec 06 08:16:30 crc kubenswrapper[4954]: I1206 08:16:30.294236 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b-utilities" (OuterVolumeSpecName: "utilities") pod "3bbaf2ea-d0a3-4885-8b30-a979bc9c878b" (UID: "3bbaf2ea-d0a3-4885-8b30-a979bc9c878b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:16:30 crc kubenswrapper[4954]: I1206 08:16:30.323154 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b-kube-api-access-4lbbx" (OuterVolumeSpecName: "kube-api-access-4lbbx") pod "3bbaf2ea-d0a3-4885-8b30-a979bc9c878b" (UID: "3bbaf2ea-d0a3-4885-8b30-a979bc9c878b"). InnerVolumeSpecName "kube-api-access-4lbbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:16:30 crc kubenswrapper[4954]: I1206 08:16:30.358068 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bbaf2ea-d0a3-4885-8b30-a979bc9c878b" (UID: "3bbaf2ea-d0a3-4885-8b30-a979bc9c878b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:16:30 crc kubenswrapper[4954]: I1206 08:16:30.394461 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:16:30 crc kubenswrapper[4954]: I1206 08:16:30.394544 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:16:30 crc kubenswrapper[4954]: I1206 08:16:30.394602 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lbbx\" (UniqueName: \"kubernetes.io/projected/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b-kube-api-access-4lbbx\") on node \"crc\" DevicePath \"\"" Dec 06 08:16:31 crc kubenswrapper[4954]: I1206 08:16:31.119332 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47695" event={"ID":"3bbaf2ea-d0a3-4885-8b30-a979bc9c878b","Type":"ContainerDied","Data":"8b1dac05d2538abfabf46e4a6872c760ef708251ba6482a0d5ce522eb867ef13"} Dec 06 08:16:31 crc kubenswrapper[4954]: I1206 08:16:31.119789 4954 scope.go:117] "RemoveContainer" containerID="03305f6f8e3ed0c8deaf011abf1b5c0ba58a12a5b1440c27adbff100d4e8d856" Dec 06 08:16:31 crc kubenswrapper[4954]: I1206 08:16:31.119478 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47695" Dec 06 08:16:31 crc kubenswrapper[4954]: I1206 08:16:31.123774 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6vrz" event={"ID":"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5","Type":"ContainerStarted","Data":"88b555a60d49a8957ff6415c80bce55a61566a3d49c1656cc7209a90042390eb"} Dec 06 08:16:31 crc kubenswrapper[4954]: I1206 08:16:31.139247 4954 scope.go:117] "RemoveContainer" containerID="1a376ec05a59480234e3141ded0e89ad93e5aa301ec83d187d6ca7a5dd124099" Dec 06 08:16:31 crc kubenswrapper[4954]: I1206 08:16:31.150683 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h6vrz" podStartSLOduration=3.490209662 podStartE2EDuration="6.150661073s" podCreationTimestamp="2025-12-06 08:16:25 +0000 UTC" firstStartedPulling="2025-12-06 08:16:27.077457398 +0000 UTC m=+4761.890816787" lastFinishedPulling="2025-12-06 08:16:29.737908819 +0000 UTC m=+4764.551268198" observedRunningTime="2025-12-06 08:16:31.146117012 +0000 UTC m=+4765.959476401" watchObservedRunningTime="2025-12-06 08:16:31.150661073 +0000 UTC m=+4765.964020462" Dec 06 08:16:31 crc kubenswrapper[4954]: I1206 08:16:31.171435 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-47695"] Dec 06 08:16:31 crc kubenswrapper[4954]: I1206 08:16:31.173013 4954 scope.go:117] "RemoveContainer" containerID="f8fb8adbe3247c19cbdfe09bdf12c265a5f917399542f9891b6b9f60db693e4c" Dec 06 08:16:31 crc kubenswrapper[4954]: I1206 08:16:31.177405 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-47695"] Dec 06 08:16:31 crc kubenswrapper[4954]: I1206 08:16:31.453211 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bbaf2ea-d0a3-4885-8b30-a979bc9c878b" path="/var/lib/kubelet/pods/3bbaf2ea-d0a3-4885-8b30-a979bc9c878b/volumes" Dec 06 08:16:35 crc kubenswrapper[4954]: I1206 08:16:35.880554 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h6vrz" Dec 06 08:16:35 crc kubenswrapper[4954]: I1206 08:16:35.880649 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h6vrz" Dec 06 08:16:35 crc kubenswrapper[4954]: I1206 08:16:35.951870 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h6vrz" Dec 06 08:16:36 crc kubenswrapper[4954]: I1206 08:16:36.222622 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h6vrz" Dec 06 08:16:36 crc kubenswrapper[4954]: I1206 08:16:36.288195 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h6vrz"] Dec 06 08:16:38 crc kubenswrapper[4954]: I1206 08:16:38.189851 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h6vrz" podUID="3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5" containerName="registry-server" containerID="cri-o://88b555a60d49a8957ff6415c80bce55a61566a3d49c1656cc7209a90042390eb" gracePeriod=2 Dec 06 08:16:39 crc kubenswrapper[4954]: I1206 08:16:39.776869 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6vrz" Dec 06 08:16:39 crc kubenswrapper[4954]: I1206 08:16:39.846704 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5-utilities\") pod \"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5\" (UID: \"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5\") " Dec 06 08:16:39 crc kubenswrapper[4954]: I1206 08:16:39.846759 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5pm4\" (UniqueName: \"kubernetes.io/projected/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5-kube-api-access-d5pm4\") pod \"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5\" (UID: \"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5\") " Dec 06 08:16:39 crc kubenswrapper[4954]: I1206 08:16:39.846836 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5-catalog-content\") pod \"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5\" (UID: \"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5\") " Dec 06 08:16:39 crc kubenswrapper[4954]: I1206 08:16:39.847881 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5-utilities" (OuterVolumeSpecName: "utilities") pod "3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5" (UID: "3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:16:39 crc kubenswrapper[4954]: I1206 08:16:39.853775 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5-kube-api-access-d5pm4" (OuterVolumeSpecName: "kube-api-access-d5pm4") pod "3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5" (UID: "3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5"). InnerVolumeSpecName "kube-api-access-d5pm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:16:39 crc kubenswrapper[4954]: I1206 08:16:39.916430 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5" (UID: "3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:16:39 crc kubenswrapper[4954]: I1206 08:16:39.948620 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:16:39 crc kubenswrapper[4954]: I1206 08:16:39.948671 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:16:39 crc kubenswrapper[4954]: I1206 08:16:39.948685 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5pm4\" (UniqueName: \"kubernetes.io/projected/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5-kube-api-access-d5pm4\") on node \"crc\" DevicePath \"\"" Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.101504 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.101591 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.101658 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.102312 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4766bf634a6ae62ad652b30c578e6fcff00da433de1b1fcd0bfc25ade2fefed0"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.102378 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://4766bf634a6ae62ad652b30c578e6fcff00da433de1b1fcd0bfc25ade2fefed0" gracePeriod=600 Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.209413 4954 generic.go:334] "Generic (PLEG): container finished" podID="3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5" containerID="88b555a60d49a8957ff6415c80bce55a61566a3d49c1656cc7209a90042390eb" exitCode=0 Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.209470 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6vrz" event={"ID":"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5","Type":"ContainerDied","Data":"88b555a60d49a8957ff6415c80bce55a61566a3d49c1656cc7209a90042390eb"} Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.209476 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6vrz" Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.209501 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6vrz" event={"ID":"3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5","Type":"ContainerDied","Data":"bfee2d196091e2390f6162200bb6259f6ee0044696d28391ff7c25ed0cbcbde9"} Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.209553 4954 scope.go:117] "RemoveContainer" containerID="88b555a60d49a8957ff6415c80bce55a61566a3d49c1656cc7209a90042390eb" Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.286745 4954 scope.go:117] "RemoveContainer" containerID="cc64690c333d02ab5d3c0990ba1a5b3a9afbfd8374a7c8ab70a4c83605ef046b" Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.289033 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h6vrz"] Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.302782 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h6vrz"] Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.339352 4954 scope.go:117] "RemoveContainer" containerID="3a1f885e2f77dc8bbeea3a800ea77e0d889216d3df0ca16c45f390322736ba30" Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.366076 4954 scope.go:117] "RemoveContainer" containerID="88b555a60d49a8957ff6415c80bce55a61566a3d49c1656cc7209a90042390eb" Dec 06 08:16:40 crc kubenswrapper[4954]: E1206 08:16:40.366930 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88b555a60d49a8957ff6415c80bce55a61566a3d49c1656cc7209a90042390eb\": container with ID starting with 88b555a60d49a8957ff6415c80bce55a61566a3d49c1656cc7209a90042390eb not found: ID does not exist" containerID="88b555a60d49a8957ff6415c80bce55a61566a3d49c1656cc7209a90042390eb" Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.366988 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88b555a60d49a8957ff6415c80bce55a61566a3d49c1656cc7209a90042390eb"} err="failed to get container status \"88b555a60d49a8957ff6415c80bce55a61566a3d49c1656cc7209a90042390eb\": rpc error: code = NotFound desc = could not find container \"88b555a60d49a8957ff6415c80bce55a61566a3d49c1656cc7209a90042390eb\": container with ID starting with 88b555a60d49a8957ff6415c80bce55a61566a3d49c1656cc7209a90042390eb not found: ID does not exist" Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.367029 4954 scope.go:117] "RemoveContainer" containerID="cc64690c333d02ab5d3c0990ba1a5b3a9afbfd8374a7c8ab70a4c83605ef046b" Dec 06 08:16:40 crc kubenswrapper[4954]: E1206 08:16:40.367812 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc64690c333d02ab5d3c0990ba1a5b3a9afbfd8374a7c8ab70a4c83605ef046b\": container with ID starting with cc64690c333d02ab5d3c0990ba1a5b3a9afbfd8374a7c8ab70a4c83605ef046b not found: ID does not exist" containerID="cc64690c333d02ab5d3c0990ba1a5b3a9afbfd8374a7c8ab70a4c83605ef046b" Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.367859 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc64690c333d02ab5d3c0990ba1a5b3a9afbfd8374a7c8ab70a4c83605ef046b"} err="failed to get container status \"cc64690c333d02ab5d3c0990ba1a5b3a9afbfd8374a7c8ab70a4c83605ef046b\": rpc error: code = NotFound desc = could not find container \"cc64690c333d02ab5d3c0990ba1a5b3a9afbfd8374a7c8ab70a4c83605ef046b\": container with ID starting with cc64690c333d02ab5d3c0990ba1a5b3a9afbfd8374a7c8ab70a4c83605ef046b not found: ID does not exist" Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.367885 4954 scope.go:117] "RemoveContainer" containerID="3a1f885e2f77dc8bbeea3a800ea77e0d889216d3df0ca16c45f390322736ba30" Dec 06 08:16:40 crc kubenswrapper[4954]: E1206 08:16:40.368254 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a1f885e2f77dc8bbeea3a800ea77e0d889216d3df0ca16c45f390322736ba30\": container with ID starting with 3a1f885e2f77dc8bbeea3a800ea77e0d889216d3df0ca16c45f390322736ba30 not found: ID does not exist" containerID="3a1f885e2f77dc8bbeea3a800ea77e0d889216d3df0ca16c45f390322736ba30" Dec 06 08:16:40 crc kubenswrapper[4954]: I1206 08:16:40.368280 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a1f885e2f77dc8bbeea3a800ea77e0d889216d3df0ca16c45f390322736ba30"} err="failed to get container status \"3a1f885e2f77dc8bbeea3a800ea77e0d889216d3df0ca16c45f390322736ba30\": rpc error: code = NotFound desc = could not find container \"3a1f885e2f77dc8bbeea3a800ea77e0d889216d3df0ca16c45f390322736ba30\": container with ID starting with 3a1f885e2f77dc8bbeea3a800ea77e0d889216d3df0ca16c45f390322736ba30 not found: ID does not exist" Dec 06 08:16:41 crc kubenswrapper[4954]: I1206 08:16:41.219624 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="4766bf634a6ae62ad652b30c578e6fcff00da433de1b1fcd0bfc25ade2fefed0" exitCode=0 Dec 06 08:16:41 crc kubenswrapper[4954]: I1206 08:16:41.219708 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"4766bf634a6ae62ad652b30c578e6fcff00da433de1b1fcd0bfc25ade2fefed0"} Dec 06 08:16:41 crc kubenswrapper[4954]: I1206 08:16:41.220063 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84"} Dec 06 08:16:41 crc kubenswrapper[4954]: I1206 08:16:41.220091 4954 scope.go:117] "RemoveContainer" containerID="9be9ae2c9bea80996bdb04ae109e8553c16fcc872e7fca3bba95f7a363911316" Dec 06 08:16:41 crc kubenswrapper[4954]: I1206 08:16:41.454506 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5" path="/var/lib/kubelet/pods/3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5/volumes" Dec 06 08:18:40 crc kubenswrapper[4954]: I1206 08:18:40.101836 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:18:40 crc kubenswrapper[4954]: I1206 08:18:40.102875 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:19:10 crc kubenswrapper[4954]: I1206 08:19:10.101362 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:19:10 crc kubenswrapper[4954]: I1206 08:19:10.102473 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:19:40 crc kubenswrapper[4954]: I1206 08:19:40.101390 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:19:40 crc kubenswrapper[4954]: I1206 08:19:40.102856 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:19:40 crc kubenswrapper[4954]: I1206 08:19:40.103124 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 08:19:40 crc kubenswrapper[4954]: I1206 08:19:40.104311 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 08:19:40 crc kubenswrapper[4954]: I1206 08:19:40.104413 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" gracePeriod=600 Dec 06 08:19:40 crc kubenswrapper[4954]: I1206 08:19:40.776398 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" exitCode=0 Dec 06 08:19:40 crc kubenswrapper[4954]: I1206 08:19:40.776464 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84"} Dec 06 08:19:40 crc kubenswrapper[4954]: I1206 08:19:40.777318 4954 scope.go:117] "RemoveContainer" containerID="4766bf634a6ae62ad652b30c578e6fcff00da433de1b1fcd0bfc25ade2fefed0" Dec 06 08:19:41 crc kubenswrapper[4954]: E1206 08:19:41.170518 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:19:41 crc kubenswrapper[4954]: I1206 08:19:41.788537 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:19:41 crc kubenswrapper[4954]: E1206 08:19:41.789433 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:19:53 crc kubenswrapper[4954]: I1206 08:19:53.444868 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:19:53 crc kubenswrapper[4954]: E1206 08:19:53.446241 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:20:04 crc kubenswrapper[4954]: I1206 08:20:04.443467 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:20:04 crc kubenswrapper[4954]: E1206 08:20:04.445755 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:20:15 crc kubenswrapper[4954]: I1206 08:20:15.448300 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:20:15 crc kubenswrapper[4954]: E1206 08:20:15.449010 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:20:30 crc kubenswrapper[4954]: I1206 08:20:30.444286 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:20:30 crc kubenswrapper[4954]: E1206 08:20:30.445413 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:20:43 crc kubenswrapper[4954]: I1206 08:20:43.444623 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:20:43 crc kubenswrapper[4954]: E1206 08:20:43.446933 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:20:54 crc kubenswrapper[4954]: I1206 08:20:54.443903 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:20:54 crc kubenswrapper[4954]: E1206 08:20:54.444830 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:21:07 crc kubenswrapper[4954]: I1206 08:21:07.443591 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:21:07 crc kubenswrapper[4954]: E1206 08:21:07.444379 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:21:18 crc kubenswrapper[4954]: I1206 08:21:18.443704 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:21:18 crc kubenswrapper[4954]: E1206 08:21:18.444432 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:21:29 crc kubenswrapper[4954]: I1206 08:21:29.444100 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:21:29 crc kubenswrapper[4954]: E1206 08:21:29.444898 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:21:42 crc kubenswrapper[4954]: I1206 08:21:42.444155 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:21:42 crc kubenswrapper[4954]: E1206 08:21:42.444945 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:21:57 crc kubenswrapper[4954]: I1206 08:21:57.443019 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:21:57 crc kubenswrapper[4954]: E1206 08:21:57.443864 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:22:10 crc kubenswrapper[4954]: I1206 08:22:10.444374 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:22:10 crc kubenswrapper[4954]: E1206 08:22:10.445898 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:22:25 crc kubenswrapper[4954]: I1206 08:22:25.452189 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:22:25 crc kubenswrapper[4954]: E1206 08:22:25.456635 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:22:38 crc kubenswrapper[4954]: I1206 08:22:38.443402 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:22:38 crc kubenswrapper[4954]: E1206 08:22:38.444406 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:22:50 crc kubenswrapper[4954]: I1206 08:22:50.443492 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:22:50 crc kubenswrapper[4954]: E1206 08:22:50.444248 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:23:01 crc kubenswrapper[4954]: I1206 08:23:01.443345 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:23:01 crc kubenswrapper[4954]: E1206 08:23:01.444263 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:23:14 crc kubenswrapper[4954]: I1206 08:23:14.443421 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:23:14 crc kubenswrapper[4954]: E1206 08:23:14.444386 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:23:25 crc kubenswrapper[4954]: I1206 08:23:25.446719 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:23:25 crc kubenswrapper[4954]: E1206 08:23:25.447782 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:23:37 crc kubenswrapper[4954]: I1206 08:23:37.443129 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:23:37 crc kubenswrapper[4954]: E1206 08:23:37.443946 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:23:49 crc kubenswrapper[4954]: I1206 08:23:49.443033 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:23:49 crc kubenswrapper[4954]: E1206 08:23:49.443926 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:24:01 crc kubenswrapper[4954]: I1206 08:24:01.443749 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:24:01 crc kubenswrapper[4954]: E1206 08:24:01.444433 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:24:16 crc kubenswrapper[4954]: I1206 08:24:16.442967 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:24:16 crc kubenswrapper[4954]: E1206 08:24:16.444033 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:24:30 crc kubenswrapper[4954]: I1206 08:24:30.443618 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:24:30 crc kubenswrapper[4954]: E1206 08:24:30.445818 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:24:41 crc kubenswrapper[4954]: I1206 08:24:41.443389 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:24:42 crc kubenswrapper[4954]: I1206 08:24:42.187345 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"812f2a303638c4394b0c848ba2caf410771ccc5851f640dc426a593da3802ac2"} Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.465919 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nbpsn"] Dec 06 08:24:58 crc kubenswrapper[4954]: E1206 08:24:58.467011 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bbaf2ea-d0a3-4885-8b30-a979bc9c878b" containerName="extract-content" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.467024 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bbaf2ea-d0a3-4885-8b30-a979bc9c878b" containerName="extract-content" Dec 06 08:24:58 crc kubenswrapper[4954]: E1206 08:24:58.467047 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bbaf2ea-d0a3-4885-8b30-a979bc9c878b" containerName="extract-utilities" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.467054 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bbaf2ea-d0a3-4885-8b30-a979bc9c878b" containerName="extract-utilities" Dec 06 08:24:58 crc kubenswrapper[4954]: E1206 08:24:58.467068 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5" containerName="extract-utilities" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.467074 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5" containerName="extract-utilities" Dec 06 08:24:58 crc kubenswrapper[4954]: E1206 08:24:58.467096 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5" containerName="registry-server" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.467101 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5" containerName="registry-server" Dec 06 08:24:58 crc kubenswrapper[4954]: E1206 08:24:58.467113 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5" containerName="extract-content" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.467119 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5" containerName="extract-content" Dec 06 08:24:58 crc kubenswrapper[4954]: E1206 08:24:58.467127 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bbaf2ea-d0a3-4885-8b30-a979bc9c878b" containerName="registry-server" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.467132 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bbaf2ea-d0a3-4885-8b30-a979bc9c878b" containerName="registry-server" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.467307 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5bfd1d-861a-4d8d-a845-0b6e136aaaa5" containerName="registry-server" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.467326 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bbaf2ea-d0a3-4885-8b30-a979bc9c878b" containerName="registry-server" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.468373 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nbpsn" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.491167 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nbpsn"] Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.530895 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckmk9\" (UniqueName: \"kubernetes.io/projected/f6256427-9e62-4805-b37f-a4148c7c261d-kube-api-access-ckmk9\") pod \"redhat-marketplace-nbpsn\" (UID: \"f6256427-9e62-4805-b37f-a4148c7c261d\") " pod="openshift-marketplace/redhat-marketplace-nbpsn" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.530966 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6256427-9e62-4805-b37f-a4148c7c261d-utilities\") pod \"redhat-marketplace-nbpsn\" (UID: \"f6256427-9e62-4805-b37f-a4148c7c261d\") " pod="openshift-marketplace/redhat-marketplace-nbpsn" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.531031 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6256427-9e62-4805-b37f-a4148c7c261d-catalog-content\") pod \"redhat-marketplace-nbpsn\" (UID: \"f6256427-9e62-4805-b37f-a4148c7c261d\") " pod="openshift-marketplace/redhat-marketplace-nbpsn" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.632358 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6256427-9e62-4805-b37f-a4148c7c261d-utilities\") pod \"redhat-marketplace-nbpsn\" (UID: \"f6256427-9e62-4805-b37f-a4148c7c261d\") " pod="openshift-marketplace/redhat-marketplace-nbpsn" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.632472 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6256427-9e62-4805-b37f-a4148c7c261d-catalog-content\") pod \"redhat-marketplace-nbpsn\" (UID: \"f6256427-9e62-4805-b37f-a4148c7c261d\") " pod="openshift-marketplace/redhat-marketplace-nbpsn" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.632576 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckmk9\" (UniqueName: \"kubernetes.io/projected/f6256427-9e62-4805-b37f-a4148c7c261d-kube-api-access-ckmk9\") pod \"redhat-marketplace-nbpsn\" (UID: \"f6256427-9e62-4805-b37f-a4148c7c261d\") " pod="openshift-marketplace/redhat-marketplace-nbpsn" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.633170 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6256427-9e62-4805-b37f-a4148c7c261d-utilities\") pod \"redhat-marketplace-nbpsn\" (UID: \"f6256427-9e62-4805-b37f-a4148c7c261d\") " pod="openshift-marketplace/redhat-marketplace-nbpsn" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.633471 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6256427-9e62-4805-b37f-a4148c7c261d-catalog-content\") pod \"redhat-marketplace-nbpsn\" (UID: \"f6256427-9e62-4805-b37f-a4148c7c261d\") " pod="openshift-marketplace/redhat-marketplace-nbpsn" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.655016 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckmk9\" (UniqueName: \"kubernetes.io/projected/f6256427-9e62-4805-b37f-a4148c7c261d-kube-api-access-ckmk9\") pod \"redhat-marketplace-nbpsn\" (UID: \"f6256427-9e62-4805-b37f-a4148c7c261d\") " pod="openshift-marketplace/redhat-marketplace-nbpsn" Dec 06 08:24:58 crc kubenswrapper[4954]: I1206 08:24:58.798402 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nbpsn" Dec 06 08:24:59 crc kubenswrapper[4954]: I1206 08:24:59.075058 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nbpsn"] Dec 06 08:24:59 crc kubenswrapper[4954]: I1206 08:24:59.347110 4954 generic.go:334] "Generic (PLEG): container finished" podID="f6256427-9e62-4805-b37f-a4148c7c261d" containerID="5008a1c387a7a674beab3091f386e7386296a37886695580b183905fcd7278e7" exitCode=0 Dec 06 08:24:59 crc kubenswrapper[4954]: I1206 08:24:59.347158 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbpsn" event={"ID":"f6256427-9e62-4805-b37f-a4148c7c261d","Type":"ContainerDied","Data":"5008a1c387a7a674beab3091f386e7386296a37886695580b183905fcd7278e7"} Dec 06 08:24:59 crc kubenswrapper[4954]: I1206 08:24:59.348001 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbpsn" event={"ID":"f6256427-9e62-4805-b37f-a4148c7c261d","Type":"ContainerStarted","Data":"fd138e6cc6accceb65b356a0a6622a7c5a7c93b763e94f26f49e4270115d7ed2"} Dec 06 08:24:59 crc kubenswrapper[4954]: I1206 08:24:59.350718 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 08:25:00 crc kubenswrapper[4954]: I1206 08:25:00.358649 4954 generic.go:334] "Generic (PLEG): container finished" podID="f6256427-9e62-4805-b37f-a4148c7c261d" containerID="5dd47f43cff83f52b8dfcd7ea03bb7459cf998d91cd6e9ccfc1caa613e61ac4b" exitCode=0 Dec 06 08:25:00 crc kubenswrapper[4954]: I1206 08:25:00.358730 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbpsn" event={"ID":"f6256427-9e62-4805-b37f-a4148c7c261d","Type":"ContainerDied","Data":"5dd47f43cff83f52b8dfcd7ea03bb7459cf998d91cd6e9ccfc1caa613e61ac4b"} Dec 06 08:25:01 crc kubenswrapper[4954]: I1206 08:25:01.389778 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbpsn" event={"ID":"f6256427-9e62-4805-b37f-a4148c7c261d","Type":"ContainerStarted","Data":"df110b1938395cf7b49e5c665c9407b6b52744156b49097f3ea12a0bf7f5ceff"} Dec 06 08:25:01 crc kubenswrapper[4954]: I1206 08:25:01.414730 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nbpsn" podStartSLOduration=1.886957052 podStartE2EDuration="3.414639044s" podCreationTimestamp="2025-12-06 08:24:58 +0000 UTC" firstStartedPulling="2025-12-06 08:24:59.350308271 +0000 UTC m=+5274.163667660" lastFinishedPulling="2025-12-06 08:25:00.877990263 +0000 UTC m=+5275.691349652" observedRunningTime="2025-12-06 08:25:01.414019648 +0000 UTC m=+5276.227379057" watchObservedRunningTime="2025-12-06 08:25:01.414639044 +0000 UTC m=+5276.227998433" Dec 06 08:25:08 crc kubenswrapper[4954]: I1206 08:25:08.799283 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nbpsn" Dec 06 08:25:08 crc kubenswrapper[4954]: I1206 08:25:08.800397 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nbpsn" Dec 06 08:25:08 crc kubenswrapper[4954]: I1206 08:25:08.850139 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nbpsn" Dec 06 08:25:09 crc kubenswrapper[4954]: I1206 08:25:09.524365 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nbpsn" Dec 06 08:25:09 crc kubenswrapper[4954]: I1206 08:25:09.589880 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nbpsn"] Dec 06 08:25:11 crc kubenswrapper[4954]: I1206 08:25:11.491756 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nbpsn" podUID="f6256427-9e62-4805-b37f-a4148c7c261d" containerName="registry-server" containerID="cri-o://df110b1938395cf7b49e5c665c9407b6b52744156b49097f3ea12a0bf7f5ceff" gracePeriod=2 Dec 06 08:25:13 crc kubenswrapper[4954]: I1206 08:25:13.516704 4954 generic.go:334] "Generic (PLEG): container finished" podID="f6256427-9e62-4805-b37f-a4148c7c261d" containerID="df110b1938395cf7b49e5c665c9407b6b52744156b49097f3ea12a0bf7f5ceff" exitCode=0 Dec 06 08:25:13 crc kubenswrapper[4954]: I1206 08:25:13.516803 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbpsn" event={"ID":"f6256427-9e62-4805-b37f-a4148c7c261d","Type":"ContainerDied","Data":"df110b1938395cf7b49e5c665c9407b6b52744156b49097f3ea12a0bf7f5ceff"} Dec 06 08:25:14 crc kubenswrapper[4954]: I1206 08:25:14.107915 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nbpsn" Dec 06 08:25:14 crc kubenswrapper[4954]: I1206 08:25:14.214619 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckmk9\" (UniqueName: \"kubernetes.io/projected/f6256427-9e62-4805-b37f-a4148c7c261d-kube-api-access-ckmk9\") pod \"f6256427-9e62-4805-b37f-a4148c7c261d\" (UID: \"f6256427-9e62-4805-b37f-a4148c7c261d\") " Dec 06 08:25:14 crc kubenswrapper[4954]: I1206 08:25:14.214766 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6256427-9e62-4805-b37f-a4148c7c261d-catalog-content\") pod \"f6256427-9e62-4805-b37f-a4148c7c261d\" (UID: \"f6256427-9e62-4805-b37f-a4148c7c261d\") " Dec 06 08:25:14 crc kubenswrapper[4954]: I1206 08:25:14.214808 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6256427-9e62-4805-b37f-a4148c7c261d-utilities\") pod \"f6256427-9e62-4805-b37f-a4148c7c261d\" (UID: \"f6256427-9e62-4805-b37f-a4148c7c261d\") " Dec 06 08:25:14 crc kubenswrapper[4954]: I1206 08:25:14.216030 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6256427-9e62-4805-b37f-a4148c7c261d-utilities" (OuterVolumeSpecName: "utilities") pod "f6256427-9e62-4805-b37f-a4148c7c261d" (UID: "f6256427-9e62-4805-b37f-a4148c7c261d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:25:14 crc kubenswrapper[4954]: I1206 08:25:14.221519 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6256427-9e62-4805-b37f-a4148c7c261d-kube-api-access-ckmk9" (OuterVolumeSpecName: "kube-api-access-ckmk9") pod "f6256427-9e62-4805-b37f-a4148c7c261d" (UID: "f6256427-9e62-4805-b37f-a4148c7c261d"). InnerVolumeSpecName "kube-api-access-ckmk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:25:14 crc kubenswrapper[4954]: I1206 08:25:14.235737 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6256427-9e62-4805-b37f-a4148c7c261d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6256427-9e62-4805-b37f-a4148c7c261d" (UID: "f6256427-9e62-4805-b37f-a4148c7c261d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:25:14 crc kubenswrapper[4954]: I1206 08:25:14.316691 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckmk9\" (UniqueName: \"kubernetes.io/projected/f6256427-9e62-4805-b37f-a4148c7c261d-kube-api-access-ckmk9\") on node \"crc\" DevicePath \"\"" Dec 06 08:25:14 crc kubenswrapper[4954]: I1206 08:25:14.316927 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6256427-9e62-4805-b37f-a4148c7c261d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:25:14 crc kubenswrapper[4954]: I1206 08:25:14.316940 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6256427-9e62-4805-b37f-a4148c7c261d-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:25:14 crc kubenswrapper[4954]: I1206 08:25:14.529538 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbpsn" event={"ID":"f6256427-9e62-4805-b37f-a4148c7c261d","Type":"ContainerDied","Data":"fd138e6cc6accceb65b356a0a6622a7c5a7c93b763e94f26f49e4270115d7ed2"} Dec 06 08:25:14 crc kubenswrapper[4954]: I1206 08:25:14.529613 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nbpsn" Dec 06 08:25:14 crc kubenswrapper[4954]: I1206 08:25:14.529618 4954 scope.go:117] "RemoveContainer" containerID="df110b1938395cf7b49e5c665c9407b6b52744156b49097f3ea12a0bf7f5ceff" Dec 06 08:25:14 crc kubenswrapper[4954]: I1206 08:25:14.553995 4954 scope.go:117] "RemoveContainer" containerID="5dd47f43cff83f52b8dfcd7ea03bb7459cf998d91cd6e9ccfc1caa613e61ac4b" Dec 06 08:25:14 crc kubenswrapper[4954]: I1206 08:25:14.571429 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nbpsn"] Dec 06 08:25:14 crc kubenswrapper[4954]: I1206 08:25:14.576829 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nbpsn"] Dec 06 08:25:14 crc kubenswrapper[4954]: I1206 08:25:14.595501 4954 scope.go:117] "RemoveContainer" containerID="5008a1c387a7a674beab3091f386e7386296a37886695580b183905fcd7278e7" Dec 06 08:25:15 crc kubenswrapper[4954]: I1206 08:25:15.454358 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6256427-9e62-4805-b37f-a4148c7c261d" path="/var/lib/kubelet/pods/f6256427-9e62-4805-b37f-a4148c7c261d/volumes" Dec 06 08:26:19 crc kubenswrapper[4954]: I1206 08:26:19.504194 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8f7dk"] Dec 06 08:26:19 crc kubenswrapper[4954]: E1206 08:26:19.505264 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6256427-9e62-4805-b37f-a4148c7c261d" containerName="extract-content" Dec 06 08:26:19 crc kubenswrapper[4954]: I1206 08:26:19.505290 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6256427-9e62-4805-b37f-a4148c7c261d" containerName="extract-content" Dec 06 08:26:19 crc kubenswrapper[4954]: E1206 08:26:19.505790 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6256427-9e62-4805-b37f-a4148c7c261d" containerName="extract-utilities" Dec 06 08:26:19 crc kubenswrapper[4954]: I1206 08:26:19.505801 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6256427-9e62-4805-b37f-a4148c7c261d" containerName="extract-utilities" Dec 06 08:26:19 crc kubenswrapper[4954]: E1206 08:26:19.505816 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6256427-9e62-4805-b37f-a4148c7c261d" containerName="registry-server" Dec 06 08:26:19 crc kubenswrapper[4954]: I1206 08:26:19.505822 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6256427-9e62-4805-b37f-a4148c7c261d" containerName="registry-server" Dec 06 08:26:19 crc kubenswrapper[4954]: I1206 08:26:19.505983 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6256427-9e62-4805-b37f-a4148c7c261d" containerName="registry-server" Dec 06 08:26:19 crc kubenswrapper[4954]: I1206 08:26:19.507310 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8f7dk" Dec 06 08:26:19 crc kubenswrapper[4954]: I1206 08:26:19.524008 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8f7dk"] Dec 06 08:26:19 crc kubenswrapper[4954]: I1206 08:26:19.572233 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc13512f-4144-436b-b0dd-ef89280c8d80-utilities\") pod \"certified-operators-8f7dk\" (UID: \"fc13512f-4144-436b-b0dd-ef89280c8d80\") " pod="openshift-marketplace/certified-operators-8f7dk" Dec 06 08:26:19 crc kubenswrapper[4954]: I1206 08:26:19.572280 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtj7s\" (UniqueName: \"kubernetes.io/projected/fc13512f-4144-436b-b0dd-ef89280c8d80-kube-api-access-mtj7s\") pod \"certified-operators-8f7dk\" (UID: \"fc13512f-4144-436b-b0dd-ef89280c8d80\") " pod="openshift-marketplace/certified-operators-8f7dk" Dec 06 08:26:19 crc kubenswrapper[4954]: I1206 08:26:19.572308 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc13512f-4144-436b-b0dd-ef89280c8d80-catalog-content\") pod \"certified-operators-8f7dk\" (UID: \"fc13512f-4144-436b-b0dd-ef89280c8d80\") " pod="openshift-marketplace/certified-operators-8f7dk" Dec 06 08:26:19 crc kubenswrapper[4954]: I1206 08:26:19.673777 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc13512f-4144-436b-b0dd-ef89280c8d80-utilities\") pod \"certified-operators-8f7dk\" (UID: \"fc13512f-4144-436b-b0dd-ef89280c8d80\") " pod="openshift-marketplace/certified-operators-8f7dk" Dec 06 08:26:19 crc kubenswrapper[4954]: I1206 08:26:19.673835 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtj7s\" (UniqueName: \"kubernetes.io/projected/fc13512f-4144-436b-b0dd-ef89280c8d80-kube-api-access-mtj7s\") pod \"certified-operators-8f7dk\" (UID: \"fc13512f-4144-436b-b0dd-ef89280c8d80\") " pod="openshift-marketplace/certified-operators-8f7dk" Dec 06 08:26:19 crc kubenswrapper[4954]: I1206 08:26:19.673869 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc13512f-4144-436b-b0dd-ef89280c8d80-catalog-content\") pod \"certified-operators-8f7dk\" (UID: \"fc13512f-4144-436b-b0dd-ef89280c8d80\") " pod="openshift-marketplace/certified-operators-8f7dk" Dec 06 08:26:19 crc kubenswrapper[4954]: I1206 08:26:19.674335 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc13512f-4144-436b-b0dd-ef89280c8d80-utilities\") pod \"certified-operators-8f7dk\" (UID: \"fc13512f-4144-436b-b0dd-ef89280c8d80\") " pod="openshift-marketplace/certified-operators-8f7dk" Dec 06 08:26:19 crc kubenswrapper[4954]: I1206 08:26:19.674424 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc13512f-4144-436b-b0dd-ef89280c8d80-catalog-content\") pod \"certified-operators-8f7dk\" (UID: \"fc13512f-4144-436b-b0dd-ef89280c8d80\") " pod="openshift-marketplace/certified-operators-8f7dk" Dec 06 08:26:19 crc kubenswrapper[4954]: I1206 08:26:19.693582 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtj7s\" (UniqueName: \"kubernetes.io/projected/fc13512f-4144-436b-b0dd-ef89280c8d80-kube-api-access-mtj7s\") pod \"certified-operators-8f7dk\" (UID: \"fc13512f-4144-436b-b0dd-ef89280c8d80\") " pod="openshift-marketplace/certified-operators-8f7dk" Dec 06 08:26:19 crc kubenswrapper[4954]: I1206 08:26:19.841410 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8f7dk" Dec 06 08:26:20 crc kubenswrapper[4954]: I1206 08:26:20.340985 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8f7dk"] Dec 06 08:26:21 crc kubenswrapper[4954]: I1206 08:26:21.110086 4954 generic.go:334] "Generic (PLEG): container finished" podID="fc13512f-4144-436b-b0dd-ef89280c8d80" containerID="62916e939abfd47c1283318a5574c7980d6214f41dd7348a92874150b7714c43" exitCode=0 Dec 06 08:26:21 crc kubenswrapper[4954]: I1206 08:26:21.110179 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f7dk" event={"ID":"fc13512f-4144-436b-b0dd-ef89280c8d80","Type":"ContainerDied","Data":"62916e939abfd47c1283318a5574c7980d6214f41dd7348a92874150b7714c43"} Dec 06 08:26:21 crc kubenswrapper[4954]: I1206 08:26:21.110518 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f7dk" event={"ID":"fc13512f-4144-436b-b0dd-ef89280c8d80","Type":"ContainerStarted","Data":"124aef189896d8ceed9db7c33d174389a6a62f6af1be40c6f58c0be574219f76"} Dec 06 08:26:22 crc kubenswrapper[4954]: I1206 08:26:22.121652 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f7dk" event={"ID":"fc13512f-4144-436b-b0dd-ef89280c8d80","Type":"ContainerStarted","Data":"e158929e66663b0e8d5b70874a3d1738fb23c2a033a751adb15802e13007874b"} Dec 06 08:26:23 crc kubenswrapper[4954]: I1206 08:26:23.130945 4954 generic.go:334] "Generic (PLEG): container finished" podID="fc13512f-4144-436b-b0dd-ef89280c8d80" containerID="e158929e66663b0e8d5b70874a3d1738fb23c2a033a751adb15802e13007874b" exitCode=0 Dec 06 08:26:23 crc kubenswrapper[4954]: I1206 08:26:23.131401 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f7dk" event={"ID":"fc13512f-4144-436b-b0dd-ef89280c8d80","Type":"ContainerDied","Data":"e158929e66663b0e8d5b70874a3d1738fb23c2a033a751adb15802e13007874b"} Dec 06 08:26:24 crc kubenswrapper[4954]: I1206 08:26:24.142864 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f7dk" event={"ID":"fc13512f-4144-436b-b0dd-ef89280c8d80","Type":"ContainerStarted","Data":"aff7586f666ab7208d11a995aadefc2ca4c2892388fe7edf08afeb071c934f69"} Dec 06 08:26:29 crc kubenswrapper[4954]: I1206 08:26:29.841943 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8f7dk" Dec 06 08:26:29 crc kubenswrapper[4954]: I1206 08:26:29.842705 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8f7dk" Dec 06 08:26:29 crc kubenswrapper[4954]: I1206 08:26:29.890419 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8f7dk" Dec 06 08:26:29 crc kubenswrapper[4954]: I1206 08:26:29.916888 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8f7dk" podStartSLOduration=8.443528251 podStartE2EDuration="10.916823259s" podCreationTimestamp="2025-12-06 08:26:19 +0000 UTC" firstStartedPulling="2025-12-06 08:26:21.112235556 +0000 UTC m=+5355.925594985" lastFinishedPulling="2025-12-06 08:26:23.585530604 +0000 UTC m=+5358.398889993" observedRunningTime="2025-12-06 08:26:24.170112826 +0000 UTC m=+5358.983472215" watchObservedRunningTime="2025-12-06 08:26:29.916823259 +0000 UTC m=+5364.730182678" Dec 06 08:26:30 crc kubenswrapper[4954]: I1206 08:26:30.228094 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8f7dk" Dec 06 08:26:30 crc kubenswrapper[4954]: I1206 08:26:30.269686 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8f7dk"] Dec 06 08:26:32 crc kubenswrapper[4954]: I1206 08:26:32.205265 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8f7dk" podUID="fc13512f-4144-436b-b0dd-ef89280c8d80" containerName="registry-server" containerID="cri-o://aff7586f666ab7208d11a995aadefc2ca4c2892388fe7edf08afeb071c934f69" gracePeriod=2 Dec 06 08:26:33 crc kubenswrapper[4954]: I1206 08:26:33.213728 4954 generic.go:334] "Generic (PLEG): container finished" podID="fc13512f-4144-436b-b0dd-ef89280c8d80" containerID="aff7586f666ab7208d11a995aadefc2ca4c2892388fe7edf08afeb071c934f69" exitCode=0 Dec 06 08:26:33 crc kubenswrapper[4954]: I1206 08:26:33.213778 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f7dk" event={"ID":"fc13512f-4144-436b-b0dd-ef89280c8d80","Type":"ContainerDied","Data":"aff7586f666ab7208d11a995aadefc2ca4c2892388fe7edf08afeb071c934f69"} Dec 06 08:26:33 crc kubenswrapper[4954]: I1206 08:26:33.715220 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8f7dk" Dec 06 08:26:33 crc kubenswrapper[4954]: I1206 08:26:33.825190 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtj7s\" (UniqueName: \"kubernetes.io/projected/fc13512f-4144-436b-b0dd-ef89280c8d80-kube-api-access-mtj7s\") pod \"fc13512f-4144-436b-b0dd-ef89280c8d80\" (UID: \"fc13512f-4144-436b-b0dd-ef89280c8d80\") " Dec 06 08:26:33 crc kubenswrapper[4954]: I1206 08:26:33.825458 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc13512f-4144-436b-b0dd-ef89280c8d80-utilities\") pod \"fc13512f-4144-436b-b0dd-ef89280c8d80\" (UID: \"fc13512f-4144-436b-b0dd-ef89280c8d80\") " Dec 06 08:26:33 crc kubenswrapper[4954]: I1206 08:26:33.825494 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc13512f-4144-436b-b0dd-ef89280c8d80-catalog-content\") pod \"fc13512f-4144-436b-b0dd-ef89280c8d80\" (UID: \"fc13512f-4144-436b-b0dd-ef89280c8d80\") " Dec 06 08:26:33 crc kubenswrapper[4954]: I1206 08:26:33.826620 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc13512f-4144-436b-b0dd-ef89280c8d80-utilities" (OuterVolumeSpecName: "utilities") pod "fc13512f-4144-436b-b0dd-ef89280c8d80" (UID: "fc13512f-4144-436b-b0dd-ef89280c8d80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:26:33 crc kubenswrapper[4954]: I1206 08:26:33.833659 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc13512f-4144-436b-b0dd-ef89280c8d80-kube-api-access-mtj7s" (OuterVolumeSpecName: "kube-api-access-mtj7s") pod "fc13512f-4144-436b-b0dd-ef89280c8d80" (UID: "fc13512f-4144-436b-b0dd-ef89280c8d80"). InnerVolumeSpecName "kube-api-access-mtj7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:26:33 crc kubenswrapper[4954]: I1206 08:26:33.875846 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc13512f-4144-436b-b0dd-ef89280c8d80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc13512f-4144-436b-b0dd-ef89280c8d80" (UID: "fc13512f-4144-436b-b0dd-ef89280c8d80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:26:33 crc kubenswrapper[4954]: I1206 08:26:33.927360 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc13512f-4144-436b-b0dd-ef89280c8d80-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:26:33 crc kubenswrapper[4954]: I1206 08:26:33.927405 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc13512f-4144-436b-b0dd-ef89280c8d80-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:26:33 crc kubenswrapper[4954]: I1206 08:26:33.927422 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtj7s\" (UniqueName: \"kubernetes.io/projected/fc13512f-4144-436b-b0dd-ef89280c8d80-kube-api-access-mtj7s\") on node \"crc\" DevicePath \"\"" Dec 06 08:26:34 crc kubenswrapper[4954]: I1206 08:26:34.222895 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f7dk" event={"ID":"fc13512f-4144-436b-b0dd-ef89280c8d80","Type":"ContainerDied","Data":"124aef189896d8ceed9db7c33d174389a6a62f6af1be40c6f58c0be574219f76"} Dec 06 08:26:34 crc kubenswrapper[4954]: I1206 08:26:34.222954 4954 scope.go:117] "RemoveContainer" containerID="aff7586f666ab7208d11a995aadefc2ca4c2892388fe7edf08afeb071c934f69" Dec 06 08:26:34 crc kubenswrapper[4954]: I1206 08:26:34.222982 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8f7dk" Dec 06 08:26:34 crc kubenswrapper[4954]: I1206 08:26:34.248546 4954 scope.go:117] "RemoveContainer" containerID="e158929e66663b0e8d5b70874a3d1738fb23c2a033a751adb15802e13007874b" Dec 06 08:26:34 crc kubenswrapper[4954]: I1206 08:26:34.261945 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8f7dk"] Dec 06 08:26:34 crc kubenswrapper[4954]: I1206 08:26:34.275328 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8f7dk"] Dec 06 08:26:34 crc kubenswrapper[4954]: I1206 08:26:34.292401 4954 scope.go:117] "RemoveContainer" containerID="62916e939abfd47c1283318a5574c7980d6214f41dd7348a92874150b7714c43" Dec 06 08:26:35 crc kubenswrapper[4954]: I1206 08:26:35.451094 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc13512f-4144-436b-b0dd-ef89280c8d80" path="/var/lib/kubelet/pods/fc13512f-4144-436b-b0dd-ef89280c8d80/volumes" Dec 06 08:26:49 crc kubenswrapper[4954]: I1206 08:26:49.531784 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7hcxh"] Dec 06 08:26:49 crc kubenswrapper[4954]: E1206 08:26:49.532829 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc13512f-4144-436b-b0dd-ef89280c8d80" containerName="registry-server" Dec 06 08:26:49 crc kubenswrapper[4954]: I1206 08:26:49.532846 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc13512f-4144-436b-b0dd-ef89280c8d80" containerName="registry-server" Dec 06 08:26:49 crc kubenswrapper[4954]: E1206 08:26:49.532862 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc13512f-4144-436b-b0dd-ef89280c8d80" containerName="extract-utilities" Dec 06 08:26:49 crc kubenswrapper[4954]: I1206 08:26:49.532871 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc13512f-4144-436b-b0dd-ef89280c8d80" containerName="extract-utilities" Dec 06 08:26:49 crc kubenswrapper[4954]: E1206 08:26:49.532893 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc13512f-4144-436b-b0dd-ef89280c8d80" containerName="extract-content" Dec 06 08:26:49 crc kubenswrapper[4954]: I1206 08:26:49.532900 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc13512f-4144-436b-b0dd-ef89280c8d80" containerName="extract-content" Dec 06 08:26:49 crc kubenswrapper[4954]: I1206 08:26:49.533094 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc13512f-4144-436b-b0dd-ef89280c8d80" containerName="registry-server" Dec 06 08:26:49 crc kubenswrapper[4954]: I1206 08:26:49.534332 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hcxh" Dec 06 08:26:49 crc kubenswrapper[4954]: I1206 08:26:49.540743 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7hcxh"] Dec 06 08:26:49 crc kubenswrapper[4954]: I1206 08:26:49.559902 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c80f646f-7307-4968-8fde-94209077053e-utilities\") pod \"community-operators-7hcxh\" (UID: \"c80f646f-7307-4968-8fde-94209077053e\") " pod="openshift-marketplace/community-operators-7hcxh" Dec 06 08:26:49 crc kubenswrapper[4954]: I1206 08:26:49.559952 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c80f646f-7307-4968-8fde-94209077053e-catalog-content\") pod \"community-operators-7hcxh\" (UID: \"c80f646f-7307-4968-8fde-94209077053e\") " pod="openshift-marketplace/community-operators-7hcxh" Dec 06 08:26:49 crc kubenswrapper[4954]: I1206 08:26:49.560001 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kd96\" (UniqueName: \"kubernetes.io/projected/c80f646f-7307-4968-8fde-94209077053e-kube-api-access-5kd96\") pod \"community-operators-7hcxh\" (UID: \"c80f646f-7307-4968-8fde-94209077053e\") " pod="openshift-marketplace/community-operators-7hcxh" Dec 06 08:26:49 crc kubenswrapper[4954]: I1206 08:26:49.661841 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c80f646f-7307-4968-8fde-94209077053e-catalog-content\") pod \"community-operators-7hcxh\" (UID: \"c80f646f-7307-4968-8fde-94209077053e\") " pod="openshift-marketplace/community-operators-7hcxh" Dec 06 08:26:49 crc kubenswrapper[4954]: I1206 08:26:49.661909 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kd96\" (UniqueName: \"kubernetes.io/projected/c80f646f-7307-4968-8fde-94209077053e-kube-api-access-5kd96\") pod \"community-operators-7hcxh\" (UID: \"c80f646f-7307-4968-8fde-94209077053e\") " pod="openshift-marketplace/community-operators-7hcxh" Dec 06 08:26:49 crc kubenswrapper[4954]: I1206 08:26:49.661983 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c80f646f-7307-4968-8fde-94209077053e-utilities\") pod \"community-operators-7hcxh\" (UID: \"c80f646f-7307-4968-8fde-94209077053e\") " pod="openshift-marketplace/community-operators-7hcxh" Dec 06 08:26:49 crc kubenswrapper[4954]: I1206 08:26:49.662411 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c80f646f-7307-4968-8fde-94209077053e-catalog-content\") pod \"community-operators-7hcxh\" (UID: \"c80f646f-7307-4968-8fde-94209077053e\") " pod="openshift-marketplace/community-operators-7hcxh" Dec 06 08:26:49 crc kubenswrapper[4954]: I1206 08:26:49.662453 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c80f646f-7307-4968-8fde-94209077053e-utilities\") pod \"community-operators-7hcxh\" (UID: \"c80f646f-7307-4968-8fde-94209077053e\") " pod="openshift-marketplace/community-operators-7hcxh" Dec 06 08:26:49 crc kubenswrapper[4954]: I1206 08:26:49.683693 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kd96\" (UniqueName: \"kubernetes.io/projected/c80f646f-7307-4968-8fde-94209077053e-kube-api-access-5kd96\") pod \"community-operators-7hcxh\" (UID: \"c80f646f-7307-4968-8fde-94209077053e\") " pod="openshift-marketplace/community-operators-7hcxh" Dec 06 08:26:49 crc kubenswrapper[4954]: I1206 08:26:49.859772 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hcxh" Dec 06 08:26:50 crc kubenswrapper[4954]: I1206 08:26:50.206876 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7hcxh"] Dec 06 08:26:50 crc kubenswrapper[4954]: I1206 08:26:50.339375 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hcxh" event={"ID":"c80f646f-7307-4968-8fde-94209077053e","Type":"ContainerStarted","Data":"b4f1705d3ddd38eac72016c3a8303c3a3ecb5683afa35460a6056e0e21dbc930"} Dec 06 08:26:51 crc kubenswrapper[4954]: I1206 08:26:51.353289 4954 generic.go:334] "Generic (PLEG): container finished" podID="c80f646f-7307-4968-8fde-94209077053e" containerID="1917e742172a6666c16c6f0a5f883f7500046cec8dbd86cd0f77a92ef48ffe06" exitCode=0 Dec 06 08:26:51 crc kubenswrapper[4954]: I1206 08:26:51.353334 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hcxh" event={"ID":"c80f646f-7307-4968-8fde-94209077053e","Type":"ContainerDied","Data":"1917e742172a6666c16c6f0a5f883f7500046cec8dbd86cd0f77a92ef48ffe06"} Dec 06 08:26:52 crc kubenswrapper[4954]: I1206 08:26:52.369508 4954 generic.go:334] "Generic (PLEG): container finished" podID="c80f646f-7307-4968-8fde-94209077053e" containerID="e48971e49244ea798c8d9c70e4c09ad189463ebd23d2b52d530146ebebfb4410" exitCode=0 Dec 06 08:26:52 crc kubenswrapper[4954]: I1206 08:26:52.369822 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hcxh" event={"ID":"c80f646f-7307-4968-8fde-94209077053e","Type":"ContainerDied","Data":"e48971e49244ea798c8d9c70e4c09ad189463ebd23d2b52d530146ebebfb4410"} Dec 06 08:26:53 crc kubenswrapper[4954]: I1206 08:26:53.379531 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hcxh" event={"ID":"c80f646f-7307-4968-8fde-94209077053e","Type":"ContainerStarted","Data":"95e03cb94e2c2a640baeadb08aa68ca4982373cbc9eea396b114f14676966efb"} Dec 06 08:26:53 crc kubenswrapper[4954]: I1206 08:26:53.403418 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7hcxh" podStartSLOduration=2.926269662 podStartE2EDuration="4.403382483s" podCreationTimestamp="2025-12-06 08:26:49 +0000 UTC" firstStartedPulling="2025-12-06 08:26:51.35676145 +0000 UTC m=+5386.170120839" lastFinishedPulling="2025-12-06 08:26:52.833874271 +0000 UTC m=+5387.647233660" observedRunningTime="2025-12-06 08:26:53.395693519 +0000 UTC m=+5388.209052918" watchObservedRunningTime="2025-12-06 08:26:53.403382483 +0000 UTC m=+5388.216741872" Dec 06 08:26:59 crc kubenswrapper[4954]: I1206 08:26:59.860754 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7hcxh" Dec 06 08:26:59 crc kubenswrapper[4954]: I1206 08:26:59.861968 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7hcxh" Dec 06 08:26:59 crc kubenswrapper[4954]: I1206 08:26:59.916316 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7hcxh" Dec 06 08:27:00 crc kubenswrapper[4954]: I1206 08:27:00.469609 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7hcxh" Dec 06 08:27:00 crc kubenswrapper[4954]: I1206 08:27:00.531801 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7hcxh"] Dec 06 08:27:02 crc kubenswrapper[4954]: I1206 08:27:02.441286 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7hcxh" podUID="c80f646f-7307-4968-8fde-94209077053e" containerName="registry-server" containerID="cri-o://95e03cb94e2c2a640baeadb08aa68ca4982373cbc9eea396b114f14676966efb" gracePeriod=2 Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.400700 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hcxh" Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.453442 4954 generic.go:334] "Generic (PLEG): container finished" podID="c80f646f-7307-4968-8fde-94209077053e" containerID="95e03cb94e2c2a640baeadb08aa68ca4982373cbc9eea396b114f14676966efb" exitCode=0 Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.453539 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hcxh" Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.455545 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hcxh" event={"ID":"c80f646f-7307-4968-8fde-94209077053e","Type":"ContainerDied","Data":"95e03cb94e2c2a640baeadb08aa68ca4982373cbc9eea396b114f14676966efb"} Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.455614 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hcxh" event={"ID":"c80f646f-7307-4968-8fde-94209077053e","Type":"ContainerDied","Data":"b4f1705d3ddd38eac72016c3a8303c3a3ecb5683afa35460a6056e0e21dbc930"} Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.455639 4954 scope.go:117] "RemoveContainer" containerID="95e03cb94e2c2a640baeadb08aa68ca4982373cbc9eea396b114f14676966efb" Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.467951 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kd96\" (UniqueName: \"kubernetes.io/projected/c80f646f-7307-4968-8fde-94209077053e-kube-api-access-5kd96\") pod \"c80f646f-7307-4968-8fde-94209077053e\" (UID: \"c80f646f-7307-4968-8fde-94209077053e\") " Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.468012 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c80f646f-7307-4968-8fde-94209077053e-catalog-content\") pod \"c80f646f-7307-4968-8fde-94209077053e\" (UID: \"c80f646f-7307-4968-8fde-94209077053e\") " Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.468169 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c80f646f-7307-4968-8fde-94209077053e-utilities\") pod \"c80f646f-7307-4968-8fde-94209077053e\" (UID: \"c80f646f-7307-4968-8fde-94209077053e\") " Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.469049 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c80f646f-7307-4968-8fde-94209077053e-utilities" (OuterVolumeSpecName: "utilities") pod "c80f646f-7307-4968-8fde-94209077053e" (UID: "c80f646f-7307-4968-8fde-94209077053e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.476194 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c80f646f-7307-4968-8fde-94209077053e-kube-api-access-5kd96" (OuterVolumeSpecName: "kube-api-access-5kd96") pod "c80f646f-7307-4968-8fde-94209077053e" (UID: "c80f646f-7307-4968-8fde-94209077053e"). InnerVolumeSpecName "kube-api-access-5kd96". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.478130 4954 scope.go:117] "RemoveContainer" containerID="e48971e49244ea798c8d9c70e4c09ad189463ebd23d2b52d530146ebebfb4410" Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.522533 4954 scope.go:117] "RemoveContainer" containerID="1917e742172a6666c16c6f0a5f883f7500046cec8dbd86cd0f77a92ef48ffe06" Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.535097 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c80f646f-7307-4968-8fde-94209077053e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c80f646f-7307-4968-8fde-94209077053e" (UID: "c80f646f-7307-4968-8fde-94209077053e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.545273 4954 scope.go:117] "RemoveContainer" containerID="95e03cb94e2c2a640baeadb08aa68ca4982373cbc9eea396b114f14676966efb" Dec 06 08:27:03 crc kubenswrapper[4954]: E1206 08:27:03.545784 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95e03cb94e2c2a640baeadb08aa68ca4982373cbc9eea396b114f14676966efb\": container with ID starting with 95e03cb94e2c2a640baeadb08aa68ca4982373cbc9eea396b114f14676966efb not found: ID does not exist" containerID="95e03cb94e2c2a640baeadb08aa68ca4982373cbc9eea396b114f14676966efb" Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.545825 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95e03cb94e2c2a640baeadb08aa68ca4982373cbc9eea396b114f14676966efb"} err="failed to get container status \"95e03cb94e2c2a640baeadb08aa68ca4982373cbc9eea396b114f14676966efb\": rpc error: code = NotFound desc = could not find container \"95e03cb94e2c2a640baeadb08aa68ca4982373cbc9eea396b114f14676966efb\": container with ID starting with 95e03cb94e2c2a640baeadb08aa68ca4982373cbc9eea396b114f14676966efb not found: ID does not exist" Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.545855 4954 scope.go:117] "RemoveContainer" containerID="e48971e49244ea798c8d9c70e4c09ad189463ebd23d2b52d530146ebebfb4410" Dec 06 08:27:03 crc kubenswrapper[4954]: E1206 08:27:03.546232 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e48971e49244ea798c8d9c70e4c09ad189463ebd23d2b52d530146ebebfb4410\": container with ID starting with e48971e49244ea798c8d9c70e4c09ad189463ebd23d2b52d530146ebebfb4410 not found: ID does not exist" containerID="e48971e49244ea798c8d9c70e4c09ad189463ebd23d2b52d530146ebebfb4410" Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.546260 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e48971e49244ea798c8d9c70e4c09ad189463ebd23d2b52d530146ebebfb4410"} err="failed to get container status \"e48971e49244ea798c8d9c70e4c09ad189463ebd23d2b52d530146ebebfb4410\": rpc error: code = NotFound desc = could not find container \"e48971e49244ea798c8d9c70e4c09ad189463ebd23d2b52d530146ebebfb4410\": container with ID starting with e48971e49244ea798c8d9c70e4c09ad189463ebd23d2b52d530146ebebfb4410 not found: ID does not exist" Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.546275 4954 scope.go:117] "RemoveContainer" containerID="1917e742172a6666c16c6f0a5f883f7500046cec8dbd86cd0f77a92ef48ffe06" Dec 06 08:27:03 crc kubenswrapper[4954]: E1206 08:27:03.546642 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1917e742172a6666c16c6f0a5f883f7500046cec8dbd86cd0f77a92ef48ffe06\": container with ID starting with 1917e742172a6666c16c6f0a5f883f7500046cec8dbd86cd0f77a92ef48ffe06 not found: ID does not exist" containerID="1917e742172a6666c16c6f0a5f883f7500046cec8dbd86cd0f77a92ef48ffe06" Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.546664 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1917e742172a6666c16c6f0a5f883f7500046cec8dbd86cd0f77a92ef48ffe06"} err="failed to get container status \"1917e742172a6666c16c6f0a5f883f7500046cec8dbd86cd0f77a92ef48ffe06\": rpc error: code = NotFound desc = could not find container \"1917e742172a6666c16c6f0a5f883f7500046cec8dbd86cd0f77a92ef48ffe06\": container with ID starting with 1917e742172a6666c16c6f0a5f883f7500046cec8dbd86cd0f77a92ef48ffe06 not found: ID does not exist" Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.570248 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kd96\" (UniqueName: \"kubernetes.io/projected/c80f646f-7307-4968-8fde-94209077053e-kube-api-access-5kd96\") on node \"crc\" DevicePath \"\"" Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.570311 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c80f646f-7307-4968-8fde-94209077053e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.570326 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c80f646f-7307-4968-8fde-94209077053e-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.788470 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7hcxh"] Dec 06 08:27:03 crc kubenswrapper[4954]: I1206 08:27:03.795020 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7hcxh"] Dec 06 08:27:05 crc kubenswrapper[4954]: I1206 08:27:05.454481 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c80f646f-7307-4968-8fde-94209077053e" path="/var/lib/kubelet/pods/c80f646f-7307-4968-8fde-94209077053e/volumes" Dec 06 08:27:10 crc kubenswrapper[4954]: I1206 08:27:10.101395 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:27:10 crc kubenswrapper[4954]: I1206 08:27:10.102489 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:27:40 crc kubenswrapper[4954]: I1206 08:27:40.101343 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:27:40 crc kubenswrapper[4954]: I1206 08:27:40.102258 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:28:10 crc kubenswrapper[4954]: I1206 08:28:10.101359 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:28:10 crc kubenswrapper[4954]: I1206 08:28:10.101859 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:28:10 crc kubenswrapper[4954]: I1206 08:28:10.101897 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 08:28:10 crc kubenswrapper[4954]: I1206 08:28:10.102311 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"812f2a303638c4394b0c848ba2caf410771ccc5851f640dc426a593da3802ac2"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 08:28:10 crc kubenswrapper[4954]: I1206 08:28:10.102375 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://812f2a303638c4394b0c848ba2caf410771ccc5851f640dc426a593da3802ac2" gracePeriod=600 Dec 06 08:28:10 crc kubenswrapper[4954]: I1206 08:28:10.869198 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-79bm4"] Dec 06 08:28:10 crc kubenswrapper[4954]: E1206 08:28:10.869843 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c80f646f-7307-4968-8fde-94209077053e" containerName="extract-content" Dec 06 08:28:10 crc kubenswrapper[4954]: I1206 08:28:10.869868 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c80f646f-7307-4968-8fde-94209077053e" containerName="extract-content" Dec 06 08:28:10 crc kubenswrapper[4954]: E1206 08:28:10.869890 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c80f646f-7307-4968-8fde-94209077053e" containerName="extract-utilities" Dec 06 08:28:10 crc kubenswrapper[4954]: I1206 08:28:10.869899 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c80f646f-7307-4968-8fde-94209077053e" containerName="extract-utilities" Dec 06 08:28:10 crc kubenswrapper[4954]: E1206 08:28:10.869922 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c80f646f-7307-4968-8fde-94209077053e" containerName="registry-server" Dec 06 08:28:10 crc kubenswrapper[4954]: I1206 08:28:10.869930 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c80f646f-7307-4968-8fde-94209077053e" containerName="registry-server" Dec 06 08:28:10 crc kubenswrapper[4954]: I1206 08:28:10.870101 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c80f646f-7307-4968-8fde-94209077053e" containerName="registry-server" Dec 06 08:28:10 crc kubenswrapper[4954]: I1206 08:28:10.871369 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-79bm4" Dec 06 08:28:10 crc kubenswrapper[4954]: I1206 08:28:10.889349 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-79bm4"] Dec 06 08:28:10 crc kubenswrapper[4954]: I1206 08:28:10.989550 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b359005-c3d7-42af-bf33-96d511486c67-catalog-content\") pod \"redhat-operators-79bm4\" (UID: \"9b359005-c3d7-42af-bf33-96d511486c67\") " pod="openshift-marketplace/redhat-operators-79bm4" Dec 06 08:28:10 crc kubenswrapper[4954]: I1206 08:28:10.989659 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b359005-c3d7-42af-bf33-96d511486c67-utilities\") pod \"redhat-operators-79bm4\" (UID: \"9b359005-c3d7-42af-bf33-96d511486c67\") " pod="openshift-marketplace/redhat-operators-79bm4" Dec 06 08:28:10 crc kubenswrapper[4954]: I1206 08:28:10.989695 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grg5d\" (UniqueName: \"kubernetes.io/projected/9b359005-c3d7-42af-bf33-96d511486c67-kube-api-access-grg5d\") pod \"redhat-operators-79bm4\" (UID: \"9b359005-c3d7-42af-bf33-96d511486c67\") " pod="openshift-marketplace/redhat-operators-79bm4" Dec 06 08:28:11 crc kubenswrapper[4954]: I1206 08:28:11.016185 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="812f2a303638c4394b0c848ba2caf410771ccc5851f640dc426a593da3802ac2" exitCode=0 Dec 06 08:28:11 crc kubenswrapper[4954]: I1206 08:28:11.016233 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"812f2a303638c4394b0c848ba2caf410771ccc5851f640dc426a593da3802ac2"} Dec 06 08:28:11 crc kubenswrapper[4954]: I1206 08:28:11.016354 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd"} Dec 06 08:28:11 crc kubenswrapper[4954]: I1206 08:28:11.016416 4954 scope.go:117] "RemoveContainer" containerID="067ddb1d97ee695de8b97bbb9551ec2e630a3c81c80be68dbcca59f14db09f84" Dec 06 08:28:11 crc kubenswrapper[4954]: I1206 08:28:11.090912 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grg5d\" (UniqueName: \"kubernetes.io/projected/9b359005-c3d7-42af-bf33-96d511486c67-kube-api-access-grg5d\") pod \"redhat-operators-79bm4\" (UID: \"9b359005-c3d7-42af-bf33-96d511486c67\") " pod="openshift-marketplace/redhat-operators-79bm4" Dec 06 08:28:11 crc kubenswrapper[4954]: I1206 08:28:11.091474 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b359005-c3d7-42af-bf33-96d511486c67-catalog-content\") pod \"redhat-operators-79bm4\" (UID: \"9b359005-c3d7-42af-bf33-96d511486c67\") " pod="openshift-marketplace/redhat-operators-79bm4" Dec 06 08:28:11 crc kubenswrapper[4954]: I1206 08:28:11.091664 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b359005-c3d7-42af-bf33-96d511486c67-utilities\") pod \"redhat-operators-79bm4\" (UID: \"9b359005-c3d7-42af-bf33-96d511486c67\") " pod="openshift-marketplace/redhat-operators-79bm4" Dec 06 08:28:11 crc kubenswrapper[4954]: I1206 08:28:11.092392 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b359005-c3d7-42af-bf33-96d511486c67-utilities\") pod \"redhat-operators-79bm4\" (UID: \"9b359005-c3d7-42af-bf33-96d511486c67\") " pod="openshift-marketplace/redhat-operators-79bm4" Dec 06 08:28:11 crc kubenswrapper[4954]: I1206 08:28:11.092716 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b359005-c3d7-42af-bf33-96d511486c67-catalog-content\") pod \"redhat-operators-79bm4\" (UID: \"9b359005-c3d7-42af-bf33-96d511486c67\") " pod="openshift-marketplace/redhat-operators-79bm4" Dec 06 08:28:11 crc kubenswrapper[4954]: I1206 08:28:11.118049 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grg5d\" (UniqueName: \"kubernetes.io/projected/9b359005-c3d7-42af-bf33-96d511486c67-kube-api-access-grg5d\") pod \"redhat-operators-79bm4\" (UID: \"9b359005-c3d7-42af-bf33-96d511486c67\") " pod="openshift-marketplace/redhat-operators-79bm4" Dec 06 08:28:11 crc kubenswrapper[4954]: I1206 08:28:11.205186 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-79bm4" Dec 06 08:28:11 crc kubenswrapper[4954]: I1206 08:28:11.658363 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-79bm4"] Dec 06 08:28:11 crc kubenswrapper[4954]: W1206 08:28:11.658795 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b359005_c3d7_42af_bf33_96d511486c67.slice/crio-b0dc6ed0d9b1b13853573a7a9dac22cbf67c263c629ec355c09d1801ec72fbc3 WatchSource:0}: Error finding container b0dc6ed0d9b1b13853573a7a9dac22cbf67c263c629ec355c09d1801ec72fbc3: Status 404 returned error can't find the container with id b0dc6ed0d9b1b13853573a7a9dac22cbf67c263c629ec355c09d1801ec72fbc3 Dec 06 08:28:12 crc kubenswrapper[4954]: I1206 08:28:12.027319 4954 generic.go:334] "Generic (PLEG): container finished" podID="9b359005-c3d7-42af-bf33-96d511486c67" containerID="00d619bd04d88ba5a21b4e3d2882ea2e809355e6b9fed47c7bafe7657b8bbb79" exitCode=0 Dec 06 08:28:12 crc kubenswrapper[4954]: I1206 08:28:12.027434 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-79bm4" event={"ID":"9b359005-c3d7-42af-bf33-96d511486c67","Type":"ContainerDied","Data":"00d619bd04d88ba5a21b4e3d2882ea2e809355e6b9fed47c7bafe7657b8bbb79"} Dec 06 08:28:12 crc kubenswrapper[4954]: I1206 08:28:12.027746 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-79bm4" event={"ID":"9b359005-c3d7-42af-bf33-96d511486c67","Type":"ContainerStarted","Data":"b0dc6ed0d9b1b13853573a7a9dac22cbf67c263c629ec355c09d1801ec72fbc3"} Dec 06 08:28:13 crc kubenswrapper[4954]: I1206 08:28:13.036544 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-79bm4" event={"ID":"9b359005-c3d7-42af-bf33-96d511486c67","Type":"ContainerStarted","Data":"bad9f47a1c8bb2530bcbba935b344d662ec0a5ae7482cbb4abab06f18c2db98d"} Dec 06 08:28:14 crc kubenswrapper[4954]: I1206 08:28:14.046455 4954 generic.go:334] "Generic (PLEG): container finished" podID="9b359005-c3d7-42af-bf33-96d511486c67" containerID="bad9f47a1c8bb2530bcbba935b344d662ec0a5ae7482cbb4abab06f18c2db98d" exitCode=0 Dec 06 08:28:14 crc kubenswrapper[4954]: I1206 08:28:14.046532 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-79bm4" event={"ID":"9b359005-c3d7-42af-bf33-96d511486c67","Type":"ContainerDied","Data":"bad9f47a1c8bb2530bcbba935b344d662ec0a5ae7482cbb4abab06f18c2db98d"} Dec 06 08:28:15 crc kubenswrapper[4954]: I1206 08:28:15.055514 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-79bm4" event={"ID":"9b359005-c3d7-42af-bf33-96d511486c67","Type":"ContainerStarted","Data":"2a9557199c61bfa49437d21de4500206c064ef0dd7756a74b244f33fd711b9de"} Dec 06 08:28:15 crc kubenswrapper[4954]: I1206 08:28:15.081067 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-79bm4" podStartSLOduration=2.667631224 podStartE2EDuration="5.081018735s" podCreationTimestamp="2025-12-06 08:28:10 +0000 UTC" firstStartedPulling="2025-12-06 08:28:12.029505104 +0000 UTC m=+5466.842864493" lastFinishedPulling="2025-12-06 08:28:14.442892595 +0000 UTC m=+5469.256252004" observedRunningTime="2025-12-06 08:28:15.077877011 +0000 UTC m=+5469.891236410" watchObservedRunningTime="2025-12-06 08:28:15.081018735 +0000 UTC m=+5469.894378124" Dec 06 08:28:21 crc kubenswrapper[4954]: I1206 08:28:21.205461 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-79bm4" Dec 06 08:28:21 crc kubenswrapper[4954]: I1206 08:28:21.206228 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-79bm4" Dec 06 08:28:21 crc kubenswrapper[4954]: I1206 08:28:21.264740 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-79bm4" Dec 06 08:28:22 crc kubenswrapper[4954]: I1206 08:28:22.181653 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-79bm4" Dec 06 08:28:22 crc kubenswrapper[4954]: I1206 08:28:22.229964 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-79bm4"] Dec 06 08:28:24 crc kubenswrapper[4954]: I1206 08:28:24.137848 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-79bm4" podUID="9b359005-c3d7-42af-bf33-96d511486c67" containerName="registry-server" containerID="cri-o://2a9557199c61bfa49437d21de4500206c064ef0dd7756a74b244f33fd711b9de" gracePeriod=2 Dec 06 08:28:26 crc kubenswrapper[4954]: I1206 08:28:26.164821 4954 generic.go:334] "Generic (PLEG): container finished" podID="9b359005-c3d7-42af-bf33-96d511486c67" containerID="2a9557199c61bfa49437d21de4500206c064ef0dd7756a74b244f33fd711b9de" exitCode=0 Dec 06 08:28:26 crc kubenswrapper[4954]: I1206 08:28:26.166694 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-79bm4" event={"ID":"9b359005-c3d7-42af-bf33-96d511486c67","Type":"ContainerDied","Data":"2a9557199c61bfa49437d21de4500206c064ef0dd7756a74b244f33fd711b9de"} Dec 06 08:28:26 crc kubenswrapper[4954]: I1206 08:28:26.414078 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-79bm4" Dec 06 08:28:26 crc kubenswrapper[4954]: I1206 08:28:26.527151 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grg5d\" (UniqueName: \"kubernetes.io/projected/9b359005-c3d7-42af-bf33-96d511486c67-kube-api-access-grg5d\") pod \"9b359005-c3d7-42af-bf33-96d511486c67\" (UID: \"9b359005-c3d7-42af-bf33-96d511486c67\") " Dec 06 08:28:26 crc kubenswrapper[4954]: I1206 08:28:26.527240 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b359005-c3d7-42af-bf33-96d511486c67-utilities\") pod \"9b359005-c3d7-42af-bf33-96d511486c67\" (UID: \"9b359005-c3d7-42af-bf33-96d511486c67\") " Dec 06 08:28:26 crc kubenswrapper[4954]: I1206 08:28:26.527317 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b359005-c3d7-42af-bf33-96d511486c67-catalog-content\") pod \"9b359005-c3d7-42af-bf33-96d511486c67\" (UID: \"9b359005-c3d7-42af-bf33-96d511486c67\") " Dec 06 08:28:26 crc kubenswrapper[4954]: I1206 08:28:26.529461 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b359005-c3d7-42af-bf33-96d511486c67-utilities" (OuterVolumeSpecName: "utilities") pod "9b359005-c3d7-42af-bf33-96d511486c67" (UID: "9b359005-c3d7-42af-bf33-96d511486c67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:28:26 crc kubenswrapper[4954]: I1206 08:28:26.534027 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b359005-c3d7-42af-bf33-96d511486c67-kube-api-access-grg5d" (OuterVolumeSpecName: "kube-api-access-grg5d") pod "9b359005-c3d7-42af-bf33-96d511486c67" (UID: "9b359005-c3d7-42af-bf33-96d511486c67"). InnerVolumeSpecName "kube-api-access-grg5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:28:26 crc kubenswrapper[4954]: I1206 08:28:26.629523 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grg5d\" (UniqueName: \"kubernetes.io/projected/9b359005-c3d7-42af-bf33-96d511486c67-kube-api-access-grg5d\") on node \"crc\" DevicePath \"\"" Dec 06 08:28:26 crc kubenswrapper[4954]: I1206 08:28:26.629556 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b359005-c3d7-42af-bf33-96d511486c67-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:28:26 crc kubenswrapper[4954]: I1206 08:28:26.648326 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b359005-c3d7-42af-bf33-96d511486c67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b359005-c3d7-42af-bf33-96d511486c67" (UID: "9b359005-c3d7-42af-bf33-96d511486c67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:28:26 crc kubenswrapper[4954]: I1206 08:28:26.731465 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b359005-c3d7-42af-bf33-96d511486c67-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:28:27 crc kubenswrapper[4954]: I1206 08:28:27.175676 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-79bm4" event={"ID":"9b359005-c3d7-42af-bf33-96d511486c67","Type":"ContainerDied","Data":"b0dc6ed0d9b1b13853573a7a9dac22cbf67c263c629ec355c09d1801ec72fbc3"} Dec 06 08:28:27 crc kubenswrapper[4954]: I1206 08:28:27.175731 4954 scope.go:117] "RemoveContainer" containerID="2a9557199c61bfa49437d21de4500206c064ef0dd7756a74b244f33fd711b9de" Dec 06 08:28:27 crc kubenswrapper[4954]: I1206 08:28:27.175746 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-79bm4" Dec 06 08:28:27 crc kubenswrapper[4954]: I1206 08:28:27.204273 4954 scope.go:117] "RemoveContainer" containerID="bad9f47a1c8bb2530bcbba935b344d662ec0a5ae7482cbb4abab06f18c2db98d" Dec 06 08:28:27 crc kubenswrapper[4954]: I1206 08:28:27.208801 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-79bm4"] Dec 06 08:28:27 crc kubenswrapper[4954]: I1206 08:28:27.215161 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-79bm4"] Dec 06 08:28:27 crc kubenswrapper[4954]: I1206 08:28:27.232340 4954 scope.go:117] "RemoveContainer" containerID="00d619bd04d88ba5a21b4e3d2882ea2e809355e6b9fed47c7bafe7657b8bbb79" Dec 06 08:28:27 crc kubenswrapper[4954]: I1206 08:28:27.451921 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b359005-c3d7-42af-bf33-96d511486c67" path="/var/lib/kubelet/pods/9b359005-c3d7-42af-bf33-96d511486c67/volumes" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.157871 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp"] Dec 06 08:30:00 crc kubenswrapper[4954]: E1206 08:30:00.161004 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b359005-c3d7-42af-bf33-96d511486c67" containerName="registry-server" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.161026 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b359005-c3d7-42af-bf33-96d511486c67" containerName="registry-server" Dec 06 08:30:00 crc kubenswrapper[4954]: E1206 08:30:00.161057 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b359005-c3d7-42af-bf33-96d511486c67" containerName="extract-content" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.161071 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b359005-c3d7-42af-bf33-96d511486c67" containerName="extract-content" Dec 06 08:30:00 crc kubenswrapper[4954]: E1206 08:30:00.161103 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b359005-c3d7-42af-bf33-96d511486c67" containerName="extract-utilities" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.161116 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b359005-c3d7-42af-bf33-96d511486c67" containerName="extract-utilities" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.161379 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b359005-c3d7-42af-bf33-96d511486c67" containerName="registry-server" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.162033 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.165776 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.165906 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.173408 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp"] Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.253252 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/539b8ab6-bc54-459e-b9cf-a46c09e4449e-secret-volume\") pod \"collect-profiles-29416830-446qp\" (UID: \"539b8ab6-bc54-459e-b9cf-a46c09e4449e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.254055 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm82z\" (UniqueName: \"kubernetes.io/projected/539b8ab6-bc54-459e-b9cf-a46c09e4449e-kube-api-access-nm82z\") pod \"collect-profiles-29416830-446qp\" (UID: \"539b8ab6-bc54-459e-b9cf-a46c09e4449e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.254181 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/539b8ab6-bc54-459e-b9cf-a46c09e4449e-config-volume\") pod \"collect-profiles-29416830-446qp\" (UID: \"539b8ab6-bc54-459e-b9cf-a46c09e4449e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.355437 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/539b8ab6-bc54-459e-b9cf-a46c09e4449e-secret-volume\") pod \"collect-profiles-29416830-446qp\" (UID: \"539b8ab6-bc54-459e-b9cf-a46c09e4449e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.355555 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm82z\" (UniqueName: \"kubernetes.io/projected/539b8ab6-bc54-459e-b9cf-a46c09e4449e-kube-api-access-nm82z\") pod \"collect-profiles-29416830-446qp\" (UID: \"539b8ab6-bc54-459e-b9cf-a46c09e4449e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.355638 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/539b8ab6-bc54-459e-b9cf-a46c09e4449e-config-volume\") pod \"collect-profiles-29416830-446qp\" (UID: \"539b8ab6-bc54-459e-b9cf-a46c09e4449e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.356929 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/539b8ab6-bc54-459e-b9cf-a46c09e4449e-config-volume\") pod \"collect-profiles-29416830-446qp\" (UID: \"539b8ab6-bc54-459e-b9cf-a46c09e4449e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.366918 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/539b8ab6-bc54-459e-b9cf-a46c09e4449e-secret-volume\") pod \"collect-profiles-29416830-446qp\" (UID: \"539b8ab6-bc54-459e-b9cf-a46c09e4449e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.376160 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm82z\" (UniqueName: \"kubernetes.io/projected/539b8ab6-bc54-459e-b9cf-a46c09e4449e-kube-api-access-nm82z\") pod \"collect-profiles-29416830-446qp\" (UID: \"539b8ab6-bc54-459e-b9cf-a46c09e4449e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.490549 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp" Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.906409 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp"] Dec 06 08:30:00 crc kubenswrapper[4954]: I1206 08:30:00.987114 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp" event={"ID":"539b8ab6-bc54-459e-b9cf-a46c09e4449e","Type":"ContainerStarted","Data":"8dcecd52cd1b8434c75716b39924703a10f6ae822f7ab288d0b4e77adcec8312"} Dec 06 08:30:02 crc kubenswrapper[4954]: I1206 08:30:02.000188 4954 generic.go:334] "Generic (PLEG): container finished" podID="539b8ab6-bc54-459e-b9cf-a46c09e4449e" containerID="a8e62d0fd2f4a7313c07c695aefff39769b1e7af592373966bb295b47168d783" exitCode=0 Dec 06 08:30:02 crc kubenswrapper[4954]: I1206 08:30:02.000338 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp" event={"ID":"539b8ab6-bc54-459e-b9cf-a46c09e4449e","Type":"ContainerDied","Data":"a8e62d0fd2f4a7313c07c695aefff39769b1e7af592373966bb295b47168d783"} Dec 06 08:30:03 crc kubenswrapper[4954]: I1206 08:30:03.353678 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp" Dec 06 08:30:03 crc kubenswrapper[4954]: I1206 08:30:03.506105 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm82z\" (UniqueName: \"kubernetes.io/projected/539b8ab6-bc54-459e-b9cf-a46c09e4449e-kube-api-access-nm82z\") pod \"539b8ab6-bc54-459e-b9cf-a46c09e4449e\" (UID: \"539b8ab6-bc54-459e-b9cf-a46c09e4449e\") " Dec 06 08:30:03 crc kubenswrapper[4954]: I1206 08:30:03.506179 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/539b8ab6-bc54-459e-b9cf-a46c09e4449e-secret-volume\") pod \"539b8ab6-bc54-459e-b9cf-a46c09e4449e\" (UID: \"539b8ab6-bc54-459e-b9cf-a46c09e4449e\") " Dec 06 08:30:03 crc kubenswrapper[4954]: I1206 08:30:03.506227 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/539b8ab6-bc54-459e-b9cf-a46c09e4449e-config-volume\") pod \"539b8ab6-bc54-459e-b9cf-a46c09e4449e\" (UID: \"539b8ab6-bc54-459e-b9cf-a46c09e4449e\") " Dec 06 08:30:03 crc kubenswrapper[4954]: I1206 08:30:03.506936 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/539b8ab6-bc54-459e-b9cf-a46c09e4449e-config-volume" (OuterVolumeSpecName: "config-volume") pod "539b8ab6-bc54-459e-b9cf-a46c09e4449e" (UID: "539b8ab6-bc54-459e-b9cf-a46c09e4449e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:30:03 crc kubenswrapper[4954]: I1206 08:30:03.511388 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/539b8ab6-bc54-459e-b9cf-a46c09e4449e-kube-api-access-nm82z" (OuterVolumeSpecName: "kube-api-access-nm82z") pod "539b8ab6-bc54-459e-b9cf-a46c09e4449e" (UID: "539b8ab6-bc54-459e-b9cf-a46c09e4449e"). InnerVolumeSpecName "kube-api-access-nm82z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:30:03 crc kubenswrapper[4954]: I1206 08:30:03.511465 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/539b8ab6-bc54-459e-b9cf-a46c09e4449e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "539b8ab6-bc54-459e-b9cf-a46c09e4449e" (UID: "539b8ab6-bc54-459e-b9cf-a46c09e4449e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:30:03 crc kubenswrapper[4954]: I1206 08:30:03.607786 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm82z\" (UniqueName: \"kubernetes.io/projected/539b8ab6-bc54-459e-b9cf-a46c09e4449e-kube-api-access-nm82z\") on node \"crc\" DevicePath \"\"" Dec 06 08:30:03 crc kubenswrapper[4954]: I1206 08:30:03.608158 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/539b8ab6-bc54-459e-b9cf-a46c09e4449e-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 08:30:03 crc kubenswrapper[4954]: I1206 08:30:03.608172 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/539b8ab6-bc54-459e-b9cf-a46c09e4449e-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 08:30:04 crc kubenswrapper[4954]: I1206 08:30:04.023211 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp" event={"ID":"539b8ab6-bc54-459e-b9cf-a46c09e4449e","Type":"ContainerDied","Data":"8dcecd52cd1b8434c75716b39924703a10f6ae822f7ab288d0b4e77adcec8312"} Dec 06 08:30:04 crc kubenswrapper[4954]: I1206 08:30:04.023250 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp" Dec 06 08:30:04 crc kubenswrapper[4954]: I1206 08:30:04.023261 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dcecd52cd1b8434c75716b39924703a10f6ae822f7ab288d0b4e77adcec8312" Dec 06 08:30:04 crc kubenswrapper[4954]: I1206 08:30:04.457881 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l"] Dec 06 08:30:04 crc kubenswrapper[4954]: I1206 08:30:04.463673 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416785-n888l"] Dec 06 08:30:05 crc kubenswrapper[4954]: I1206 08:30:05.460218 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5edc0052-160b-4212-9556-7af69bb55a89" path="/var/lib/kubelet/pods/5edc0052-160b-4212-9556-7af69bb55a89/volumes" Dec 06 08:30:10 crc kubenswrapper[4954]: I1206 08:30:10.101432 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:30:10 crc kubenswrapper[4954]: I1206 08:30:10.101948 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:30:22 crc kubenswrapper[4954]: I1206 08:30:22.151465 4954 scope.go:117] "RemoveContainer" containerID="4f335a34f8d198dec55906916f687644d1185db68ad24b6479190a0673b93a79" Dec 06 08:30:40 crc kubenswrapper[4954]: I1206 08:30:40.100687 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:30:40 crc kubenswrapper[4954]: I1206 08:30:40.101184 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:31:10 crc kubenswrapper[4954]: I1206 08:31:10.101886 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:31:10 crc kubenswrapper[4954]: I1206 08:31:10.102931 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:31:10 crc kubenswrapper[4954]: I1206 08:31:10.103009 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 08:31:10 crc kubenswrapper[4954]: I1206 08:31:10.104241 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 08:31:10 crc kubenswrapper[4954]: I1206 08:31:10.104316 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" gracePeriod=600 Dec 06 08:31:10 crc kubenswrapper[4954]: E1206 08:31:10.256399 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:31:10 crc kubenswrapper[4954]: I1206 08:31:10.576645 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" exitCode=0 Dec 06 08:31:10 crc kubenswrapper[4954]: I1206 08:31:10.576741 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd"} Dec 06 08:31:10 crc kubenswrapper[4954]: I1206 08:31:10.576845 4954 scope.go:117] "RemoveContainer" containerID="812f2a303638c4394b0c848ba2caf410771ccc5851f640dc426a593da3802ac2" Dec 06 08:31:10 crc kubenswrapper[4954]: I1206 08:31:10.577920 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:31:10 crc kubenswrapper[4954]: E1206 08:31:10.578366 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:31:24 crc kubenswrapper[4954]: I1206 08:31:24.443432 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:31:24 crc kubenswrapper[4954]: E1206 08:31:24.444139 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:31:36 crc kubenswrapper[4954]: I1206 08:31:36.443695 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:31:36 crc kubenswrapper[4954]: E1206 08:31:36.444333 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:31:48 crc kubenswrapper[4954]: I1206 08:31:48.443717 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:31:48 crc kubenswrapper[4954]: E1206 08:31:48.444470 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:32:01 crc kubenswrapper[4954]: I1206 08:32:01.444042 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:32:01 crc kubenswrapper[4954]: E1206 08:32:01.445330 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:32:12 crc kubenswrapper[4954]: I1206 08:32:12.443860 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:32:12 crc kubenswrapper[4954]: E1206 08:32:12.444555 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:32:27 crc kubenswrapper[4954]: I1206 08:32:27.443004 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:32:27 crc kubenswrapper[4954]: E1206 08:32:27.443728 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:32:39 crc kubenswrapper[4954]: I1206 08:32:39.446609 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:32:39 crc kubenswrapper[4954]: E1206 08:32:39.447392 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:32:53 crc kubenswrapper[4954]: I1206 08:32:53.446476 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:32:53 crc kubenswrapper[4954]: E1206 08:32:53.447931 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:33:06 crc kubenswrapper[4954]: I1206 08:33:06.443451 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:33:06 crc kubenswrapper[4954]: E1206 08:33:06.444405 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:33:19 crc kubenswrapper[4954]: I1206 08:33:19.443886 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:33:19 crc kubenswrapper[4954]: E1206 08:33:19.444930 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:33:33 crc kubenswrapper[4954]: I1206 08:33:33.443996 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:33:33 crc kubenswrapper[4954]: E1206 08:33:33.445018 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:33:48 crc kubenswrapper[4954]: I1206 08:33:48.443124 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:33:48 crc kubenswrapper[4954]: E1206 08:33:48.443892 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:34:01 crc kubenswrapper[4954]: I1206 08:34:01.443866 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:34:01 crc kubenswrapper[4954]: E1206 08:34:01.445691 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:34:16 crc kubenswrapper[4954]: I1206 08:34:16.442980 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:34:16 crc kubenswrapper[4954]: E1206 08:34:16.443681 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:34:27 crc kubenswrapper[4954]: I1206 08:34:27.444026 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:34:27 crc kubenswrapper[4954]: E1206 08:34:27.444938 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:34:42 crc kubenswrapper[4954]: I1206 08:34:42.444273 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:34:42 crc kubenswrapper[4954]: E1206 08:34:42.445724 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:34:53 crc kubenswrapper[4954]: I1206 08:34:53.444116 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:34:53 crc kubenswrapper[4954]: E1206 08:34:53.444998 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:35:06 crc kubenswrapper[4954]: I1206 08:35:06.443219 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:35:06 crc kubenswrapper[4954]: E1206 08:35:06.444313 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:35:18 crc kubenswrapper[4954]: I1206 08:35:18.443926 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:35:18 crc kubenswrapper[4954]: E1206 08:35:18.467121 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:35:29 crc kubenswrapper[4954]: I1206 08:35:29.142094 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qn9dh"] Dec 06 08:35:29 crc kubenswrapper[4954]: E1206 08:35:29.143088 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="539b8ab6-bc54-459e-b9cf-a46c09e4449e" containerName="collect-profiles" Dec 06 08:35:29 crc kubenswrapper[4954]: I1206 08:35:29.143106 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="539b8ab6-bc54-459e-b9cf-a46c09e4449e" containerName="collect-profiles" Dec 06 08:35:29 crc kubenswrapper[4954]: I1206 08:35:29.143287 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="539b8ab6-bc54-459e-b9cf-a46c09e4449e" containerName="collect-profiles" Dec 06 08:35:29 crc kubenswrapper[4954]: I1206 08:35:29.144330 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qn9dh" Dec 06 08:35:29 crc kubenswrapper[4954]: I1206 08:35:29.159799 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qn9dh"] Dec 06 08:35:29 crc kubenswrapper[4954]: I1206 08:35:29.230749 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6h87\" (UniqueName: \"kubernetes.io/projected/d41b3bae-a31b-4b9b-8612-aa10731f36b0-kube-api-access-k6h87\") pod \"redhat-marketplace-qn9dh\" (UID: \"d41b3bae-a31b-4b9b-8612-aa10731f36b0\") " pod="openshift-marketplace/redhat-marketplace-qn9dh" Dec 06 08:35:29 crc kubenswrapper[4954]: I1206 08:35:29.230851 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d41b3bae-a31b-4b9b-8612-aa10731f36b0-catalog-content\") pod \"redhat-marketplace-qn9dh\" (UID: \"d41b3bae-a31b-4b9b-8612-aa10731f36b0\") " pod="openshift-marketplace/redhat-marketplace-qn9dh" Dec 06 08:35:29 crc kubenswrapper[4954]: I1206 08:35:29.231163 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d41b3bae-a31b-4b9b-8612-aa10731f36b0-utilities\") pod \"redhat-marketplace-qn9dh\" (UID: \"d41b3bae-a31b-4b9b-8612-aa10731f36b0\") " pod="openshift-marketplace/redhat-marketplace-qn9dh" Dec 06 08:35:29 crc kubenswrapper[4954]: I1206 08:35:29.332636 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d41b3bae-a31b-4b9b-8612-aa10731f36b0-utilities\") pod \"redhat-marketplace-qn9dh\" (UID: \"d41b3bae-a31b-4b9b-8612-aa10731f36b0\") " pod="openshift-marketplace/redhat-marketplace-qn9dh" Dec 06 08:35:29 crc kubenswrapper[4954]: I1206 08:35:29.332732 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6h87\" (UniqueName: \"kubernetes.io/projected/d41b3bae-a31b-4b9b-8612-aa10731f36b0-kube-api-access-k6h87\") pod \"redhat-marketplace-qn9dh\" (UID: \"d41b3bae-a31b-4b9b-8612-aa10731f36b0\") " pod="openshift-marketplace/redhat-marketplace-qn9dh" Dec 06 08:35:29 crc kubenswrapper[4954]: I1206 08:35:29.332782 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d41b3bae-a31b-4b9b-8612-aa10731f36b0-catalog-content\") pod \"redhat-marketplace-qn9dh\" (UID: \"d41b3bae-a31b-4b9b-8612-aa10731f36b0\") " pod="openshift-marketplace/redhat-marketplace-qn9dh" Dec 06 08:35:29 crc kubenswrapper[4954]: I1206 08:35:29.333315 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d41b3bae-a31b-4b9b-8612-aa10731f36b0-utilities\") pod \"redhat-marketplace-qn9dh\" (UID: \"d41b3bae-a31b-4b9b-8612-aa10731f36b0\") " pod="openshift-marketplace/redhat-marketplace-qn9dh" Dec 06 08:35:29 crc kubenswrapper[4954]: I1206 08:35:29.333474 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d41b3bae-a31b-4b9b-8612-aa10731f36b0-catalog-content\") pod \"redhat-marketplace-qn9dh\" (UID: \"d41b3bae-a31b-4b9b-8612-aa10731f36b0\") " pod="openshift-marketplace/redhat-marketplace-qn9dh" Dec 06 08:35:29 crc kubenswrapper[4954]: I1206 08:35:29.354050 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6h87\" (UniqueName: \"kubernetes.io/projected/d41b3bae-a31b-4b9b-8612-aa10731f36b0-kube-api-access-k6h87\") pod \"redhat-marketplace-qn9dh\" (UID: \"d41b3bae-a31b-4b9b-8612-aa10731f36b0\") " pod="openshift-marketplace/redhat-marketplace-qn9dh" Dec 06 08:35:29 crc kubenswrapper[4954]: I1206 08:35:29.469277 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qn9dh" Dec 06 08:35:29 crc kubenswrapper[4954]: I1206 08:35:29.739332 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qn9dh"] Dec 06 08:35:29 crc kubenswrapper[4954]: I1206 08:35:29.782446 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qn9dh" event={"ID":"d41b3bae-a31b-4b9b-8612-aa10731f36b0","Type":"ContainerStarted","Data":"e30650e0b227fca774e479356dee4b7242ce83aa0944a14d4e7f2ee22d7152b0"} Dec 06 08:35:30 crc kubenswrapper[4954]: I1206 08:35:30.793053 4954 generic.go:334] "Generic (PLEG): container finished" podID="d41b3bae-a31b-4b9b-8612-aa10731f36b0" containerID="ee394ccf972327c86999ab6f1a41cce99aa1560eb437c4c2c90c0f2c16237170" exitCode=0 Dec 06 08:35:30 crc kubenswrapper[4954]: I1206 08:35:30.794557 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qn9dh" event={"ID":"d41b3bae-a31b-4b9b-8612-aa10731f36b0","Type":"ContainerDied","Data":"ee394ccf972327c86999ab6f1a41cce99aa1560eb437c4c2c90c0f2c16237170"} Dec 06 08:35:30 crc kubenswrapper[4954]: I1206 08:35:30.797022 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 08:35:31 crc kubenswrapper[4954]: I1206 08:35:31.445183 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:35:31 crc kubenswrapper[4954]: E1206 08:35:31.445856 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:35:31 crc kubenswrapper[4954]: I1206 08:35:31.806618 4954 generic.go:334] "Generic (PLEG): container finished" podID="d41b3bae-a31b-4b9b-8612-aa10731f36b0" containerID="5ca873f4a813302b021ca492102affe8c26b22358826e7f9405116802e399977" exitCode=0 Dec 06 08:35:31 crc kubenswrapper[4954]: I1206 08:35:31.806688 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qn9dh" event={"ID":"d41b3bae-a31b-4b9b-8612-aa10731f36b0","Type":"ContainerDied","Data":"5ca873f4a813302b021ca492102affe8c26b22358826e7f9405116802e399977"} Dec 06 08:35:32 crc kubenswrapper[4954]: I1206 08:35:32.821483 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qn9dh" event={"ID":"d41b3bae-a31b-4b9b-8612-aa10731f36b0","Type":"ContainerStarted","Data":"d8fe23ca3e3045e9390129f714165569c4b38944eb5d28c9bf3ce23115a77221"} Dec 06 08:35:32 crc kubenswrapper[4954]: I1206 08:35:32.846221 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qn9dh" podStartSLOduration=2.427004857 podStartE2EDuration="3.84599668s" podCreationTimestamp="2025-12-06 08:35:29 +0000 UTC" firstStartedPulling="2025-12-06 08:35:30.796724257 +0000 UTC m=+5905.610083646" lastFinishedPulling="2025-12-06 08:35:32.21571607 +0000 UTC m=+5907.029075469" observedRunningTime="2025-12-06 08:35:32.841243484 +0000 UTC m=+5907.654602873" watchObservedRunningTime="2025-12-06 08:35:32.84599668 +0000 UTC m=+5907.659356069" Dec 06 08:35:39 crc kubenswrapper[4954]: I1206 08:35:39.471514 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qn9dh" Dec 06 08:35:39 crc kubenswrapper[4954]: I1206 08:35:39.472422 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qn9dh" Dec 06 08:35:39 crc kubenswrapper[4954]: I1206 08:35:39.523462 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qn9dh" Dec 06 08:35:39 crc kubenswrapper[4954]: I1206 08:35:39.933488 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qn9dh" Dec 06 08:35:39 crc kubenswrapper[4954]: I1206 08:35:39.984706 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qn9dh"] Dec 06 08:35:41 crc kubenswrapper[4954]: I1206 08:35:41.898542 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qn9dh" podUID="d41b3bae-a31b-4b9b-8612-aa10731f36b0" containerName="registry-server" containerID="cri-o://d8fe23ca3e3045e9390129f714165569c4b38944eb5d28c9bf3ce23115a77221" gracePeriod=2 Dec 06 08:35:42 crc kubenswrapper[4954]: I1206 08:35:42.444543 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:35:42 crc kubenswrapper[4954]: E1206 08:35:42.444958 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:35:42 crc kubenswrapper[4954]: I1206 08:35:42.907325 4954 generic.go:334] "Generic (PLEG): container finished" podID="d41b3bae-a31b-4b9b-8612-aa10731f36b0" containerID="d8fe23ca3e3045e9390129f714165569c4b38944eb5d28c9bf3ce23115a77221" exitCode=0 Dec 06 08:35:42 crc kubenswrapper[4954]: I1206 08:35:42.907396 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qn9dh" event={"ID":"d41b3bae-a31b-4b9b-8612-aa10731f36b0","Type":"ContainerDied","Data":"d8fe23ca3e3045e9390129f714165569c4b38944eb5d28c9bf3ce23115a77221"} Dec 06 08:35:43 crc kubenswrapper[4954]: I1206 08:35:43.551111 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qn9dh" Dec 06 08:35:43 crc kubenswrapper[4954]: I1206 08:35:43.682318 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d41b3bae-a31b-4b9b-8612-aa10731f36b0-utilities\") pod \"d41b3bae-a31b-4b9b-8612-aa10731f36b0\" (UID: \"d41b3bae-a31b-4b9b-8612-aa10731f36b0\") " Dec 06 08:35:43 crc kubenswrapper[4954]: I1206 08:35:43.682373 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6h87\" (UniqueName: \"kubernetes.io/projected/d41b3bae-a31b-4b9b-8612-aa10731f36b0-kube-api-access-k6h87\") pod \"d41b3bae-a31b-4b9b-8612-aa10731f36b0\" (UID: \"d41b3bae-a31b-4b9b-8612-aa10731f36b0\") " Dec 06 08:35:43 crc kubenswrapper[4954]: I1206 08:35:43.682464 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d41b3bae-a31b-4b9b-8612-aa10731f36b0-catalog-content\") pod \"d41b3bae-a31b-4b9b-8612-aa10731f36b0\" (UID: \"d41b3bae-a31b-4b9b-8612-aa10731f36b0\") " Dec 06 08:35:43 crc kubenswrapper[4954]: I1206 08:35:43.683598 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d41b3bae-a31b-4b9b-8612-aa10731f36b0-utilities" (OuterVolumeSpecName: "utilities") pod "d41b3bae-a31b-4b9b-8612-aa10731f36b0" (UID: "d41b3bae-a31b-4b9b-8612-aa10731f36b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:35:43 crc kubenswrapper[4954]: I1206 08:35:43.692047 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d41b3bae-a31b-4b9b-8612-aa10731f36b0-kube-api-access-k6h87" (OuterVolumeSpecName: "kube-api-access-k6h87") pod "d41b3bae-a31b-4b9b-8612-aa10731f36b0" (UID: "d41b3bae-a31b-4b9b-8612-aa10731f36b0"). InnerVolumeSpecName "kube-api-access-k6h87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:35:43 crc kubenswrapper[4954]: I1206 08:35:43.705125 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d41b3bae-a31b-4b9b-8612-aa10731f36b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d41b3bae-a31b-4b9b-8612-aa10731f36b0" (UID: "d41b3bae-a31b-4b9b-8612-aa10731f36b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:35:43 crc kubenswrapper[4954]: I1206 08:35:43.783551 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6h87\" (UniqueName: \"kubernetes.io/projected/d41b3bae-a31b-4b9b-8612-aa10731f36b0-kube-api-access-k6h87\") on node \"crc\" DevicePath \"\"" Dec 06 08:35:43 crc kubenswrapper[4954]: I1206 08:35:43.783601 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d41b3bae-a31b-4b9b-8612-aa10731f36b0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:35:43 crc kubenswrapper[4954]: I1206 08:35:43.783613 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d41b3bae-a31b-4b9b-8612-aa10731f36b0-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:35:43 crc kubenswrapper[4954]: I1206 08:35:43.922018 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qn9dh" event={"ID":"d41b3bae-a31b-4b9b-8612-aa10731f36b0","Type":"ContainerDied","Data":"e30650e0b227fca774e479356dee4b7242ce83aa0944a14d4e7f2ee22d7152b0"} Dec 06 08:35:43 crc kubenswrapper[4954]: I1206 08:35:43.922136 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qn9dh" Dec 06 08:35:43 crc kubenswrapper[4954]: I1206 08:35:43.922737 4954 scope.go:117] "RemoveContainer" containerID="d8fe23ca3e3045e9390129f714165569c4b38944eb5d28c9bf3ce23115a77221" Dec 06 08:35:43 crc kubenswrapper[4954]: I1206 08:35:43.962331 4954 scope.go:117] "RemoveContainer" containerID="5ca873f4a813302b021ca492102affe8c26b22358826e7f9405116802e399977" Dec 06 08:35:43 crc kubenswrapper[4954]: I1206 08:35:43.986695 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qn9dh"] Dec 06 08:35:43 crc kubenswrapper[4954]: I1206 08:35:43.995958 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qn9dh"] Dec 06 08:35:44 crc kubenswrapper[4954]: I1206 08:35:44.017864 4954 scope.go:117] "RemoveContainer" containerID="ee394ccf972327c86999ab6f1a41cce99aa1560eb437c4c2c90c0f2c16237170" Dec 06 08:35:45 crc kubenswrapper[4954]: I1206 08:35:45.455369 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d41b3bae-a31b-4b9b-8612-aa10731f36b0" path="/var/lib/kubelet/pods/d41b3bae-a31b-4b9b-8612-aa10731f36b0/volumes" Dec 06 08:35:57 crc kubenswrapper[4954]: I1206 08:35:57.443172 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:35:57 crc kubenswrapper[4954]: E1206 08:35:57.443909 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:36:10 crc kubenswrapper[4954]: I1206 08:36:10.443083 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:36:11 crc kubenswrapper[4954]: I1206 08:36:11.111067 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"0bd6f8a5555a0f99770e3b85c32a88e35dc9167385ddb1f2f1ba17c4331137be"} Dec 06 08:37:02 crc kubenswrapper[4954]: I1206 08:37:02.234551 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mcrp7"] Dec 06 08:37:02 crc kubenswrapper[4954]: E1206 08:37:02.239233 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d41b3bae-a31b-4b9b-8612-aa10731f36b0" containerName="extract-content" Dec 06 08:37:02 crc kubenswrapper[4954]: I1206 08:37:02.239271 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41b3bae-a31b-4b9b-8612-aa10731f36b0" containerName="extract-content" Dec 06 08:37:02 crc kubenswrapper[4954]: E1206 08:37:02.239289 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d41b3bae-a31b-4b9b-8612-aa10731f36b0" containerName="extract-utilities" Dec 06 08:37:02 crc kubenswrapper[4954]: I1206 08:37:02.239298 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41b3bae-a31b-4b9b-8612-aa10731f36b0" containerName="extract-utilities" Dec 06 08:37:02 crc kubenswrapper[4954]: E1206 08:37:02.239312 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d41b3bae-a31b-4b9b-8612-aa10731f36b0" containerName="registry-server" Dec 06 08:37:02 crc kubenswrapper[4954]: I1206 08:37:02.239318 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41b3bae-a31b-4b9b-8612-aa10731f36b0" containerName="registry-server" Dec 06 08:37:02 crc kubenswrapper[4954]: I1206 08:37:02.239509 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d41b3bae-a31b-4b9b-8612-aa10731f36b0" containerName="registry-server" Dec 06 08:37:02 crc kubenswrapper[4954]: I1206 08:37:02.242536 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mcrp7"] Dec 06 08:37:02 crc kubenswrapper[4954]: I1206 08:37:02.243735 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mcrp7" Dec 06 08:37:02 crc kubenswrapper[4954]: I1206 08:37:02.433800 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399f6a0e-06e8-4a45-8a4a-8531e011e5f0-catalog-content\") pod \"certified-operators-mcrp7\" (UID: \"399f6a0e-06e8-4a45-8a4a-8531e011e5f0\") " pod="openshift-marketplace/certified-operators-mcrp7" Dec 06 08:37:02 crc kubenswrapper[4954]: I1206 08:37:02.433993 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42hl7\" (UniqueName: \"kubernetes.io/projected/399f6a0e-06e8-4a45-8a4a-8531e011e5f0-kube-api-access-42hl7\") pod \"certified-operators-mcrp7\" (UID: \"399f6a0e-06e8-4a45-8a4a-8531e011e5f0\") " pod="openshift-marketplace/certified-operators-mcrp7" Dec 06 08:37:02 crc kubenswrapper[4954]: I1206 08:37:02.434142 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399f6a0e-06e8-4a45-8a4a-8531e011e5f0-utilities\") pod \"certified-operators-mcrp7\" (UID: \"399f6a0e-06e8-4a45-8a4a-8531e011e5f0\") " pod="openshift-marketplace/certified-operators-mcrp7" Dec 06 08:37:02 crc kubenswrapper[4954]: I1206 08:37:02.535718 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399f6a0e-06e8-4a45-8a4a-8531e011e5f0-utilities\") pod \"certified-operators-mcrp7\" (UID: \"399f6a0e-06e8-4a45-8a4a-8531e011e5f0\") " pod="openshift-marketplace/certified-operators-mcrp7" Dec 06 08:37:02 crc kubenswrapper[4954]: I1206 08:37:02.535781 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399f6a0e-06e8-4a45-8a4a-8531e011e5f0-catalog-content\") pod \"certified-operators-mcrp7\" (UID: \"399f6a0e-06e8-4a45-8a4a-8531e011e5f0\") " pod="openshift-marketplace/certified-operators-mcrp7" Dec 06 08:37:02 crc kubenswrapper[4954]: I1206 08:37:02.535887 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42hl7\" (UniqueName: \"kubernetes.io/projected/399f6a0e-06e8-4a45-8a4a-8531e011e5f0-kube-api-access-42hl7\") pod \"certified-operators-mcrp7\" (UID: \"399f6a0e-06e8-4a45-8a4a-8531e011e5f0\") " pod="openshift-marketplace/certified-operators-mcrp7" Dec 06 08:37:02 crc kubenswrapper[4954]: I1206 08:37:02.536329 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399f6a0e-06e8-4a45-8a4a-8531e011e5f0-catalog-content\") pod \"certified-operators-mcrp7\" (UID: \"399f6a0e-06e8-4a45-8a4a-8531e011e5f0\") " pod="openshift-marketplace/certified-operators-mcrp7" Dec 06 08:37:02 crc kubenswrapper[4954]: I1206 08:37:02.537206 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399f6a0e-06e8-4a45-8a4a-8531e011e5f0-utilities\") pod \"certified-operators-mcrp7\" (UID: \"399f6a0e-06e8-4a45-8a4a-8531e011e5f0\") " pod="openshift-marketplace/certified-operators-mcrp7" Dec 06 08:37:02 crc kubenswrapper[4954]: I1206 08:37:02.565598 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42hl7\" (UniqueName: \"kubernetes.io/projected/399f6a0e-06e8-4a45-8a4a-8531e011e5f0-kube-api-access-42hl7\") pod \"certified-operators-mcrp7\" (UID: \"399f6a0e-06e8-4a45-8a4a-8531e011e5f0\") " pod="openshift-marketplace/certified-operators-mcrp7" Dec 06 08:37:02 crc kubenswrapper[4954]: I1206 08:37:02.586384 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mcrp7" Dec 06 08:37:03 crc kubenswrapper[4954]: I1206 08:37:03.088530 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mcrp7"] Dec 06 08:37:03 crc kubenswrapper[4954]: I1206 08:37:03.554777 4954 generic.go:334] "Generic (PLEG): container finished" podID="399f6a0e-06e8-4a45-8a4a-8531e011e5f0" containerID="bc6401b2d67b90dc51002faee5998b541619d9ec8974c436ecbaa0c57d31a094" exitCode=0 Dec 06 08:37:03 crc kubenswrapper[4954]: I1206 08:37:03.554864 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcrp7" event={"ID":"399f6a0e-06e8-4a45-8a4a-8531e011e5f0","Type":"ContainerDied","Data":"bc6401b2d67b90dc51002faee5998b541619d9ec8974c436ecbaa0c57d31a094"} Dec 06 08:37:03 crc kubenswrapper[4954]: I1206 08:37:03.554897 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcrp7" event={"ID":"399f6a0e-06e8-4a45-8a4a-8531e011e5f0","Type":"ContainerStarted","Data":"59501f053907e2e283c7d889e2220622639d5baf8be8838cdc0dc6265b8fbd82"} Dec 06 08:37:04 crc kubenswrapper[4954]: I1206 08:37:04.578796 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcrp7" event={"ID":"399f6a0e-06e8-4a45-8a4a-8531e011e5f0","Type":"ContainerStarted","Data":"7300334699d71ed97b9aa1c1f34e76b50fec651d119d238b989a5ec8c1bf9adf"} Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.318216 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-5ql4s"] Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.323040 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-5ql4s"] Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.463807 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eaf5c66-35fa-41ac-915e-adfac50f0261" path="/var/lib/kubelet/pods/8eaf5c66-35fa-41ac-915e-adfac50f0261/volumes" Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.465613 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-msmm6"] Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.468386 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-msmm6" Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.473106 4954 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-98b5k" Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.473447 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.473857 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.475027 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.481335 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-msmm6"] Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.581071 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3fd55905-1123-4dd0-8593-f1e7b2650f78-crc-storage\") pod \"crc-storage-crc-msmm6\" (UID: \"3fd55905-1123-4dd0-8593-f1e7b2650f78\") " pod="crc-storage/crc-storage-crc-msmm6" Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.581587 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3fd55905-1123-4dd0-8593-f1e7b2650f78-node-mnt\") pod \"crc-storage-crc-msmm6\" (UID: \"3fd55905-1123-4dd0-8593-f1e7b2650f78\") " pod="crc-storage/crc-storage-crc-msmm6" Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.581619 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msb55\" (UniqueName: \"kubernetes.io/projected/3fd55905-1123-4dd0-8593-f1e7b2650f78-kube-api-access-msb55\") pod \"crc-storage-crc-msmm6\" (UID: \"3fd55905-1123-4dd0-8593-f1e7b2650f78\") " pod="crc-storage/crc-storage-crc-msmm6" Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.589135 4954 generic.go:334] "Generic (PLEG): container finished" podID="399f6a0e-06e8-4a45-8a4a-8531e011e5f0" containerID="7300334699d71ed97b9aa1c1f34e76b50fec651d119d238b989a5ec8c1bf9adf" exitCode=0 Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.589206 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcrp7" event={"ID":"399f6a0e-06e8-4a45-8a4a-8531e011e5f0","Type":"ContainerDied","Data":"7300334699d71ed97b9aa1c1f34e76b50fec651d119d238b989a5ec8c1bf9adf"} Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.683258 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3fd55905-1123-4dd0-8593-f1e7b2650f78-node-mnt\") pod \"crc-storage-crc-msmm6\" (UID: \"3fd55905-1123-4dd0-8593-f1e7b2650f78\") " pod="crc-storage/crc-storage-crc-msmm6" Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.683321 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msb55\" (UniqueName: \"kubernetes.io/projected/3fd55905-1123-4dd0-8593-f1e7b2650f78-kube-api-access-msb55\") pod \"crc-storage-crc-msmm6\" (UID: \"3fd55905-1123-4dd0-8593-f1e7b2650f78\") " pod="crc-storage/crc-storage-crc-msmm6" Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.683999 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3fd55905-1123-4dd0-8593-f1e7b2650f78-crc-storage\") pod \"crc-storage-crc-msmm6\" (UID: \"3fd55905-1123-4dd0-8593-f1e7b2650f78\") " pod="crc-storage/crc-storage-crc-msmm6" Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.684012 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3fd55905-1123-4dd0-8593-f1e7b2650f78-node-mnt\") pod \"crc-storage-crc-msmm6\" (UID: \"3fd55905-1123-4dd0-8593-f1e7b2650f78\") " pod="crc-storage/crc-storage-crc-msmm6" Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.684848 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3fd55905-1123-4dd0-8593-f1e7b2650f78-crc-storage\") pod \"crc-storage-crc-msmm6\" (UID: \"3fd55905-1123-4dd0-8593-f1e7b2650f78\") " pod="crc-storage/crc-storage-crc-msmm6" Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.709375 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msb55\" (UniqueName: \"kubernetes.io/projected/3fd55905-1123-4dd0-8593-f1e7b2650f78-kube-api-access-msb55\") pod \"crc-storage-crc-msmm6\" (UID: \"3fd55905-1123-4dd0-8593-f1e7b2650f78\") " pod="crc-storage/crc-storage-crc-msmm6" Dec 06 08:37:05 crc kubenswrapper[4954]: I1206 08:37:05.801124 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-msmm6" Dec 06 08:37:06 crc kubenswrapper[4954]: I1206 08:37:06.257263 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-msmm6"] Dec 06 08:37:06 crc kubenswrapper[4954]: W1206 08:37:06.258302 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fd55905_1123_4dd0_8593_f1e7b2650f78.slice/crio-1542da8306e7d97f40fc1c56e2e3f103f8f7903af0a6bcdb61cf6c3c25beb910 WatchSource:0}: Error finding container 1542da8306e7d97f40fc1c56e2e3f103f8f7903af0a6bcdb61cf6c3c25beb910: Status 404 returned error can't find the container with id 1542da8306e7d97f40fc1c56e2e3f103f8f7903af0a6bcdb61cf6c3c25beb910 Dec 06 08:37:06 crc kubenswrapper[4954]: I1206 08:37:06.601633 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-msmm6" event={"ID":"3fd55905-1123-4dd0-8593-f1e7b2650f78","Type":"ContainerStarted","Data":"1542da8306e7d97f40fc1c56e2e3f103f8f7903af0a6bcdb61cf6c3c25beb910"} Dec 06 08:37:06 crc kubenswrapper[4954]: I1206 08:37:06.607252 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcrp7" event={"ID":"399f6a0e-06e8-4a45-8a4a-8531e011e5f0","Type":"ContainerStarted","Data":"c978287893438db25e9c965d1887992d81a989586a165bb4743388b672c0afa2"} Dec 06 08:37:06 crc kubenswrapper[4954]: I1206 08:37:06.641178 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mcrp7" podStartSLOduration=2.221175537 podStartE2EDuration="4.641146672s" podCreationTimestamp="2025-12-06 08:37:02 +0000 UTC" firstStartedPulling="2025-12-06 08:37:03.557798084 +0000 UTC m=+5998.371157473" lastFinishedPulling="2025-12-06 08:37:05.977769209 +0000 UTC m=+6000.791128608" observedRunningTime="2025-12-06 08:37:06.636020276 +0000 UTC m=+6001.449379675" watchObservedRunningTime="2025-12-06 08:37:06.641146672 +0000 UTC m=+6001.454506061" Dec 06 08:37:07 crc kubenswrapper[4954]: I1206 08:37:07.615216 4954 generic.go:334] "Generic (PLEG): container finished" podID="3fd55905-1123-4dd0-8593-f1e7b2650f78" containerID="9422f2af6b3998c4acdb66d786d40ae1611f3e78c0dab9847a12bcee46a87a60" exitCode=0 Dec 06 08:37:07 crc kubenswrapper[4954]: I1206 08:37:07.615616 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-msmm6" event={"ID":"3fd55905-1123-4dd0-8593-f1e7b2650f78","Type":"ContainerDied","Data":"9422f2af6b3998c4acdb66d786d40ae1611f3e78c0dab9847a12bcee46a87a60"} Dec 06 08:37:08 crc kubenswrapper[4954]: I1206 08:37:08.928972 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-msmm6" Dec 06 08:37:08 crc kubenswrapper[4954]: I1206 08:37:08.936410 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msb55\" (UniqueName: \"kubernetes.io/projected/3fd55905-1123-4dd0-8593-f1e7b2650f78-kube-api-access-msb55\") pod \"3fd55905-1123-4dd0-8593-f1e7b2650f78\" (UID: \"3fd55905-1123-4dd0-8593-f1e7b2650f78\") " Dec 06 08:37:08 crc kubenswrapper[4954]: I1206 08:37:08.936530 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3fd55905-1123-4dd0-8593-f1e7b2650f78-crc-storage\") pod \"3fd55905-1123-4dd0-8593-f1e7b2650f78\" (UID: \"3fd55905-1123-4dd0-8593-f1e7b2650f78\") " Dec 06 08:37:08 crc kubenswrapper[4954]: I1206 08:37:08.936723 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3fd55905-1123-4dd0-8593-f1e7b2650f78-node-mnt\") pod \"3fd55905-1123-4dd0-8593-f1e7b2650f78\" (UID: \"3fd55905-1123-4dd0-8593-f1e7b2650f78\") " Dec 06 08:37:08 crc kubenswrapper[4954]: I1206 08:37:08.937327 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fd55905-1123-4dd0-8593-f1e7b2650f78-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "3fd55905-1123-4dd0-8593-f1e7b2650f78" (UID: "3fd55905-1123-4dd0-8593-f1e7b2650f78"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 08:37:08 crc kubenswrapper[4954]: I1206 08:37:08.952959 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd55905-1123-4dd0-8593-f1e7b2650f78-kube-api-access-msb55" (OuterVolumeSpecName: "kube-api-access-msb55") pod "3fd55905-1123-4dd0-8593-f1e7b2650f78" (UID: "3fd55905-1123-4dd0-8593-f1e7b2650f78"). InnerVolumeSpecName "kube-api-access-msb55". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:37:08 crc kubenswrapper[4954]: I1206 08:37:08.973025 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fd55905-1123-4dd0-8593-f1e7b2650f78-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "3fd55905-1123-4dd0-8593-f1e7b2650f78" (UID: "3fd55905-1123-4dd0-8593-f1e7b2650f78"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:37:09 crc kubenswrapper[4954]: I1206 08:37:09.038130 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msb55\" (UniqueName: \"kubernetes.io/projected/3fd55905-1123-4dd0-8593-f1e7b2650f78-kube-api-access-msb55\") on node \"crc\" DevicePath \"\"" Dec 06 08:37:09 crc kubenswrapper[4954]: I1206 08:37:09.038165 4954 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3fd55905-1123-4dd0-8593-f1e7b2650f78-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 06 08:37:09 crc kubenswrapper[4954]: I1206 08:37:09.038176 4954 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3fd55905-1123-4dd0-8593-f1e7b2650f78-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 06 08:37:09 crc kubenswrapper[4954]: I1206 08:37:09.633705 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-msmm6" event={"ID":"3fd55905-1123-4dd0-8593-f1e7b2650f78","Type":"ContainerDied","Data":"1542da8306e7d97f40fc1c56e2e3f103f8f7903af0a6bcdb61cf6c3c25beb910"} Dec 06 08:37:09 crc kubenswrapper[4954]: I1206 08:37:09.633771 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1542da8306e7d97f40fc1c56e2e3f103f8f7903af0a6bcdb61cf6c3c25beb910" Dec 06 08:37:09 crc kubenswrapper[4954]: I1206 08:37:09.633832 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-msmm6" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.182532 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-msmm6"] Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.190170 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-msmm6"] Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.317780 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-ww8zv"] Dec 06 08:37:11 crc kubenswrapper[4954]: E1206 08:37:11.318259 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd55905-1123-4dd0-8593-f1e7b2650f78" containerName="storage" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.318281 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd55905-1123-4dd0-8593-f1e7b2650f78" containerName="storage" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.318432 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd55905-1123-4dd0-8593-f1e7b2650f78" containerName="storage" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.318997 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ww8zv" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.321684 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.321903 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.322459 4954 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-98b5k" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.322607 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.334150 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ww8zv"] Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.379856 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snbwp\" (UniqueName: \"kubernetes.io/projected/83d9d42e-ac14-4521-8eb9-253edb8b378e-kube-api-access-snbwp\") pod \"crc-storage-crc-ww8zv\" (UID: \"83d9d42e-ac14-4521-8eb9-253edb8b378e\") " pod="crc-storage/crc-storage-crc-ww8zv" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.379977 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/83d9d42e-ac14-4521-8eb9-253edb8b378e-node-mnt\") pod \"crc-storage-crc-ww8zv\" (UID: \"83d9d42e-ac14-4521-8eb9-253edb8b378e\") " pod="crc-storage/crc-storage-crc-ww8zv" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.380017 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/83d9d42e-ac14-4521-8eb9-253edb8b378e-crc-storage\") pod \"crc-storage-crc-ww8zv\" (UID: \"83d9d42e-ac14-4521-8eb9-253edb8b378e\") " pod="crc-storage/crc-storage-crc-ww8zv" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.451977 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd55905-1123-4dd0-8593-f1e7b2650f78" path="/var/lib/kubelet/pods/3fd55905-1123-4dd0-8593-f1e7b2650f78/volumes" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.481413 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/83d9d42e-ac14-4521-8eb9-253edb8b378e-crc-storage\") pod \"crc-storage-crc-ww8zv\" (UID: \"83d9d42e-ac14-4521-8eb9-253edb8b378e\") " pod="crc-storage/crc-storage-crc-ww8zv" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.481511 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snbwp\" (UniqueName: \"kubernetes.io/projected/83d9d42e-ac14-4521-8eb9-253edb8b378e-kube-api-access-snbwp\") pod \"crc-storage-crc-ww8zv\" (UID: \"83d9d42e-ac14-4521-8eb9-253edb8b378e\") " pod="crc-storage/crc-storage-crc-ww8zv" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.481795 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/83d9d42e-ac14-4521-8eb9-253edb8b378e-node-mnt\") pod \"crc-storage-crc-ww8zv\" (UID: \"83d9d42e-ac14-4521-8eb9-253edb8b378e\") " pod="crc-storage/crc-storage-crc-ww8zv" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.482016 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/83d9d42e-ac14-4521-8eb9-253edb8b378e-node-mnt\") pod \"crc-storage-crc-ww8zv\" (UID: \"83d9d42e-ac14-4521-8eb9-253edb8b378e\") " pod="crc-storage/crc-storage-crc-ww8zv" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.482191 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/83d9d42e-ac14-4521-8eb9-253edb8b378e-crc-storage\") pod \"crc-storage-crc-ww8zv\" (UID: \"83d9d42e-ac14-4521-8eb9-253edb8b378e\") " pod="crc-storage/crc-storage-crc-ww8zv" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.504346 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snbwp\" (UniqueName: \"kubernetes.io/projected/83d9d42e-ac14-4521-8eb9-253edb8b378e-kube-api-access-snbwp\") pod \"crc-storage-crc-ww8zv\" (UID: \"83d9d42e-ac14-4521-8eb9-253edb8b378e\") " pod="crc-storage/crc-storage-crc-ww8zv" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.642674 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ww8zv" Dec 06 08:37:11 crc kubenswrapper[4954]: I1206 08:37:11.867226 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ww8zv"] Dec 06 08:37:12 crc kubenswrapper[4954]: I1206 08:37:12.586991 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mcrp7" Dec 06 08:37:12 crc kubenswrapper[4954]: I1206 08:37:12.587057 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mcrp7" Dec 06 08:37:12 crc kubenswrapper[4954]: I1206 08:37:12.642669 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mcrp7" Dec 06 08:37:12 crc kubenswrapper[4954]: I1206 08:37:12.662464 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ww8zv" event={"ID":"83d9d42e-ac14-4521-8eb9-253edb8b378e","Type":"ContainerStarted","Data":"b80ad90d0feb0c2f21b1e900725dc63876f2c8e8aa4bf4d73973760162da86e0"} Dec 06 08:37:12 crc kubenswrapper[4954]: I1206 08:37:12.715036 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mcrp7" Dec 06 08:37:12 crc kubenswrapper[4954]: I1206 08:37:12.879546 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mcrp7"] Dec 06 08:37:13 crc kubenswrapper[4954]: I1206 08:37:13.673387 4954 generic.go:334] "Generic (PLEG): container finished" podID="83d9d42e-ac14-4521-8eb9-253edb8b378e" containerID="18a97eb14e852b333b776074cb9217d0b6d110ecfe07088d67db74ab2300a106" exitCode=0 Dec 06 08:37:13 crc kubenswrapper[4954]: I1206 08:37:13.673490 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ww8zv" event={"ID":"83d9d42e-ac14-4521-8eb9-253edb8b378e","Type":"ContainerDied","Data":"18a97eb14e852b333b776074cb9217d0b6d110ecfe07088d67db74ab2300a106"} Dec 06 08:37:14 crc kubenswrapper[4954]: I1206 08:37:14.682072 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mcrp7" podUID="399f6a0e-06e8-4a45-8a4a-8531e011e5f0" containerName="registry-server" containerID="cri-o://c978287893438db25e9c965d1887992d81a989586a165bb4743388b672c0afa2" gracePeriod=2 Dec 06 08:37:15 crc kubenswrapper[4954]: I1206 08:37:15.034066 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ww8zv" Dec 06 08:37:15 crc kubenswrapper[4954]: I1206 08:37:15.141406 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/83d9d42e-ac14-4521-8eb9-253edb8b378e-node-mnt\") pod \"83d9d42e-ac14-4521-8eb9-253edb8b378e\" (UID: \"83d9d42e-ac14-4521-8eb9-253edb8b378e\") " Dec 06 08:37:15 crc kubenswrapper[4954]: I1206 08:37:15.141620 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83d9d42e-ac14-4521-8eb9-253edb8b378e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "83d9d42e-ac14-4521-8eb9-253edb8b378e" (UID: "83d9d42e-ac14-4521-8eb9-253edb8b378e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 08:37:15 crc kubenswrapper[4954]: I1206 08:37:15.141665 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/83d9d42e-ac14-4521-8eb9-253edb8b378e-crc-storage\") pod \"83d9d42e-ac14-4521-8eb9-253edb8b378e\" (UID: \"83d9d42e-ac14-4521-8eb9-253edb8b378e\") " Dec 06 08:37:15 crc kubenswrapper[4954]: I1206 08:37:15.141699 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snbwp\" (UniqueName: \"kubernetes.io/projected/83d9d42e-ac14-4521-8eb9-253edb8b378e-kube-api-access-snbwp\") pod \"83d9d42e-ac14-4521-8eb9-253edb8b378e\" (UID: \"83d9d42e-ac14-4521-8eb9-253edb8b378e\") " Dec 06 08:37:15 crc kubenswrapper[4954]: I1206 08:37:15.141977 4954 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/83d9d42e-ac14-4521-8eb9-253edb8b378e-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 06 08:37:15 crc kubenswrapper[4954]: I1206 08:37:15.147062 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83d9d42e-ac14-4521-8eb9-253edb8b378e-kube-api-access-snbwp" (OuterVolumeSpecName: "kube-api-access-snbwp") pod "83d9d42e-ac14-4521-8eb9-253edb8b378e" (UID: "83d9d42e-ac14-4521-8eb9-253edb8b378e"). InnerVolumeSpecName "kube-api-access-snbwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:37:15 crc kubenswrapper[4954]: I1206 08:37:15.159789 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83d9d42e-ac14-4521-8eb9-253edb8b378e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "83d9d42e-ac14-4521-8eb9-253edb8b378e" (UID: "83d9d42e-ac14-4521-8eb9-253edb8b378e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:37:15 crc kubenswrapper[4954]: I1206 08:37:15.243972 4954 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/83d9d42e-ac14-4521-8eb9-253edb8b378e-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 06 08:37:15 crc kubenswrapper[4954]: I1206 08:37:15.244020 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snbwp\" (UniqueName: \"kubernetes.io/projected/83d9d42e-ac14-4521-8eb9-253edb8b378e-kube-api-access-snbwp\") on node \"crc\" DevicePath \"\"" Dec 06 08:37:15 crc kubenswrapper[4954]: I1206 08:37:15.699694 4954 generic.go:334] "Generic (PLEG): container finished" podID="399f6a0e-06e8-4a45-8a4a-8531e011e5f0" containerID="c978287893438db25e9c965d1887992d81a989586a165bb4743388b672c0afa2" exitCode=0 Dec 06 08:37:15 crc kubenswrapper[4954]: I1206 08:37:15.699744 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcrp7" event={"ID":"399f6a0e-06e8-4a45-8a4a-8531e011e5f0","Type":"ContainerDied","Data":"c978287893438db25e9c965d1887992d81a989586a165bb4743388b672c0afa2"} Dec 06 08:37:15 crc kubenswrapper[4954]: I1206 08:37:15.702263 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ww8zv" event={"ID":"83d9d42e-ac14-4521-8eb9-253edb8b378e","Type":"ContainerDied","Data":"b80ad90d0feb0c2f21b1e900725dc63876f2c8e8aa4bf4d73973760162da86e0"} Dec 06 08:37:15 crc kubenswrapper[4954]: I1206 08:37:15.702301 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b80ad90d0feb0c2f21b1e900725dc63876f2c8e8aa4bf4d73973760162da86e0" Dec 06 08:37:15 crc kubenswrapper[4954]: I1206 08:37:15.702365 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ww8zv" Dec 06 08:37:16 crc kubenswrapper[4954]: I1206 08:37:16.250366 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mcrp7" Dec 06 08:37:16 crc kubenswrapper[4954]: I1206 08:37:16.361966 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42hl7\" (UniqueName: \"kubernetes.io/projected/399f6a0e-06e8-4a45-8a4a-8531e011e5f0-kube-api-access-42hl7\") pod \"399f6a0e-06e8-4a45-8a4a-8531e011e5f0\" (UID: \"399f6a0e-06e8-4a45-8a4a-8531e011e5f0\") " Dec 06 08:37:16 crc kubenswrapper[4954]: I1206 08:37:16.362029 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399f6a0e-06e8-4a45-8a4a-8531e011e5f0-catalog-content\") pod \"399f6a0e-06e8-4a45-8a4a-8531e011e5f0\" (UID: \"399f6a0e-06e8-4a45-8a4a-8531e011e5f0\") " Dec 06 08:37:16 crc kubenswrapper[4954]: I1206 08:37:16.362098 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399f6a0e-06e8-4a45-8a4a-8531e011e5f0-utilities\") pod \"399f6a0e-06e8-4a45-8a4a-8531e011e5f0\" (UID: \"399f6a0e-06e8-4a45-8a4a-8531e011e5f0\") " Dec 06 08:37:16 crc kubenswrapper[4954]: I1206 08:37:16.363101 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/399f6a0e-06e8-4a45-8a4a-8531e011e5f0-utilities" (OuterVolumeSpecName: "utilities") pod "399f6a0e-06e8-4a45-8a4a-8531e011e5f0" (UID: "399f6a0e-06e8-4a45-8a4a-8531e011e5f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:37:16 crc kubenswrapper[4954]: I1206 08:37:16.366621 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/399f6a0e-06e8-4a45-8a4a-8531e011e5f0-kube-api-access-42hl7" (OuterVolumeSpecName: "kube-api-access-42hl7") pod "399f6a0e-06e8-4a45-8a4a-8531e011e5f0" (UID: "399f6a0e-06e8-4a45-8a4a-8531e011e5f0"). InnerVolumeSpecName "kube-api-access-42hl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:37:16 crc kubenswrapper[4954]: I1206 08:37:16.412516 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/399f6a0e-06e8-4a45-8a4a-8531e011e5f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "399f6a0e-06e8-4a45-8a4a-8531e011e5f0" (UID: "399f6a0e-06e8-4a45-8a4a-8531e011e5f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:37:16 crc kubenswrapper[4954]: I1206 08:37:16.463598 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399f6a0e-06e8-4a45-8a4a-8531e011e5f0-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:37:16 crc kubenswrapper[4954]: I1206 08:37:16.463638 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42hl7\" (UniqueName: \"kubernetes.io/projected/399f6a0e-06e8-4a45-8a4a-8531e011e5f0-kube-api-access-42hl7\") on node \"crc\" DevicePath \"\"" Dec 06 08:37:16 crc kubenswrapper[4954]: I1206 08:37:16.463689 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399f6a0e-06e8-4a45-8a4a-8531e011e5f0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:37:16 crc kubenswrapper[4954]: I1206 08:37:16.712973 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcrp7" event={"ID":"399f6a0e-06e8-4a45-8a4a-8531e011e5f0","Type":"ContainerDied","Data":"59501f053907e2e283c7d889e2220622639d5baf8be8838cdc0dc6265b8fbd82"} Dec 06 08:37:16 crc kubenswrapper[4954]: I1206 08:37:16.713026 4954 scope.go:117] "RemoveContainer" containerID="c978287893438db25e9c965d1887992d81a989586a165bb4743388b672c0afa2" Dec 06 08:37:16 crc kubenswrapper[4954]: I1206 08:37:16.713026 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mcrp7" Dec 06 08:37:16 crc kubenswrapper[4954]: I1206 08:37:16.731278 4954 scope.go:117] "RemoveContainer" containerID="7300334699d71ed97b9aa1c1f34e76b50fec651d119d238b989a5ec8c1bf9adf" Dec 06 08:37:16 crc kubenswrapper[4954]: I1206 08:37:16.744325 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mcrp7"] Dec 06 08:37:16 crc kubenswrapper[4954]: I1206 08:37:16.750693 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mcrp7"] Dec 06 08:37:16 crc kubenswrapper[4954]: I1206 08:37:16.767132 4954 scope.go:117] "RemoveContainer" containerID="bc6401b2d67b90dc51002faee5998b541619d9ec8974c436ecbaa0c57d31a094" Dec 06 08:37:17 crc kubenswrapper[4954]: I1206 08:37:17.453012 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="399f6a0e-06e8-4a45-8a4a-8531e011e5f0" path="/var/lib/kubelet/pods/399f6a0e-06e8-4a45-8a4a-8531e011e5f0/volumes" Dec 06 08:37:22 crc kubenswrapper[4954]: I1206 08:37:22.308787 4954 scope.go:117] "RemoveContainer" containerID="df96f4f2d31d66b67a00f4ca761a910560a4aa6b969945f427faf07832ee150c" Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.243200 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hwdpq"] Dec 06 08:37:52 crc kubenswrapper[4954]: E1206 08:37:52.244193 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399f6a0e-06e8-4a45-8a4a-8531e011e5f0" containerName="extract-utilities" Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.244209 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="399f6a0e-06e8-4a45-8a4a-8531e011e5f0" containerName="extract-utilities" Dec 06 08:37:52 crc kubenswrapper[4954]: E1206 08:37:52.244219 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399f6a0e-06e8-4a45-8a4a-8531e011e5f0" containerName="registry-server" Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.244225 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="399f6a0e-06e8-4a45-8a4a-8531e011e5f0" containerName="registry-server" Dec 06 08:37:52 crc kubenswrapper[4954]: E1206 08:37:52.244243 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399f6a0e-06e8-4a45-8a4a-8531e011e5f0" containerName="extract-content" Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.244250 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="399f6a0e-06e8-4a45-8a4a-8531e011e5f0" containerName="extract-content" Dec 06 08:37:52 crc kubenswrapper[4954]: E1206 08:37:52.244272 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83d9d42e-ac14-4521-8eb9-253edb8b378e" containerName="storage" Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.244280 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="83d9d42e-ac14-4521-8eb9-253edb8b378e" containerName="storage" Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.244418 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="399f6a0e-06e8-4a45-8a4a-8531e011e5f0" containerName="registry-server" Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.244438 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="83d9d42e-ac14-4521-8eb9-253edb8b378e" containerName="storage" Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.245405 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hwdpq" Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.255421 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hwdpq"] Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.322773 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-855md\" (UniqueName: \"kubernetes.io/projected/3da05d94-0963-405c-a531-e2bb2fe09247-kube-api-access-855md\") pod \"community-operators-hwdpq\" (UID: \"3da05d94-0963-405c-a531-e2bb2fe09247\") " pod="openshift-marketplace/community-operators-hwdpq" Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.322872 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3da05d94-0963-405c-a531-e2bb2fe09247-catalog-content\") pod \"community-operators-hwdpq\" (UID: \"3da05d94-0963-405c-a531-e2bb2fe09247\") " pod="openshift-marketplace/community-operators-hwdpq" Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.322940 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3da05d94-0963-405c-a531-e2bb2fe09247-utilities\") pod \"community-operators-hwdpq\" (UID: \"3da05d94-0963-405c-a531-e2bb2fe09247\") " pod="openshift-marketplace/community-operators-hwdpq" Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.425001 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3da05d94-0963-405c-a531-e2bb2fe09247-utilities\") pod \"community-operators-hwdpq\" (UID: \"3da05d94-0963-405c-a531-e2bb2fe09247\") " pod="openshift-marketplace/community-operators-hwdpq" Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.425100 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-855md\" (UniqueName: \"kubernetes.io/projected/3da05d94-0963-405c-a531-e2bb2fe09247-kube-api-access-855md\") pod \"community-operators-hwdpq\" (UID: \"3da05d94-0963-405c-a531-e2bb2fe09247\") " pod="openshift-marketplace/community-operators-hwdpq" Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.425205 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3da05d94-0963-405c-a531-e2bb2fe09247-catalog-content\") pod \"community-operators-hwdpq\" (UID: \"3da05d94-0963-405c-a531-e2bb2fe09247\") " pod="openshift-marketplace/community-operators-hwdpq" Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.425633 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3da05d94-0963-405c-a531-e2bb2fe09247-utilities\") pod \"community-operators-hwdpq\" (UID: \"3da05d94-0963-405c-a531-e2bb2fe09247\") " pod="openshift-marketplace/community-operators-hwdpq" Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.425781 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3da05d94-0963-405c-a531-e2bb2fe09247-catalog-content\") pod \"community-operators-hwdpq\" (UID: \"3da05d94-0963-405c-a531-e2bb2fe09247\") " pod="openshift-marketplace/community-operators-hwdpq" Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.660678 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-855md\" (UniqueName: \"kubernetes.io/projected/3da05d94-0963-405c-a531-e2bb2fe09247-kube-api-access-855md\") pod \"community-operators-hwdpq\" (UID: \"3da05d94-0963-405c-a531-e2bb2fe09247\") " pod="openshift-marketplace/community-operators-hwdpq" Dec 06 08:37:52 crc kubenswrapper[4954]: I1206 08:37:52.863099 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hwdpq" Dec 06 08:37:53 crc kubenswrapper[4954]: I1206 08:37:53.358696 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hwdpq"] Dec 06 08:37:54 crc kubenswrapper[4954]: I1206 08:37:54.019940 4954 generic.go:334] "Generic (PLEG): container finished" podID="3da05d94-0963-405c-a531-e2bb2fe09247" containerID="7a11d2be2ebe870fb03e3370a085715b735a6793c66fd31269edc8845f1343a8" exitCode=0 Dec 06 08:37:54 crc kubenswrapper[4954]: I1206 08:37:54.020068 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwdpq" event={"ID":"3da05d94-0963-405c-a531-e2bb2fe09247","Type":"ContainerDied","Data":"7a11d2be2ebe870fb03e3370a085715b735a6793c66fd31269edc8845f1343a8"} Dec 06 08:37:54 crc kubenswrapper[4954]: I1206 08:37:54.020725 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwdpq" event={"ID":"3da05d94-0963-405c-a531-e2bb2fe09247","Type":"ContainerStarted","Data":"28b46056d2b04091d01b9a1df6aa71d649fc0556b7c675088b7bc3c9ce2bd843"} Dec 06 08:37:55 crc kubenswrapper[4954]: I1206 08:37:55.033609 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwdpq" event={"ID":"3da05d94-0963-405c-a531-e2bb2fe09247","Type":"ContainerStarted","Data":"8c268a125c9e7af3251d8b885042acca8adb799de8dee9d01c26113a425c45e8"} Dec 06 08:37:56 crc kubenswrapper[4954]: I1206 08:37:56.044721 4954 generic.go:334] "Generic (PLEG): container finished" podID="3da05d94-0963-405c-a531-e2bb2fe09247" containerID="8c268a125c9e7af3251d8b885042acca8adb799de8dee9d01c26113a425c45e8" exitCode=0 Dec 06 08:37:56 crc kubenswrapper[4954]: I1206 08:37:56.044796 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwdpq" event={"ID":"3da05d94-0963-405c-a531-e2bb2fe09247","Type":"ContainerDied","Data":"8c268a125c9e7af3251d8b885042acca8adb799de8dee9d01c26113a425c45e8"} Dec 06 08:37:57 crc kubenswrapper[4954]: I1206 08:37:57.053865 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwdpq" event={"ID":"3da05d94-0963-405c-a531-e2bb2fe09247","Type":"ContainerStarted","Data":"cf0e990e7cf2707304d510fd61a35e56627cc4ed186395a2b97944da26f56384"} Dec 06 08:37:57 crc kubenswrapper[4954]: I1206 08:37:57.076600 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hwdpq" podStartSLOduration=2.685061497 podStartE2EDuration="5.076583004s" podCreationTimestamp="2025-12-06 08:37:52 +0000 UTC" firstStartedPulling="2025-12-06 08:37:54.021544351 +0000 UTC m=+6048.834903750" lastFinishedPulling="2025-12-06 08:37:56.413065868 +0000 UTC m=+6051.226425257" observedRunningTime="2025-12-06 08:37:57.07269152 +0000 UTC m=+6051.886050909" watchObservedRunningTime="2025-12-06 08:37:57.076583004 +0000 UTC m=+6051.889942393" Dec 06 08:38:02 crc kubenswrapper[4954]: I1206 08:38:02.863244 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hwdpq" Dec 06 08:38:02 crc kubenswrapper[4954]: I1206 08:38:02.863882 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hwdpq" Dec 06 08:38:02 crc kubenswrapper[4954]: I1206 08:38:02.904393 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hwdpq" Dec 06 08:38:03 crc kubenswrapper[4954]: I1206 08:38:03.158651 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hwdpq" Dec 06 08:38:03 crc kubenswrapper[4954]: I1206 08:38:03.207467 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hwdpq"] Dec 06 08:38:05 crc kubenswrapper[4954]: I1206 08:38:05.131874 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hwdpq" podUID="3da05d94-0963-405c-a531-e2bb2fe09247" containerName="registry-server" containerID="cri-o://cf0e990e7cf2707304d510fd61a35e56627cc4ed186395a2b97944da26f56384" gracePeriod=2 Dec 06 08:38:10 crc kubenswrapper[4954]: I1206 08:38:10.101475 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:38:10 crc kubenswrapper[4954]: I1206 08:38:10.102073 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:38:10 crc kubenswrapper[4954]: I1206 08:38:10.689021 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hwdpq" Dec 06 08:38:10 crc kubenswrapper[4954]: I1206 08:38:10.834408 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3da05d94-0963-405c-a531-e2bb2fe09247-catalog-content\") pod \"3da05d94-0963-405c-a531-e2bb2fe09247\" (UID: \"3da05d94-0963-405c-a531-e2bb2fe09247\") " Dec 06 08:38:10 crc kubenswrapper[4954]: I1206 08:38:10.834685 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3da05d94-0963-405c-a531-e2bb2fe09247-utilities\") pod \"3da05d94-0963-405c-a531-e2bb2fe09247\" (UID: \"3da05d94-0963-405c-a531-e2bb2fe09247\") " Dec 06 08:38:10 crc kubenswrapper[4954]: I1206 08:38:10.834734 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-855md\" (UniqueName: \"kubernetes.io/projected/3da05d94-0963-405c-a531-e2bb2fe09247-kube-api-access-855md\") pod \"3da05d94-0963-405c-a531-e2bb2fe09247\" (UID: \"3da05d94-0963-405c-a531-e2bb2fe09247\") " Dec 06 08:38:10 crc kubenswrapper[4954]: I1206 08:38:10.835743 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3da05d94-0963-405c-a531-e2bb2fe09247-utilities" (OuterVolumeSpecName: "utilities") pod "3da05d94-0963-405c-a531-e2bb2fe09247" (UID: "3da05d94-0963-405c-a531-e2bb2fe09247"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:38:10 crc kubenswrapper[4954]: I1206 08:38:10.841516 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3da05d94-0963-405c-a531-e2bb2fe09247-kube-api-access-855md" (OuterVolumeSpecName: "kube-api-access-855md") pod "3da05d94-0963-405c-a531-e2bb2fe09247" (UID: "3da05d94-0963-405c-a531-e2bb2fe09247"). InnerVolumeSpecName "kube-api-access-855md". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:38:10 crc kubenswrapper[4954]: I1206 08:38:10.897207 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3da05d94-0963-405c-a531-e2bb2fe09247-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3da05d94-0963-405c-a531-e2bb2fe09247" (UID: "3da05d94-0963-405c-a531-e2bb2fe09247"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:38:10 crc kubenswrapper[4954]: I1206 08:38:10.937226 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3da05d94-0963-405c-a531-e2bb2fe09247-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:38:10 crc kubenswrapper[4954]: I1206 08:38:10.937300 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-855md\" (UniqueName: \"kubernetes.io/projected/3da05d94-0963-405c-a531-e2bb2fe09247-kube-api-access-855md\") on node \"crc\" DevicePath \"\"" Dec 06 08:38:10 crc kubenswrapper[4954]: I1206 08:38:10.937341 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3da05d94-0963-405c-a531-e2bb2fe09247-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:38:11 crc kubenswrapper[4954]: I1206 08:38:11.002052 4954 generic.go:334] "Generic (PLEG): container finished" podID="3da05d94-0963-405c-a531-e2bb2fe09247" containerID="cf0e990e7cf2707304d510fd61a35e56627cc4ed186395a2b97944da26f56384" exitCode=0 Dec 06 08:38:11 crc kubenswrapper[4954]: I1206 08:38:11.002115 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwdpq" event={"ID":"3da05d94-0963-405c-a531-e2bb2fe09247","Type":"ContainerDied","Data":"cf0e990e7cf2707304d510fd61a35e56627cc4ed186395a2b97944da26f56384"} Dec 06 08:38:11 crc kubenswrapper[4954]: I1206 08:38:11.002263 4954 scope.go:117] "RemoveContainer" containerID="cf0e990e7cf2707304d510fd61a35e56627cc4ed186395a2b97944da26f56384" Dec 06 08:38:11 crc kubenswrapper[4954]: I1206 08:38:11.025304 4954 scope.go:117] "RemoveContainer" containerID="8c268a125c9e7af3251d8b885042acca8adb799de8dee9d01c26113a425c45e8" Dec 06 08:38:11 crc kubenswrapper[4954]: I1206 08:38:11.042802 4954 scope.go:117] "RemoveContainer" containerID="7a11d2be2ebe870fb03e3370a085715b735a6793c66fd31269edc8845f1343a8" Dec 06 08:38:12 crc kubenswrapper[4954]: I1206 08:38:12.010186 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwdpq" event={"ID":"3da05d94-0963-405c-a531-e2bb2fe09247","Type":"ContainerDied","Data":"28b46056d2b04091d01b9a1df6aa71d649fc0556b7c675088b7bc3c9ce2bd843"} Dec 06 08:38:12 crc kubenswrapper[4954]: I1206 08:38:12.010297 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hwdpq" Dec 06 08:38:12 crc kubenswrapper[4954]: I1206 08:38:12.034116 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hwdpq"] Dec 06 08:38:12 crc kubenswrapper[4954]: I1206 08:38:12.041601 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hwdpq"] Dec 06 08:38:13 crc kubenswrapper[4954]: I1206 08:38:13.453683 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3da05d94-0963-405c-a531-e2bb2fe09247" path="/var/lib/kubelet/pods/3da05d94-0963-405c-a531-e2bb2fe09247/volumes" Dec 06 08:38:13 crc kubenswrapper[4954]: I1206 08:38:13.647476 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ckqk4"] Dec 06 08:38:13 crc kubenswrapper[4954]: E1206 08:38:13.647821 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da05d94-0963-405c-a531-e2bb2fe09247" containerName="registry-server" Dec 06 08:38:13 crc kubenswrapper[4954]: I1206 08:38:13.647836 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da05d94-0963-405c-a531-e2bb2fe09247" containerName="registry-server" Dec 06 08:38:13 crc kubenswrapper[4954]: E1206 08:38:13.647855 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da05d94-0963-405c-a531-e2bb2fe09247" containerName="extract-content" Dec 06 08:38:13 crc kubenswrapper[4954]: I1206 08:38:13.647861 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da05d94-0963-405c-a531-e2bb2fe09247" containerName="extract-content" Dec 06 08:38:13 crc kubenswrapper[4954]: E1206 08:38:13.647878 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da05d94-0963-405c-a531-e2bb2fe09247" containerName="extract-utilities" Dec 06 08:38:13 crc kubenswrapper[4954]: I1206 08:38:13.647885 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da05d94-0963-405c-a531-e2bb2fe09247" containerName="extract-utilities" Dec 06 08:38:13 crc kubenswrapper[4954]: I1206 08:38:13.648022 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da05d94-0963-405c-a531-e2bb2fe09247" containerName="registry-server" Dec 06 08:38:13 crc kubenswrapper[4954]: I1206 08:38:13.649216 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ckqk4" Dec 06 08:38:13 crc kubenswrapper[4954]: I1206 08:38:13.666211 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ckqk4"] Dec 06 08:38:13 crc kubenswrapper[4954]: I1206 08:38:13.777635 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rrg4\" (UniqueName: \"kubernetes.io/projected/85588a85-b1f3-436e-8ad9-1668199980aa-kube-api-access-7rrg4\") pod \"redhat-operators-ckqk4\" (UID: \"85588a85-b1f3-436e-8ad9-1668199980aa\") " pod="openshift-marketplace/redhat-operators-ckqk4" Dec 06 08:38:13 crc kubenswrapper[4954]: I1206 08:38:13.778196 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85588a85-b1f3-436e-8ad9-1668199980aa-catalog-content\") pod \"redhat-operators-ckqk4\" (UID: \"85588a85-b1f3-436e-8ad9-1668199980aa\") " pod="openshift-marketplace/redhat-operators-ckqk4" Dec 06 08:38:13 crc kubenswrapper[4954]: I1206 08:38:13.778325 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85588a85-b1f3-436e-8ad9-1668199980aa-utilities\") pod \"redhat-operators-ckqk4\" (UID: \"85588a85-b1f3-436e-8ad9-1668199980aa\") " pod="openshift-marketplace/redhat-operators-ckqk4" Dec 06 08:38:13 crc kubenswrapper[4954]: I1206 08:38:13.879132 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85588a85-b1f3-436e-8ad9-1668199980aa-catalog-content\") pod \"redhat-operators-ckqk4\" (UID: \"85588a85-b1f3-436e-8ad9-1668199980aa\") " pod="openshift-marketplace/redhat-operators-ckqk4" Dec 06 08:38:13 crc kubenswrapper[4954]: I1206 08:38:13.879180 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85588a85-b1f3-436e-8ad9-1668199980aa-utilities\") pod \"redhat-operators-ckqk4\" (UID: \"85588a85-b1f3-436e-8ad9-1668199980aa\") " pod="openshift-marketplace/redhat-operators-ckqk4" Dec 06 08:38:13 crc kubenswrapper[4954]: I1206 08:38:13.879259 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rrg4\" (UniqueName: \"kubernetes.io/projected/85588a85-b1f3-436e-8ad9-1668199980aa-kube-api-access-7rrg4\") pod \"redhat-operators-ckqk4\" (UID: \"85588a85-b1f3-436e-8ad9-1668199980aa\") " pod="openshift-marketplace/redhat-operators-ckqk4" Dec 06 08:38:13 crc kubenswrapper[4954]: I1206 08:38:13.879830 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85588a85-b1f3-436e-8ad9-1668199980aa-utilities\") pod \"redhat-operators-ckqk4\" (UID: \"85588a85-b1f3-436e-8ad9-1668199980aa\") " pod="openshift-marketplace/redhat-operators-ckqk4" Dec 06 08:38:13 crc kubenswrapper[4954]: I1206 08:38:13.879827 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85588a85-b1f3-436e-8ad9-1668199980aa-catalog-content\") pod \"redhat-operators-ckqk4\" (UID: \"85588a85-b1f3-436e-8ad9-1668199980aa\") " pod="openshift-marketplace/redhat-operators-ckqk4" Dec 06 08:38:13 crc kubenswrapper[4954]: I1206 08:38:13.905143 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rrg4\" (UniqueName: \"kubernetes.io/projected/85588a85-b1f3-436e-8ad9-1668199980aa-kube-api-access-7rrg4\") pod \"redhat-operators-ckqk4\" (UID: \"85588a85-b1f3-436e-8ad9-1668199980aa\") " pod="openshift-marketplace/redhat-operators-ckqk4" Dec 06 08:38:13 crc kubenswrapper[4954]: I1206 08:38:13.970695 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ckqk4" Dec 06 08:38:14 crc kubenswrapper[4954]: I1206 08:38:14.396827 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ckqk4"] Dec 06 08:38:15 crc kubenswrapper[4954]: I1206 08:38:15.053248 4954 generic.go:334] "Generic (PLEG): container finished" podID="85588a85-b1f3-436e-8ad9-1668199980aa" containerID="c5a31d9f7b9e5441126ab5de23eb1ad01c574efe1bd7db0b7930982e620cbb51" exitCode=0 Dec 06 08:38:15 crc kubenswrapper[4954]: I1206 08:38:15.053328 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckqk4" event={"ID":"85588a85-b1f3-436e-8ad9-1668199980aa","Type":"ContainerDied","Data":"c5a31d9f7b9e5441126ab5de23eb1ad01c574efe1bd7db0b7930982e620cbb51"} Dec 06 08:38:15 crc kubenswrapper[4954]: I1206 08:38:15.053544 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckqk4" event={"ID":"85588a85-b1f3-436e-8ad9-1668199980aa","Type":"ContainerStarted","Data":"601de1afe63c0ece9efeda3bb432b246b42be0e02d42b4756e00643f58ebc9aa"} Dec 06 08:38:16 crc kubenswrapper[4954]: I1206 08:38:16.062343 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckqk4" event={"ID":"85588a85-b1f3-436e-8ad9-1668199980aa","Type":"ContainerStarted","Data":"c81c88c19ffa0e59a274a42c0f2ebd415e517dfa89ed011c48c8d9ecad65ad86"} Dec 06 08:38:17 crc kubenswrapper[4954]: I1206 08:38:17.073043 4954 generic.go:334] "Generic (PLEG): container finished" podID="85588a85-b1f3-436e-8ad9-1668199980aa" containerID="c81c88c19ffa0e59a274a42c0f2ebd415e517dfa89ed011c48c8d9ecad65ad86" exitCode=0 Dec 06 08:38:17 crc kubenswrapper[4954]: I1206 08:38:17.073091 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckqk4" event={"ID":"85588a85-b1f3-436e-8ad9-1668199980aa","Type":"ContainerDied","Data":"c81c88c19ffa0e59a274a42c0f2ebd415e517dfa89ed011c48c8d9ecad65ad86"} Dec 06 08:38:18 crc kubenswrapper[4954]: I1206 08:38:18.082964 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckqk4" event={"ID":"85588a85-b1f3-436e-8ad9-1668199980aa","Type":"ContainerStarted","Data":"40edd18370c70294c485d9b516293297e547308eff764d01bb6502a286143b2d"} Dec 06 08:38:18 crc kubenswrapper[4954]: I1206 08:38:18.102023 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ckqk4" podStartSLOduration=2.694083363 podStartE2EDuration="5.102001927s" podCreationTimestamp="2025-12-06 08:38:13 +0000 UTC" firstStartedPulling="2025-12-06 08:38:15.055904572 +0000 UTC m=+6069.869263961" lastFinishedPulling="2025-12-06 08:38:17.463823136 +0000 UTC m=+6072.277182525" observedRunningTime="2025-12-06 08:38:18.100006404 +0000 UTC m=+6072.913365793" watchObservedRunningTime="2025-12-06 08:38:18.102001927 +0000 UTC m=+6072.915361316" Dec 06 08:38:23 crc kubenswrapper[4954]: I1206 08:38:23.971610 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ckqk4" Dec 06 08:38:23 crc kubenswrapper[4954]: I1206 08:38:23.972214 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ckqk4" Dec 06 08:38:24 crc kubenswrapper[4954]: I1206 08:38:24.014940 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ckqk4" Dec 06 08:38:24 crc kubenswrapper[4954]: I1206 08:38:24.167787 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ckqk4" Dec 06 08:38:24 crc kubenswrapper[4954]: I1206 08:38:24.643511 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ckqk4"] Dec 06 08:38:26 crc kubenswrapper[4954]: I1206 08:38:26.140028 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ckqk4" podUID="85588a85-b1f3-436e-8ad9-1668199980aa" containerName="registry-server" containerID="cri-o://40edd18370c70294c485d9b516293297e547308eff764d01bb6502a286143b2d" gracePeriod=2 Dec 06 08:38:29 crc kubenswrapper[4954]: I1206 08:38:29.166466 4954 generic.go:334] "Generic (PLEG): container finished" podID="85588a85-b1f3-436e-8ad9-1668199980aa" containerID="40edd18370c70294c485d9b516293297e547308eff764d01bb6502a286143b2d" exitCode=0 Dec 06 08:38:29 crc kubenswrapper[4954]: I1206 08:38:29.166547 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckqk4" event={"ID":"85588a85-b1f3-436e-8ad9-1668199980aa","Type":"ContainerDied","Data":"40edd18370c70294c485d9b516293297e547308eff764d01bb6502a286143b2d"} Dec 06 08:38:29 crc kubenswrapper[4954]: I1206 08:38:29.652275 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ckqk4" Dec 06 08:38:29 crc kubenswrapper[4954]: I1206 08:38:29.847446 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85588a85-b1f3-436e-8ad9-1668199980aa-catalog-content\") pod \"85588a85-b1f3-436e-8ad9-1668199980aa\" (UID: \"85588a85-b1f3-436e-8ad9-1668199980aa\") " Dec 06 08:38:29 crc kubenswrapper[4954]: I1206 08:38:29.847488 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rrg4\" (UniqueName: \"kubernetes.io/projected/85588a85-b1f3-436e-8ad9-1668199980aa-kube-api-access-7rrg4\") pod \"85588a85-b1f3-436e-8ad9-1668199980aa\" (UID: \"85588a85-b1f3-436e-8ad9-1668199980aa\") " Dec 06 08:38:29 crc kubenswrapper[4954]: I1206 08:38:29.847675 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85588a85-b1f3-436e-8ad9-1668199980aa-utilities\") pod \"85588a85-b1f3-436e-8ad9-1668199980aa\" (UID: \"85588a85-b1f3-436e-8ad9-1668199980aa\") " Dec 06 08:38:29 crc kubenswrapper[4954]: I1206 08:38:29.848857 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85588a85-b1f3-436e-8ad9-1668199980aa-utilities" (OuterVolumeSpecName: "utilities") pod "85588a85-b1f3-436e-8ad9-1668199980aa" (UID: "85588a85-b1f3-436e-8ad9-1668199980aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:38:29 crc kubenswrapper[4954]: I1206 08:38:29.858018 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85588a85-b1f3-436e-8ad9-1668199980aa-kube-api-access-7rrg4" (OuterVolumeSpecName: "kube-api-access-7rrg4") pod "85588a85-b1f3-436e-8ad9-1668199980aa" (UID: "85588a85-b1f3-436e-8ad9-1668199980aa"). InnerVolumeSpecName "kube-api-access-7rrg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:38:29 crc kubenswrapper[4954]: I1206 08:38:29.949533 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85588a85-b1f3-436e-8ad9-1668199980aa-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:38:29 crc kubenswrapper[4954]: I1206 08:38:29.949600 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rrg4\" (UniqueName: \"kubernetes.io/projected/85588a85-b1f3-436e-8ad9-1668199980aa-kube-api-access-7rrg4\") on node \"crc\" DevicePath \"\"" Dec 06 08:38:29 crc kubenswrapper[4954]: I1206 08:38:29.977440 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85588a85-b1f3-436e-8ad9-1668199980aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85588a85-b1f3-436e-8ad9-1668199980aa" (UID: "85588a85-b1f3-436e-8ad9-1668199980aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:38:30 crc kubenswrapper[4954]: I1206 08:38:30.050543 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85588a85-b1f3-436e-8ad9-1668199980aa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:38:30 crc kubenswrapper[4954]: I1206 08:38:30.175449 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckqk4" event={"ID":"85588a85-b1f3-436e-8ad9-1668199980aa","Type":"ContainerDied","Data":"601de1afe63c0ece9efeda3bb432b246b42be0e02d42b4756e00643f58ebc9aa"} Dec 06 08:38:30 crc kubenswrapper[4954]: I1206 08:38:30.175503 4954 scope.go:117] "RemoveContainer" containerID="40edd18370c70294c485d9b516293297e547308eff764d01bb6502a286143b2d" Dec 06 08:38:30 crc kubenswrapper[4954]: I1206 08:38:30.176055 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ckqk4" Dec 06 08:38:30 crc kubenswrapper[4954]: I1206 08:38:30.195213 4954 scope.go:117] "RemoveContainer" containerID="c81c88c19ffa0e59a274a42c0f2ebd415e517dfa89ed011c48c8d9ecad65ad86" Dec 06 08:38:30 crc kubenswrapper[4954]: I1206 08:38:30.223919 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ckqk4"] Dec 06 08:38:30 crc kubenswrapper[4954]: I1206 08:38:30.236377 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ckqk4"] Dec 06 08:38:30 crc kubenswrapper[4954]: I1206 08:38:30.237470 4954 scope.go:117] "RemoveContainer" containerID="c5a31d9f7b9e5441126ab5de23eb1ad01c574efe1bd7db0b7930982e620cbb51" Dec 06 08:38:31 crc kubenswrapper[4954]: I1206 08:38:31.453039 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85588a85-b1f3-436e-8ad9-1668199980aa" path="/var/lib/kubelet/pods/85588a85-b1f3-436e-8ad9-1668199980aa/volumes" Dec 06 08:38:40 crc kubenswrapper[4954]: I1206 08:38:40.101052 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:38:40 crc kubenswrapper[4954]: I1206 08:38:40.101731 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:39:10 crc kubenswrapper[4954]: I1206 08:39:10.101273 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:39:10 crc kubenswrapper[4954]: I1206 08:39:10.101884 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:39:10 crc kubenswrapper[4954]: I1206 08:39:10.101946 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 08:39:10 crc kubenswrapper[4954]: I1206 08:39:10.102709 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0bd6f8a5555a0f99770e3b85c32a88e35dc9167385ddb1f2f1ba17c4331137be"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 08:39:10 crc kubenswrapper[4954]: I1206 08:39:10.102775 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://0bd6f8a5555a0f99770e3b85c32a88e35dc9167385ddb1f2f1ba17c4331137be" gracePeriod=600 Dec 06 08:39:10 crc kubenswrapper[4954]: I1206 08:39:10.472992 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="0bd6f8a5555a0f99770e3b85c32a88e35dc9167385ddb1f2f1ba17c4331137be" exitCode=0 Dec 06 08:39:10 crc kubenswrapper[4954]: I1206 08:39:10.473343 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"0bd6f8a5555a0f99770e3b85c32a88e35dc9167385ddb1f2f1ba17c4331137be"} Dec 06 08:39:10 crc kubenswrapper[4954]: I1206 08:39:10.473373 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448"} Dec 06 08:39:10 crc kubenswrapper[4954]: I1206 08:39:10.473412 4954 scope.go:117] "RemoveContainer" containerID="98b2cc0cc01a0edf43edc1ed72809d00597af41d5121335e860b1a2cb9e99dcd" Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.906689 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cd6d96557-dwqpz"] Dec 06 08:39:23 crc kubenswrapper[4954]: E1206 08:39:23.907725 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85588a85-b1f3-436e-8ad9-1668199980aa" containerName="extract-content" Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.907737 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="85588a85-b1f3-436e-8ad9-1668199980aa" containerName="extract-content" Dec 06 08:39:23 crc kubenswrapper[4954]: E1206 08:39:23.907751 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85588a85-b1f3-436e-8ad9-1668199980aa" containerName="registry-server" Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.907758 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="85588a85-b1f3-436e-8ad9-1668199980aa" containerName="registry-server" Dec 06 08:39:23 crc kubenswrapper[4954]: E1206 08:39:23.907765 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85588a85-b1f3-436e-8ad9-1668199980aa" containerName="extract-utilities" Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.907771 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="85588a85-b1f3-436e-8ad9-1668199980aa" containerName="extract-utilities" Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.907965 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="85588a85-b1f3-436e-8ad9-1668199980aa" containerName="registry-server" Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.908733 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd6d96557-dwqpz" Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.910541 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-ccl9b" Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.910888 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.911199 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.911411 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.917214 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc"] Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.918483 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc" Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.923376 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.939320 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc"] Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.945022 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd6d96557-dwqpz"] Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.946088 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwx28\" (UniqueName: \"kubernetes.io/projected/eb1b1824-c6c2-444a-9b15-fc102a1754e2-kube-api-access-mwx28\") pod \"dnsmasq-dns-6cd6d96557-dwqpz\" (UID: \"eb1b1824-c6c2-444a-9b15-fc102a1754e2\") " pod="openstack/dnsmasq-dns-6cd6d96557-dwqpz" Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.947083 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb1b1824-c6c2-444a-9b15-fc102a1754e2-config\") pod \"dnsmasq-dns-6cd6d96557-dwqpz\" (UID: \"eb1b1824-c6c2-444a-9b15-fc102a1754e2\") " pod="openstack/dnsmasq-dns-6cd6d96557-dwqpz" Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.947260 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ffeb377-ffa2-4eed-afdc-933962462780-config\") pod \"dnsmasq-dns-7f9bd9cf5f-bdqmc\" (UID: \"0ffeb377-ffa2-4eed-afdc-933962462780\") " pod="openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc" Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.947302 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7x8c\" (UniqueName: \"kubernetes.io/projected/0ffeb377-ffa2-4eed-afdc-933962462780-kube-api-access-z7x8c\") pod \"dnsmasq-dns-7f9bd9cf5f-bdqmc\" (UID: \"0ffeb377-ffa2-4eed-afdc-933962462780\") " pod="openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc" Dec 06 08:39:23 crc kubenswrapper[4954]: I1206 08:39:23.947413 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ffeb377-ffa2-4eed-afdc-933962462780-dns-svc\") pod \"dnsmasq-dns-7f9bd9cf5f-bdqmc\" (UID: \"0ffeb377-ffa2-4eed-afdc-933962462780\") " pod="openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.048492 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ffeb377-ffa2-4eed-afdc-933962462780-config\") pod \"dnsmasq-dns-7f9bd9cf5f-bdqmc\" (UID: \"0ffeb377-ffa2-4eed-afdc-933962462780\") " pod="openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.048540 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7x8c\" (UniqueName: \"kubernetes.io/projected/0ffeb377-ffa2-4eed-afdc-933962462780-kube-api-access-z7x8c\") pod \"dnsmasq-dns-7f9bd9cf5f-bdqmc\" (UID: \"0ffeb377-ffa2-4eed-afdc-933962462780\") " pod="openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.048684 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ffeb377-ffa2-4eed-afdc-933962462780-dns-svc\") pod \"dnsmasq-dns-7f9bd9cf5f-bdqmc\" (UID: \"0ffeb377-ffa2-4eed-afdc-933962462780\") " pod="openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.048739 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwx28\" (UniqueName: \"kubernetes.io/projected/eb1b1824-c6c2-444a-9b15-fc102a1754e2-kube-api-access-mwx28\") pod \"dnsmasq-dns-6cd6d96557-dwqpz\" (UID: \"eb1b1824-c6c2-444a-9b15-fc102a1754e2\") " pod="openstack/dnsmasq-dns-6cd6d96557-dwqpz" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.048766 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb1b1824-c6c2-444a-9b15-fc102a1754e2-config\") pod \"dnsmasq-dns-6cd6d96557-dwqpz\" (UID: \"eb1b1824-c6c2-444a-9b15-fc102a1754e2\") " pod="openstack/dnsmasq-dns-6cd6d96557-dwqpz" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.049754 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb1b1824-c6c2-444a-9b15-fc102a1754e2-config\") pod \"dnsmasq-dns-6cd6d96557-dwqpz\" (UID: \"eb1b1824-c6c2-444a-9b15-fc102a1754e2\") " pod="openstack/dnsmasq-dns-6cd6d96557-dwqpz" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.049825 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ffeb377-ffa2-4eed-afdc-933962462780-dns-svc\") pod \"dnsmasq-dns-7f9bd9cf5f-bdqmc\" (UID: \"0ffeb377-ffa2-4eed-afdc-933962462780\") " pod="openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.049831 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ffeb377-ffa2-4eed-afdc-933962462780-config\") pod \"dnsmasq-dns-7f9bd9cf5f-bdqmc\" (UID: \"0ffeb377-ffa2-4eed-afdc-933962462780\") " pod="openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.070976 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwx28\" (UniqueName: \"kubernetes.io/projected/eb1b1824-c6c2-444a-9b15-fc102a1754e2-kube-api-access-mwx28\") pod \"dnsmasq-dns-6cd6d96557-dwqpz\" (UID: \"eb1b1824-c6c2-444a-9b15-fc102a1754e2\") " pod="openstack/dnsmasq-dns-6cd6d96557-dwqpz" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.084677 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7x8c\" (UniqueName: \"kubernetes.io/projected/0ffeb377-ffa2-4eed-afdc-933962462780-kube-api-access-z7x8c\") pod \"dnsmasq-dns-7f9bd9cf5f-bdqmc\" (UID: \"0ffeb377-ffa2-4eed-afdc-933962462780\") " pod="openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.203547 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cd6d96557-dwqpz"] Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.204050 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd6d96557-dwqpz" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.239070 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f55d4cc4c-hhgjk"] Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.240510 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.247867 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.250191 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f55d4cc4c-hhgjk"] Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.252174 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6289174-ae4e-4881-a916-e836d9ae7243-config\") pod \"dnsmasq-dns-5f55d4cc4c-hhgjk\" (UID: \"d6289174-ae4e-4881-a916-e836d9ae7243\") " pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.252263 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6289174-ae4e-4881-a916-e836d9ae7243-dns-svc\") pod \"dnsmasq-dns-5f55d4cc4c-hhgjk\" (UID: \"d6289174-ae4e-4881-a916-e836d9ae7243\") " pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.252296 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt2wd\" (UniqueName: \"kubernetes.io/projected/d6289174-ae4e-4881-a916-e836d9ae7243-kube-api-access-lt2wd\") pod \"dnsmasq-dns-5f55d4cc4c-hhgjk\" (UID: \"d6289174-ae4e-4881-a916-e836d9ae7243\") " pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.353297 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6289174-ae4e-4881-a916-e836d9ae7243-config\") pod \"dnsmasq-dns-5f55d4cc4c-hhgjk\" (UID: \"d6289174-ae4e-4881-a916-e836d9ae7243\") " pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.353392 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6289174-ae4e-4881-a916-e836d9ae7243-dns-svc\") pod \"dnsmasq-dns-5f55d4cc4c-hhgjk\" (UID: \"d6289174-ae4e-4881-a916-e836d9ae7243\") " pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.353423 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt2wd\" (UniqueName: \"kubernetes.io/projected/d6289174-ae4e-4881-a916-e836d9ae7243-kube-api-access-lt2wd\") pod \"dnsmasq-dns-5f55d4cc4c-hhgjk\" (UID: \"d6289174-ae4e-4881-a916-e836d9ae7243\") " pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.354616 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6289174-ae4e-4881-a916-e836d9ae7243-dns-svc\") pod \"dnsmasq-dns-5f55d4cc4c-hhgjk\" (UID: \"d6289174-ae4e-4881-a916-e836d9ae7243\") " pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.366935 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6289174-ae4e-4881-a916-e836d9ae7243-config\") pod \"dnsmasq-dns-5f55d4cc4c-hhgjk\" (UID: \"d6289174-ae4e-4881-a916-e836d9ae7243\") " pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.374240 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt2wd\" (UniqueName: \"kubernetes.io/projected/d6289174-ae4e-4881-a916-e836d9ae7243-kube-api-access-lt2wd\") pod \"dnsmasq-dns-5f55d4cc4c-hhgjk\" (UID: \"d6289174-ae4e-4881-a916-e836d9ae7243\") " pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.613199 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc"] Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.631074 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.657029 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68ddc8d76c-p4p8h"] Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.659601 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.683376 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68ddc8d76c-p4p8h"] Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.765157 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxwjg\" (UniqueName: \"kubernetes.io/projected/f264bd7e-c010-4d5a-af52-d39a6adca97a-kube-api-access-xxwjg\") pod \"dnsmasq-dns-68ddc8d76c-p4p8h\" (UID: \"f264bd7e-c010-4d5a-af52-d39a6adca97a\") " pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.765392 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f264bd7e-c010-4d5a-af52-d39a6adca97a-dns-svc\") pod \"dnsmasq-dns-68ddc8d76c-p4p8h\" (UID: \"f264bd7e-c010-4d5a-af52-d39a6adca97a\") " pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.765473 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f264bd7e-c010-4d5a-af52-d39a6adca97a-config\") pod \"dnsmasq-dns-68ddc8d76c-p4p8h\" (UID: \"f264bd7e-c010-4d5a-af52-d39a6adca97a\") " pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.867224 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxwjg\" (UniqueName: \"kubernetes.io/projected/f264bd7e-c010-4d5a-af52-d39a6adca97a-kube-api-access-xxwjg\") pod \"dnsmasq-dns-68ddc8d76c-p4p8h\" (UID: \"f264bd7e-c010-4d5a-af52-d39a6adca97a\") " pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.867304 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f264bd7e-c010-4d5a-af52-d39a6adca97a-dns-svc\") pod \"dnsmasq-dns-68ddc8d76c-p4p8h\" (UID: \"f264bd7e-c010-4d5a-af52-d39a6adca97a\") " pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.867333 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f264bd7e-c010-4d5a-af52-d39a6adca97a-config\") pod \"dnsmasq-dns-68ddc8d76c-p4p8h\" (UID: \"f264bd7e-c010-4d5a-af52-d39a6adca97a\") " pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.868234 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f264bd7e-c010-4d5a-af52-d39a6adca97a-dns-svc\") pod \"dnsmasq-dns-68ddc8d76c-p4p8h\" (UID: \"f264bd7e-c010-4d5a-af52-d39a6adca97a\") " pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.868841 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f264bd7e-c010-4d5a-af52-d39a6adca97a-config\") pod \"dnsmasq-dns-68ddc8d76c-p4p8h\" (UID: \"f264bd7e-c010-4d5a-af52-d39a6adca97a\") " pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" Dec 06 08:39:24 crc kubenswrapper[4954]: I1206 08:39:24.923694 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxwjg\" (UniqueName: \"kubernetes.io/projected/f264bd7e-c010-4d5a-af52-d39a6adca97a-kube-api-access-xxwjg\") pod \"dnsmasq-dns-68ddc8d76c-p4p8h\" (UID: \"f264bd7e-c010-4d5a-af52-d39a6adca97a\") " pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.007088 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.014887 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc"] Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.216630 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f55d4cc4c-hhgjk"] Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.233247 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cd6d96557-dwqpz"] Dec 06 08:39:25 crc kubenswrapper[4954]: W1206 08:39:25.239919 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6289174_ae4e_4881_a916_e836d9ae7243.slice/crio-6e4ad6c7c30a8557302a8cca470e2682ad659684173618052bc1c65425a47d93 WatchSource:0}: Error finding container 6e4ad6c7c30a8557302a8cca470e2682ad659684173618052bc1c65425a47d93: Status 404 returned error can't find the container with id 6e4ad6c7c30a8557302a8cca470e2682ad659684173618052bc1c65425a47d93 Dec 06 08:39:25 crc kubenswrapper[4954]: W1206 08:39:25.243271 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb1b1824_c6c2_444a_9b15_fc102a1754e2.slice/crio-caa780e6d17c5a10b8617af9d7af721125a343140e4d2b985ef011890496efe3 WatchSource:0}: Error finding container caa780e6d17c5a10b8617af9d7af721125a343140e4d2b985ef011890496efe3: Status 404 returned error can't find the container with id caa780e6d17c5a10b8617af9d7af721125a343140e4d2b985ef011890496efe3 Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.273281 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68ddc8d76c-p4p8h"] Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.390938 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.392543 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.404769 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.405954 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.406107 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.406241 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.406388 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.406539 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.406712 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mw7z5" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.434359 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.580504 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13f26340-bc95-4778-8d90-f1c8271c7974-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.580900 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.580934 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.580957 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13f26340-bc95-4778-8d90-f1c8271c7974-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.581009 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13f26340-bc95-4778-8d90-f1c8271c7974-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.581180 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.581360 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13f26340-bc95-4778-8d90-f1c8271c7974-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.581427 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13f26340-bc95-4778-8d90-f1c8271c7974-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.581638 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.581673 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.581698 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wzmt\" (UniqueName: \"kubernetes.io/projected/13f26340-bc95-4778-8d90-f1c8271c7974-kube-api-access-2wzmt\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.641878 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" event={"ID":"f264bd7e-c010-4d5a-af52-d39a6adca97a","Type":"ContainerStarted","Data":"849d5135433544a3e8c5e35b0ff5b4d3c9654a2805d49ccbb27963906cbc0be4"} Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.659198 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd6d96557-dwqpz" event={"ID":"eb1b1824-c6c2-444a-9b15-fc102a1754e2","Type":"ContainerStarted","Data":"caa780e6d17c5a10b8617af9d7af721125a343140e4d2b985ef011890496efe3"} Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.674412 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc" event={"ID":"0ffeb377-ffa2-4eed-afdc-933962462780","Type":"ContainerStarted","Data":"9aa448568b5517e7640b7e183342a5f495af98b0b3e6c5294858a5a8ee9d54ec"} Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.682947 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.683014 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.683040 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13f26340-bc95-4778-8d90-f1c8271c7974-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.683079 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13f26340-bc95-4778-8d90-f1c8271c7974-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.683147 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.683206 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13f26340-bc95-4778-8d90-f1c8271c7974-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.683236 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13f26340-bc95-4778-8d90-f1c8271c7974-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.683296 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.683331 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.683356 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wzmt\" (UniqueName: \"kubernetes.io/projected/13f26340-bc95-4778-8d90-f1c8271c7974-kube-api-access-2wzmt\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.683394 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13f26340-bc95-4778-8d90-f1c8271c7974-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.685453 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13f26340-bc95-4778-8d90-f1c8271c7974-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.686388 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.688548 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" event={"ID":"d6289174-ae4e-4881-a916-e836d9ae7243","Type":"ContainerStarted","Data":"6e4ad6c7c30a8557302a8cca470e2682ad659684173618052bc1c65425a47d93"} Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.689027 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.690266 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13f26340-bc95-4778-8d90-f1c8271c7974-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.692901 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13f26340-bc95-4778-8d90-f1c8271c7974-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.695454 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.698928 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.699209 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ff64e79839bf38cdfa760e805152cd6507415719f8c1044e4cbe5ff4d2a39c02/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.706384 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13f26340-bc95-4778-8d90-f1c8271c7974-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.706864 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13f26340-bc95-4778-8d90-f1c8271c7974-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.706973 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.715429 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wzmt\" (UniqueName: \"kubernetes.io/projected/13f26340-bc95-4778-8d90-f1c8271c7974-kube-api-access-2wzmt\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.752903 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\") pod \"rabbitmq-cell1-server-0\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.789214 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.790618 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.800708 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.800886 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.801005 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.801230 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.801291 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-bfs4n" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.801350 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.802125 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.822215 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.886493 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.886602 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/881b9e58-a271-4ecc-a59b-ff1c290add0a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.886673 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.886945 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.887031 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.887104 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/881b9e58-a271-4ecc-a59b-ff1c290add0a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.887128 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/881b9e58-a271-4ecc-a59b-ff1c290add0a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.887322 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.887357 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vlb2\" (UniqueName: \"kubernetes.io/projected/881b9e58-a271-4ecc-a59b-ff1c290add0a-kube-api-access-4vlb2\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.887404 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/881b9e58-a271-4ecc-a59b-ff1c290add0a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.887486 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/881b9e58-a271-4ecc-a59b-ff1c290add0a-config-data\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.991587 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.991647 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vlb2\" (UniqueName: \"kubernetes.io/projected/881b9e58-a271-4ecc-a59b-ff1c290add0a-kube-api-access-4vlb2\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.991675 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/881b9e58-a271-4ecc-a59b-ff1c290add0a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.991710 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/881b9e58-a271-4ecc-a59b-ff1c290add0a-config-data\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.991761 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.991792 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/881b9e58-a271-4ecc-a59b-ff1c290add0a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.991833 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.991900 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.991937 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.991968 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/881b9e58-a271-4ecc-a59b-ff1c290add0a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.991991 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/881b9e58-a271-4ecc-a59b-ff1c290add0a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.992258 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.992760 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/881b9e58-a271-4ecc-a59b-ff1c290add0a-config-data\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.992784 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.992896 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/881b9e58-a271-4ecc-a59b-ff1c290add0a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.995099 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/881b9e58-a271-4ecc-a59b-ff1c290add0a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.995721 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.995757 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7c5c16d0a93ca6b4aec1033eaf25c57d8e8c8a5d1fe221d6f62016c4f1db2425/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.997718 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/881b9e58-a271-4ecc-a59b-ff1c290add0a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.998432 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/881b9e58-a271-4ecc-a59b-ff1c290add0a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:25 crc kubenswrapper[4954]: I1206 08:39:25.999077 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.002886 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.012369 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vlb2\" (UniqueName: \"kubernetes.io/projected/881b9e58-a271-4ecc-a59b-ff1c290add0a-kube-api-access-4vlb2\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.028178 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.036883 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\") pod \"rabbitmq-server-0\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " pod="openstack/rabbitmq-server-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.127016 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.472330 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.474413 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.477061 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-djjnd" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.478532 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.478711 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.478734 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.491946 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.494978 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.548958 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 08:39:26 crc kubenswrapper[4954]: W1206 08:39:26.558044 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13f26340_bc95_4778_8d90_f1c8271c7974.slice/crio-ab8aacf99582b9cee37c50a22effb7331257b5c1ffa20cf61d51e6c76c7df78d WatchSource:0}: Error finding container ab8aacf99582b9cee37c50a22effb7331257b5c1ffa20cf61d51e6c76c7df78d: Status 404 returned error can't find the container with id ab8aacf99582b9cee37c50a22effb7331257b5c1ffa20cf61d51e6c76c7df78d Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.615818 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-094363bb-4891-4401-a126-68b7b976a62b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-094363bb-4891-4401-a126-68b7b976a62b\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.615874 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef4d1830-bfa7-4aba-8718-a7e540e52222-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.615974 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef4d1830-bfa7-4aba-8718-a7e540e52222-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.615998 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnxns\" (UniqueName: \"kubernetes.io/projected/ef4d1830-bfa7-4aba-8718-a7e540e52222-kube-api-access-xnxns\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.616019 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ef4d1830-bfa7-4aba-8718-a7e540e52222-kolla-config\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.616041 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ef4d1830-bfa7-4aba-8718-a7e540e52222-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.616060 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef4d1830-bfa7-4aba-8718-a7e540e52222-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.616078 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ef4d1830-bfa7-4aba-8718-a7e540e52222-config-data-default\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.700464 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13f26340-bc95-4778-8d90-f1c8271c7974","Type":"ContainerStarted","Data":"ab8aacf99582b9cee37c50a22effb7331257b5c1ffa20cf61d51e6c76c7df78d"} Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.717688 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef4d1830-bfa7-4aba-8718-a7e540e52222-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.717751 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnxns\" (UniqueName: \"kubernetes.io/projected/ef4d1830-bfa7-4aba-8718-a7e540e52222-kube-api-access-xnxns\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.717777 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ef4d1830-bfa7-4aba-8718-a7e540e52222-kolla-config\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.717805 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ef4d1830-bfa7-4aba-8718-a7e540e52222-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.717831 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef4d1830-bfa7-4aba-8718-a7e540e52222-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.717854 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ef4d1830-bfa7-4aba-8718-a7e540e52222-config-data-default\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.717900 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-094363bb-4891-4401-a126-68b7b976a62b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-094363bb-4891-4401-a126-68b7b976a62b\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.717930 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef4d1830-bfa7-4aba-8718-a7e540e52222-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.719434 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ef4d1830-bfa7-4aba-8718-a7e540e52222-kolla-config\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.719447 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ef4d1830-bfa7-4aba-8718-a7e540e52222-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.719845 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef4d1830-bfa7-4aba-8718-a7e540e52222-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.722732 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ef4d1830-bfa7-4aba-8718-a7e540e52222-config-data-default\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.722850 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.723198 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-094363bb-4891-4401-a126-68b7b976a62b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-094363bb-4891-4401-a126-68b7b976a62b\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0452b3748bc242b06d94a22e6795dac385dcf0020e85fbb2d86ee5dd9d993141/globalmount\"" pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.726069 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef4d1830-bfa7-4aba-8718-a7e540e52222-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.732810 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef4d1830-bfa7-4aba-8718-a7e540e52222-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.741877 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnxns\" (UniqueName: \"kubernetes.io/projected/ef4d1830-bfa7-4aba-8718-a7e540e52222-kube-api-access-xnxns\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.761961 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.763054 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-094363bb-4891-4401-a126-68b7b976a62b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-094363bb-4891-4401-a126-68b7b976a62b\") pod \"openstack-galera-0\" (UID: \"ef4d1830-bfa7-4aba-8718-a7e540e52222\") " pod="openstack/openstack-galera-0" Dec 06 08:39:26 crc kubenswrapper[4954]: W1206 08:39:26.773862 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod881b9e58_a271_4ecc_a59b_ff1c290add0a.slice/crio-3ff0f4548840f598dc69023fae296d48275c427b55b226257b4b8758a19f2568 WatchSource:0}: Error finding container 3ff0f4548840f598dc69023fae296d48275c427b55b226257b4b8758a19f2568: Status 404 returned error can't find the container with id 3ff0f4548840f598dc69023fae296d48275c427b55b226257b4b8758a19f2568 Dec 06 08:39:26 crc kubenswrapper[4954]: I1206 08:39:26.811660 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 06 08:39:27 crc kubenswrapper[4954]: I1206 08:39:27.348271 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 06 08:39:27 crc kubenswrapper[4954]: I1206 08:39:27.711577 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ef4d1830-bfa7-4aba-8718-a7e540e52222","Type":"ContainerStarted","Data":"9c5b2d32b8168cc02205f988cc7b888b8d8d2c5a98b497f3d2b4d4fe72148ce1"} Dec 06 08:39:27 crc kubenswrapper[4954]: I1206 08:39:27.713759 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"881b9e58-a271-4ecc-a59b-ff1c290add0a","Type":"ContainerStarted","Data":"3ff0f4548840f598dc69023fae296d48275c427b55b226257b4b8758a19f2568"} Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.244874 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.246619 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.250052 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-xf9g8" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.251022 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.251240 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.251621 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.278022 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.368767 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/237adbf8-c3e0-427c-ac06-656c110de87b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.368849 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/237adbf8-c3e0-427c-ac06-656c110de87b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.368884 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/237adbf8-c3e0-427c-ac06-656c110de87b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.368948 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br27p\" (UniqueName: \"kubernetes.io/projected/237adbf8-c3e0-427c-ac06-656c110de87b-kube-api-access-br27p\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.368987 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/237adbf8-c3e0-427c-ac06-656c110de87b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.369067 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5be64e66-365d-4293-8c3c-70a2c2f3bbec\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5be64e66-365d-4293-8c3c-70a2c2f3bbec\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.369143 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/237adbf8-c3e0-427c-ac06-656c110de87b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.369177 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/237adbf8-c3e0-427c-ac06-656c110de87b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.483441 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br27p\" (UniqueName: \"kubernetes.io/projected/237adbf8-c3e0-427c-ac06-656c110de87b-kube-api-access-br27p\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.483587 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/237adbf8-c3e0-427c-ac06-656c110de87b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.483687 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5be64e66-365d-4293-8c3c-70a2c2f3bbec\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5be64e66-365d-4293-8c3c-70a2c2f3bbec\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.483871 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/237adbf8-c3e0-427c-ac06-656c110de87b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.483950 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/237adbf8-c3e0-427c-ac06-656c110de87b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.484069 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/237adbf8-c3e0-427c-ac06-656c110de87b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.484143 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/237adbf8-c3e0-427c-ac06-656c110de87b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.484187 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/237adbf8-c3e0-427c-ac06-656c110de87b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.485053 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/237adbf8-c3e0-427c-ac06-656c110de87b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.486167 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/237adbf8-c3e0-427c-ac06-656c110de87b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.487109 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/237adbf8-c3e0-427c-ac06-656c110de87b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.487507 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/237adbf8-c3e0-427c-ac06-656c110de87b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.494392 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/237adbf8-c3e0-427c-ac06-656c110de87b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.494424 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.494461 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5be64e66-365d-4293-8c3c-70a2c2f3bbec\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5be64e66-365d-4293-8c3c-70a2c2f3bbec\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/44ecb88e711efd9ebfda87b376e1b39d10814ce9f0fb57361f29380e7fe87ef0/globalmount\"" pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.513243 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/237adbf8-c3e0-427c-ac06-656c110de87b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.527455 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br27p\" (UniqueName: \"kubernetes.io/projected/237adbf8-c3e0-427c-ac06-656c110de87b-kube-api-access-br27p\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.577535 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5be64e66-365d-4293-8c3c-70a2c2f3bbec\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5be64e66-365d-4293-8c3c-70a2c2f3bbec\") pod \"openstack-cell1-galera-0\" (UID: \"237adbf8-c3e0-427c-ac06-656c110de87b\") " pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.717119 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.718368 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.728159 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.728391 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-p6kcl" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.728577 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.740491 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.798714 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f56f9e3-a30b-4342-981a-170433fd75ea-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2f56f9e3-a30b-4342-981a-170433fd75ea\") " pod="openstack/memcached-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.799778 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2f56f9e3-a30b-4342-981a-170433fd75ea-kolla-config\") pod \"memcached-0\" (UID: \"2f56f9e3-a30b-4342-981a-170433fd75ea\") " pod="openstack/memcached-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.799821 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nwf8\" (UniqueName: \"kubernetes.io/projected/2f56f9e3-a30b-4342-981a-170433fd75ea-kube-api-access-2nwf8\") pod \"memcached-0\" (UID: \"2f56f9e3-a30b-4342-981a-170433fd75ea\") " pod="openstack/memcached-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.799857 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f56f9e3-a30b-4342-981a-170433fd75ea-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2f56f9e3-a30b-4342-981a-170433fd75ea\") " pod="openstack/memcached-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.800019 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f56f9e3-a30b-4342-981a-170433fd75ea-config-data\") pod \"memcached-0\" (UID: \"2f56f9e3-a30b-4342-981a-170433fd75ea\") " pod="openstack/memcached-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.867919 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.917005 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f56f9e3-a30b-4342-981a-170433fd75ea-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2f56f9e3-a30b-4342-981a-170433fd75ea\") " pod="openstack/memcached-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.917097 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2f56f9e3-a30b-4342-981a-170433fd75ea-kolla-config\") pod \"memcached-0\" (UID: \"2f56f9e3-a30b-4342-981a-170433fd75ea\") " pod="openstack/memcached-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.917135 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nwf8\" (UniqueName: \"kubernetes.io/projected/2f56f9e3-a30b-4342-981a-170433fd75ea-kube-api-access-2nwf8\") pod \"memcached-0\" (UID: \"2f56f9e3-a30b-4342-981a-170433fd75ea\") " pod="openstack/memcached-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.917158 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f56f9e3-a30b-4342-981a-170433fd75ea-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2f56f9e3-a30b-4342-981a-170433fd75ea\") " pod="openstack/memcached-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.917197 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f56f9e3-a30b-4342-981a-170433fd75ea-config-data\") pod \"memcached-0\" (UID: \"2f56f9e3-a30b-4342-981a-170433fd75ea\") " pod="openstack/memcached-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.920173 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2f56f9e3-a30b-4342-981a-170433fd75ea-kolla-config\") pod \"memcached-0\" (UID: \"2f56f9e3-a30b-4342-981a-170433fd75ea\") " pod="openstack/memcached-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.927533 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f56f9e3-a30b-4342-981a-170433fd75ea-config-data\") pod \"memcached-0\" (UID: \"2f56f9e3-a30b-4342-981a-170433fd75ea\") " pod="openstack/memcached-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.933954 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f56f9e3-a30b-4342-981a-170433fd75ea-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2f56f9e3-a30b-4342-981a-170433fd75ea\") " pod="openstack/memcached-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.944871 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f56f9e3-a30b-4342-981a-170433fd75ea-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2f56f9e3-a30b-4342-981a-170433fd75ea\") " pod="openstack/memcached-0" Dec 06 08:39:28 crc kubenswrapper[4954]: I1206 08:39:28.945847 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nwf8\" (UniqueName: \"kubernetes.io/projected/2f56f9e3-a30b-4342-981a-170433fd75ea-kube-api-access-2nwf8\") pod \"memcached-0\" (UID: \"2f56f9e3-a30b-4342-981a-170433fd75ea\") " pod="openstack/memcached-0" Dec 06 08:39:29 crc kubenswrapper[4954]: I1206 08:39:29.071116 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 06 08:39:30 crc kubenswrapper[4954]: I1206 08:39:29.602674 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 08:39:30 crc kubenswrapper[4954]: W1206 08:39:29.677585 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod237adbf8_c3e0_427c_ac06_656c110de87b.slice/crio-322cf37e2f059efff3539fb93f21b50e7420f89c712bd47a7c8ddea0698c71e9 WatchSource:0}: Error finding container 322cf37e2f059efff3539fb93f21b50e7420f89c712bd47a7c8ddea0698c71e9: Status 404 returned error can't find the container with id 322cf37e2f059efff3539fb93f21b50e7420f89c712bd47a7c8ddea0698c71e9 Dec 06 08:39:30 crc kubenswrapper[4954]: I1206 08:39:29.702360 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 06 08:39:30 crc kubenswrapper[4954]: W1206 08:39:29.725775 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f56f9e3_a30b_4342_981a_170433fd75ea.slice/crio-4dd53b739ba381a4c26b053193ed18e972e05b782174077380ccba49700caff2 WatchSource:0}: Error finding container 4dd53b739ba381a4c26b053193ed18e972e05b782174077380ccba49700caff2: Status 404 returned error can't find the container with id 4dd53b739ba381a4c26b053193ed18e972e05b782174077380ccba49700caff2 Dec 06 08:39:30 crc kubenswrapper[4954]: I1206 08:39:29.846361 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2f56f9e3-a30b-4342-981a-170433fd75ea","Type":"ContainerStarted","Data":"4dd53b739ba381a4c26b053193ed18e972e05b782174077380ccba49700caff2"} Dec 06 08:39:30 crc kubenswrapper[4954]: I1206 08:39:29.848205 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"237adbf8-c3e0-427c-ac06-656c110de87b","Type":"ContainerStarted","Data":"322cf37e2f059efff3539fb93f21b50e7420f89c712bd47a7c8ddea0698c71e9"} Dec 06 08:39:56 crc kubenswrapper[4954]: E1206 08:39:56.087987 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3923531bcda0b0811b2d5053f189beb" Dec 06 08:39:56 crc kubenswrapper[4954]: E1206 08:39:56.088741 4954 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3923531bcda0b0811b2d5053f189beb" Dec 06 08:39:56 crc kubenswrapper[4954]: E1206 08:39:56.088878 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3923531bcda0b0811b2d5053f189beb,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n697h54dhb7h666h69h76h59ch55ch65ch596h8h79h5c8h57hc8hfch5d7h697h79h698h5fch644hf9h54chbfh655hfchcbh5f8h646h5f7h89q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z7x8c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7f9bd9cf5f-bdqmc_openstack(0ffeb377-ffa2-4eed-afdc-933962462780): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 08:39:56 crc kubenswrapper[4954]: E1206 08:39:56.090142 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc" podUID="0ffeb377-ffa2-4eed-afdc-933962462780" Dec 06 08:39:56 crc kubenswrapper[4954]: E1206 08:39:56.113769 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3923531bcda0b0811b2d5053f189beb" Dec 06 08:39:56 crc kubenswrapper[4954]: E1206 08:39:56.113839 4954 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3923531bcda0b0811b2d5053f189beb" Dec 06 08:39:56 crc kubenswrapper[4954]: E1206 08:39:56.113968 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3923531bcda0b0811b2d5053f189beb,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nb6hc5h68h68h594h659hdbh679h65ch5f6hdch6h5b9h8fh55hfhf8h57fhc7h56ch687h669h559h678h5dhc7hf7h697h5d6h9ch669h54fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lt2wd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5f55d4cc4c-hhgjk_openstack(d6289174-ae4e-4881-a916-e836d9ae7243): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 08:39:56 crc kubenswrapper[4954]: E1206 08:39:56.115395 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" podUID="d6289174-ae4e-4881-a916-e836d9ae7243" Dec 06 08:39:56 crc kubenswrapper[4954]: E1206 08:39:56.122173 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3923531bcda0b0811b2d5053f189beb" Dec 06 08:39:56 crc kubenswrapper[4954]: E1206 08:39:56.122233 4954 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3923531bcda0b0811b2d5053f189beb" Dec 06 08:39:56 crc kubenswrapper[4954]: E1206 08:39:56.122354 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3923531bcda0b0811b2d5053f189beb,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n647h57bh695h68dh54fhf5hc5h67h5d4hb6h696h685h54ch6h599h5c5h679h74h689h644h5c8h64ch555h5c6h5dh569h698h59fh66ch57bh5b9hb7q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mwx28,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6cd6d96557-dwqpz_openstack(eb1b1824-c6c2-444a-9b15-fc102a1754e2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 08:39:56 crc kubenswrapper[4954]: E1206 08:39:56.124200 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6cd6d96557-dwqpz" podUID="eb1b1824-c6c2-444a-9b15-fc102a1754e2" Dec 06 08:39:56 crc kubenswrapper[4954]: E1206 08:39:56.134087 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3923531bcda0b0811b2d5053f189beb\\\"\"" pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" podUID="d6289174-ae4e-4881-a916-e836d9ae7243" Dec 06 08:39:56 crc kubenswrapper[4954]: I1206 08:39:56.917588 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc" Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.089360 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ffeb377-ffa2-4eed-afdc-933962462780-config\") pod \"0ffeb377-ffa2-4eed-afdc-933962462780\" (UID: \"0ffeb377-ffa2-4eed-afdc-933962462780\") " Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.089754 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ffeb377-ffa2-4eed-afdc-933962462780-dns-svc\") pod \"0ffeb377-ffa2-4eed-afdc-933962462780\" (UID: \"0ffeb377-ffa2-4eed-afdc-933962462780\") " Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.089862 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7x8c\" (UniqueName: \"kubernetes.io/projected/0ffeb377-ffa2-4eed-afdc-933962462780-kube-api-access-z7x8c\") pod \"0ffeb377-ffa2-4eed-afdc-933962462780\" (UID: \"0ffeb377-ffa2-4eed-afdc-933962462780\") " Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.090421 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ffeb377-ffa2-4eed-afdc-933962462780-config" (OuterVolumeSpecName: "config") pod "0ffeb377-ffa2-4eed-afdc-933962462780" (UID: "0ffeb377-ffa2-4eed-afdc-933962462780"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.090549 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ffeb377-ffa2-4eed-afdc-933962462780-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ffeb377-ffa2-4eed-afdc-933962462780" (UID: "0ffeb377-ffa2-4eed-afdc-933962462780"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.095341 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ffeb377-ffa2-4eed-afdc-933962462780-kube-api-access-z7x8c" (OuterVolumeSpecName: "kube-api-access-z7x8c") pod "0ffeb377-ffa2-4eed-afdc-933962462780" (UID: "0ffeb377-ffa2-4eed-afdc-933962462780"). InnerVolumeSpecName "kube-api-access-z7x8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.136689 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc" Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.136738 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc" event={"ID":"0ffeb377-ffa2-4eed-afdc-933962462780","Type":"ContainerDied","Data":"9aa448568b5517e7640b7e183342a5f495af98b0b3e6c5294858a5a8ee9d54ec"} Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.193233 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7x8c\" (UniqueName: \"kubernetes.io/projected/0ffeb377-ffa2-4eed-afdc-933962462780-kube-api-access-z7x8c\") on node \"crc\" DevicePath \"\"" Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.193274 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ffeb377-ffa2-4eed-afdc-933962462780-config\") on node \"crc\" DevicePath \"\"" Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.193284 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ffeb377-ffa2-4eed-afdc-933962462780-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.203588 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc"] Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.209866 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f9bd9cf5f-bdqmc"] Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.457190 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd6d96557-dwqpz" Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.460927 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ffeb377-ffa2-4eed-afdc-933962462780" path="/var/lib/kubelet/pods/0ffeb377-ffa2-4eed-afdc-933962462780/volumes" Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.598649 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwx28\" (UniqueName: \"kubernetes.io/projected/eb1b1824-c6c2-444a-9b15-fc102a1754e2-kube-api-access-mwx28\") pod \"eb1b1824-c6c2-444a-9b15-fc102a1754e2\" (UID: \"eb1b1824-c6c2-444a-9b15-fc102a1754e2\") " Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.598748 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb1b1824-c6c2-444a-9b15-fc102a1754e2-config\") pod \"eb1b1824-c6c2-444a-9b15-fc102a1754e2\" (UID: \"eb1b1824-c6c2-444a-9b15-fc102a1754e2\") " Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.599507 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb1b1824-c6c2-444a-9b15-fc102a1754e2-config" (OuterVolumeSpecName: "config") pod "eb1b1824-c6c2-444a-9b15-fc102a1754e2" (UID: "eb1b1824-c6c2-444a-9b15-fc102a1754e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.700578 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb1b1824-c6c2-444a-9b15-fc102a1754e2-config\") on node \"crc\" DevicePath \"\"" Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.767671 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb1b1824-c6c2-444a-9b15-fc102a1754e2-kube-api-access-mwx28" (OuterVolumeSpecName: "kube-api-access-mwx28") pod "eb1b1824-c6c2-444a-9b15-fc102a1754e2" (UID: "eb1b1824-c6c2-444a-9b15-fc102a1754e2"). InnerVolumeSpecName "kube-api-access-mwx28". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:39:57 crc kubenswrapper[4954]: I1206 08:39:57.802180 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwx28\" (UniqueName: \"kubernetes.io/projected/eb1b1824-c6c2-444a-9b15-fc102a1754e2-kube-api-access-mwx28\") on node \"crc\" DevicePath \"\"" Dec 06 08:39:58 crc kubenswrapper[4954]: I1206 08:39:58.144046 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd6d96557-dwqpz" event={"ID":"eb1b1824-c6c2-444a-9b15-fc102a1754e2","Type":"ContainerDied","Data":"caa780e6d17c5a10b8617af9d7af721125a343140e4d2b985ef011890496efe3"} Dec 06 08:39:58 crc kubenswrapper[4954]: I1206 08:39:58.144098 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd6d96557-dwqpz" Dec 06 08:39:58 crc kubenswrapper[4954]: I1206 08:39:58.199783 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cd6d96557-dwqpz"] Dec 06 08:39:58 crc kubenswrapper[4954]: I1206 08:39:58.205645 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cd6d96557-dwqpz"] Dec 06 08:39:59 crc kubenswrapper[4954]: I1206 08:39:59.454451 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb1b1824-c6c2-444a-9b15-fc102a1754e2" path="/var/lib/kubelet/pods/eb1b1824-c6c2-444a-9b15-fc102a1754e2/volumes" Dec 06 08:40:03 crc kubenswrapper[4954]: I1206 08:40:03.190474 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" event={"ID":"f264bd7e-c010-4d5a-af52-d39a6adca97a","Type":"ContainerStarted","Data":"36c32bbc8bb092242847d53f4fdc5e9f955500d92276790e06c3a98b38139bfb"} Dec 06 08:40:04 crc kubenswrapper[4954]: I1206 08:40:04.202152 4954 generic.go:334] "Generic (PLEG): container finished" podID="f264bd7e-c010-4d5a-af52-d39a6adca97a" containerID="36c32bbc8bb092242847d53f4fdc5e9f955500d92276790e06c3a98b38139bfb" exitCode=0 Dec 06 08:40:04 crc kubenswrapper[4954]: I1206 08:40:04.202811 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" event={"ID":"f264bd7e-c010-4d5a-af52-d39a6adca97a","Type":"ContainerDied","Data":"36c32bbc8bb092242847d53f4fdc5e9f955500d92276790e06c3a98b38139bfb"} Dec 06 08:40:04 crc kubenswrapper[4954]: I1206 08:40:04.205231 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ef4d1830-bfa7-4aba-8718-a7e540e52222","Type":"ContainerStarted","Data":"1af10588b1dd6cbf3de2bc0e91ce1e24beeae032665c5d2a77a600374f025f0d"} Dec 06 08:40:04 crc kubenswrapper[4954]: I1206 08:40:04.207407 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"881b9e58-a271-4ecc-a59b-ff1c290add0a","Type":"ContainerStarted","Data":"64922413eb189670c0fffa2cf9cfa5dab81a6109ce4f69edea7c669aadd50af7"} Dec 06 08:40:04 crc kubenswrapper[4954]: I1206 08:40:04.210951 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2f56f9e3-a30b-4342-981a-170433fd75ea","Type":"ContainerStarted","Data":"dde2fcb6263ea1cb1a5163231c7f5f0852c2a711c039f5abd719ec8c01fc445c"} Dec 06 08:40:04 crc kubenswrapper[4954]: I1206 08:40:04.211141 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 06 08:40:04 crc kubenswrapper[4954]: I1206 08:40:04.222147 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"237adbf8-c3e0-427c-ac06-656c110de87b","Type":"ContainerStarted","Data":"a1b074e3bfaa409bef8ce2ce3eac76b653d78af54aaee837f7247b20c55f7a32"} Dec 06 08:40:04 crc kubenswrapper[4954]: I1206 08:40:04.231706 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13f26340-bc95-4778-8d90-f1c8271c7974","Type":"ContainerStarted","Data":"f00195ff2ec109ac2efc7bb48ff1f06e36e4c6e043451c25f4c1b30203ccf000"} Dec 06 08:40:04 crc kubenswrapper[4954]: I1206 08:40:04.357728 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=9.171965874 podStartE2EDuration="36.357648272s" podCreationTimestamp="2025-12-06 08:39:28 +0000 UTC" firstStartedPulling="2025-12-06 08:39:29.730075657 +0000 UTC m=+6144.543435046" lastFinishedPulling="2025-12-06 08:39:56.915758055 +0000 UTC m=+6171.729117444" observedRunningTime="2025-12-06 08:40:04.328077874 +0000 UTC m=+6179.141437273" watchObservedRunningTime="2025-12-06 08:40:04.357648272 +0000 UTC m=+6179.171007661" Dec 06 08:40:05 crc kubenswrapper[4954]: I1206 08:40:05.247156 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" event={"ID":"f264bd7e-c010-4d5a-af52-d39a6adca97a","Type":"ContainerStarted","Data":"db1753fa11e08cf4056cae860be8749dd07f8ad087b6a29282043c618d5ff9e3"} Dec 06 08:40:06 crc kubenswrapper[4954]: I1206 08:40:06.255011 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" Dec 06 08:40:07 crc kubenswrapper[4954]: I1206 08:40:07.262749 4954 generic.go:334] "Generic (PLEG): container finished" podID="237adbf8-c3e0-427c-ac06-656c110de87b" containerID="a1b074e3bfaa409bef8ce2ce3eac76b653d78af54aaee837f7247b20c55f7a32" exitCode=0 Dec 06 08:40:07 crc kubenswrapper[4954]: I1206 08:40:07.262832 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"237adbf8-c3e0-427c-ac06-656c110de87b","Type":"ContainerDied","Data":"a1b074e3bfaa409bef8ce2ce3eac76b653d78af54aaee837f7247b20c55f7a32"} Dec 06 08:40:07 crc kubenswrapper[4954]: I1206 08:40:07.265971 4954 generic.go:334] "Generic (PLEG): container finished" podID="ef4d1830-bfa7-4aba-8718-a7e540e52222" containerID="1af10588b1dd6cbf3de2bc0e91ce1e24beeae032665c5d2a77a600374f025f0d" exitCode=0 Dec 06 08:40:07 crc kubenswrapper[4954]: I1206 08:40:07.266517 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ef4d1830-bfa7-4aba-8718-a7e540e52222","Type":"ContainerDied","Data":"1af10588b1dd6cbf3de2bc0e91ce1e24beeae032665c5d2a77a600374f025f0d"} Dec 06 08:40:07 crc kubenswrapper[4954]: I1206 08:40:07.296258 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" podStartSLOduration=11.687353423 podStartE2EDuration="43.296235353s" podCreationTimestamp="2025-12-06 08:39:24 +0000 UTC" firstStartedPulling="2025-12-06 08:39:25.30632138 +0000 UTC m=+6140.119680769" lastFinishedPulling="2025-12-06 08:39:56.91520329 +0000 UTC m=+6171.728562699" observedRunningTime="2025-12-06 08:40:05.279843034 +0000 UTC m=+6180.093202453" watchObservedRunningTime="2025-12-06 08:40:07.296235353 +0000 UTC m=+6182.109594742" Dec 06 08:40:08 crc kubenswrapper[4954]: I1206 08:40:08.296526 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ef4d1830-bfa7-4aba-8718-a7e540e52222","Type":"ContainerStarted","Data":"7c81ed12a8e143b0829b59ac348691c96ffa148a01f7ca5a18b0f2941adb9c77"} Dec 06 08:40:08 crc kubenswrapper[4954]: I1206 08:40:08.300266 4954 generic.go:334] "Generic (PLEG): container finished" podID="d6289174-ae4e-4881-a916-e836d9ae7243" containerID="36f4e58de946fee0837f6d1c24588323833a140d54f854140b396cee9b23fc5f" exitCode=0 Dec 06 08:40:08 crc kubenswrapper[4954]: I1206 08:40:08.300353 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" event={"ID":"d6289174-ae4e-4881-a916-e836d9ae7243","Type":"ContainerDied","Data":"36f4e58de946fee0837f6d1c24588323833a140d54f854140b396cee9b23fc5f"} Dec 06 08:40:08 crc kubenswrapper[4954]: I1206 08:40:08.308670 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"237adbf8-c3e0-427c-ac06-656c110de87b","Type":"ContainerStarted","Data":"7e6f51684909847faae3ab5dc1b131a955ad4f117f59b195a61f873fbf80e97f"} Dec 06 08:40:08 crc kubenswrapper[4954]: I1206 08:40:08.369194 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=13.819950143 podStartE2EDuration="43.369151301s" podCreationTimestamp="2025-12-06 08:39:25 +0000 UTC" firstStartedPulling="2025-12-06 08:39:27.361980645 +0000 UTC m=+6142.175340034" lastFinishedPulling="2025-12-06 08:39:56.911181803 +0000 UTC m=+6171.724541192" observedRunningTime="2025-12-06 08:40:08.332665289 +0000 UTC m=+6183.146024678" watchObservedRunningTime="2025-12-06 08:40:08.369151301 +0000 UTC m=+6183.182510690" Dec 06 08:40:08 crc kubenswrapper[4954]: I1206 08:40:08.386264 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=14.094346469 podStartE2EDuration="41.386235787s" podCreationTimestamp="2025-12-06 08:39:27 +0000 UTC" firstStartedPulling="2025-12-06 08:39:29.69376995 +0000 UTC m=+6144.507129339" lastFinishedPulling="2025-12-06 08:39:56.985659268 +0000 UTC m=+6171.799018657" observedRunningTime="2025-12-06 08:40:08.377789922 +0000 UTC m=+6183.191149311" watchObservedRunningTime="2025-12-06 08:40:08.386235787 +0000 UTC m=+6183.199595196" Dec 06 08:40:08 crc kubenswrapper[4954]: I1206 08:40:08.869647 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 06 08:40:08 crc kubenswrapper[4954]: I1206 08:40:08.869745 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 06 08:40:09 crc kubenswrapper[4954]: I1206 08:40:09.073729 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 06 08:40:09 crc kubenswrapper[4954]: I1206 08:40:09.320791 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" event={"ID":"d6289174-ae4e-4881-a916-e836d9ae7243","Type":"ContainerStarted","Data":"56f6cc8c2771618276f42967128a2a0c5e88b3ee5675582cd44def97d1131077"} Dec 06 08:40:09 crc kubenswrapper[4954]: I1206 08:40:09.321449 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" Dec 06 08:40:09 crc kubenswrapper[4954]: I1206 08:40:09.345977 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" podStartSLOduration=-9223371991.508821 podStartE2EDuration="45.345954199s" podCreationTimestamp="2025-12-06 08:39:24 +0000 UTC" firstStartedPulling="2025-12-06 08:39:25.243448964 +0000 UTC m=+6140.056808353" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:40:09.338702246 +0000 UTC m=+6184.152061635" watchObservedRunningTime="2025-12-06 08:40:09.345954199 +0000 UTC m=+6184.159313588" Dec 06 08:40:10 crc kubenswrapper[4954]: I1206 08:40:10.009795 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" Dec 06 08:40:10 crc kubenswrapper[4954]: I1206 08:40:10.082629 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f55d4cc4c-hhgjk"] Dec 06 08:40:11 crc kubenswrapper[4954]: I1206 08:40:11.346934 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" podUID="d6289174-ae4e-4881-a916-e836d9ae7243" containerName="dnsmasq-dns" containerID="cri-o://56f6cc8c2771618276f42967128a2a0c5e88b3ee5675582cd44def97d1131077" gracePeriod=10 Dec 06 08:40:11 crc kubenswrapper[4954]: I1206 08:40:11.755149 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" Dec 06 08:40:11 crc kubenswrapper[4954]: I1206 08:40:11.837210 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt2wd\" (UniqueName: \"kubernetes.io/projected/d6289174-ae4e-4881-a916-e836d9ae7243-kube-api-access-lt2wd\") pod \"d6289174-ae4e-4881-a916-e836d9ae7243\" (UID: \"d6289174-ae4e-4881-a916-e836d9ae7243\") " Dec 06 08:40:11 crc kubenswrapper[4954]: I1206 08:40:11.837283 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6289174-ae4e-4881-a916-e836d9ae7243-config\") pod \"d6289174-ae4e-4881-a916-e836d9ae7243\" (UID: \"d6289174-ae4e-4881-a916-e836d9ae7243\") " Dec 06 08:40:11 crc kubenswrapper[4954]: I1206 08:40:11.837372 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6289174-ae4e-4881-a916-e836d9ae7243-dns-svc\") pod \"d6289174-ae4e-4881-a916-e836d9ae7243\" (UID: \"d6289174-ae4e-4881-a916-e836d9ae7243\") " Dec 06 08:40:11 crc kubenswrapper[4954]: I1206 08:40:11.844180 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6289174-ae4e-4881-a916-e836d9ae7243-kube-api-access-lt2wd" (OuterVolumeSpecName: "kube-api-access-lt2wd") pod "d6289174-ae4e-4881-a916-e836d9ae7243" (UID: "d6289174-ae4e-4881-a916-e836d9ae7243"). InnerVolumeSpecName "kube-api-access-lt2wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:40:11 crc kubenswrapper[4954]: I1206 08:40:11.877404 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6289174-ae4e-4881-a916-e836d9ae7243-config" (OuterVolumeSpecName: "config") pod "d6289174-ae4e-4881-a916-e836d9ae7243" (UID: "d6289174-ae4e-4881-a916-e836d9ae7243"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:40:11 crc kubenswrapper[4954]: I1206 08:40:11.878077 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6289174-ae4e-4881-a916-e836d9ae7243-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d6289174-ae4e-4881-a916-e836d9ae7243" (UID: "d6289174-ae4e-4881-a916-e836d9ae7243"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:40:11 crc kubenswrapper[4954]: I1206 08:40:11.939361 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt2wd\" (UniqueName: \"kubernetes.io/projected/d6289174-ae4e-4881-a916-e836d9ae7243-kube-api-access-lt2wd\") on node \"crc\" DevicePath \"\"" Dec 06 08:40:11 crc kubenswrapper[4954]: I1206 08:40:11.939410 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6289174-ae4e-4881-a916-e836d9ae7243-config\") on node \"crc\" DevicePath \"\"" Dec 06 08:40:11 crc kubenswrapper[4954]: I1206 08:40:11.939423 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6289174-ae4e-4881-a916-e836d9ae7243-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 08:40:12 crc kubenswrapper[4954]: I1206 08:40:12.359612 4954 generic.go:334] "Generic (PLEG): container finished" podID="d6289174-ae4e-4881-a916-e836d9ae7243" containerID="56f6cc8c2771618276f42967128a2a0c5e88b3ee5675582cd44def97d1131077" exitCode=0 Dec 06 08:40:12 crc kubenswrapper[4954]: I1206 08:40:12.359820 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" Dec 06 08:40:12 crc kubenswrapper[4954]: I1206 08:40:12.360723 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" event={"ID":"d6289174-ae4e-4881-a916-e836d9ae7243","Type":"ContainerDied","Data":"56f6cc8c2771618276f42967128a2a0c5e88b3ee5675582cd44def97d1131077"} Dec 06 08:40:12 crc kubenswrapper[4954]: I1206 08:40:12.360883 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f55d4cc4c-hhgjk" event={"ID":"d6289174-ae4e-4881-a916-e836d9ae7243","Type":"ContainerDied","Data":"6e4ad6c7c30a8557302a8cca470e2682ad659684173618052bc1c65425a47d93"} Dec 06 08:40:12 crc kubenswrapper[4954]: I1206 08:40:12.360930 4954 scope.go:117] "RemoveContainer" containerID="56f6cc8c2771618276f42967128a2a0c5e88b3ee5675582cd44def97d1131077" Dec 06 08:40:12 crc kubenswrapper[4954]: I1206 08:40:12.386229 4954 scope.go:117] "RemoveContainer" containerID="36f4e58de946fee0837f6d1c24588323833a140d54f854140b396cee9b23fc5f" Dec 06 08:40:12 crc kubenswrapper[4954]: I1206 08:40:12.405897 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f55d4cc4c-hhgjk"] Dec 06 08:40:12 crc kubenswrapper[4954]: I1206 08:40:12.408179 4954 scope.go:117] "RemoveContainer" containerID="56f6cc8c2771618276f42967128a2a0c5e88b3ee5675582cd44def97d1131077" Dec 06 08:40:12 crc kubenswrapper[4954]: I1206 08:40:12.416134 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f55d4cc4c-hhgjk"] Dec 06 08:40:12 crc kubenswrapper[4954]: E1206 08:40:12.416234 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56f6cc8c2771618276f42967128a2a0c5e88b3ee5675582cd44def97d1131077\": container with ID starting with 56f6cc8c2771618276f42967128a2a0c5e88b3ee5675582cd44def97d1131077 not found: ID does not exist" containerID="56f6cc8c2771618276f42967128a2a0c5e88b3ee5675582cd44def97d1131077" Dec 06 08:40:12 crc kubenswrapper[4954]: I1206 08:40:12.416279 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56f6cc8c2771618276f42967128a2a0c5e88b3ee5675582cd44def97d1131077"} err="failed to get container status \"56f6cc8c2771618276f42967128a2a0c5e88b3ee5675582cd44def97d1131077\": rpc error: code = NotFound desc = could not find container \"56f6cc8c2771618276f42967128a2a0c5e88b3ee5675582cd44def97d1131077\": container with ID starting with 56f6cc8c2771618276f42967128a2a0c5e88b3ee5675582cd44def97d1131077 not found: ID does not exist" Dec 06 08:40:12 crc kubenswrapper[4954]: I1206 08:40:12.416339 4954 scope.go:117] "RemoveContainer" containerID="36f4e58de946fee0837f6d1c24588323833a140d54f854140b396cee9b23fc5f" Dec 06 08:40:12 crc kubenswrapper[4954]: E1206 08:40:12.418731 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36f4e58de946fee0837f6d1c24588323833a140d54f854140b396cee9b23fc5f\": container with ID starting with 36f4e58de946fee0837f6d1c24588323833a140d54f854140b396cee9b23fc5f not found: ID does not exist" containerID="36f4e58de946fee0837f6d1c24588323833a140d54f854140b396cee9b23fc5f" Dec 06 08:40:12 crc kubenswrapper[4954]: I1206 08:40:12.418802 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36f4e58de946fee0837f6d1c24588323833a140d54f854140b396cee9b23fc5f"} err="failed to get container status \"36f4e58de946fee0837f6d1c24588323833a140d54f854140b396cee9b23fc5f\": rpc error: code = NotFound desc = could not find container \"36f4e58de946fee0837f6d1c24588323833a140d54f854140b396cee9b23fc5f\": container with ID starting with 36f4e58de946fee0837f6d1c24588323833a140d54f854140b396cee9b23fc5f not found: ID does not exist" Dec 06 08:40:12 crc kubenswrapper[4954]: I1206 08:40:12.980556 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 06 08:40:13 crc kubenswrapper[4954]: I1206 08:40:13.060428 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 06 08:40:13 crc kubenswrapper[4954]: I1206 08:40:13.457781 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6289174-ae4e-4881-a916-e836d9ae7243" path="/var/lib/kubelet/pods/d6289174-ae4e-4881-a916-e836d9ae7243/volumes" Dec 06 08:40:16 crc kubenswrapper[4954]: I1206 08:40:16.812115 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 06 08:40:16 crc kubenswrapper[4954]: I1206 08:40:16.812865 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 06 08:40:16 crc kubenswrapper[4954]: I1206 08:40:16.912048 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 06 08:40:17 crc kubenswrapper[4954]: I1206 08:40:17.491706 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 06 08:40:38 crc kubenswrapper[4954]: I1206 08:40:38.613763 4954 generic.go:334] "Generic (PLEG): container finished" podID="881b9e58-a271-4ecc-a59b-ff1c290add0a" containerID="64922413eb189670c0fffa2cf9cfa5dab81a6109ce4f69edea7c669aadd50af7" exitCode=0 Dec 06 08:40:38 crc kubenswrapper[4954]: I1206 08:40:38.613871 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"881b9e58-a271-4ecc-a59b-ff1c290add0a","Type":"ContainerDied","Data":"64922413eb189670c0fffa2cf9cfa5dab81a6109ce4f69edea7c669aadd50af7"} Dec 06 08:40:38 crc kubenswrapper[4954]: I1206 08:40:38.620254 4954 generic.go:334] "Generic (PLEG): container finished" podID="13f26340-bc95-4778-8d90-f1c8271c7974" containerID="f00195ff2ec109ac2efc7bb48ff1f06e36e4c6e043451c25f4c1b30203ccf000" exitCode=0 Dec 06 08:40:38 crc kubenswrapper[4954]: I1206 08:40:38.620315 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13f26340-bc95-4778-8d90-f1c8271c7974","Type":"ContainerDied","Data":"f00195ff2ec109ac2efc7bb48ff1f06e36e4c6e043451c25f4c1b30203ccf000"} Dec 06 08:40:39 crc kubenswrapper[4954]: I1206 08:40:39.628378 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"881b9e58-a271-4ecc-a59b-ff1c290add0a","Type":"ContainerStarted","Data":"4baead65ce2ccba6d3df6dd439f17b411416cd8f50cca441bf4f04af8cf52343"} Dec 06 08:40:39 crc kubenswrapper[4954]: I1206 08:40:39.629547 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 06 08:40:39 crc kubenswrapper[4954]: I1206 08:40:39.630523 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13f26340-bc95-4778-8d90-f1c8271c7974","Type":"ContainerStarted","Data":"ce45072658eb533b80f114cd5a0828c046ab75d2cbe9865d4343afe0f0b8d47d"} Dec 06 08:40:39 crc kubenswrapper[4954]: I1206 08:40:39.630811 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:40:39 crc kubenswrapper[4954]: I1206 08:40:39.659791 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=45.539854155 podStartE2EDuration="1m15.659769105s" podCreationTimestamp="2025-12-06 08:39:24 +0000 UTC" firstStartedPulling="2025-12-06 08:39:26.788963641 +0000 UTC m=+6141.602323030" lastFinishedPulling="2025-12-06 08:39:56.908878591 +0000 UTC m=+6171.722237980" observedRunningTime="2025-12-06 08:40:39.654790792 +0000 UTC m=+6214.468150201" watchObservedRunningTime="2025-12-06 08:40:39.659769105 +0000 UTC m=+6214.473128494" Dec 06 08:40:39 crc kubenswrapper[4954]: I1206 08:40:39.690689 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=45.368330062 podStartE2EDuration="1m15.690649588s" podCreationTimestamp="2025-12-06 08:39:24 +0000 UTC" firstStartedPulling="2025-12-06 08:39:26.588830686 +0000 UTC m=+6141.402190065" lastFinishedPulling="2025-12-06 08:39:56.911150182 +0000 UTC m=+6171.724509591" observedRunningTime="2025-12-06 08:40:39.688228563 +0000 UTC m=+6214.501587952" watchObservedRunningTime="2025-12-06 08:40:39.690649588 +0000 UTC m=+6214.504008987" Dec 06 08:40:56 crc kubenswrapper[4954]: I1206 08:40:56.030857 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:40:56 crc kubenswrapper[4954]: I1206 08:40:56.130905 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 06 08:41:03 crc kubenswrapper[4954]: I1206 08:41:03.437220 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f7f6bbcbf-8lknl"] Dec 06 08:41:03 crc kubenswrapper[4954]: E1206 08:41:03.439130 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6289174-ae4e-4881-a916-e836d9ae7243" containerName="dnsmasq-dns" Dec 06 08:41:03 crc kubenswrapper[4954]: I1206 08:41:03.439215 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6289174-ae4e-4881-a916-e836d9ae7243" containerName="dnsmasq-dns" Dec 06 08:41:03 crc kubenswrapper[4954]: E1206 08:41:03.439305 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6289174-ae4e-4881-a916-e836d9ae7243" containerName="init" Dec 06 08:41:03 crc kubenswrapper[4954]: I1206 08:41:03.439368 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6289174-ae4e-4881-a916-e836d9ae7243" containerName="init" Dec 06 08:41:03 crc kubenswrapper[4954]: I1206 08:41:03.439583 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6289174-ae4e-4881-a916-e836d9ae7243" containerName="dnsmasq-dns" Dec 06 08:41:03 crc kubenswrapper[4954]: I1206 08:41:03.440409 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" Dec 06 08:41:03 crc kubenswrapper[4954]: I1206 08:41:03.493898 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f7f6bbcbf-8lknl"] Dec 06 08:41:03 crc kubenswrapper[4954]: I1206 08:41:03.564418 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd5zb\" (UniqueName: \"kubernetes.io/projected/2e847fab-859e-418c-96d2-9780288fd5f7-kube-api-access-qd5zb\") pod \"dnsmasq-dns-6f7f6bbcbf-8lknl\" (UID: \"2e847fab-859e-418c-96d2-9780288fd5f7\") " pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" Dec 06 08:41:03 crc kubenswrapper[4954]: I1206 08:41:03.564484 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e847fab-859e-418c-96d2-9780288fd5f7-dns-svc\") pod \"dnsmasq-dns-6f7f6bbcbf-8lknl\" (UID: \"2e847fab-859e-418c-96d2-9780288fd5f7\") " pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" Dec 06 08:41:03 crc kubenswrapper[4954]: I1206 08:41:03.564510 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e847fab-859e-418c-96d2-9780288fd5f7-config\") pod \"dnsmasq-dns-6f7f6bbcbf-8lknl\" (UID: \"2e847fab-859e-418c-96d2-9780288fd5f7\") " pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" Dec 06 08:41:03 crc kubenswrapper[4954]: I1206 08:41:03.666159 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd5zb\" (UniqueName: \"kubernetes.io/projected/2e847fab-859e-418c-96d2-9780288fd5f7-kube-api-access-qd5zb\") pod \"dnsmasq-dns-6f7f6bbcbf-8lknl\" (UID: \"2e847fab-859e-418c-96d2-9780288fd5f7\") " pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" Dec 06 08:41:03 crc kubenswrapper[4954]: I1206 08:41:03.666214 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e847fab-859e-418c-96d2-9780288fd5f7-dns-svc\") pod \"dnsmasq-dns-6f7f6bbcbf-8lknl\" (UID: \"2e847fab-859e-418c-96d2-9780288fd5f7\") " pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" Dec 06 08:41:03 crc kubenswrapper[4954]: I1206 08:41:03.666251 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e847fab-859e-418c-96d2-9780288fd5f7-config\") pod \"dnsmasq-dns-6f7f6bbcbf-8lknl\" (UID: \"2e847fab-859e-418c-96d2-9780288fd5f7\") " pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" Dec 06 08:41:03 crc kubenswrapper[4954]: I1206 08:41:03.667269 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e847fab-859e-418c-96d2-9780288fd5f7-config\") pod \"dnsmasq-dns-6f7f6bbcbf-8lknl\" (UID: \"2e847fab-859e-418c-96d2-9780288fd5f7\") " pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" Dec 06 08:41:03 crc kubenswrapper[4954]: I1206 08:41:03.667365 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e847fab-859e-418c-96d2-9780288fd5f7-dns-svc\") pod \"dnsmasq-dns-6f7f6bbcbf-8lknl\" (UID: \"2e847fab-859e-418c-96d2-9780288fd5f7\") " pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" Dec 06 08:41:03 crc kubenswrapper[4954]: I1206 08:41:03.686509 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd5zb\" (UniqueName: \"kubernetes.io/projected/2e847fab-859e-418c-96d2-9780288fd5f7-kube-api-access-qd5zb\") pod \"dnsmasq-dns-6f7f6bbcbf-8lknl\" (UID: \"2e847fab-859e-418c-96d2-9780288fd5f7\") " pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" Dec 06 08:41:03 crc kubenswrapper[4954]: I1206 08:41:03.772926 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" Dec 06 08:41:04 crc kubenswrapper[4954]: I1206 08:41:04.122545 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 08:41:04 crc kubenswrapper[4954]: I1206 08:41:04.276956 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f7f6bbcbf-8lknl"] Dec 06 08:41:04 crc kubenswrapper[4954]: I1206 08:41:04.838524 4954 generic.go:334] "Generic (PLEG): container finished" podID="2e847fab-859e-418c-96d2-9780288fd5f7" containerID="4ca5edda898d9d83a00886d13e2d117dab63652b8322ab9ef7e6029e19a6d7ce" exitCode=0 Dec 06 08:41:04 crc kubenswrapper[4954]: I1206 08:41:04.838614 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" event={"ID":"2e847fab-859e-418c-96d2-9780288fd5f7","Type":"ContainerDied","Data":"4ca5edda898d9d83a00886d13e2d117dab63652b8322ab9ef7e6029e19a6d7ce"} Dec 06 08:41:04 crc kubenswrapper[4954]: I1206 08:41:04.838930 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" event={"ID":"2e847fab-859e-418c-96d2-9780288fd5f7","Type":"ContainerStarted","Data":"8482361d5278f2d6989bc1b904462faa0e91ee52bb971921f06f842a6696bd41"} Dec 06 08:41:04 crc kubenswrapper[4954]: I1206 08:41:04.898162 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 08:41:05 crc kubenswrapper[4954]: I1206 08:41:05.846956 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" event={"ID":"2e847fab-859e-418c-96d2-9780288fd5f7","Type":"ContainerStarted","Data":"c29efca274771d1f3acaa9a5c9f8e49e159e07b57590bc770f4236bf95339cdb"} Dec 06 08:41:05 crc kubenswrapper[4954]: I1206 08:41:05.848296 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" Dec 06 08:41:05 crc kubenswrapper[4954]: I1206 08:41:05.868536 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" podStartSLOduration=2.868516651 podStartE2EDuration="2.868516651s" podCreationTimestamp="2025-12-06 08:41:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:41:05.863654541 +0000 UTC m=+6240.677013940" watchObservedRunningTime="2025-12-06 08:41:05.868516651 +0000 UTC m=+6240.681876040" Dec 06 08:41:08 crc kubenswrapper[4954]: I1206 08:41:08.275608 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="881b9e58-a271-4ecc-a59b-ff1c290add0a" containerName="rabbitmq" containerID="cri-o://4baead65ce2ccba6d3df6dd439f17b411416cd8f50cca441bf4f04af8cf52343" gracePeriod=604796 Dec 06 08:41:08 crc kubenswrapper[4954]: I1206 08:41:08.792254 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="13f26340-bc95-4778-8d90-f1c8271c7974" containerName="rabbitmq" containerID="cri-o://ce45072658eb533b80f114cd5a0828c046ab75d2cbe9865d4343afe0f0b8d47d" gracePeriod=604797 Dec 06 08:41:10 crc kubenswrapper[4954]: I1206 08:41:10.101408 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:41:10 crc kubenswrapper[4954]: I1206 08:41:10.101696 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:41:13 crc kubenswrapper[4954]: I1206 08:41:13.775300 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" Dec 06 08:41:13 crc kubenswrapper[4954]: I1206 08:41:13.841617 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68ddc8d76c-p4p8h"] Dec 06 08:41:13 crc kubenswrapper[4954]: I1206 08:41:13.842591 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" podUID="f264bd7e-c010-4d5a-af52-d39a6adca97a" containerName="dnsmasq-dns" containerID="cri-o://db1753fa11e08cf4056cae860be8749dd07f8ad087b6a29282043c618d5ff9e3" gracePeriod=10 Dec 06 08:41:15 crc kubenswrapper[4954]: I1206 08:41:15.033661 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" podUID="f264bd7e-c010-4d5a-af52-d39a6adca97a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.253:5353: connect: connection refused" Dec 06 08:41:16 crc kubenswrapper[4954]: I1206 08:41:16.029623 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="13f26340-bc95-4778-8d90-f1c8271c7974" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.254:5671: connect: connection refused" Dec 06 08:41:16 crc kubenswrapper[4954]: I1206 08:41:16.128994 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="881b9e58-a271-4ecc-a59b-ff1c290add0a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.255:5671: connect: connection refused" Dec 06 08:41:16 crc kubenswrapper[4954]: I1206 08:41:16.932326 4954 generic.go:334] "Generic (PLEG): container finished" podID="f264bd7e-c010-4d5a-af52-d39a6adca97a" containerID="db1753fa11e08cf4056cae860be8749dd07f8ad087b6a29282043c618d5ff9e3" exitCode=0 Dec 06 08:41:16 crc kubenswrapper[4954]: I1206 08:41:16.932439 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" event={"ID":"f264bd7e-c010-4d5a-af52-d39a6adca97a","Type":"ContainerDied","Data":"db1753fa11e08cf4056cae860be8749dd07f8ad087b6a29282043c618d5ff9e3"} Dec 06 08:41:17 crc kubenswrapper[4954]: E1206 08:41:17.321641 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod881b9e58_a271_4ecc_a59b_ff1c290add0a.slice/crio-conmon-4baead65ce2ccba6d3df6dd439f17b411416cd8f50cca441bf4f04af8cf52343.scope\": RecentStats: unable to find data in memory cache]" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.329108 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.508358 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f264bd7e-c010-4d5a-af52-d39a6adca97a-dns-svc\") pod \"f264bd7e-c010-4d5a-af52-d39a6adca97a\" (UID: \"f264bd7e-c010-4d5a-af52-d39a6adca97a\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.508629 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxwjg\" (UniqueName: \"kubernetes.io/projected/f264bd7e-c010-4d5a-af52-d39a6adca97a-kube-api-access-xxwjg\") pod \"f264bd7e-c010-4d5a-af52-d39a6adca97a\" (UID: \"f264bd7e-c010-4d5a-af52-d39a6adca97a\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.508680 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f264bd7e-c010-4d5a-af52-d39a6adca97a-config\") pod \"f264bd7e-c010-4d5a-af52-d39a6adca97a\" (UID: \"f264bd7e-c010-4d5a-af52-d39a6adca97a\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.515618 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.516970 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f264bd7e-c010-4d5a-af52-d39a6adca97a-kube-api-access-xxwjg" (OuterVolumeSpecName: "kube-api-access-xxwjg") pod "f264bd7e-c010-4d5a-af52-d39a6adca97a" (UID: "f264bd7e-c010-4d5a-af52-d39a6adca97a"). InnerVolumeSpecName "kube-api-access-xxwjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.544679 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f264bd7e-c010-4d5a-af52-d39a6adca97a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f264bd7e-c010-4d5a-af52-d39a6adca97a" (UID: "f264bd7e-c010-4d5a-af52-d39a6adca97a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.552947 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f264bd7e-c010-4d5a-af52-d39a6adca97a-config" (OuterVolumeSpecName: "config") pod "f264bd7e-c010-4d5a-af52-d39a6adca97a" (UID: "f264bd7e-c010-4d5a-af52-d39a6adca97a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.565632 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.610488 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxwjg\" (UniqueName: \"kubernetes.io/projected/f264bd7e-c010-4d5a-af52-d39a6adca97a-kube-api-access-xxwjg\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.610533 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f264bd7e-c010-4d5a-af52-d39a6adca97a-config\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.610546 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f264bd7e-c010-4d5a-af52-d39a6adca97a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712078 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13f26340-bc95-4778-8d90-f1c8271c7974-config-data\") pod \"13f26340-bc95-4778-8d90-f1c8271c7974\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712147 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-erlang-cookie\") pod \"881b9e58-a271-4ecc-a59b-ff1c290add0a\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712182 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13f26340-bc95-4778-8d90-f1c8271c7974-server-conf\") pod \"13f26340-bc95-4778-8d90-f1c8271c7974\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712210 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-tls\") pod \"13f26340-bc95-4778-8d90-f1c8271c7974\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712236 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-plugins\") pod \"881b9e58-a271-4ecc-a59b-ff1c290add0a\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712425 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\") pod \"881b9e58-a271-4ecc-a59b-ff1c290add0a\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712476 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/881b9e58-a271-4ecc-a59b-ff1c290add0a-erlang-cookie-secret\") pod \"881b9e58-a271-4ecc-a59b-ff1c290add0a\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712534 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-confd\") pod \"881b9e58-a271-4ecc-a59b-ff1c290add0a\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712586 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/881b9e58-a271-4ecc-a59b-ff1c290add0a-plugins-conf\") pod \"881b9e58-a271-4ecc-a59b-ff1c290add0a\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712613 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wzmt\" (UniqueName: \"kubernetes.io/projected/13f26340-bc95-4778-8d90-f1c8271c7974-kube-api-access-2wzmt\") pod \"13f26340-bc95-4778-8d90-f1c8271c7974\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712640 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vlb2\" (UniqueName: \"kubernetes.io/projected/881b9e58-a271-4ecc-a59b-ff1c290add0a-kube-api-access-4vlb2\") pod \"881b9e58-a271-4ecc-a59b-ff1c290add0a\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712673 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/881b9e58-a271-4ecc-a59b-ff1c290add0a-config-data\") pod \"881b9e58-a271-4ecc-a59b-ff1c290add0a\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712712 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13f26340-bc95-4778-8d90-f1c8271c7974-pod-info\") pod \"13f26340-bc95-4778-8d90-f1c8271c7974\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712738 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-erlang-cookie\") pod \"13f26340-bc95-4778-8d90-f1c8271c7974\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712783 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-plugins\") pod \"13f26340-bc95-4778-8d90-f1c8271c7974\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712855 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/881b9e58-a271-4ecc-a59b-ff1c290add0a-pod-info\") pod \"881b9e58-a271-4ecc-a59b-ff1c290add0a\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712906 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13f26340-bc95-4778-8d90-f1c8271c7974-plugins-conf\") pod \"13f26340-bc95-4778-8d90-f1c8271c7974\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712905 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "881b9e58-a271-4ecc-a59b-ff1c290add0a" (UID: "881b9e58-a271-4ecc-a59b-ff1c290add0a"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.712937 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/881b9e58-a271-4ecc-a59b-ff1c290add0a-server-conf\") pod \"881b9e58-a271-4ecc-a59b-ff1c290add0a\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.713028 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\") pod \"13f26340-bc95-4778-8d90-f1c8271c7974\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.713059 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-tls\") pod \"881b9e58-a271-4ecc-a59b-ff1c290add0a\" (UID: \"881b9e58-a271-4ecc-a59b-ff1c290add0a\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.713093 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13f26340-bc95-4778-8d90-f1c8271c7974-erlang-cookie-secret\") pod \"13f26340-bc95-4778-8d90-f1c8271c7974\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.713132 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-confd\") pod \"13f26340-bc95-4778-8d90-f1c8271c7974\" (UID: \"13f26340-bc95-4778-8d90-f1c8271c7974\") " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.713446 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/881b9e58-a271-4ecc-a59b-ff1c290add0a-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "881b9e58-a271-4ecc-a59b-ff1c290add0a" (UID: "881b9e58-a271-4ecc-a59b-ff1c290add0a"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.713473 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.714341 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13f26340-bc95-4778-8d90-f1c8271c7974-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "13f26340-bc95-4778-8d90-f1c8271c7974" (UID: "13f26340-bc95-4778-8d90-f1c8271c7974"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.715422 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "881b9e58-a271-4ecc-a59b-ff1c290add0a" (UID: "881b9e58-a271-4ecc-a59b-ff1c290add0a"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.716312 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "13f26340-bc95-4778-8d90-f1c8271c7974" (UID: "13f26340-bc95-4778-8d90-f1c8271c7974"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.719814 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "13f26340-bc95-4778-8d90-f1c8271c7974" (UID: "13f26340-bc95-4778-8d90-f1c8271c7974"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.721452 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "13f26340-bc95-4778-8d90-f1c8271c7974" (UID: "13f26340-bc95-4778-8d90-f1c8271c7974"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.723457 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/881b9e58-a271-4ecc-a59b-ff1c290add0a-kube-api-access-4vlb2" (OuterVolumeSpecName: "kube-api-access-4vlb2") pod "881b9e58-a271-4ecc-a59b-ff1c290add0a" (UID: "881b9e58-a271-4ecc-a59b-ff1c290add0a"). InnerVolumeSpecName "kube-api-access-4vlb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.723678 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f26340-bc95-4778-8d90-f1c8271c7974-kube-api-access-2wzmt" (OuterVolumeSpecName: "kube-api-access-2wzmt") pod "13f26340-bc95-4778-8d90-f1c8271c7974" (UID: "13f26340-bc95-4778-8d90-f1c8271c7974"). InnerVolumeSpecName "kube-api-access-2wzmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.726250 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/881b9e58-a271-4ecc-a59b-ff1c290add0a-pod-info" (OuterVolumeSpecName: "pod-info") pod "881b9e58-a271-4ecc-a59b-ff1c290add0a" (UID: "881b9e58-a271-4ecc-a59b-ff1c290add0a"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.726460 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/881b9e58-a271-4ecc-a59b-ff1c290add0a-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "881b9e58-a271-4ecc-a59b-ff1c290add0a" (UID: "881b9e58-a271-4ecc-a59b-ff1c290add0a"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.729165 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "881b9e58-a271-4ecc-a59b-ff1c290add0a" (UID: "881b9e58-a271-4ecc-a59b-ff1c290add0a"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.730173 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/13f26340-bc95-4778-8d90-f1c8271c7974-pod-info" (OuterVolumeSpecName: "pod-info") pod "13f26340-bc95-4778-8d90-f1c8271c7974" (UID: "13f26340-bc95-4778-8d90-f1c8271c7974"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.738983 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f26340-bc95-4778-8d90-f1c8271c7974-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "13f26340-bc95-4778-8d90-f1c8271c7974" (UID: "13f26340-bc95-4778-8d90-f1c8271c7974"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.742016 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0" (OuterVolumeSpecName: "persistence") pod "881b9e58-a271-4ecc-a59b-ff1c290add0a" (UID: "881b9e58-a271-4ecc-a59b-ff1c290add0a"). InnerVolumeSpecName "pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.743470 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6" (OuterVolumeSpecName: "persistence") pod "13f26340-bc95-4778-8d90-f1c8271c7974" (UID: "13f26340-bc95-4778-8d90-f1c8271c7974"). InnerVolumeSpecName "pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.747665 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/881b9e58-a271-4ecc-a59b-ff1c290add0a-config-data" (OuterVolumeSpecName: "config-data") pod "881b9e58-a271-4ecc-a59b-ff1c290add0a" (UID: "881b9e58-a271-4ecc-a59b-ff1c290add0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.751389 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13f26340-bc95-4778-8d90-f1c8271c7974-config-data" (OuterVolumeSpecName: "config-data") pod "13f26340-bc95-4778-8d90-f1c8271c7974" (UID: "13f26340-bc95-4778-8d90-f1c8271c7974"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.772495 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13f26340-bc95-4778-8d90-f1c8271c7974-server-conf" (OuterVolumeSpecName: "server-conf") pod "13f26340-bc95-4778-8d90-f1c8271c7974" (UID: "13f26340-bc95-4778-8d90-f1c8271c7974"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.776625 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/881b9e58-a271-4ecc-a59b-ff1c290add0a-server-conf" (OuterVolumeSpecName: "server-conf") pod "881b9e58-a271-4ecc-a59b-ff1c290add0a" (UID: "881b9e58-a271-4ecc-a59b-ff1c290add0a"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.804448 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "881b9e58-a271-4ecc-a59b-ff1c290add0a" (UID: "881b9e58-a271-4ecc-a59b-ff1c290add0a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.814669 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.814706 4954 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/881b9e58-a271-4ecc-a59b-ff1c290add0a-pod-info\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.814717 4954 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13f26340-bc95-4778-8d90-f1c8271c7974-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.814728 4954 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/881b9e58-a271-4ecc-a59b-ff1c290add0a-server-conf\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.814739 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.814861 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\") on node \"crc\" " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.814880 4954 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13f26340-bc95-4778-8d90-f1c8271c7974-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.814895 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13f26340-bc95-4778-8d90-f1c8271c7974-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.814909 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.814921 4954 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13f26340-bc95-4778-8d90-f1c8271c7974-server-conf\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.814932 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.814985 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\") on node \"crc\" " Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.815001 4954 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/881b9e58-a271-4ecc-a59b-ff1c290add0a-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.815014 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/881b9e58-a271-4ecc-a59b-ff1c290add0a-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.815024 4954 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/881b9e58-a271-4ecc-a59b-ff1c290add0a-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.815077 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wzmt\" (UniqueName: \"kubernetes.io/projected/13f26340-bc95-4778-8d90-f1c8271c7974-kube-api-access-2wzmt\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.815090 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vlb2\" (UniqueName: \"kubernetes.io/projected/881b9e58-a271-4ecc-a59b-ff1c290add0a-kube-api-access-4vlb2\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.815101 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/881b9e58-a271-4ecc-a59b-ff1c290add0a-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.815110 4954 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13f26340-bc95-4778-8d90-f1c8271c7974-pod-info\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.815121 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.822201 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "13f26340-bc95-4778-8d90-f1c8271c7974" (UID: "13f26340-bc95-4778-8d90-f1c8271c7974"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.836913 4954 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.837248 4954 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6") on node "crc" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.838374 4954 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.838524 4954 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0") on node "crc" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.916368 4954 reconciler_common.go:293] "Volume detached for volume \"pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.916584 4954 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13f26340-bc95-4778-8d90-f1c8271c7974-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.916648 4954 reconciler_common.go:293] "Volume detached for volume \"pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\") on node \"crc\" DevicePath \"\"" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.940353 4954 generic.go:334] "Generic (PLEG): container finished" podID="13f26340-bc95-4778-8d90-f1c8271c7974" containerID="ce45072658eb533b80f114cd5a0828c046ab75d2cbe9865d4343afe0f0b8d47d" exitCode=0 Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.940409 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.940416 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13f26340-bc95-4778-8d90-f1c8271c7974","Type":"ContainerDied","Data":"ce45072658eb533b80f114cd5a0828c046ab75d2cbe9865d4343afe0f0b8d47d"} Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.940802 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13f26340-bc95-4778-8d90-f1c8271c7974","Type":"ContainerDied","Data":"ab8aacf99582b9cee37c50a22effb7331257b5c1ffa20cf61d51e6c76c7df78d"} Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.940821 4954 scope.go:117] "RemoveContainer" containerID="ce45072658eb533b80f114cd5a0828c046ab75d2cbe9865d4343afe0f0b8d47d" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.944166 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" event={"ID":"f264bd7e-c010-4d5a-af52-d39a6adca97a","Type":"ContainerDied","Data":"849d5135433544a3e8c5e35b0ff5b4d3c9654a2805d49ccbb27963906cbc0be4"} Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.944935 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68ddc8d76c-p4p8h" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.947090 4954 generic.go:334] "Generic (PLEG): container finished" podID="881b9e58-a271-4ecc-a59b-ff1c290add0a" containerID="4baead65ce2ccba6d3df6dd439f17b411416cd8f50cca441bf4f04af8cf52343" exitCode=0 Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.947132 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"881b9e58-a271-4ecc-a59b-ff1c290add0a","Type":"ContainerDied","Data":"4baead65ce2ccba6d3df6dd439f17b411416cd8f50cca441bf4f04af8cf52343"} Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.947152 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"881b9e58-a271-4ecc-a59b-ff1c290add0a","Type":"ContainerDied","Data":"3ff0f4548840f598dc69023fae296d48275c427b55b226257b4b8758a19f2568"} Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.947176 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.960758 4954 scope.go:117] "RemoveContainer" containerID="f00195ff2ec109ac2efc7bb48ff1f06e36e4c6e043451c25f4c1b30203ccf000" Dec 06 08:41:17 crc kubenswrapper[4954]: I1206 08:41:17.987873 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.000043 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.006917 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.008774 4954 scope.go:117] "RemoveContainer" containerID="ce45072658eb533b80f114cd5a0828c046ab75d2cbe9865d4343afe0f0b8d47d" Dec 06 08:41:18 crc kubenswrapper[4954]: E1206 08:41:18.013780 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce45072658eb533b80f114cd5a0828c046ab75d2cbe9865d4343afe0f0b8d47d\": container with ID starting with ce45072658eb533b80f114cd5a0828c046ab75d2cbe9865d4343afe0f0b8d47d not found: ID does not exist" containerID="ce45072658eb533b80f114cd5a0828c046ab75d2cbe9865d4343afe0f0b8d47d" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.013847 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce45072658eb533b80f114cd5a0828c046ab75d2cbe9865d4343afe0f0b8d47d"} err="failed to get container status \"ce45072658eb533b80f114cd5a0828c046ab75d2cbe9865d4343afe0f0b8d47d\": rpc error: code = NotFound desc = could not find container \"ce45072658eb533b80f114cd5a0828c046ab75d2cbe9865d4343afe0f0b8d47d\": container with ID starting with ce45072658eb533b80f114cd5a0828c046ab75d2cbe9865d4343afe0f0b8d47d not found: ID does not exist" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.013886 4954 scope.go:117] "RemoveContainer" containerID="f00195ff2ec109ac2efc7bb48ff1f06e36e4c6e043451c25f4c1b30203ccf000" Dec 06 08:41:18 crc kubenswrapper[4954]: E1206 08:41:18.014378 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f00195ff2ec109ac2efc7bb48ff1f06e36e4c6e043451c25f4c1b30203ccf000\": container with ID starting with f00195ff2ec109ac2efc7bb48ff1f06e36e4c6e043451c25f4c1b30203ccf000 not found: ID does not exist" containerID="f00195ff2ec109ac2efc7bb48ff1f06e36e4c6e043451c25f4c1b30203ccf000" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.014408 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f00195ff2ec109ac2efc7bb48ff1f06e36e4c6e043451c25f4c1b30203ccf000"} err="failed to get container status \"f00195ff2ec109ac2efc7bb48ff1f06e36e4c6e043451c25f4c1b30203ccf000\": rpc error: code = NotFound desc = could not find container \"f00195ff2ec109ac2efc7bb48ff1f06e36e4c6e043451c25f4c1b30203ccf000\": container with ID starting with f00195ff2ec109ac2efc7bb48ff1f06e36e4c6e043451c25f4c1b30203ccf000 not found: ID does not exist" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.014427 4954 scope.go:117] "RemoveContainer" containerID="db1753fa11e08cf4056cae860be8749dd07f8ad087b6a29282043c618d5ff9e3" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.017382 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.029420 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 08:41:18 crc kubenswrapper[4954]: E1206 08:41:18.029895 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13f26340-bc95-4778-8d90-f1c8271c7974" containerName="setup-container" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.029911 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="13f26340-bc95-4778-8d90-f1c8271c7974" containerName="setup-container" Dec 06 08:41:18 crc kubenswrapper[4954]: E1206 08:41:18.029923 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f264bd7e-c010-4d5a-af52-d39a6adca97a" containerName="dnsmasq-dns" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.029930 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f264bd7e-c010-4d5a-af52-d39a6adca97a" containerName="dnsmasq-dns" Dec 06 08:41:18 crc kubenswrapper[4954]: E1206 08:41:18.029966 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="881b9e58-a271-4ecc-a59b-ff1c290add0a" containerName="rabbitmq" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.029975 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="881b9e58-a271-4ecc-a59b-ff1c290add0a" containerName="rabbitmq" Dec 06 08:41:18 crc kubenswrapper[4954]: E1206 08:41:18.029987 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13f26340-bc95-4778-8d90-f1c8271c7974" containerName="rabbitmq" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.029996 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="13f26340-bc95-4778-8d90-f1c8271c7974" containerName="rabbitmq" Dec 06 08:41:18 crc kubenswrapper[4954]: E1206 08:41:18.030008 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="881b9e58-a271-4ecc-a59b-ff1c290add0a" containerName="setup-container" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.030018 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="881b9e58-a271-4ecc-a59b-ff1c290add0a" containerName="setup-container" Dec 06 08:41:18 crc kubenswrapper[4954]: E1206 08:41:18.030029 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f264bd7e-c010-4d5a-af52-d39a6adca97a" containerName="init" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.030037 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f264bd7e-c010-4d5a-af52-d39a6adca97a" containerName="init" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.030203 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f264bd7e-c010-4d5a-af52-d39a6adca97a" containerName="dnsmasq-dns" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.030222 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="881b9e58-a271-4ecc-a59b-ff1c290add0a" containerName="rabbitmq" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.030234 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="13f26340-bc95-4778-8d90-f1c8271c7974" containerName="rabbitmq" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.031142 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.034693 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.034953 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mw7z5" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.035149 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.035373 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.035616 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.035774 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.035942 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68ddc8d76c-p4p8h"] Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.037596 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.038854 4954 scope.go:117] "RemoveContainer" containerID="36c32bbc8bb092242847d53f4fdc5e9f955500d92276790e06c3a98b38139bfb" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.043834 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68ddc8d76c-p4p8h"] Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.049341 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.067609 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.067777 4954 scope.go:117] "RemoveContainer" containerID="4baead65ce2ccba6d3df6dd439f17b411416cd8f50cca441bf4f04af8cf52343" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.069020 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.072033 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.072434 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.072465 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-bfs4n" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.072764 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.073099 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.073276 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.073296 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.077863 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.103291 4954 scope.go:117] "RemoveContainer" containerID="64922413eb189670c0fffa2cf9cfa5dab81a6109ce4f69edea7c669aadd50af7" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.190511 4954 scope.go:117] "RemoveContainer" containerID="4baead65ce2ccba6d3df6dd439f17b411416cd8f50cca441bf4f04af8cf52343" Dec 06 08:41:18 crc kubenswrapper[4954]: E1206 08:41:18.191026 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4baead65ce2ccba6d3df6dd439f17b411416cd8f50cca441bf4f04af8cf52343\": container with ID starting with 4baead65ce2ccba6d3df6dd439f17b411416cd8f50cca441bf4f04af8cf52343 not found: ID does not exist" containerID="4baead65ce2ccba6d3df6dd439f17b411416cd8f50cca441bf4f04af8cf52343" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.191092 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4baead65ce2ccba6d3df6dd439f17b411416cd8f50cca441bf4f04af8cf52343"} err="failed to get container status \"4baead65ce2ccba6d3df6dd439f17b411416cd8f50cca441bf4f04af8cf52343\": rpc error: code = NotFound desc = could not find container \"4baead65ce2ccba6d3df6dd439f17b411416cd8f50cca441bf4f04af8cf52343\": container with ID starting with 4baead65ce2ccba6d3df6dd439f17b411416cd8f50cca441bf4f04af8cf52343 not found: ID does not exist" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.191137 4954 scope.go:117] "RemoveContainer" containerID="64922413eb189670c0fffa2cf9cfa5dab81a6109ce4f69edea7c669aadd50af7" Dec 06 08:41:18 crc kubenswrapper[4954]: E1206 08:41:18.191424 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64922413eb189670c0fffa2cf9cfa5dab81a6109ce4f69edea7c669aadd50af7\": container with ID starting with 64922413eb189670c0fffa2cf9cfa5dab81a6109ce4f69edea7c669aadd50af7 not found: ID does not exist" containerID="64922413eb189670c0fffa2cf9cfa5dab81a6109ce4f69edea7c669aadd50af7" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.191457 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64922413eb189670c0fffa2cf9cfa5dab81a6109ce4f69edea7c669aadd50af7"} err="failed to get container status \"64922413eb189670c0fffa2cf9cfa5dab81a6109ce4f69edea7c669aadd50af7\": rpc error: code = NotFound desc = could not find container \"64922413eb189670c0fffa2cf9cfa5dab81a6109ce4f69edea7c669aadd50af7\": container with ID starting with 64922413eb189670c0fffa2cf9cfa5dab81a6109ce4f69edea7c669aadd50af7 not found: ID does not exist" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.221792 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.221858 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/813293b6-616e-4bc6-a118-c73410b6cc88-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.221892 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/813293b6-616e-4bc6-a118-c73410b6cc88-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.221912 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.222008 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/813293b6-616e-4bc6-a118-c73410b6cc88-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.222061 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-server-conf\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.222121 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.222159 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/813293b6-616e-4bc6-a118-c73410b6cc88-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.222177 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqtbb\" (UniqueName: \"kubernetes.io/projected/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-kube-api-access-cqtbb\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.222212 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-pod-info\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.222240 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/813293b6-616e-4bc6-a118-c73410b6cc88-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.222308 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-config-data\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.222346 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.222417 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.222531 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.222622 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/813293b6-616e-4bc6-a118-c73410b6cc88-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.222654 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/813293b6-616e-4bc6-a118-c73410b6cc88-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.222684 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.222699 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5frqg\" (UniqueName: \"kubernetes.io/projected/813293b6-616e-4bc6-a118-c73410b6cc88-kube-api-access-5frqg\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.222718 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/813293b6-616e-4bc6-a118-c73410b6cc88-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.222759 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.222787 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/813293b6-616e-4bc6-a118-c73410b6cc88-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.323896 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-server-conf\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.323967 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/813293b6-616e-4bc6-a118-c73410b6cc88-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324003 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324023 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/813293b6-616e-4bc6-a118-c73410b6cc88-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324041 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqtbb\" (UniqueName: \"kubernetes.io/projected/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-kube-api-access-cqtbb\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324058 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-pod-info\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324076 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/813293b6-616e-4bc6-a118-c73410b6cc88-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324110 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324125 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-config-data\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324154 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324193 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324225 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/813293b6-616e-4bc6-a118-c73410b6cc88-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324243 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/813293b6-616e-4bc6-a118-c73410b6cc88-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324261 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324275 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5frqg\" (UniqueName: \"kubernetes.io/projected/813293b6-616e-4bc6-a118-c73410b6cc88-kube-api-access-5frqg\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324291 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/813293b6-616e-4bc6-a118-c73410b6cc88-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324316 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324336 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/813293b6-616e-4bc6-a118-c73410b6cc88-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324356 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324379 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/813293b6-616e-4bc6-a118-c73410b6cc88-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324404 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/813293b6-616e-4bc6-a118-c73410b6cc88-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324421 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.324928 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/813293b6-616e-4bc6-a118-c73410b6cc88-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.325244 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.325261 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/813293b6-616e-4bc6-a118-c73410b6cc88-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.325633 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.325760 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-server-conf\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.326396 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-config-data\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.326429 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.326781 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/813293b6-616e-4bc6-a118-c73410b6cc88-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.326887 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/813293b6-616e-4bc6-a118-c73410b6cc88-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.327394 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/813293b6-616e-4bc6-a118-c73410b6cc88-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.330296 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.330339 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7c5c16d0a93ca6b4aec1033eaf25c57d8e8c8a5d1fe221d6f62016c4f1db2425/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.330640 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.330677 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ff64e79839bf38cdfa760e805152cd6507415719f8c1044e4cbe5ff4d2a39c02/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.330970 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.331019 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/813293b6-616e-4bc6-a118-c73410b6cc88-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.331226 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.331255 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.331440 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/813293b6-616e-4bc6-a118-c73410b6cc88-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.331937 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-pod-info\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.333109 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/813293b6-616e-4bc6-a118-c73410b6cc88-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.338049 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/813293b6-616e-4bc6-a118-c73410b6cc88-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.341229 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqtbb\" (UniqueName: \"kubernetes.io/projected/350fbf38-ed6d-43f8-bb3f-54afba2c0a08-kube-api-access-cqtbb\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.341759 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5frqg\" (UniqueName: \"kubernetes.io/projected/813293b6-616e-4bc6-a118-c73410b6cc88-kube-api-access-5frqg\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.357533 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d34623bd-c1ac-4321-8e2c-3215770ff5e6\") pod \"rabbitmq-cell1-server-0\" (UID: \"813293b6-616e-4bc6-a118-c73410b6cc88\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.360905 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.363082 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25c9f8e3-5bcd-4374-930c-fbd5943ad4d0\") pod \"rabbitmq-server-0\" (UID: \"350fbf38-ed6d-43f8-bb3f-54afba2c0a08\") " pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.403977 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.589074 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.659596 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 08:41:18 crc kubenswrapper[4954]: W1206 08:41:18.690504 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod350fbf38_ed6d_43f8_bb3f_54afba2c0a08.slice/crio-4effb7a4b47ca9e23e6e9aaacabd1c64072ac58a69ac16a9a9e135444e163d0a WatchSource:0}: Error finding container 4effb7a4b47ca9e23e6e9aaacabd1c64072ac58a69ac16a9a9e135444e163d0a: Status 404 returned error can't find the container with id 4effb7a4b47ca9e23e6e9aaacabd1c64072ac58a69ac16a9a9e135444e163d0a Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.956871 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"813293b6-616e-4bc6-a118-c73410b6cc88","Type":"ContainerStarted","Data":"eacd227156b929f8cd09f06a0e475a055d1de4386e16cf553d10723892b5bf76"} Dec 06 08:41:18 crc kubenswrapper[4954]: I1206 08:41:18.958055 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"350fbf38-ed6d-43f8-bb3f-54afba2c0a08","Type":"ContainerStarted","Data":"4effb7a4b47ca9e23e6e9aaacabd1c64072ac58a69ac16a9a9e135444e163d0a"} Dec 06 08:41:19 crc kubenswrapper[4954]: I1206 08:41:19.451820 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13f26340-bc95-4778-8d90-f1c8271c7974" path="/var/lib/kubelet/pods/13f26340-bc95-4778-8d90-f1c8271c7974/volumes" Dec 06 08:41:19 crc kubenswrapper[4954]: I1206 08:41:19.452891 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="881b9e58-a271-4ecc-a59b-ff1c290add0a" path="/var/lib/kubelet/pods/881b9e58-a271-4ecc-a59b-ff1c290add0a/volumes" Dec 06 08:41:19 crc kubenswrapper[4954]: I1206 08:41:19.453931 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f264bd7e-c010-4d5a-af52-d39a6adca97a" path="/var/lib/kubelet/pods/f264bd7e-c010-4d5a-af52-d39a6adca97a/volumes" Dec 06 08:41:19 crc kubenswrapper[4954]: I1206 08:41:19.971126 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"813293b6-616e-4bc6-a118-c73410b6cc88","Type":"ContainerStarted","Data":"f78194253d9aac99c645bb5bbf0307142955e1542e8516fc1a2f9dd6e5e52649"} Dec 06 08:41:20 crc kubenswrapper[4954]: I1206 08:41:20.980632 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"350fbf38-ed6d-43f8-bb3f-54afba2c0a08","Type":"ContainerStarted","Data":"a022d3d0bc45aa17e3c472c5f05a58fadb44c5cffbc78176a3061d5c4b884c95"} Dec 06 08:41:40 crc kubenswrapper[4954]: I1206 08:41:40.101062 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:41:40 crc kubenswrapper[4954]: I1206 08:41:40.101642 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:41:53 crc kubenswrapper[4954]: I1206 08:41:53.255630 4954 generic.go:334] "Generic (PLEG): container finished" podID="813293b6-616e-4bc6-a118-c73410b6cc88" containerID="f78194253d9aac99c645bb5bbf0307142955e1542e8516fc1a2f9dd6e5e52649" exitCode=0 Dec 06 08:41:53 crc kubenswrapper[4954]: I1206 08:41:53.255743 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"813293b6-616e-4bc6-a118-c73410b6cc88","Type":"ContainerDied","Data":"f78194253d9aac99c645bb5bbf0307142955e1542e8516fc1a2f9dd6e5e52649"} Dec 06 08:41:53 crc kubenswrapper[4954]: I1206 08:41:53.259199 4954 generic.go:334] "Generic (PLEG): container finished" podID="350fbf38-ed6d-43f8-bb3f-54afba2c0a08" containerID="a022d3d0bc45aa17e3c472c5f05a58fadb44c5cffbc78176a3061d5c4b884c95" exitCode=0 Dec 06 08:41:53 crc kubenswrapper[4954]: I1206 08:41:53.259244 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"350fbf38-ed6d-43f8-bb3f-54afba2c0a08","Type":"ContainerDied","Data":"a022d3d0bc45aa17e3c472c5f05a58fadb44c5cffbc78176a3061d5c4b884c95"} Dec 06 08:41:54 crc kubenswrapper[4954]: I1206 08:41:54.266496 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"813293b6-616e-4bc6-a118-c73410b6cc88","Type":"ContainerStarted","Data":"f7925a7c6b44ff3d6e5e1667788e27a06a0efa8455931c125c6859933978b155"} Dec 06 08:41:54 crc kubenswrapper[4954]: I1206 08:41:54.266876 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:41:54 crc kubenswrapper[4954]: I1206 08:41:54.268446 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"350fbf38-ed6d-43f8-bb3f-54afba2c0a08","Type":"ContainerStarted","Data":"1ccce1445b5e97b63457e07e4af6783ffae81445f901ef2ca4051b4bfeb3348a"} Dec 06 08:41:54 crc kubenswrapper[4954]: I1206 08:41:54.268867 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 06 08:41:54 crc kubenswrapper[4954]: I1206 08:41:54.292181 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.292157702 podStartE2EDuration="37.292157702s" podCreationTimestamp="2025-12-06 08:41:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:41:54.29133641 +0000 UTC m=+6289.104695819" watchObservedRunningTime="2025-12-06 08:41:54.292157702 +0000 UTC m=+6289.105517091" Dec 06 08:41:54 crc kubenswrapper[4954]: I1206 08:41:54.318656 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.318640208 podStartE2EDuration="36.318640208s" podCreationTimestamp="2025-12-06 08:41:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:41:54.316333307 +0000 UTC m=+6289.129692696" watchObservedRunningTime="2025-12-06 08:41:54.318640208 +0000 UTC m=+6289.131999597" Dec 06 08:42:08 crc kubenswrapper[4954]: I1206 08:42:08.364115 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 06 08:42:08 crc kubenswrapper[4954]: I1206 08:42:08.407792 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 06 08:42:10 crc kubenswrapper[4954]: I1206 08:42:10.100920 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:42:10 crc kubenswrapper[4954]: I1206 08:42:10.100984 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:42:10 crc kubenswrapper[4954]: I1206 08:42:10.101031 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 08:42:10 crc kubenswrapper[4954]: I1206 08:42:10.101684 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 08:42:10 crc kubenswrapper[4954]: I1206 08:42:10.101737 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" gracePeriod=600 Dec 06 08:42:10 crc kubenswrapper[4954]: E1206 08:42:10.261137 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:42:10 crc kubenswrapper[4954]: I1206 08:42:10.400690 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" exitCode=0 Dec 06 08:42:10 crc kubenswrapper[4954]: I1206 08:42:10.400781 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448"} Dec 06 08:42:10 crc kubenswrapper[4954]: I1206 08:42:10.400856 4954 scope.go:117] "RemoveContainer" containerID="0bd6f8a5555a0f99770e3b85c32a88e35dc9167385ddb1f2f1ba17c4331137be" Dec 06 08:42:10 crc kubenswrapper[4954]: I1206 08:42:10.402026 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:42:10 crc kubenswrapper[4954]: E1206 08:42:10.402442 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:42:10 crc kubenswrapper[4954]: I1206 08:42:10.866979 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Dec 06 08:42:10 crc kubenswrapper[4954]: I1206 08:42:10.868547 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 06 08:42:10 crc kubenswrapper[4954]: I1206 08:42:10.871344 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-s69fq" Dec 06 08:42:10 crc kubenswrapper[4954]: I1206 08:42:10.874506 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 06 08:42:10 crc kubenswrapper[4954]: I1206 08:42:10.942952 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkmpn\" (UniqueName: \"kubernetes.io/projected/ff70bee5-eacd-44d0-9acf-738461a7e7eb-kube-api-access-tkmpn\") pod \"mariadb-client-1-default\" (UID: \"ff70bee5-eacd-44d0-9acf-738461a7e7eb\") " pod="openstack/mariadb-client-1-default" Dec 06 08:42:11 crc kubenswrapper[4954]: I1206 08:42:11.045034 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkmpn\" (UniqueName: \"kubernetes.io/projected/ff70bee5-eacd-44d0-9acf-738461a7e7eb-kube-api-access-tkmpn\") pod \"mariadb-client-1-default\" (UID: \"ff70bee5-eacd-44d0-9acf-738461a7e7eb\") " pod="openstack/mariadb-client-1-default" Dec 06 08:42:11 crc kubenswrapper[4954]: I1206 08:42:11.074179 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkmpn\" (UniqueName: \"kubernetes.io/projected/ff70bee5-eacd-44d0-9acf-738461a7e7eb-kube-api-access-tkmpn\") pod \"mariadb-client-1-default\" (UID: \"ff70bee5-eacd-44d0-9acf-738461a7e7eb\") " pod="openstack/mariadb-client-1-default" Dec 06 08:42:11 crc kubenswrapper[4954]: I1206 08:42:11.195130 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 06 08:42:11 crc kubenswrapper[4954]: I1206 08:42:11.706773 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 06 08:42:11 crc kubenswrapper[4954]: W1206 08:42:11.710921 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff70bee5_eacd_44d0_9acf_738461a7e7eb.slice/crio-8bd9bb1cc977d7f380f2a9e581d75bad47cf8fe225537bbd07c4614170289ddf WatchSource:0}: Error finding container 8bd9bb1cc977d7f380f2a9e581d75bad47cf8fe225537bbd07c4614170289ddf: Status 404 returned error can't find the container with id 8bd9bb1cc977d7f380f2a9e581d75bad47cf8fe225537bbd07c4614170289ddf Dec 06 08:42:11 crc kubenswrapper[4954]: I1206 08:42:11.715610 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 08:42:12 crc kubenswrapper[4954]: I1206 08:42:12.417750 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"ff70bee5-eacd-44d0-9acf-738461a7e7eb","Type":"ContainerStarted","Data":"8bd9bb1cc977d7f380f2a9e581d75bad47cf8fe225537bbd07c4614170289ddf"} Dec 06 08:42:15 crc kubenswrapper[4954]: I1206 08:42:15.439523 4954 generic.go:334] "Generic (PLEG): container finished" podID="ff70bee5-eacd-44d0-9acf-738461a7e7eb" containerID="293c63c3d5d38aeba4cfc103c69959f9368dbee132b38038e3ff08a5d59cdfd1" exitCode=0 Dec 06 08:42:15 crc kubenswrapper[4954]: I1206 08:42:15.439610 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"ff70bee5-eacd-44d0-9acf-738461a7e7eb","Type":"ContainerDied","Data":"293c63c3d5d38aeba4cfc103c69959f9368dbee132b38038e3ff08a5d59cdfd1"} Dec 06 08:42:16 crc kubenswrapper[4954]: I1206 08:42:16.866731 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 06 08:42:16 crc kubenswrapper[4954]: I1206 08:42:16.909502 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_ff70bee5-eacd-44d0-9acf-738461a7e7eb/mariadb-client-1-default/0.log" Dec 06 08:42:16 crc kubenswrapper[4954]: I1206 08:42:16.939059 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkmpn\" (UniqueName: \"kubernetes.io/projected/ff70bee5-eacd-44d0-9acf-738461a7e7eb-kube-api-access-tkmpn\") pod \"ff70bee5-eacd-44d0-9acf-738461a7e7eb\" (UID: \"ff70bee5-eacd-44d0-9acf-738461a7e7eb\") " Dec 06 08:42:16 crc kubenswrapper[4954]: I1206 08:42:16.939268 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 06 08:42:16 crc kubenswrapper[4954]: I1206 08:42:16.944585 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff70bee5-eacd-44d0-9acf-738461a7e7eb-kube-api-access-tkmpn" (OuterVolumeSpecName: "kube-api-access-tkmpn") pod "ff70bee5-eacd-44d0-9acf-738461a7e7eb" (UID: "ff70bee5-eacd-44d0-9acf-738461a7e7eb"). InnerVolumeSpecName "kube-api-access-tkmpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:42:16 crc kubenswrapper[4954]: I1206 08:42:16.948879 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 06 08:42:17 crc kubenswrapper[4954]: I1206 08:42:17.042046 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkmpn\" (UniqueName: \"kubernetes.io/projected/ff70bee5-eacd-44d0-9acf-738461a7e7eb-kube-api-access-tkmpn\") on node \"crc\" DevicePath \"\"" Dec 06 08:42:17 crc kubenswrapper[4954]: I1206 08:42:17.367980 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Dec 06 08:42:17 crc kubenswrapper[4954]: E1206 08:42:17.368418 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff70bee5-eacd-44d0-9acf-738461a7e7eb" containerName="mariadb-client-1-default" Dec 06 08:42:17 crc kubenswrapper[4954]: I1206 08:42:17.368437 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff70bee5-eacd-44d0-9acf-738461a7e7eb" containerName="mariadb-client-1-default" Dec 06 08:42:17 crc kubenswrapper[4954]: I1206 08:42:17.368642 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff70bee5-eacd-44d0-9acf-738461a7e7eb" containerName="mariadb-client-1-default" Dec 06 08:42:17 crc kubenswrapper[4954]: I1206 08:42:17.369245 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 06 08:42:17 crc kubenswrapper[4954]: I1206 08:42:17.374781 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 06 08:42:17 crc kubenswrapper[4954]: I1206 08:42:17.449135 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b25rl\" (UniqueName: \"kubernetes.io/projected/873b18f9-23fd-4e2d-abd8-3127034e9e45-kube-api-access-b25rl\") pod \"mariadb-client-2-default\" (UID: \"873b18f9-23fd-4e2d-abd8-3127034e9e45\") " pod="openstack/mariadb-client-2-default" Dec 06 08:42:17 crc kubenswrapper[4954]: I1206 08:42:17.455543 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff70bee5-eacd-44d0-9acf-738461a7e7eb" path="/var/lib/kubelet/pods/ff70bee5-eacd-44d0-9acf-738461a7e7eb/volumes" Dec 06 08:42:17 crc kubenswrapper[4954]: I1206 08:42:17.459534 4954 scope.go:117] "RemoveContainer" containerID="293c63c3d5d38aeba4cfc103c69959f9368dbee132b38038e3ff08a5d59cdfd1" Dec 06 08:42:17 crc kubenswrapper[4954]: I1206 08:42:17.459577 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 06 08:42:17 crc kubenswrapper[4954]: I1206 08:42:17.551274 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b25rl\" (UniqueName: \"kubernetes.io/projected/873b18f9-23fd-4e2d-abd8-3127034e9e45-kube-api-access-b25rl\") pod \"mariadb-client-2-default\" (UID: \"873b18f9-23fd-4e2d-abd8-3127034e9e45\") " pod="openstack/mariadb-client-2-default" Dec 06 08:42:17 crc kubenswrapper[4954]: I1206 08:42:17.568871 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b25rl\" (UniqueName: \"kubernetes.io/projected/873b18f9-23fd-4e2d-abd8-3127034e9e45-kube-api-access-b25rl\") pod \"mariadb-client-2-default\" (UID: \"873b18f9-23fd-4e2d-abd8-3127034e9e45\") " pod="openstack/mariadb-client-2-default" Dec 06 08:42:17 crc kubenswrapper[4954]: I1206 08:42:17.690066 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 06 08:42:18 crc kubenswrapper[4954]: I1206 08:42:18.180465 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 06 08:42:18 crc kubenswrapper[4954]: W1206 08:42:18.184854 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod873b18f9_23fd_4e2d_abd8_3127034e9e45.slice/crio-f0db8496612c9dad6ef3e56115a8f254aae6bbc7e694f7edc000b5b86e273942 WatchSource:0}: Error finding container f0db8496612c9dad6ef3e56115a8f254aae6bbc7e694f7edc000b5b86e273942: Status 404 returned error can't find the container with id f0db8496612c9dad6ef3e56115a8f254aae6bbc7e694f7edc000b5b86e273942 Dec 06 08:42:18 crc kubenswrapper[4954]: I1206 08:42:18.467021 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"873b18f9-23fd-4e2d-abd8-3127034e9e45","Type":"ContainerStarted","Data":"f0db8496612c9dad6ef3e56115a8f254aae6bbc7e694f7edc000b5b86e273942"} Dec 06 08:42:19 crc kubenswrapper[4954]: I1206 08:42:19.479235 4954 generic.go:334] "Generic (PLEG): container finished" podID="873b18f9-23fd-4e2d-abd8-3127034e9e45" containerID="ae8b3928a6b0f3cfc244772a984cdeb9b6c1fcdca34cb8a719e56be796b24521" exitCode=0 Dec 06 08:42:19 crc kubenswrapper[4954]: I1206 08:42:19.479333 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"873b18f9-23fd-4e2d-abd8-3127034e9e45","Type":"ContainerDied","Data":"ae8b3928a6b0f3cfc244772a984cdeb9b6c1fcdca34cb8a719e56be796b24521"} Dec 06 08:42:20 crc kubenswrapper[4954]: I1206 08:42:20.820035 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 06 08:42:20 crc kubenswrapper[4954]: I1206 08:42:20.837769 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2-default_873b18f9-23fd-4e2d-abd8-3127034e9e45/mariadb-client-2-default/0.log" Dec 06 08:42:20 crc kubenswrapper[4954]: I1206 08:42:20.867759 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 06 08:42:20 crc kubenswrapper[4954]: I1206 08:42:20.874274 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 06 08:42:20 crc kubenswrapper[4954]: I1206 08:42:20.902375 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b25rl\" (UniqueName: \"kubernetes.io/projected/873b18f9-23fd-4e2d-abd8-3127034e9e45-kube-api-access-b25rl\") pod \"873b18f9-23fd-4e2d-abd8-3127034e9e45\" (UID: \"873b18f9-23fd-4e2d-abd8-3127034e9e45\") " Dec 06 08:42:20 crc kubenswrapper[4954]: I1206 08:42:20.908761 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/873b18f9-23fd-4e2d-abd8-3127034e9e45-kube-api-access-b25rl" (OuterVolumeSpecName: "kube-api-access-b25rl") pod "873b18f9-23fd-4e2d-abd8-3127034e9e45" (UID: "873b18f9-23fd-4e2d-abd8-3127034e9e45"). InnerVolumeSpecName "kube-api-access-b25rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:42:21 crc kubenswrapper[4954]: I1206 08:42:21.005014 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b25rl\" (UniqueName: \"kubernetes.io/projected/873b18f9-23fd-4e2d-abd8-3127034e9e45-kube-api-access-b25rl\") on node \"crc\" DevicePath \"\"" Dec 06 08:42:21 crc kubenswrapper[4954]: I1206 08:42:21.005998 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-3-default"] Dec 06 08:42:21 crc kubenswrapper[4954]: E1206 08:42:21.006509 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873b18f9-23fd-4e2d-abd8-3127034e9e45" containerName="mariadb-client-2-default" Dec 06 08:42:21 crc kubenswrapper[4954]: I1206 08:42:21.006537 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="873b18f9-23fd-4e2d-abd8-3127034e9e45" containerName="mariadb-client-2-default" Dec 06 08:42:21 crc kubenswrapper[4954]: I1206 08:42:21.006751 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="873b18f9-23fd-4e2d-abd8-3127034e9e45" containerName="mariadb-client-2-default" Dec 06 08:42:21 crc kubenswrapper[4954]: I1206 08:42:21.007443 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-3-default" Dec 06 08:42:21 crc kubenswrapper[4954]: I1206 08:42:21.014812 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-3-default"] Dec 06 08:42:21 crc kubenswrapper[4954]: I1206 08:42:21.106437 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zflzd\" (UniqueName: \"kubernetes.io/projected/6e7d9d2f-0202-48ab-b397-00480fe845dd-kube-api-access-zflzd\") pod \"mariadb-client-3-default\" (UID: \"6e7d9d2f-0202-48ab-b397-00480fe845dd\") " pod="openstack/mariadb-client-3-default" Dec 06 08:42:21 crc kubenswrapper[4954]: I1206 08:42:21.207937 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zflzd\" (UniqueName: \"kubernetes.io/projected/6e7d9d2f-0202-48ab-b397-00480fe845dd-kube-api-access-zflzd\") pod \"mariadb-client-3-default\" (UID: \"6e7d9d2f-0202-48ab-b397-00480fe845dd\") " pod="openstack/mariadb-client-3-default" Dec 06 08:42:21 crc kubenswrapper[4954]: I1206 08:42:21.226924 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zflzd\" (UniqueName: \"kubernetes.io/projected/6e7d9d2f-0202-48ab-b397-00480fe845dd-kube-api-access-zflzd\") pod \"mariadb-client-3-default\" (UID: \"6e7d9d2f-0202-48ab-b397-00480fe845dd\") " pod="openstack/mariadb-client-3-default" Dec 06 08:42:21 crc kubenswrapper[4954]: I1206 08:42:21.328165 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-3-default" Dec 06 08:42:21 crc kubenswrapper[4954]: I1206 08:42:21.452409 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="873b18f9-23fd-4e2d-abd8-3127034e9e45" path="/var/lib/kubelet/pods/873b18f9-23fd-4e2d-abd8-3127034e9e45/volumes" Dec 06 08:42:21 crc kubenswrapper[4954]: I1206 08:42:21.494017 4954 scope.go:117] "RemoveContainer" containerID="ae8b3928a6b0f3cfc244772a984cdeb9b6c1fcdca34cb8a719e56be796b24521" Dec 06 08:42:21 crc kubenswrapper[4954]: I1206 08:42:21.494200 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 06 08:42:21 crc kubenswrapper[4954]: I1206 08:42:21.806415 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-3-default"] Dec 06 08:42:22 crc kubenswrapper[4954]: I1206 08:42:22.502515 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-3-default" event={"ID":"6e7d9d2f-0202-48ab-b397-00480fe845dd","Type":"ContainerStarted","Data":"e1401541be69625927b51adeaf663a210e86da268dcf24622b539ad1b3a088a3"} Dec 06 08:42:22 crc kubenswrapper[4954]: I1206 08:42:22.502905 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-3-default" event={"ID":"6e7d9d2f-0202-48ab-b397-00480fe845dd","Type":"ContainerStarted","Data":"853a3185a781c4353e23c5d956b1d10662d7136462f2bfb75a0025186f224a16"} Dec 06 08:42:22 crc kubenswrapper[4954]: I1206 08:42:22.519830 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-3-default" podStartSLOduration=2.519811894 podStartE2EDuration="2.519811894s" podCreationTimestamp="2025-12-06 08:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:42:22.517464082 +0000 UTC m=+6317.330823471" watchObservedRunningTime="2025-12-06 08:42:22.519811894 +0000 UTC m=+6317.333171283" Dec 06 08:42:25 crc kubenswrapper[4954]: I1206 08:42:25.450400 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:42:25 crc kubenswrapper[4954]: E1206 08:42:25.451096 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:42:25 crc kubenswrapper[4954]: I1206 08:42:25.525399 4954 generic.go:334] "Generic (PLEG): container finished" podID="6e7d9d2f-0202-48ab-b397-00480fe845dd" containerID="e1401541be69625927b51adeaf663a210e86da268dcf24622b539ad1b3a088a3" exitCode=0 Dec 06 08:42:25 crc kubenswrapper[4954]: I1206 08:42:25.525441 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-3-default" event={"ID":"6e7d9d2f-0202-48ab-b397-00480fe845dd","Type":"ContainerDied","Data":"e1401541be69625927b51adeaf663a210e86da268dcf24622b539ad1b3a088a3"} Dec 06 08:42:26 crc kubenswrapper[4954]: I1206 08:42:26.902506 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-3-default" Dec 06 08:42:26 crc kubenswrapper[4954]: I1206 08:42:26.934234 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-3-default"] Dec 06 08:42:26 crc kubenswrapper[4954]: I1206 08:42:26.942865 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-3-default"] Dec 06 08:42:27 crc kubenswrapper[4954]: I1206 08:42:27.006107 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zflzd\" (UniqueName: \"kubernetes.io/projected/6e7d9d2f-0202-48ab-b397-00480fe845dd-kube-api-access-zflzd\") pod \"6e7d9d2f-0202-48ab-b397-00480fe845dd\" (UID: \"6e7d9d2f-0202-48ab-b397-00480fe845dd\") " Dec 06 08:42:27 crc kubenswrapper[4954]: I1206 08:42:27.011616 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e7d9d2f-0202-48ab-b397-00480fe845dd-kube-api-access-zflzd" (OuterVolumeSpecName: "kube-api-access-zflzd") pod "6e7d9d2f-0202-48ab-b397-00480fe845dd" (UID: "6e7d9d2f-0202-48ab-b397-00480fe845dd"). InnerVolumeSpecName "kube-api-access-zflzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:42:27 crc kubenswrapper[4954]: I1206 08:42:27.107541 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zflzd\" (UniqueName: \"kubernetes.io/projected/6e7d9d2f-0202-48ab-b397-00480fe845dd-kube-api-access-zflzd\") on node \"crc\" DevicePath \"\"" Dec 06 08:42:27 crc kubenswrapper[4954]: I1206 08:42:27.368156 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Dec 06 08:42:27 crc kubenswrapper[4954]: E1206 08:42:27.368551 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e7d9d2f-0202-48ab-b397-00480fe845dd" containerName="mariadb-client-3-default" Dec 06 08:42:27 crc kubenswrapper[4954]: I1206 08:42:27.368600 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7d9d2f-0202-48ab-b397-00480fe845dd" containerName="mariadb-client-3-default" Dec 06 08:42:27 crc kubenswrapper[4954]: I1206 08:42:27.368853 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e7d9d2f-0202-48ab-b397-00480fe845dd" containerName="mariadb-client-3-default" Dec 06 08:42:27 crc kubenswrapper[4954]: I1206 08:42:27.369655 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 06 08:42:27 crc kubenswrapper[4954]: I1206 08:42:27.378401 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 06 08:42:27 crc kubenswrapper[4954]: I1206 08:42:27.460358 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e7d9d2f-0202-48ab-b397-00480fe845dd" path="/var/lib/kubelet/pods/6e7d9d2f-0202-48ab-b397-00480fe845dd/volumes" Dec 06 08:42:27 crc kubenswrapper[4954]: I1206 08:42:27.513435 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp6xm\" (UniqueName: \"kubernetes.io/projected/43da643b-7665-4c52-9a1b-5e74a1de75f9-kube-api-access-wp6xm\") pod \"mariadb-client-1\" (UID: \"43da643b-7665-4c52-9a1b-5e74a1de75f9\") " pod="openstack/mariadb-client-1" Dec 06 08:42:27 crc kubenswrapper[4954]: I1206 08:42:27.543330 4954 scope.go:117] "RemoveContainer" containerID="e1401541be69625927b51adeaf663a210e86da268dcf24622b539ad1b3a088a3" Dec 06 08:42:27 crc kubenswrapper[4954]: I1206 08:42:27.543399 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-3-default" Dec 06 08:42:27 crc kubenswrapper[4954]: I1206 08:42:27.615673 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp6xm\" (UniqueName: \"kubernetes.io/projected/43da643b-7665-4c52-9a1b-5e74a1de75f9-kube-api-access-wp6xm\") pod \"mariadb-client-1\" (UID: \"43da643b-7665-4c52-9a1b-5e74a1de75f9\") " pod="openstack/mariadb-client-1" Dec 06 08:42:27 crc kubenswrapper[4954]: I1206 08:42:27.636963 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp6xm\" (UniqueName: \"kubernetes.io/projected/43da643b-7665-4c52-9a1b-5e74a1de75f9-kube-api-access-wp6xm\") pod \"mariadb-client-1\" (UID: \"43da643b-7665-4c52-9a1b-5e74a1de75f9\") " pod="openstack/mariadb-client-1" Dec 06 08:42:27 crc kubenswrapper[4954]: I1206 08:42:27.694822 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 06 08:42:28 crc kubenswrapper[4954]: I1206 08:42:28.255330 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 06 08:42:28 crc kubenswrapper[4954]: I1206 08:42:28.553977 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"43da643b-7665-4c52-9a1b-5e74a1de75f9","Type":"ContainerStarted","Data":"6c4d368541becfbe899a732bccdc4fef4cdc5f2cd6669c71dae2215fdb24623a"} Dec 06 08:42:28 crc kubenswrapper[4954]: I1206 08:42:28.554027 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"43da643b-7665-4c52-9a1b-5e74a1de75f9","Type":"ContainerStarted","Data":"87c7ed01d994c2dadb64ea07540e2b41d2a46b981afdf97593c930ca68a094de"} Dec 06 08:42:28 crc kubenswrapper[4954]: I1206 08:42:28.573187 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-1" podStartSLOduration=1.5731581079999999 podStartE2EDuration="1.573158108s" podCreationTimestamp="2025-12-06 08:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:42:28.568925475 +0000 UTC m=+6323.382284874" watchObservedRunningTime="2025-12-06 08:42:28.573158108 +0000 UTC m=+6323.386517507" Dec 06 08:42:29 crc kubenswrapper[4954]: I1206 08:42:29.138420 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_43da643b-7665-4c52-9a1b-5e74a1de75f9/mariadb-client-1/0.log" Dec 06 08:42:29 crc kubenswrapper[4954]: I1206 08:42:29.562123 4954 generic.go:334] "Generic (PLEG): container finished" podID="43da643b-7665-4c52-9a1b-5e74a1de75f9" containerID="6c4d368541becfbe899a732bccdc4fef4cdc5f2cd6669c71dae2215fdb24623a" exitCode=0 Dec 06 08:42:29 crc kubenswrapper[4954]: I1206 08:42:29.562762 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"43da643b-7665-4c52-9a1b-5e74a1de75f9","Type":"ContainerDied","Data":"6c4d368541becfbe899a732bccdc4fef4cdc5f2cd6669c71dae2215fdb24623a"} Dec 06 08:42:30 crc kubenswrapper[4954]: I1206 08:42:30.975914 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 06 08:42:31 crc kubenswrapper[4954]: I1206 08:42:31.014415 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Dec 06 08:42:31 crc kubenswrapper[4954]: I1206 08:42:31.020286 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Dec 06 08:42:31 crc kubenswrapper[4954]: I1206 08:42:31.071957 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp6xm\" (UniqueName: \"kubernetes.io/projected/43da643b-7665-4c52-9a1b-5e74a1de75f9-kube-api-access-wp6xm\") pod \"43da643b-7665-4c52-9a1b-5e74a1de75f9\" (UID: \"43da643b-7665-4c52-9a1b-5e74a1de75f9\") " Dec 06 08:42:31 crc kubenswrapper[4954]: I1206 08:42:31.077276 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43da643b-7665-4c52-9a1b-5e74a1de75f9-kube-api-access-wp6xm" (OuterVolumeSpecName: "kube-api-access-wp6xm") pod "43da643b-7665-4c52-9a1b-5e74a1de75f9" (UID: "43da643b-7665-4c52-9a1b-5e74a1de75f9"). InnerVolumeSpecName "kube-api-access-wp6xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:42:31 crc kubenswrapper[4954]: I1206 08:42:31.175534 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp6xm\" (UniqueName: \"kubernetes.io/projected/43da643b-7665-4c52-9a1b-5e74a1de75f9-kube-api-access-wp6xm\") on node \"crc\" DevicePath \"\"" Dec 06 08:42:31 crc kubenswrapper[4954]: I1206 08:42:31.457431 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43da643b-7665-4c52-9a1b-5e74a1de75f9" path="/var/lib/kubelet/pods/43da643b-7665-4c52-9a1b-5e74a1de75f9/volumes" Dec 06 08:42:31 crc kubenswrapper[4954]: I1206 08:42:31.459067 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Dec 06 08:42:31 crc kubenswrapper[4954]: E1206 08:42:31.459500 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43da643b-7665-4c52-9a1b-5e74a1de75f9" containerName="mariadb-client-1" Dec 06 08:42:31 crc kubenswrapper[4954]: I1206 08:42:31.459524 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="43da643b-7665-4c52-9a1b-5e74a1de75f9" containerName="mariadb-client-1" Dec 06 08:42:31 crc kubenswrapper[4954]: I1206 08:42:31.459828 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="43da643b-7665-4c52-9a1b-5e74a1de75f9" containerName="mariadb-client-1" Dec 06 08:42:31 crc kubenswrapper[4954]: I1206 08:42:31.460533 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 06 08:42:31 crc kubenswrapper[4954]: I1206 08:42:31.470791 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 06 08:42:31 crc kubenswrapper[4954]: I1206 08:42:31.581298 4954 scope.go:117] "RemoveContainer" containerID="6c4d368541becfbe899a732bccdc4fef4cdc5f2cd6669c71dae2215fdb24623a" Dec 06 08:42:31 crc kubenswrapper[4954]: I1206 08:42:31.581341 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 06 08:42:31 crc kubenswrapper[4954]: I1206 08:42:31.584530 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2rsb\" (UniqueName: \"kubernetes.io/projected/a2f722c6-8a4d-473a-86ca-47f460090fe1-kube-api-access-l2rsb\") pod \"mariadb-client-4-default\" (UID: \"a2f722c6-8a4d-473a-86ca-47f460090fe1\") " pod="openstack/mariadb-client-4-default" Dec 06 08:42:31 crc kubenswrapper[4954]: I1206 08:42:31.686428 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2rsb\" (UniqueName: \"kubernetes.io/projected/a2f722c6-8a4d-473a-86ca-47f460090fe1-kube-api-access-l2rsb\") pod \"mariadb-client-4-default\" (UID: \"a2f722c6-8a4d-473a-86ca-47f460090fe1\") " pod="openstack/mariadb-client-4-default" Dec 06 08:42:31 crc kubenswrapper[4954]: I1206 08:42:31.705208 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2rsb\" (UniqueName: \"kubernetes.io/projected/a2f722c6-8a4d-473a-86ca-47f460090fe1-kube-api-access-l2rsb\") pod \"mariadb-client-4-default\" (UID: \"a2f722c6-8a4d-473a-86ca-47f460090fe1\") " pod="openstack/mariadb-client-4-default" Dec 06 08:42:31 crc kubenswrapper[4954]: I1206 08:42:31.782671 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 06 08:42:32 crc kubenswrapper[4954]: I1206 08:42:32.281018 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 06 08:42:32 crc kubenswrapper[4954]: I1206 08:42:32.589896 4954 generic.go:334] "Generic (PLEG): container finished" podID="a2f722c6-8a4d-473a-86ca-47f460090fe1" containerID="60fc5c446c5192654fe08533607b78dcfa3c982894b17283e72057e115e16858" exitCode=0 Dec 06 08:42:32 crc kubenswrapper[4954]: I1206 08:42:32.589946 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"a2f722c6-8a4d-473a-86ca-47f460090fe1","Type":"ContainerDied","Data":"60fc5c446c5192654fe08533607b78dcfa3c982894b17283e72057e115e16858"} Dec 06 08:42:32 crc kubenswrapper[4954]: I1206 08:42:32.589999 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"a2f722c6-8a4d-473a-86ca-47f460090fe1","Type":"ContainerStarted","Data":"fda52d0e8f13e3895c5bad4722ec2ef0755a5a6b36b53bc4741889fc0f9ee644"} Dec 06 08:42:33 crc kubenswrapper[4954]: I1206 08:42:33.933531 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 06 08:42:33 crc kubenswrapper[4954]: I1206 08:42:33.951722 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_a2f722c6-8a4d-473a-86ca-47f460090fe1/mariadb-client-4-default/0.log" Dec 06 08:42:33 crc kubenswrapper[4954]: I1206 08:42:33.980513 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 06 08:42:33 crc kubenswrapper[4954]: I1206 08:42:33.991335 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 06 08:42:34 crc kubenswrapper[4954]: I1206 08:42:34.021313 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2rsb\" (UniqueName: \"kubernetes.io/projected/a2f722c6-8a4d-473a-86ca-47f460090fe1-kube-api-access-l2rsb\") pod \"a2f722c6-8a4d-473a-86ca-47f460090fe1\" (UID: \"a2f722c6-8a4d-473a-86ca-47f460090fe1\") " Dec 06 08:42:34 crc kubenswrapper[4954]: I1206 08:42:34.026465 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2f722c6-8a4d-473a-86ca-47f460090fe1-kube-api-access-l2rsb" (OuterVolumeSpecName: "kube-api-access-l2rsb") pod "a2f722c6-8a4d-473a-86ca-47f460090fe1" (UID: "a2f722c6-8a4d-473a-86ca-47f460090fe1"). InnerVolumeSpecName "kube-api-access-l2rsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:42:34 crc kubenswrapper[4954]: I1206 08:42:34.122908 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2rsb\" (UniqueName: \"kubernetes.io/projected/a2f722c6-8a4d-473a-86ca-47f460090fe1-kube-api-access-l2rsb\") on node \"crc\" DevicePath \"\"" Dec 06 08:42:34 crc kubenswrapper[4954]: I1206 08:42:34.605127 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fda52d0e8f13e3895c5bad4722ec2ef0755a5a6b36b53bc4741889fc0f9ee644" Dec 06 08:42:34 crc kubenswrapper[4954]: I1206 08:42:34.605192 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 06 08:42:35 crc kubenswrapper[4954]: I1206 08:42:35.452273 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2f722c6-8a4d-473a-86ca-47f460090fe1" path="/var/lib/kubelet/pods/a2f722c6-8a4d-473a-86ca-47f460090fe1/volumes" Dec 06 08:42:37 crc kubenswrapper[4954]: I1206 08:42:37.442904 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:42:37 crc kubenswrapper[4954]: E1206 08:42:37.443390 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:42:37 crc kubenswrapper[4954]: I1206 08:42:37.609924 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Dec 06 08:42:37 crc kubenswrapper[4954]: E1206 08:42:37.610305 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2f722c6-8a4d-473a-86ca-47f460090fe1" containerName="mariadb-client-4-default" Dec 06 08:42:37 crc kubenswrapper[4954]: I1206 08:42:37.610325 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2f722c6-8a4d-473a-86ca-47f460090fe1" containerName="mariadb-client-4-default" Dec 06 08:42:37 crc kubenswrapper[4954]: I1206 08:42:37.610583 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2f722c6-8a4d-473a-86ca-47f460090fe1" containerName="mariadb-client-4-default" Dec 06 08:42:37 crc kubenswrapper[4954]: I1206 08:42:37.611163 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 06 08:42:37 crc kubenswrapper[4954]: I1206 08:42:37.615930 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-s69fq" Dec 06 08:42:37 crc kubenswrapper[4954]: I1206 08:42:37.619757 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 06 08:42:37 crc kubenswrapper[4954]: I1206 08:42:37.677205 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvjfs\" (UniqueName: \"kubernetes.io/projected/37954378-6cc4-4e0c-ab3d-eda123af0f40-kube-api-access-pvjfs\") pod \"mariadb-client-5-default\" (UID: \"37954378-6cc4-4e0c-ab3d-eda123af0f40\") " pod="openstack/mariadb-client-5-default" Dec 06 08:42:37 crc kubenswrapper[4954]: I1206 08:42:37.778727 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvjfs\" (UniqueName: \"kubernetes.io/projected/37954378-6cc4-4e0c-ab3d-eda123af0f40-kube-api-access-pvjfs\") pod \"mariadb-client-5-default\" (UID: \"37954378-6cc4-4e0c-ab3d-eda123af0f40\") " pod="openstack/mariadb-client-5-default" Dec 06 08:42:37 crc kubenswrapper[4954]: I1206 08:42:37.797862 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvjfs\" (UniqueName: \"kubernetes.io/projected/37954378-6cc4-4e0c-ab3d-eda123af0f40-kube-api-access-pvjfs\") pod \"mariadb-client-5-default\" (UID: \"37954378-6cc4-4e0c-ab3d-eda123af0f40\") " pod="openstack/mariadb-client-5-default" Dec 06 08:42:37 crc kubenswrapper[4954]: I1206 08:42:37.968907 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 06 08:42:38 crc kubenswrapper[4954]: I1206 08:42:38.507689 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 06 08:42:38 crc kubenswrapper[4954]: I1206 08:42:38.689070 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"37954378-6cc4-4e0c-ab3d-eda123af0f40","Type":"ContainerStarted","Data":"0296481bd97785e4d0dafa9df14ad084e7cb073f230ff92134fb85f0aee508c0"} Dec 06 08:42:39 crc kubenswrapper[4954]: I1206 08:42:39.697403 4954 generic.go:334] "Generic (PLEG): container finished" podID="37954378-6cc4-4e0c-ab3d-eda123af0f40" containerID="d6f3b31d45ece46a931cfb28d75cce17c7a0d760257bbb9a06cd7489ab8a602b" exitCode=0 Dec 06 08:42:39 crc kubenswrapper[4954]: I1206 08:42:39.697494 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"37954378-6cc4-4e0c-ab3d-eda123af0f40","Type":"ContainerDied","Data":"d6f3b31d45ece46a931cfb28d75cce17c7a0d760257bbb9a06cd7489ab8a602b"} Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.062716 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.088322 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_37954378-6cc4-4e0c-ab3d-eda123af0f40/mariadb-client-5-default/0.log" Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.121056 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.126451 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.149578 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvjfs\" (UniqueName: \"kubernetes.io/projected/37954378-6cc4-4e0c-ab3d-eda123af0f40-kube-api-access-pvjfs\") pod \"37954378-6cc4-4e0c-ab3d-eda123af0f40\" (UID: \"37954378-6cc4-4e0c-ab3d-eda123af0f40\") " Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.155355 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37954378-6cc4-4e0c-ab3d-eda123af0f40-kube-api-access-pvjfs" (OuterVolumeSpecName: "kube-api-access-pvjfs") pod "37954378-6cc4-4e0c-ab3d-eda123af0f40" (UID: "37954378-6cc4-4e0c-ab3d-eda123af0f40"). InnerVolumeSpecName "kube-api-access-pvjfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.256131 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvjfs\" (UniqueName: \"kubernetes.io/projected/37954378-6cc4-4e0c-ab3d-eda123af0f40-kube-api-access-pvjfs\") on node \"crc\" DevicePath \"\"" Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.273862 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Dec 06 08:42:41 crc kubenswrapper[4954]: E1206 08:42:41.274310 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37954378-6cc4-4e0c-ab3d-eda123af0f40" containerName="mariadb-client-5-default" Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.274326 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="37954378-6cc4-4e0c-ab3d-eda123af0f40" containerName="mariadb-client-5-default" Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.274513 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="37954378-6cc4-4e0c-ab3d-eda123af0f40" containerName="mariadb-client-5-default" Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.275082 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.282213 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.357958 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvk8x\" (UniqueName: \"kubernetes.io/projected/5146204f-bc03-4e54-a805-415a9ac2a8ee-kube-api-access-kvk8x\") pod \"mariadb-client-6-default\" (UID: \"5146204f-bc03-4e54-a805-415a9ac2a8ee\") " pod="openstack/mariadb-client-6-default" Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.453830 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37954378-6cc4-4e0c-ab3d-eda123af0f40" path="/var/lib/kubelet/pods/37954378-6cc4-4e0c-ab3d-eda123af0f40/volumes" Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.459248 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvk8x\" (UniqueName: \"kubernetes.io/projected/5146204f-bc03-4e54-a805-415a9ac2a8ee-kube-api-access-kvk8x\") pod \"mariadb-client-6-default\" (UID: \"5146204f-bc03-4e54-a805-415a9ac2a8ee\") " pod="openstack/mariadb-client-6-default" Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.475456 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvk8x\" (UniqueName: \"kubernetes.io/projected/5146204f-bc03-4e54-a805-415a9ac2a8ee-kube-api-access-kvk8x\") pod \"mariadb-client-6-default\" (UID: \"5146204f-bc03-4e54-a805-415a9ac2a8ee\") " pod="openstack/mariadb-client-6-default" Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.594101 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.714700 4954 scope.go:117] "RemoveContainer" containerID="d6f3b31d45ece46a931cfb28d75cce17c7a0d760257bbb9a06cd7489ab8a602b" Dec 06 08:42:41 crc kubenswrapper[4954]: I1206 08:42:41.714863 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 06 08:42:42 crc kubenswrapper[4954]: I1206 08:42:42.087834 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 06 08:42:42 crc kubenswrapper[4954]: I1206 08:42:42.728535 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"5146204f-bc03-4e54-a805-415a9ac2a8ee","Type":"ContainerStarted","Data":"8f8b98d2adf6137e79ebaa36c58e66dc1f2daca9a8dd747b7b7e3e14f8136f98"} Dec 06 08:42:44 crc kubenswrapper[4954]: I1206 08:42:44.748733 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"5146204f-bc03-4e54-a805-415a9ac2a8ee","Type":"ContainerStarted","Data":"5be41206c112366394ecdb8cb70f1097195b2501503acf562c4c18067e6d8f77"} Dec 06 08:42:44 crc kubenswrapper[4954]: I1206 08:42:44.765785 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=3.765763312 podStartE2EDuration="3.765763312s" podCreationTimestamp="2025-12-06 08:42:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:42:44.760617334 +0000 UTC m=+6339.573976733" watchObservedRunningTime="2025-12-06 08:42:44.765763312 +0000 UTC m=+6339.579122701" Dec 06 08:42:45 crc kubenswrapper[4954]: I1206 08:42:45.756018 4954 generic.go:334] "Generic (PLEG): container finished" podID="5146204f-bc03-4e54-a805-415a9ac2a8ee" containerID="5be41206c112366394ecdb8cb70f1097195b2501503acf562c4c18067e6d8f77" exitCode=1 Dec 06 08:42:45 crc kubenswrapper[4954]: I1206 08:42:45.756079 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"5146204f-bc03-4e54-a805-415a9ac2a8ee","Type":"ContainerDied","Data":"5be41206c112366394ecdb8cb70f1097195b2501503acf562c4c18067e6d8f77"} Dec 06 08:42:47 crc kubenswrapper[4954]: I1206 08:42:47.137155 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 06 08:42:47 crc kubenswrapper[4954]: I1206 08:42:47.174360 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 06 08:42:47 crc kubenswrapper[4954]: I1206 08:42:47.183250 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 06 08:42:47 crc kubenswrapper[4954]: I1206 08:42:47.262263 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvk8x\" (UniqueName: \"kubernetes.io/projected/5146204f-bc03-4e54-a805-415a9ac2a8ee-kube-api-access-kvk8x\") pod \"5146204f-bc03-4e54-a805-415a9ac2a8ee\" (UID: \"5146204f-bc03-4e54-a805-415a9ac2a8ee\") " Dec 06 08:42:47 crc kubenswrapper[4954]: I1206 08:42:47.268390 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5146204f-bc03-4e54-a805-415a9ac2a8ee-kube-api-access-kvk8x" (OuterVolumeSpecName: "kube-api-access-kvk8x") pod "5146204f-bc03-4e54-a805-415a9ac2a8ee" (UID: "5146204f-bc03-4e54-a805-415a9ac2a8ee"). InnerVolumeSpecName "kube-api-access-kvk8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:42:47 crc kubenswrapper[4954]: I1206 08:42:47.325216 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Dec 06 08:42:47 crc kubenswrapper[4954]: E1206 08:42:47.325557 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5146204f-bc03-4e54-a805-415a9ac2a8ee" containerName="mariadb-client-6-default" Dec 06 08:42:47 crc kubenswrapper[4954]: I1206 08:42:47.325599 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5146204f-bc03-4e54-a805-415a9ac2a8ee" containerName="mariadb-client-6-default" Dec 06 08:42:47 crc kubenswrapper[4954]: I1206 08:42:47.325839 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="5146204f-bc03-4e54-a805-415a9ac2a8ee" containerName="mariadb-client-6-default" Dec 06 08:42:47 crc kubenswrapper[4954]: I1206 08:42:47.326408 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 06 08:42:47 crc kubenswrapper[4954]: I1206 08:42:47.335874 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 06 08:42:47 crc kubenswrapper[4954]: I1206 08:42:47.363840 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvk8x\" (UniqueName: \"kubernetes.io/projected/5146204f-bc03-4e54-a805-415a9ac2a8ee-kube-api-access-kvk8x\") on node \"crc\" DevicePath \"\"" Dec 06 08:42:47 crc kubenswrapper[4954]: I1206 08:42:47.452589 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5146204f-bc03-4e54-a805-415a9ac2a8ee" path="/var/lib/kubelet/pods/5146204f-bc03-4e54-a805-415a9ac2a8ee/volumes" Dec 06 08:42:47 crc kubenswrapper[4954]: I1206 08:42:47.465185 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4fsb\" (UniqueName: \"kubernetes.io/projected/f0400725-fdd0-4f67-ab84-ca735e7b99b8-kube-api-access-q4fsb\") pod \"mariadb-client-7-default\" (UID: \"f0400725-fdd0-4f67-ab84-ca735e7b99b8\") " pod="openstack/mariadb-client-7-default" Dec 06 08:42:47 crc kubenswrapper[4954]: I1206 08:42:47.566743 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4fsb\" (UniqueName: \"kubernetes.io/projected/f0400725-fdd0-4f67-ab84-ca735e7b99b8-kube-api-access-q4fsb\") pod \"mariadb-client-7-default\" (UID: \"f0400725-fdd0-4f67-ab84-ca735e7b99b8\") " pod="openstack/mariadb-client-7-default" Dec 06 08:42:47 crc kubenswrapper[4954]: I1206 08:42:47.600478 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4fsb\" (UniqueName: \"kubernetes.io/projected/f0400725-fdd0-4f67-ab84-ca735e7b99b8-kube-api-access-q4fsb\") pod \"mariadb-client-7-default\" (UID: \"f0400725-fdd0-4f67-ab84-ca735e7b99b8\") " pod="openstack/mariadb-client-7-default" Dec 06 08:42:47 crc kubenswrapper[4954]: I1206 08:42:47.649159 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 06 08:42:47 crc kubenswrapper[4954]: I1206 08:42:47.791312 4954 scope.go:117] "RemoveContainer" containerID="5be41206c112366394ecdb8cb70f1097195b2501503acf562c4c18067e6d8f77" Dec 06 08:42:47 crc kubenswrapper[4954]: I1206 08:42:47.791447 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 06 08:42:48 crc kubenswrapper[4954]: I1206 08:42:48.169351 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 06 08:42:48 crc kubenswrapper[4954]: W1206 08:42:48.174698 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0400725_fdd0_4f67_ab84_ca735e7b99b8.slice/crio-cf5f8ebf4a4e2b3929a2763aeee063f85c580a161b25cd5da44ef233801beb2c WatchSource:0}: Error finding container cf5f8ebf4a4e2b3929a2763aeee063f85c580a161b25cd5da44ef233801beb2c: Status 404 returned error can't find the container with id cf5f8ebf4a4e2b3929a2763aeee063f85c580a161b25cd5da44ef233801beb2c Dec 06 08:42:48 crc kubenswrapper[4954]: I1206 08:42:48.800293 4954 generic.go:334] "Generic (PLEG): container finished" podID="f0400725-fdd0-4f67-ab84-ca735e7b99b8" containerID="ff0a76f79636a16af4ee3fd5ce79baf8fd95782cbbe4478372284f9a1446200e" exitCode=0 Dec 06 08:42:48 crc kubenswrapper[4954]: I1206 08:42:48.800355 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"f0400725-fdd0-4f67-ab84-ca735e7b99b8","Type":"ContainerDied","Data":"ff0a76f79636a16af4ee3fd5ce79baf8fd95782cbbe4478372284f9a1446200e"} Dec 06 08:42:48 crc kubenswrapper[4954]: I1206 08:42:48.800644 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"f0400725-fdd0-4f67-ab84-ca735e7b99b8","Type":"ContainerStarted","Data":"cf5f8ebf4a4e2b3929a2763aeee063f85c580a161b25cd5da44ef233801beb2c"} Dec 06 08:42:50 crc kubenswrapper[4954]: I1206 08:42:50.667836 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 06 08:42:50 crc kubenswrapper[4954]: I1206 08:42:50.696895 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_f0400725-fdd0-4f67-ab84-ca735e7b99b8/mariadb-client-7-default/0.log" Dec 06 08:42:50 crc kubenswrapper[4954]: I1206 08:42:50.727431 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 06 08:42:50 crc kubenswrapper[4954]: I1206 08:42:50.737123 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 06 08:42:50 crc kubenswrapper[4954]: I1206 08:42:50.743307 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4fsb\" (UniqueName: \"kubernetes.io/projected/f0400725-fdd0-4f67-ab84-ca735e7b99b8-kube-api-access-q4fsb\") pod \"f0400725-fdd0-4f67-ab84-ca735e7b99b8\" (UID: \"f0400725-fdd0-4f67-ab84-ca735e7b99b8\") " Dec 06 08:42:50 crc kubenswrapper[4954]: I1206 08:42:50.752834 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0400725-fdd0-4f67-ab84-ca735e7b99b8-kube-api-access-q4fsb" (OuterVolumeSpecName: "kube-api-access-q4fsb") pod "f0400725-fdd0-4f67-ab84-ca735e7b99b8" (UID: "f0400725-fdd0-4f67-ab84-ca735e7b99b8"). InnerVolumeSpecName "kube-api-access-q4fsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:42:50 crc kubenswrapper[4954]: I1206 08:42:50.827823 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf5f8ebf4a4e2b3929a2763aeee063f85c580a161b25cd5da44ef233801beb2c" Dec 06 08:42:50 crc kubenswrapper[4954]: I1206 08:42:50.827909 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 06 08:42:50 crc kubenswrapper[4954]: I1206 08:42:50.845705 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4fsb\" (UniqueName: \"kubernetes.io/projected/f0400725-fdd0-4f67-ab84-ca735e7b99b8-kube-api-access-q4fsb\") on node \"crc\" DevicePath \"\"" Dec 06 08:42:50 crc kubenswrapper[4954]: I1206 08:42:50.872487 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Dec 06 08:42:50 crc kubenswrapper[4954]: E1206 08:42:50.872956 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0400725-fdd0-4f67-ab84-ca735e7b99b8" containerName="mariadb-client-7-default" Dec 06 08:42:50 crc kubenswrapper[4954]: I1206 08:42:50.872986 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0400725-fdd0-4f67-ab84-ca735e7b99b8" containerName="mariadb-client-7-default" Dec 06 08:42:50 crc kubenswrapper[4954]: I1206 08:42:50.873210 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0400725-fdd0-4f67-ab84-ca735e7b99b8" containerName="mariadb-client-7-default" Dec 06 08:42:50 crc kubenswrapper[4954]: I1206 08:42:50.874072 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 06 08:42:50 crc kubenswrapper[4954]: I1206 08:42:50.877245 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-s69fq" Dec 06 08:42:50 crc kubenswrapper[4954]: I1206 08:42:50.884472 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 06 08:42:50 crc kubenswrapper[4954]: I1206 08:42:50.947117 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxhmg\" (UniqueName: \"kubernetes.io/projected/8141d01f-64c8-46fc-84bc-c2764ca75ed8-kube-api-access-nxhmg\") pod \"mariadb-client-2\" (UID: \"8141d01f-64c8-46fc-84bc-c2764ca75ed8\") " pod="openstack/mariadb-client-2" Dec 06 08:42:51 crc kubenswrapper[4954]: I1206 08:42:51.048592 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxhmg\" (UniqueName: \"kubernetes.io/projected/8141d01f-64c8-46fc-84bc-c2764ca75ed8-kube-api-access-nxhmg\") pod \"mariadb-client-2\" (UID: \"8141d01f-64c8-46fc-84bc-c2764ca75ed8\") " pod="openstack/mariadb-client-2" Dec 06 08:42:51 crc kubenswrapper[4954]: I1206 08:42:51.071292 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxhmg\" (UniqueName: \"kubernetes.io/projected/8141d01f-64c8-46fc-84bc-c2764ca75ed8-kube-api-access-nxhmg\") pod \"mariadb-client-2\" (UID: \"8141d01f-64c8-46fc-84bc-c2764ca75ed8\") " pod="openstack/mariadb-client-2" Dec 06 08:42:51 crc kubenswrapper[4954]: I1206 08:42:51.202142 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 06 08:42:51 crc kubenswrapper[4954]: I1206 08:42:51.443885 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:42:51 crc kubenswrapper[4954]: E1206 08:42:51.444172 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:42:51 crc kubenswrapper[4954]: I1206 08:42:51.453736 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0400725-fdd0-4f67-ab84-ca735e7b99b8" path="/var/lib/kubelet/pods/f0400725-fdd0-4f67-ab84-ca735e7b99b8/volumes" Dec 06 08:42:51 crc kubenswrapper[4954]: I1206 08:42:51.813641 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 06 08:42:51 crc kubenswrapper[4954]: W1206 08:42:51.819316 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8141d01f_64c8_46fc_84bc_c2764ca75ed8.slice/crio-86dfbf79bf3ae73340b2c973b02386f0d9756e053ea3c0ac492d145dd979381f WatchSource:0}: Error finding container 86dfbf79bf3ae73340b2c973b02386f0d9756e053ea3c0ac492d145dd979381f: Status 404 returned error can't find the container with id 86dfbf79bf3ae73340b2c973b02386f0d9756e053ea3c0ac492d145dd979381f Dec 06 08:42:51 crc kubenswrapper[4954]: I1206 08:42:51.842459 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"8141d01f-64c8-46fc-84bc-c2764ca75ed8","Type":"ContainerStarted","Data":"86dfbf79bf3ae73340b2c973b02386f0d9756e053ea3c0ac492d145dd979381f"} Dec 06 08:42:52 crc kubenswrapper[4954]: I1206 08:42:52.854911 4954 generic.go:334] "Generic (PLEG): container finished" podID="8141d01f-64c8-46fc-84bc-c2764ca75ed8" containerID="f4e9e21265b502bf16a90ab46c92742a42444925df8ef72e04e5fdcb34df8218" exitCode=0 Dec 06 08:42:52 crc kubenswrapper[4954]: I1206 08:42:52.855043 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"8141d01f-64c8-46fc-84bc-c2764ca75ed8","Type":"ContainerDied","Data":"f4e9e21265b502bf16a90ab46c92742a42444925df8ef72e04e5fdcb34df8218"} Dec 06 08:42:54 crc kubenswrapper[4954]: I1206 08:42:54.277859 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 06 08:42:54 crc kubenswrapper[4954]: I1206 08:42:54.298768 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_8141d01f-64c8-46fc-84bc-c2764ca75ed8/mariadb-client-2/0.log" Dec 06 08:42:54 crc kubenswrapper[4954]: I1206 08:42:54.323353 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Dec 06 08:42:54 crc kubenswrapper[4954]: I1206 08:42:54.328446 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Dec 06 08:42:54 crc kubenswrapper[4954]: I1206 08:42:54.459296 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxhmg\" (UniqueName: \"kubernetes.io/projected/8141d01f-64c8-46fc-84bc-c2764ca75ed8-kube-api-access-nxhmg\") pod \"8141d01f-64c8-46fc-84bc-c2764ca75ed8\" (UID: \"8141d01f-64c8-46fc-84bc-c2764ca75ed8\") " Dec 06 08:42:54 crc kubenswrapper[4954]: I1206 08:42:54.465253 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8141d01f-64c8-46fc-84bc-c2764ca75ed8-kube-api-access-nxhmg" (OuterVolumeSpecName: "kube-api-access-nxhmg") pod "8141d01f-64c8-46fc-84bc-c2764ca75ed8" (UID: "8141d01f-64c8-46fc-84bc-c2764ca75ed8"). InnerVolumeSpecName "kube-api-access-nxhmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:42:54 crc kubenswrapper[4954]: I1206 08:42:54.562050 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxhmg\" (UniqueName: \"kubernetes.io/projected/8141d01f-64c8-46fc-84bc-c2764ca75ed8-kube-api-access-nxhmg\") on node \"crc\" DevicePath \"\"" Dec 06 08:42:54 crc kubenswrapper[4954]: I1206 08:42:54.877065 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86dfbf79bf3ae73340b2c973b02386f0d9756e053ea3c0ac492d145dd979381f" Dec 06 08:42:54 crc kubenswrapper[4954]: I1206 08:42:54.877133 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 06 08:42:55 crc kubenswrapper[4954]: I1206 08:42:55.453661 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8141d01f-64c8-46fc-84bc-c2764ca75ed8" path="/var/lib/kubelet/pods/8141d01f-64c8-46fc-84bc-c2764ca75ed8/volumes" Dec 06 08:43:04 crc kubenswrapper[4954]: I1206 08:43:04.443698 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:43:04 crc kubenswrapper[4954]: E1206 08:43:04.444431 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:43:17 crc kubenswrapper[4954]: I1206 08:43:17.455133 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:43:17 crc kubenswrapper[4954]: E1206 08:43:17.455906 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:43:22 crc kubenswrapper[4954]: I1206 08:43:22.607947 4954 scope.go:117] "RemoveContainer" containerID="9422f2af6b3998c4acdb66d786d40ae1611f3e78c0dab9847a12bcee46a87a60" Dec 06 08:43:32 crc kubenswrapper[4954]: I1206 08:43:32.444250 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:43:32 crc kubenswrapper[4954]: E1206 08:43:32.445154 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:43:45 crc kubenswrapper[4954]: I1206 08:43:45.448665 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:43:45 crc kubenswrapper[4954]: E1206 08:43:45.449852 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:43:57 crc kubenswrapper[4954]: I1206 08:43:57.445712 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:43:57 crc kubenswrapper[4954]: E1206 08:43:57.446412 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:44:10 crc kubenswrapper[4954]: I1206 08:44:10.443727 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:44:10 crc kubenswrapper[4954]: E1206 08:44:10.444583 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:44:25 crc kubenswrapper[4954]: I1206 08:44:25.451338 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:44:25 crc kubenswrapper[4954]: E1206 08:44:25.453652 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:44:40 crc kubenswrapper[4954]: I1206 08:44:40.443313 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:44:40 crc kubenswrapper[4954]: E1206 08:44:40.444066 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:44:54 crc kubenswrapper[4954]: I1206 08:44:54.443611 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:44:54 crc kubenswrapper[4954]: E1206 08:44:54.444467 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:45:00 crc kubenswrapper[4954]: I1206 08:45:00.151092 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56"] Dec 06 08:45:00 crc kubenswrapper[4954]: E1206 08:45:00.151981 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8141d01f-64c8-46fc-84bc-c2764ca75ed8" containerName="mariadb-client-2" Dec 06 08:45:00 crc kubenswrapper[4954]: I1206 08:45:00.152003 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8141d01f-64c8-46fc-84bc-c2764ca75ed8" containerName="mariadb-client-2" Dec 06 08:45:00 crc kubenswrapper[4954]: I1206 08:45:00.152260 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8141d01f-64c8-46fc-84bc-c2764ca75ed8" containerName="mariadb-client-2" Dec 06 08:45:00 crc kubenswrapper[4954]: I1206 08:45:00.152884 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56" Dec 06 08:45:00 crc kubenswrapper[4954]: I1206 08:45:00.155668 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 08:45:00 crc kubenswrapper[4954]: I1206 08:45:00.156906 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 08:45:00 crc kubenswrapper[4954]: I1206 08:45:00.163950 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56"] Dec 06 08:45:00 crc kubenswrapper[4954]: I1206 08:45:00.165282 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69zwx\" (UniqueName: \"kubernetes.io/projected/ff6704a9-ac73-4944-b286-047db35d742f-kube-api-access-69zwx\") pod \"collect-profiles-29416845-w6s56\" (UID: \"ff6704a9-ac73-4944-b286-047db35d742f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56" Dec 06 08:45:00 crc kubenswrapper[4954]: I1206 08:45:00.165398 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ff6704a9-ac73-4944-b286-047db35d742f-secret-volume\") pod \"collect-profiles-29416845-w6s56\" (UID: \"ff6704a9-ac73-4944-b286-047db35d742f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56" Dec 06 08:45:00 crc kubenswrapper[4954]: I1206 08:45:00.165458 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff6704a9-ac73-4944-b286-047db35d742f-config-volume\") pod \"collect-profiles-29416845-w6s56\" (UID: \"ff6704a9-ac73-4944-b286-047db35d742f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56" Dec 06 08:45:00 crc kubenswrapper[4954]: I1206 08:45:00.267070 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff6704a9-ac73-4944-b286-047db35d742f-config-volume\") pod \"collect-profiles-29416845-w6s56\" (UID: \"ff6704a9-ac73-4944-b286-047db35d742f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56" Dec 06 08:45:00 crc kubenswrapper[4954]: I1206 08:45:00.267526 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69zwx\" (UniqueName: \"kubernetes.io/projected/ff6704a9-ac73-4944-b286-047db35d742f-kube-api-access-69zwx\") pod \"collect-profiles-29416845-w6s56\" (UID: \"ff6704a9-ac73-4944-b286-047db35d742f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56" Dec 06 08:45:00 crc kubenswrapper[4954]: I1206 08:45:00.267635 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ff6704a9-ac73-4944-b286-047db35d742f-secret-volume\") pod \"collect-profiles-29416845-w6s56\" (UID: \"ff6704a9-ac73-4944-b286-047db35d742f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56" Dec 06 08:45:00 crc kubenswrapper[4954]: I1206 08:45:00.268110 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff6704a9-ac73-4944-b286-047db35d742f-config-volume\") pod \"collect-profiles-29416845-w6s56\" (UID: \"ff6704a9-ac73-4944-b286-047db35d742f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56" Dec 06 08:45:00 crc kubenswrapper[4954]: I1206 08:45:00.275291 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ff6704a9-ac73-4944-b286-047db35d742f-secret-volume\") pod \"collect-profiles-29416845-w6s56\" (UID: \"ff6704a9-ac73-4944-b286-047db35d742f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56" Dec 06 08:45:00 crc kubenswrapper[4954]: I1206 08:45:00.286207 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69zwx\" (UniqueName: \"kubernetes.io/projected/ff6704a9-ac73-4944-b286-047db35d742f-kube-api-access-69zwx\") pod \"collect-profiles-29416845-w6s56\" (UID: \"ff6704a9-ac73-4944-b286-047db35d742f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56" Dec 06 08:45:00 crc kubenswrapper[4954]: I1206 08:45:00.477523 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56" Dec 06 08:45:00 crc kubenswrapper[4954]: I1206 08:45:00.968903 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56"] Dec 06 08:45:01 crc kubenswrapper[4954]: I1206 08:45:01.794723 4954 generic.go:334] "Generic (PLEG): container finished" podID="ff6704a9-ac73-4944-b286-047db35d742f" containerID="072d6be198759a163bbf3f43e5b5bb10cc516b1aba69f4ef435e9d993b123004" exitCode=0 Dec 06 08:45:01 crc kubenswrapper[4954]: I1206 08:45:01.794812 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56" event={"ID":"ff6704a9-ac73-4944-b286-047db35d742f","Type":"ContainerDied","Data":"072d6be198759a163bbf3f43e5b5bb10cc516b1aba69f4ef435e9d993b123004"} Dec 06 08:45:01 crc kubenswrapper[4954]: I1206 08:45:01.795097 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56" event={"ID":"ff6704a9-ac73-4944-b286-047db35d742f","Type":"ContainerStarted","Data":"9db808c684d16c54e6371e2dc1b485ed40de3ff9bb9302659a70420184e2a13d"} Dec 06 08:45:03 crc kubenswrapper[4954]: I1206 08:45:03.075039 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56" Dec 06 08:45:03 crc kubenswrapper[4954]: I1206 08:45:03.283049 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff6704a9-ac73-4944-b286-047db35d742f-config-volume\") pod \"ff6704a9-ac73-4944-b286-047db35d742f\" (UID: \"ff6704a9-ac73-4944-b286-047db35d742f\") " Dec 06 08:45:03 crc kubenswrapper[4954]: I1206 08:45:03.283182 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69zwx\" (UniqueName: \"kubernetes.io/projected/ff6704a9-ac73-4944-b286-047db35d742f-kube-api-access-69zwx\") pod \"ff6704a9-ac73-4944-b286-047db35d742f\" (UID: \"ff6704a9-ac73-4944-b286-047db35d742f\") " Dec 06 08:45:03 crc kubenswrapper[4954]: I1206 08:45:03.283233 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ff6704a9-ac73-4944-b286-047db35d742f-secret-volume\") pod \"ff6704a9-ac73-4944-b286-047db35d742f\" (UID: \"ff6704a9-ac73-4944-b286-047db35d742f\") " Dec 06 08:45:03 crc kubenswrapper[4954]: I1206 08:45:03.284104 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff6704a9-ac73-4944-b286-047db35d742f-config-volume" (OuterVolumeSpecName: "config-volume") pod "ff6704a9-ac73-4944-b286-047db35d742f" (UID: "ff6704a9-ac73-4944-b286-047db35d742f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:45:03 crc kubenswrapper[4954]: I1206 08:45:03.289290 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff6704a9-ac73-4944-b286-047db35d742f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ff6704a9-ac73-4944-b286-047db35d742f" (UID: "ff6704a9-ac73-4944-b286-047db35d742f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:45:03 crc kubenswrapper[4954]: I1206 08:45:03.290706 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff6704a9-ac73-4944-b286-047db35d742f-kube-api-access-69zwx" (OuterVolumeSpecName: "kube-api-access-69zwx") pod "ff6704a9-ac73-4944-b286-047db35d742f" (UID: "ff6704a9-ac73-4944-b286-047db35d742f"). InnerVolumeSpecName "kube-api-access-69zwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:45:03 crc kubenswrapper[4954]: I1206 08:45:03.385997 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff6704a9-ac73-4944-b286-047db35d742f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 08:45:03 crc kubenswrapper[4954]: I1206 08:45:03.386090 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69zwx\" (UniqueName: \"kubernetes.io/projected/ff6704a9-ac73-4944-b286-047db35d742f-kube-api-access-69zwx\") on node \"crc\" DevicePath \"\"" Dec 06 08:45:03 crc kubenswrapper[4954]: I1206 08:45:03.386115 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ff6704a9-ac73-4944-b286-047db35d742f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 08:45:03 crc kubenswrapper[4954]: I1206 08:45:03.813368 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56" event={"ID":"ff6704a9-ac73-4944-b286-047db35d742f","Type":"ContainerDied","Data":"9db808c684d16c54e6371e2dc1b485ed40de3ff9bb9302659a70420184e2a13d"} Dec 06 08:45:03 crc kubenswrapper[4954]: I1206 08:45:03.813431 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9db808c684d16c54e6371e2dc1b485ed40de3ff9bb9302659a70420184e2a13d" Dec 06 08:45:03 crc kubenswrapper[4954]: I1206 08:45:03.813490 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56" Dec 06 08:45:04 crc kubenswrapper[4954]: I1206 08:45:04.154378 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m"] Dec 06 08:45:04 crc kubenswrapper[4954]: I1206 08:45:04.160283 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416800-hbn6m"] Dec 06 08:45:05 crc kubenswrapper[4954]: I1206 08:45:05.473147 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dded35e-7239-430c-94a8-52974cd4eb0c" path="/var/lib/kubelet/pods/9dded35e-7239-430c-94a8-52974cd4eb0c/volumes" Dec 06 08:45:09 crc kubenswrapper[4954]: I1206 08:45:09.444057 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:45:09 crc kubenswrapper[4954]: E1206 08:45:09.444693 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:45:22 crc kubenswrapper[4954]: I1206 08:45:22.721961 4954 scope.go:117] "RemoveContainer" containerID="5611d172e5cbcc3fc432e189f4c58475582725ed0b2a724d1201575875585059" Dec 06 08:45:23 crc kubenswrapper[4954]: I1206 08:45:23.443087 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:45:23 crc kubenswrapper[4954]: E1206 08:45:23.443442 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:45:31 crc kubenswrapper[4954]: I1206 08:45:31.778211 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vbgpl"] Dec 06 08:45:31 crc kubenswrapper[4954]: E1206 08:45:31.780198 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff6704a9-ac73-4944-b286-047db35d742f" containerName="collect-profiles" Dec 06 08:45:31 crc kubenswrapper[4954]: I1206 08:45:31.780273 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff6704a9-ac73-4944-b286-047db35d742f" containerName="collect-profiles" Dec 06 08:45:31 crc kubenswrapper[4954]: I1206 08:45:31.780502 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff6704a9-ac73-4944-b286-047db35d742f" containerName="collect-profiles" Dec 06 08:45:31 crc kubenswrapper[4954]: I1206 08:45:31.781684 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vbgpl" Dec 06 08:45:31 crc kubenswrapper[4954]: I1206 08:45:31.798724 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vbgpl"] Dec 06 08:45:31 crc kubenswrapper[4954]: I1206 08:45:31.930980 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/945fbaa9-6047-419e-8255-c5ec3b67132c-utilities\") pod \"redhat-marketplace-vbgpl\" (UID: \"945fbaa9-6047-419e-8255-c5ec3b67132c\") " pod="openshift-marketplace/redhat-marketplace-vbgpl" Dec 06 08:45:31 crc kubenswrapper[4954]: I1206 08:45:31.931043 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x27l\" (UniqueName: \"kubernetes.io/projected/945fbaa9-6047-419e-8255-c5ec3b67132c-kube-api-access-5x27l\") pod \"redhat-marketplace-vbgpl\" (UID: \"945fbaa9-6047-419e-8255-c5ec3b67132c\") " pod="openshift-marketplace/redhat-marketplace-vbgpl" Dec 06 08:45:31 crc kubenswrapper[4954]: I1206 08:45:31.931069 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/945fbaa9-6047-419e-8255-c5ec3b67132c-catalog-content\") pod \"redhat-marketplace-vbgpl\" (UID: \"945fbaa9-6047-419e-8255-c5ec3b67132c\") " pod="openshift-marketplace/redhat-marketplace-vbgpl" Dec 06 08:45:32 crc kubenswrapper[4954]: I1206 08:45:32.033699 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/945fbaa9-6047-419e-8255-c5ec3b67132c-utilities\") pod \"redhat-marketplace-vbgpl\" (UID: \"945fbaa9-6047-419e-8255-c5ec3b67132c\") " pod="openshift-marketplace/redhat-marketplace-vbgpl" Dec 06 08:45:32 crc kubenswrapper[4954]: I1206 08:45:32.033762 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x27l\" (UniqueName: \"kubernetes.io/projected/945fbaa9-6047-419e-8255-c5ec3b67132c-kube-api-access-5x27l\") pod \"redhat-marketplace-vbgpl\" (UID: \"945fbaa9-6047-419e-8255-c5ec3b67132c\") " pod="openshift-marketplace/redhat-marketplace-vbgpl" Dec 06 08:45:32 crc kubenswrapper[4954]: I1206 08:45:32.033793 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/945fbaa9-6047-419e-8255-c5ec3b67132c-catalog-content\") pod \"redhat-marketplace-vbgpl\" (UID: \"945fbaa9-6047-419e-8255-c5ec3b67132c\") " pod="openshift-marketplace/redhat-marketplace-vbgpl" Dec 06 08:45:32 crc kubenswrapper[4954]: I1206 08:45:32.034311 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/945fbaa9-6047-419e-8255-c5ec3b67132c-utilities\") pod \"redhat-marketplace-vbgpl\" (UID: \"945fbaa9-6047-419e-8255-c5ec3b67132c\") " pod="openshift-marketplace/redhat-marketplace-vbgpl" Dec 06 08:45:32 crc kubenswrapper[4954]: I1206 08:45:32.034370 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/945fbaa9-6047-419e-8255-c5ec3b67132c-catalog-content\") pod \"redhat-marketplace-vbgpl\" (UID: \"945fbaa9-6047-419e-8255-c5ec3b67132c\") " pod="openshift-marketplace/redhat-marketplace-vbgpl" Dec 06 08:45:32 crc kubenswrapper[4954]: I1206 08:45:32.055819 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x27l\" (UniqueName: \"kubernetes.io/projected/945fbaa9-6047-419e-8255-c5ec3b67132c-kube-api-access-5x27l\") pod \"redhat-marketplace-vbgpl\" (UID: \"945fbaa9-6047-419e-8255-c5ec3b67132c\") " pod="openshift-marketplace/redhat-marketplace-vbgpl" Dec 06 08:45:32 crc kubenswrapper[4954]: I1206 08:45:32.111297 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vbgpl" Dec 06 08:45:32 crc kubenswrapper[4954]: I1206 08:45:32.404967 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vbgpl"] Dec 06 08:45:33 crc kubenswrapper[4954]: I1206 08:45:33.034970 4954 generic.go:334] "Generic (PLEG): container finished" podID="945fbaa9-6047-419e-8255-c5ec3b67132c" containerID="01e4bb86422e018cccbc46bede0a8a724dc85e80a8ddf47d1475922302b11b9f" exitCode=0 Dec 06 08:45:33 crc kubenswrapper[4954]: I1206 08:45:33.035085 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vbgpl" event={"ID":"945fbaa9-6047-419e-8255-c5ec3b67132c","Type":"ContainerDied","Data":"01e4bb86422e018cccbc46bede0a8a724dc85e80a8ddf47d1475922302b11b9f"} Dec 06 08:45:33 crc kubenswrapper[4954]: I1206 08:45:33.035547 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vbgpl" event={"ID":"945fbaa9-6047-419e-8255-c5ec3b67132c","Type":"ContainerStarted","Data":"79cefe3b0bd3ccbaba1e574769fff75ef0860fd7b6ad530b83b26c5d1efe90d6"} Dec 06 08:45:35 crc kubenswrapper[4954]: I1206 08:45:35.054255 4954 generic.go:334] "Generic (PLEG): container finished" podID="945fbaa9-6047-419e-8255-c5ec3b67132c" containerID="f6c31bce6d2790f0ede30a3847e77f2139003d013fd628ca59637f653eef4ee1" exitCode=0 Dec 06 08:45:35 crc kubenswrapper[4954]: I1206 08:45:35.054367 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vbgpl" event={"ID":"945fbaa9-6047-419e-8255-c5ec3b67132c","Type":"ContainerDied","Data":"f6c31bce6d2790f0ede30a3847e77f2139003d013fd628ca59637f653eef4ee1"} Dec 06 08:45:35 crc kubenswrapper[4954]: I1206 08:45:35.448486 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:45:35 crc kubenswrapper[4954]: E1206 08:45:35.448793 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:45:36 crc kubenswrapper[4954]: I1206 08:45:36.062678 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vbgpl" event={"ID":"945fbaa9-6047-419e-8255-c5ec3b67132c","Type":"ContainerStarted","Data":"bafd71775a816964b153520670d593fa50654541cbfd9f1f5c8e3d07847a08e5"} Dec 06 08:45:42 crc kubenswrapper[4954]: I1206 08:45:42.111888 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vbgpl" Dec 06 08:45:42 crc kubenswrapper[4954]: I1206 08:45:42.113653 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vbgpl" Dec 06 08:45:42 crc kubenswrapper[4954]: I1206 08:45:42.153304 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vbgpl" Dec 06 08:45:42 crc kubenswrapper[4954]: I1206 08:45:42.175771 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vbgpl" podStartSLOduration=8.733400079 podStartE2EDuration="11.175729732s" podCreationTimestamp="2025-12-06 08:45:31 +0000 UTC" firstStartedPulling="2025-12-06 08:45:33.036652004 +0000 UTC m=+6507.850011393" lastFinishedPulling="2025-12-06 08:45:35.478981657 +0000 UTC m=+6510.292341046" observedRunningTime="2025-12-06 08:45:36.084991066 +0000 UTC m=+6510.898350455" watchObservedRunningTime="2025-12-06 08:45:42.175729732 +0000 UTC m=+6516.989089121" Dec 06 08:45:43 crc kubenswrapper[4954]: I1206 08:45:43.162793 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vbgpl" Dec 06 08:45:43 crc kubenswrapper[4954]: I1206 08:45:43.211611 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vbgpl"] Dec 06 08:45:45 crc kubenswrapper[4954]: I1206 08:45:45.128282 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vbgpl" podUID="945fbaa9-6047-419e-8255-c5ec3b67132c" containerName="registry-server" containerID="cri-o://bafd71775a816964b153520670d593fa50654541cbfd9f1f5c8e3d07847a08e5" gracePeriod=2 Dec 06 08:45:46 crc kubenswrapper[4954]: I1206 08:45:46.138944 4954 generic.go:334] "Generic (PLEG): container finished" podID="945fbaa9-6047-419e-8255-c5ec3b67132c" containerID="bafd71775a816964b153520670d593fa50654541cbfd9f1f5c8e3d07847a08e5" exitCode=0 Dec 06 08:45:46 crc kubenswrapper[4954]: I1206 08:45:46.139023 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vbgpl" event={"ID":"945fbaa9-6047-419e-8255-c5ec3b67132c","Type":"ContainerDied","Data":"bafd71775a816964b153520670d593fa50654541cbfd9f1f5c8e3d07847a08e5"} Dec 06 08:45:46 crc kubenswrapper[4954]: I1206 08:45:46.697086 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vbgpl" Dec 06 08:45:46 crc kubenswrapper[4954]: I1206 08:45:46.870064 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/945fbaa9-6047-419e-8255-c5ec3b67132c-catalog-content\") pod \"945fbaa9-6047-419e-8255-c5ec3b67132c\" (UID: \"945fbaa9-6047-419e-8255-c5ec3b67132c\") " Dec 06 08:45:46 crc kubenswrapper[4954]: I1206 08:45:46.870500 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/945fbaa9-6047-419e-8255-c5ec3b67132c-utilities\") pod \"945fbaa9-6047-419e-8255-c5ec3b67132c\" (UID: \"945fbaa9-6047-419e-8255-c5ec3b67132c\") " Dec 06 08:45:46 crc kubenswrapper[4954]: I1206 08:45:46.870629 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x27l\" (UniqueName: \"kubernetes.io/projected/945fbaa9-6047-419e-8255-c5ec3b67132c-kube-api-access-5x27l\") pod \"945fbaa9-6047-419e-8255-c5ec3b67132c\" (UID: \"945fbaa9-6047-419e-8255-c5ec3b67132c\") " Dec 06 08:45:46 crc kubenswrapper[4954]: I1206 08:45:46.872413 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/945fbaa9-6047-419e-8255-c5ec3b67132c-utilities" (OuterVolumeSpecName: "utilities") pod "945fbaa9-6047-419e-8255-c5ec3b67132c" (UID: "945fbaa9-6047-419e-8255-c5ec3b67132c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:45:46 crc kubenswrapper[4954]: I1206 08:45:46.877115 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/945fbaa9-6047-419e-8255-c5ec3b67132c-kube-api-access-5x27l" (OuterVolumeSpecName: "kube-api-access-5x27l") pod "945fbaa9-6047-419e-8255-c5ec3b67132c" (UID: "945fbaa9-6047-419e-8255-c5ec3b67132c"). InnerVolumeSpecName "kube-api-access-5x27l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:45:46 crc kubenswrapper[4954]: I1206 08:45:46.907581 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/945fbaa9-6047-419e-8255-c5ec3b67132c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "945fbaa9-6047-419e-8255-c5ec3b67132c" (UID: "945fbaa9-6047-419e-8255-c5ec3b67132c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:45:46 crc kubenswrapper[4954]: I1206 08:45:46.972109 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x27l\" (UniqueName: \"kubernetes.io/projected/945fbaa9-6047-419e-8255-c5ec3b67132c-kube-api-access-5x27l\") on node \"crc\" DevicePath \"\"" Dec 06 08:45:46 crc kubenswrapper[4954]: I1206 08:45:46.972158 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/945fbaa9-6047-419e-8255-c5ec3b67132c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:45:46 crc kubenswrapper[4954]: I1206 08:45:46.972167 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/945fbaa9-6047-419e-8255-c5ec3b67132c-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:45:47 crc kubenswrapper[4954]: I1206 08:45:47.147603 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vbgpl" event={"ID":"945fbaa9-6047-419e-8255-c5ec3b67132c","Type":"ContainerDied","Data":"79cefe3b0bd3ccbaba1e574769fff75ef0860fd7b6ad530b83b26c5d1efe90d6"} Dec 06 08:45:47 crc kubenswrapper[4954]: I1206 08:45:47.147649 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vbgpl" Dec 06 08:45:47 crc kubenswrapper[4954]: I1206 08:45:47.147657 4954 scope.go:117] "RemoveContainer" containerID="bafd71775a816964b153520670d593fa50654541cbfd9f1f5c8e3d07847a08e5" Dec 06 08:45:47 crc kubenswrapper[4954]: I1206 08:45:47.165582 4954 scope.go:117] "RemoveContainer" containerID="f6c31bce6d2790f0ede30a3847e77f2139003d013fd628ca59637f653eef4ee1" Dec 06 08:45:47 crc kubenswrapper[4954]: I1206 08:45:47.177791 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vbgpl"] Dec 06 08:45:47 crc kubenswrapper[4954]: I1206 08:45:47.190922 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vbgpl"] Dec 06 08:45:47 crc kubenswrapper[4954]: I1206 08:45:47.196489 4954 scope.go:117] "RemoveContainer" containerID="01e4bb86422e018cccbc46bede0a8a724dc85e80a8ddf47d1475922302b11b9f" Dec 06 08:45:47 crc kubenswrapper[4954]: I1206 08:45:47.452813 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="945fbaa9-6047-419e-8255-c5ec3b67132c" path="/var/lib/kubelet/pods/945fbaa9-6047-419e-8255-c5ec3b67132c/volumes" Dec 06 08:45:48 crc kubenswrapper[4954]: I1206 08:45:48.442817 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:45:48 crc kubenswrapper[4954]: E1206 08:45:48.443294 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:45:59 crc kubenswrapper[4954]: I1206 08:45:59.443757 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:45:59 crc kubenswrapper[4954]: E1206 08:45:59.446989 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:46:11 crc kubenswrapper[4954]: I1206 08:46:11.443897 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:46:11 crc kubenswrapper[4954]: E1206 08:46:11.444798 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:46:23 crc kubenswrapper[4954]: I1206 08:46:23.443330 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:46:23 crc kubenswrapper[4954]: E1206 08:46:23.444047 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:46:36 crc kubenswrapper[4954]: I1206 08:46:36.443298 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:46:36 crc kubenswrapper[4954]: E1206 08:46:36.444213 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:46:48 crc kubenswrapper[4954]: I1206 08:46:48.443381 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:46:48 crc kubenswrapper[4954]: E1206 08:46:48.444124 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:47:02 crc kubenswrapper[4954]: I1206 08:47:02.444484 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:47:02 crc kubenswrapper[4954]: E1206 08:47:02.446749 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:47:11 crc kubenswrapper[4954]: I1206 08:47:11.791937 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qdjh2"] Dec 06 08:47:11 crc kubenswrapper[4954]: E1206 08:47:11.796007 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="945fbaa9-6047-419e-8255-c5ec3b67132c" containerName="registry-server" Dec 06 08:47:11 crc kubenswrapper[4954]: I1206 08:47:11.796046 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="945fbaa9-6047-419e-8255-c5ec3b67132c" containerName="registry-server" Dec 06 08:47:11 crc kubenswrapper[4954]: E1206 08:47:11.796081 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="945fbaa9-6047-419e-8255-c5ec3b67132c" containerName="extract-content" Dec 06 08:47:11 crc kubenswrapper[4954]: I1206 08:47:11.796091 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="945fbaa9-6047-419e-8255-c5ec3b67132c" containerName="extract-content" Dec 06 08:47:11 crc kubenswrapper[4954]: E1206 08:47:11.796133 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="945fbaa9-6047-419e-8255-c5ec3b67132c" containerName="extract-utilities" Dec 06 08:47:11 crc kubenswrapper[4954]: I1206 08:47:11.796141 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="945fbaa9-6047-419e-8255-c5ec3b67132c" containerName="extract-utilities" Dec 06 08:47:11 crc kubenswrapper[4954]: I1206 08:47:11.796808 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="945fbaa9-6047-419e-8255-c5ec3b67132c" containerName="registry-server" Dec 06 08:47:11 crc kubenswrapper[4954]: I1206 08:47:11.803263 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qdjh2" Dec 06 08:47:11 crc kubenswrapper[4954]: I1206 08:47:11.816983 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qdjh2"] Dec 06 08:47:11 crc kubenswrapper[4954]: I1206 08:47:11.908324 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94f0165a-4b5f-4a8e-ba04-3dca396f5e30-catalog-content\") pod \"certified-operators-qdjh2\" (UID: \"94f0165a-4b5f-4a8e-ba04-3dca396f5e30\") " pod="openshift-marketplace/certified-operators-qdjh2" Dec 06 08:47:11 crc kubenswrapper[4954]: I1206 08:47:11.908673 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzsq7\" (UniqueName: \"kubernetes.io/projected/94f0165a-4b5f-4a8e-ba04-3dca396f5e30-kube-api-access-tzsq7\") pod \"certified-operators-qdjh2\" (UID: \"94f0165a-4b5f-4a8e-ba04-3dca396f5e30\") " pod="openshift-marketplace/certified-operators-qdjh2" Dec 06 08:47:11 crc kubenswrapper[4954]: I1206 08:47:11.908747 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94f0165a-4b5f-4a8e-ba04-3dca396f5e30-utilities\") pod \"certified-operators-qdjh2\" (UID: \"94f0165a-4b5f-4a8e-ba04-3dca396f5e30\") " pod="openshift-marketplace/certified-operators-qdjh2" Dec 06 08:47:12 crc kubenswrapper[4954]: I1206 08:47:12.010454 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzsq7\" (UniqueName: \"kubernetes.io/projected/94f0165a-4b5f-4a8e-ba04-3dca396f5e30-kube-api-access-tzsq7\") pod \"certified-operators-qdjh2\" (UID: \"94f0165a-4b5f-4a8e-ba04-3dca396f5e30\") " pod="openshift-marketplace/certified-operators-qdjh2" Dec 06 08:47:12 crc kubenswrapper[4954]: I1206 08:47:12.010501 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94f0165a-4b5f-4a8e-ba04-3dca396f5e30-utilities\") pod \"certified-operators-qdjh2\" (UID: \"94f0165a-4b5f-4a8e-ba04-3dca396f5e30\") " pod="openshift-marketplace/certified-operators-qdjh2" Dec 06 08:47:12 crc kubenswrapper[4954]: I1206 08:47:12.010577 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94f0165a-4b5f-4a8e-ba04-3dca396f5e30-catalog-content\") pod \"certified-operators-qdjh2\" (UID: \"94f0165a-4b5f-4a8e-ba04-3dca396f5e30\") " pod="openshift-marketplace/certified-operators-qdjh2" Dec 06 08:47:12 crc kubenswrapper[4954]: I1206 08:47:12.011025 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94f0165a-4b5f-4a8e-ba04-3dca396f5e30-catalog-content\") pod \"certified-operators-qdjh2\" (UID: \"94f0165a-4b5f-4a8e-ba04-3dca396f5e30\") " pod="openshift-marketplace/certified-operators-qdjh2" Dec 06 08:47:12 crc kubenswrapper[4954]: I1206 08:47:12.011270 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94f0165a-4b5f-4a8e-ba04-3dca396f5e30-utilities\") pod \"certified-operators-qdjh2\" (UID: \"94f0165a-4b5f-4a8e-ba04-3dca396f5e30\") " pod="openshift-marketplace/certified-operators-qdjh2" Dec 06 08:47:12 crc kubenswrapper[4954]: I1206 08:47:12.030335 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzsq7\" (UniqueName: \"kubernetes.io/projected/94f0165a-4b5f-4a8e-ba04-3dca396f5e30-kube-api-access-tzsq7\") pod \"certified-operators-qdjh2\" (UID: \"94f0165a-4b5f-4a8e-ba04-3dca396f5e30\") " pod="openshift-marketplace/certified-operators-qdjh2" Dec 06 08:47:12 crc kubenswrapper[4954]: I1206 08:47:12.123329 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qdjh2" Dec 06 08:47:12 crc kubenswrapper[4954]: I1206 08:47:12.635677 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qdjh2"] Dec 06 08:47:12 crc kubenswrapper[4954]: I1206 08:47:12.814741 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdjh2" event={"ID":"94f0165a-4b5f-4a8e-ba04-3dca396f5e30","Type":"ContainerStarted","Data":"773c05d32a9fe0a3454289d099a6ebdc12c857e9926330f47d2b90c6362d5f2a"} Dec 06 08:47:13 crc kubenswrapper[4954]: I1206 08:47:13.823239 4954 generic.go:334] "Generic (PLEG): container finished" podID="94f0165a-4b5f-4a8e-ba04-3dca396f5e30" containerID="80204cd367006a8a6677f5eececa04479fecf591503e7c1bc49212bd2dd058fb" exitCode=0 Dec 06 08:47:13 crc kubenswrapper[4954]: I1206 08:47:13.823310 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdjh2" event={"ID":"94f0165a-4b5f-4a8e-ba04-3dca396f5e30","Type":"ContainerDied","Data":"80204cd367006a8a6677f5eececa04479fecf591503e7c1bc49212bd2dd058fb"} Dec 06 08:47:13 crc kubenswrapper[4954]: I1206 08:47:13.826663 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 08:47:16 crc kubenswrapper[4954]: I1206 08:47:16.444185 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:47:16 crc kubenswrapper[4954]: I1206 08:47:16.852825 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"e7e067e69197d16060215dd7fc7f0e43e8a7a1453c025d5bb5df2952f17a7ae4"} Dec 06 08:47:17 crc kubenswrapper[4954]: I1206 08:47:17.862144 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdjh2" event={"ID":"94f0165a-4b5f-4a8e-ba04-3dca396f5e30","Type":"ContainerStarted","Data":"84366f17444f7b166452493cd1e4c63f9bbfdd93b8b0666f1ec4eb606ec84296"} Dec 06 08:47:18 crc kubenswrapper[4954]: I1206 08:47:18.870241 4954 generic.go:334] "Generic (PLEG): container finished" podID="94f0165a-4b5f-4a8e-ba04-3dca396f5e30" containerID="84366f17444f7b166452493cd1e4c63f9bbfdd93b8b0666f1ec4eb606ec84296" exitCode=0 Dec 06 08:47:18 crc kubenswrapper[4954]: I1206 08:47:18.870295 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdjh2" event={"ID":"94f0165a-4b5f-4a8e-ba04-3dca396f5e30","Type":"ContainerDied","Data":"84366f17444f7b166452493cd1e4c63f9bbfdd93b8b0666f1ec4eb606ec84296"} Dec 06 08:47:19 crc kubenswrapper[4954]: I1206 08:47:19.880133 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdjh2" event={"ID":"94f0165a-4b5f-4a8e-ba04-3dca396f5e30","Type":"ContainerStarted","Data":"00a17a6f3d6a3d3a10ed8bee266fa04ca52ca455bf47a553ffb0ce91235ced45"} Dec 06 08:47:19 crc kubenswrapper[4954]: I1206 08:47:19.898322 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qdjh2" podStartSLOduration=3.458443805 podStartE2EDuration="8.898301078s" podCreationTimestamp="2025-12-06 08:47:11 +0000 UTC" firstStartedPulling="2025-12-06 08:47:13.826335511 +0000 UTC m=+6608.639694900" lastFinishedPulling="2025-12-06 08:47:19.266192784 +0000 UTC m=+6614.079552173" observedRunningTime="2025-12-06 08:47:19.89684583 +0000 UTC m=+6614.710205219" watchObservedRunningTime="2025-12-06 08:47:19.898301078 +0000 UTC m=+6614.711660467" Dec 06 08:47:22 crc kubenswrapper[4954]: I1206 08:47:22.124547 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qdjh2" Dec 06 08:47:22 crc kubenswrapper[4954]: I1206 08:47:22.124847 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qdjh2" Dec 06 08:47:22 crc kubenswrapper[4954]: I1206 08:47:22.172420 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qdjh2" Dec 06 08:47:32 crc kubenswrapper[4954]: I1206 08:47:32.168950 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qdjh2" Dec 06 08:47:32 crc kubenswrapper[4954]: I1206 08:47:32.211999 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qdjh2"] Dec 06 08:47:32 crc kubenswrapper[4954]: I1206 08:47:32.979772 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qdjh2" podUID="94f0165a-4b5f-4a8e-ba04-3dca396f5e30" containerName="registry-server" containerID="cri-o://00a17a6f3d6a3d3a10ed8bee266fa04ca52ca455bf47a553ffb0ce91235ced45" gracePeriod=2 Dec 06 08:47:33 crc kubenswrapper[4954]: I1206 08:47:33.899849 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qdjh2" Dec 06 08:47:33 crc kubenswrapper[4954]: I1206 08:47:33.946122 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94f0165a-4b5f-4a8e-ba04-3dca396f5e30-utilities\") pod \"94f0165a-4b5f-4a8e-ba04-3dca396f5e30\" (UID: \"94f0165a-4b5f-4a8e-ba04-3dca396f5e30\") " Dec 06 08:47:33 crc kubenswrapper[4954]: I1206 08:47:33.946208 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94f0165a-4b5f-4a8e-ba04-3dca396f5e30-catalog-content\") pod \"94f0165a-4b5f-4a8e-ba04-3dca396f5e30\" (UID: \"94f0165a-4b5f-4a8e-ba04-3dca396f5e30\") " Dec 06 08:47:33 crc kubenswrapper[4954]: I1206 08:47:33.946304 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzsq7\" (UniqueName: \"kubernetes.io/projected/94f0165a-4b5f-4a8e-ba04-3dca396f5e30-kube-api-access-tzsq7\") pod \"94f0165a-4b5f-4a8e-ba04-3dca396f5e30\" (UID: \"94f0165a-4b5f-4a8e-ba04-3dca396f5e30\") " Dec 06 08:47:33 crc kubenswrapper[4954]: I1206 08:47:33.947296 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94f0165a-4b5f-4a8e-ba04-3dca396f5e30-utilities" (OuterVolumeSpecName: "utilities") pod "94f0165a-4b5f-4a8e-ba04-3dca396f5e30" (UID: "94f0165a-4b5f-4a8e-ba04-3dca396f5e30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:47:33 crc kubenswrapper[4954]: I1206 08:47:33.961887 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94f0165a-4b5f-4a8e-ba04-3dca396f5e30-kube-api-access-tzsq7" (OuterVolumeSpecName: "kube-api-access-tzsq7") pod "94f0165a-4b5f-4a8e-ba04-3dca396f5e30" (UID: "94f0165a-4b5f-4a8e-ba04-3dca396f5e30"). InnerVolumeSpecName "kube-api-access-tzsq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:47:33 crc kubenswrapper[4954]: I1206 08:47:33.996256 4954 generic.go:334] "Generic (PLEG): container finished" podID="94f0165a-4b5f-4a8e-ba04-3dca396f5e30" containerID="00a17a6f3d6a3d3a10ed8bee266fa04ca52ca455bf47a553ffb0ce91235ced45" exitCode=0 Dec 06 08:47:33 crc kubenswrapper[4954]: I1206 08:47:33.996318 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdjh2" event={"ID":"94f0165a-4b5f-4a8e-ba04-3dca396f5e30","Type":"ContainerDied","Data":"00a17a6f3d6a3d3a10ed8bee266fa04ca52ca455bf47a553ffb0ce91235ced45"} Dec 06 08:47:33 crc kubenswrapper[4954]: I1206 08:47:33.996347 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdjh2" event={"ID":"94f0165a-4b5f-4a8e-ba04-3dca396f5e30","Type":"ContainerDied","Data":"773c05d32a9fe0a3454289d099a6ebdc12c857e9926330f47d2b90c6362d5f2a"} Dec 06 08:47:33 crc kubenswrapper[4954]: I1206 08:47:33.996365 4954 scope.go:117] "RemoveContainer" containerID="00a17a6f3d6a3d3a10ed8bee266fa04ca52ca455bf47a553ffb0ce91235ced45" Dec 06 08:47:33 crc kubenswrapper[4954]: I1206 08:47:33.996493 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qdjh2" Dec 06 08:47:34 crc kubenswrapper[4954]: I1206 08:47:34.050546 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzsq7\" (UniqueName: \"kubernetes.io/projected/94f0165a-4b5f-4a8e-ba04-3dca396f5e30-kube-api-access-tzsq7\") on node \"crc\" DevicePath \"\"" Dec 06 08:47:34 crc kubenswrapper[4954]: I1206 08:47:34.050646 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94f0165a-4b5f-4a8e-ba04-3dca396f5e30-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:47:34 crc kubenswrapper[4954]: I1206 08:47:34.070503 4954 scope.go:117] "RemoveContainer" containerID="84366f17444f7b166452493cd1e4c63f9bbfdd93b8b0666f1ec4eb606ec84296" Dec 06 08:47:34 crc kubenswrapper[4954]: I1206 08:47:34.084881 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94f0165a-4b5f-4a8e-ba04-3dca396f5e30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94f0165a-4b5f-4a8e-ba04-3dca396f5e30" (UID: "94f0165a-4b5f-4a8e-ba04-3dca396f5e30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:47:34 crc kubenswrapper[4954]: I1206 08:47:34.092765 4954 scope.go:117] "RemoveContainer" containerID="80204cd367006a8a6677f5eececa04479fecf591503e7c1bc49212bd2dd058fb" Dec 06 08:47:34 crc kubenswrapper[4954]: I1206 08:47:34.127792 4954 scope.go:117] "RemoveContainer" containerID="00a17a6f3d6a3d3a10ed8bee266fa04ca52ca455bf47a553ffb0ce91235ced45" Dec 06 08:47:34 crc kubenswrapper[4954]: E1206 08:47:34.128194 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00a17a6f3d6a3d3a10ed8bee266fa04ca52ca455bf47a553ffb0ce91235ced45\": container with ID starting with 00a17a6f3d6a3d3a10ed8bee266fa04ca52ca455bf47a553ffb0ce91235ced45 not found: ID does not exist" containerID="00a17a6f3d6a3d3a10ed8bee266fa04ca52ca455bf47a553ffb0ce91235ced45" Dec 06 08:47:34 crc kubenswrapper[4954]: I1206 08:47:34.128226 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00a17a6f3d6a3d3a10ed8bee266fa04ca52ca455bf47a553ffb0ce91235ced45"} err="failed to get container status \"00a17a6f3d6a3d3a10ed8bee266fa04ca52ca455bf47a553ffb0ce91235ced45\": rpc error: code = NotFound desc = could not find container \"00a17a6f3d6a3d3a10ed8bee266fa04ca52ca455bf47a553ffb0ce91235ced45\": container with ID starting with 00a17a6f3d6a3d3a10ed8bee266fa04ca52ca455bf47a553ffb0ce91235ced45 not found: ID does not exist" Dec 06 08:47:34 crc kubenswrapper[4954]: I1206 08:47:34.128247 4954 scope.go:117] "RemoveContainer" containerID="84366f17444f7b166452493cd1e4c63f9bbfdd93b8b0666f1ec4eb606ec84296" Dec 06 08:47:34 crc kubenswrapper[4954]: E1206 08:47:34.128489 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84366f17444f7b166452493cd1e4c63f9bbfdd93b8b0666f1ec4eb606ec84296\": container with ID starting with 84366f17444f7b166452493cd1e4c63f9bbfdd93b8b0666f1ec4eb606ec84296 not found: ID does not exist" containerID="84366f17444f7b166452493cd1e4c63f9bbfdd93b8b0666f1ec4eb606ec84296" Dec 06 08:47:34 crc kubenswrapper[4954]: I1206 08:47:34.128521 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84366f17444f7b166452493cd1e4c63f9bbfdd93b8b0666f1ec4eb606ec84296"} err="failed to get container status \"84366f17444f7b166452493cd1e4c63f9bbfdd93b8b0666f1ec4eb606ec84296\": rpc error: code = NotFound desc = could not find container \"84366f17444f7b166452493cd1e4c63f9bbfdd93b8b0666f1ec4eb606ec84296\": container with ID starting with 84366f17444f7b166452493cd1e4c63f9bbfdd93b8b0666f1ec4eb606ec84296 not found: ID does not exist" Dec 06 08:47:34 crc kubenswrapper[4954]: I1206 08:47:34.128539 4954 scope.go:117] "RemoveContainer" containerID="80204cd367006a8a6677f5eececa04479fecf591503e7c1bc49212bd2dd058fb" Dec 06 08:47:34 crc kubenswrapper[4954]: E1206 08:47:34.128752 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80204cd367006a8a6677f5eececa04479fecf591503e7c1bc49212bd2dd058fb\": container with ID starting with 80204cd367006a8a6677f5eececa04479fecf591503e7c1bc49212bd2dd058fb not found: ID does not exist" containerID="80204cd367006a8a6677f5eececa04479fecf591503e7c1bc49212bd2dd058fb" Dec 06 08:47:34 crc kubenswrapper[4954]: I1206 08:47:34.128773 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80204cd367006a8a6677f5eececa04479fecf591503e7c1bc49212bd2dd058fb"} err="failed to get container status \"80204cd367006a8a6677f5eececa04479fecf591503e7c1bc49212bd2dd058fb\": rpc error: code = NotFound desc = could not find container \"80204cd367006a8a6677f5eececa04479fecf591503e7c1bc49212bd2dd058fb\": container with ID starting with 80204cd367006a8a6677f5eececa04479fecf591503e7c1bc49212bd2dd058fb not found: ID does not exist" Dec 06 08:47:34 crc kubenswrapper[4954]: I1206 08:47:34.152312 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94f0165a-4b5f-4a8e-ba04-3dca396f5e30-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:47:34 crc kubenswrapper[4954]: I1206 08:47:34.325613 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qdjh2"] Dec 06 08:47:34 crc kubenswrapper[4954]: I1206 08:47:34.335422 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qdjh2"] Dec 06 08:47:35 crc kubenswrapper[4954]: I1206 08:47:35.451672 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94f0165a-4b5f-4a8e-ba04-3dca396f5e30" path="/var/lib/kubelet/pods/94f0165a-4b5f-4a8e-ba04-3dca396f5e30/volumes" Dec 06 08:48:43 crc kubenswrapper[4954]: I1206 08:48:43.759229 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n8bsv"] Dec 06 08:48:43 crc kubenswrapper[4954]: E1206 08:48:43.760302 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94f0165a-4b5f-4a8e-ba04-3dca396f5e30" containerName="extract-utilities" Dec 06 08:48:43 crc kubenswrapper[4954]: I1206 08:48:43.760321 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f0165a-4b5f-4a8e-ba04-3dca396f5e30" containerName="extract-utilities" Dec 06 08:48:43 crc kubenswrapper[4954]: E1206 08:48:43.760353 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94f0165a-4b5f-4a8e-ba04-3dca396f5e30" containerName="registry-server" Dec 06 08:48:43 crc kubenswrapper[4954]: I1206 08:48:43.760359 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f0165a-4b5f-4a8e-ba04-3dca396f5e30" containerName="registry-server" Dec 06 08:48:43 crc kubenswrapper[4954]: E1206 08:48:43.760372 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94f0165a-4b5f-4a8e-ba04-3dca396f5e30" containerName="extract-content" Dec 06 08:48:43 crc kubenswrapper[4954]: I1206 08:48:43.760378 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f0165a-4b5f-4a8e-ba04-3dca396f5e30" containerName="extract-content" Dec 06 08:48:43 crc kubenswrapper[4954]: I1206 08:48:43.760534 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="94f0165a-4b5f-4a8e-ba04-3dca396f5e30" containerName="registry-server" Dec 06 08:48:43 crc kubenswrapper[4954]: I1206 08:48:43.761903 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8bsv" Dec 06 08:48:43 crc kubenswrapper[4954]: I1206 08:48:43.769873 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8bsv"] Dec 06 08:48:43 crc kubenswrapper[4954]: I1206 08:48:43.874798 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f5dx\" (UniqueName: \"kubernetes.io/projected/423deddf-7d05-4111-90ad-1ad45b585f9b-kube-api-access-2f5dx\") pod \"redhat-operators-n8bsv\" (UID: \"423deddf-7d05-4111-90ad-1ad45b585f9b\") " pod="openshift-marketplace/redhat-operators-n8bsv" Dec 06 08:48:43 crc kubenswrapper[4954]: I1206 08:48:43.875133 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/423deddf-7d05-4111-90ad-1ad45b585f9b-catalog-content\") pod \"redhat-operators-n8bsv\" (UID: \"423deddf-7d05-4111-90ad-1ad45b585f9b\") " pod="openshift-marketplace/redhat-operators-n8bsv" Dec 06 08:48:43 crc kubenswrapper[4954]: I1206 08:48:43.875252 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/423deddf-7d05-4111-90ad-1ad45b585f9b-utilities\") pod \"redhat-operators-n8bsv\" (UID: \"423deddf-7d05-4111-90ad-1ad45b585f9b\") " pod="openshift-marketplace/redhat-operators-n8bsv" Dec 06 08:48:43 crc kubenswrapper[4954]: I1206 08:48:43.977253 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/423deddf-7d05-4111-90ad-1ad45b585f9b-catalog-content\") pod \"redhat-operators-n8bsv\" (UID: \"423deddf-7d05-4111-90ad-1ad45b585f9b\") " pod="openshift-marketplace/redhat-operators-n8bsv" Dec 06 08:48:43 crc kubenswrapper[4954]: I1206 08:48:43.977323 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/423deddf-7d05-4111-90ad-1ad45b585f9b-utilities\") pod \"redhat-operators-n8bsv\" (UID: \"423deddf-7d05-4111-90ad-1ad45b585f9b\") " pod="openshift-marketplace/redhat-operators-n8bsv" Dec 06 08:48:43 crc kubenswrapper[4954]: I1206 08:48:43.977419 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f5dx\" (UniqueName: \"kubernetes.io/projected/423deddf-7d05-4111-90ad-1ad45b585f9b-kube-api-access-2f5dx\") pod \"redhat-operators-n8bsv\" (UID: \"423deddf-7d05-4111-90ad-1ad45b585f9b\") " pod="openshift-marketplace/redhat-operators-n8bsv" Dec 06 08:48:43 crc kubenswrapper[4954]: I1206 08:48:43.977955 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/423deddf-7d05-4111-90ad-1ad45b585f9b-utilities\") pod \"redhat-operators-n8bsv\" (UID: \"423deddf-7d05-4111-90ad-1ad45b585f9b\") " pod="openshift-marketplace/redhat-operators-n8bsv" Dec 06 08:48:43 crc kubenswrapper[4954]: I1206 08:48:43.977959 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/423deddf-7d05-4111-90ad-1ad45b585f9b-catalog-content\") pod \"redhat-operators-n8bsv\" (UID: \"423deddf-7d05-4111-90ad-1ad45b585f9b\") " pod="openshift-marketplace/redhat-operators-n8bsv" Dec 06 08:48:44 crc kubenswrapper[4954]: I1206 08:48:44.000549 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f5dx\" (UniqueName: \"kubernetes.io/projected/423deddf-7d05-4111-90ad-1ad45b585f9b-kube-api-access-2f5dx\") pod \"redhat-operators-n8bsv\" (UID: \"423deddf-7d05-4111-90ad-1ad45b585f9b\") " pod="openshift-marketplace/redhat-operators-n8bsv" Dec 06 08:48:44 crc kubenswrapper[4954]: I1206 08:48:44.087748 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8bsv" Dec 06 08:48:44 crc kubenswrapper[4954]: I1206 08:48:44.546466 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8bsv"] Dec 06 08:48:45 crc kubenswrapper[4954]: I1206 08:48:45.542271 4954 generic.go:334] "Generic (PLEG): container finished" podID="423deddf-7d05-4111-90ad-1ad45b585f9b" containerID="b299fa78756fc09836f5b7aec3a254e7e57cccca655ea1d56ac0d21f9214e21c" exitCode=0 Dec 06 08:48:45 crc kubenswrapper[4954]: I1206 08:48:45.542379 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bsv" event={"ID":"423deddf-7d05-4111-90ad-1ad45b585f9b","Type":"ContainerDied","Data":"b299fa78756fc09836f5b7aec3a254e7e57cccca655ea1d56ac0d21f9214e21c"} Dec 06 08:48:45 crc kubenswrapper[4954]: I1206 08:48:45.542647 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bsv" event={"ID":"423deddf-7d05-4111-90ad-1ad45b585f9b","Type":"ContainerStarted","Data":"be6666d86615d987c526c92090bf686cd1387e37c1374529cccc24b3fa7be9de"} Dec 06 08:48:46 crc kubenswrapper[4954]: I1206 08:48:46.551773 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bsv" event={"ID":"423deddf-7d05-4111-90ad-1ad45b585f9b","Type":"ContainerStarted","Data":"4295407045da2fb68bc0323aef971da57a94ffebac8892b97d2f1d9604fc414b"} Dec 06 08:48:47 crc kubenswrapper[4954]: I1206 08:48:47.561847 4954 generic.go:334] "Generic (PLEG): container finished" podID="423deddf-7d05-4111-90ad-1ad45b585f9b" containerID="4295407045da2fb68bc0323aef971da57a94ffebac8892b97d2f1d9604fc414b" exitCode=0 Dec 06 08:48:47 crc kubenswrapper[4954]: I1206 08:48:47.561909 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bsv" event={"ID":"423deddf-7d05-4111-90ad-1ad45b585f9b","Type":"ContainerDied","Data":"4295407045da2fb68bc0323aef971da57a94ffebac8892b97d2f1d9604fc414b"} Dec 06 08:48:48 crc kubenswrapper[4954]: I1206 08:48:48.570460 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bsv" event={"ID":"423deddf-7d05-4111-90ad-1ad45b585f9b","Type":"ContainerStarted","Data":"713bea431480802520c1506b69760308d8811f1364c7e897f18ebe60cb310d0f"} Dec 06 08:48:48 crc kubenswrapper[4954]: I1206 08:48:48.592990 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n8bsv" podStartSLOduration=3.164050221 podStartE2EDuration="5.592951816s" podCreationTimestamp="2025-12-06 08:48:43 +0000 UTC" firstStartedPulling="2025-12-06 08:48:45.544760948 +0000 UTC m=+6700.358120337" lastFinishedPulling="2025-12-06 08:48:47.973662543 +0000 UTC m=+6702.787021932" observedRunningTime="2025-12-06 08:48:48.587167072 +0000 UTC m=+6703.400526461" watchObservedRunningTime="2025-12-06 08:48:48.592951816 +0000 UTC m=+6703.406311205" Dec 06 08:48:54 crc kubenswrapper[4954]: I1206 08:48:54.088742 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n8bsv" Dec 06 08:48:54 crc kubenswrapper[4954]: I1206 08:48:54.089352 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n8bsv" Dec 06 08:48:54 crc kubenswrapper[4954]: I1206 08:48:54.143511 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n8bsv" Dec 06 08:48:54 crc kubenswrapper[4954]: I1206 08:48:54.660984 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n8bsv" Dec 06 08:48:54 crc kubenswrapper[4954]: I1206 08:48:54.715301 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n8bsv"] Dec 06 08:48:56 crc kubenswrapper[4954]: I1206 08:48:56.625901 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n8bsv" podUID="423deddf-7d05-4111-90ad-1ad45b585f9b" containerName="registry-server" containerID="cri-o://713bea431480802520c1506b69760308d8811f1364c7e897f18ebe60cb310d0f" gracePeriod=2 Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.214162 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8bsv" Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.310664 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/423deddf-7d05-4111-90ad-1ad45b585f9b-catalog-content\") pod \"423deddf-7d05-4111-90ad-1ad45b585f9b\" (UID: \"423deddf-7d05-4111-90ad-1ad45b585f9b\") " Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.310760 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f5dx\" (UniqueName: \"kubernetes.io/projected/423deddf-7d05-4111-90ad-1ad45b585f9b-kube-api-access-2f5dx\") pod \"423deddf-7d05-4111-90ad-1ad45b585f9b\" (UID: \"423deddf-7d05-4111-90ad-1ad45b585f9b\") " Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.310851 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/423deddf-7d05-4111-90ad-1ad45b585f9b-utilities\") pod \"423deddf-7d05-4111-90ad-1ad45b585f9b\" (UID: \"423deddf-7d05-4111-90ad-1ad45b585f9b\") " Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.311862 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/423deddf-7d05-4111-90ad-1ad45b585f9b-utilities" (OuterVolumeSpecName: "utilities") pod "423deddf-7d05-4111-90ad-1ad45b585f9b" (UID: "423deddf-7d05-4111-90ad-1ad45b585f9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.316636 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/423deddf-7d05-4111-90ad-1ad45b585f9b-kube-api-access-2f5dx" (OuterVolumeSpecName: "kube-api-access-2f5dx") pod "423deddf-7d05-4111-90ad-1ad45b585f9b" (UID: "423deddf-7d05-4111-90ad-1ad45b585f9b"). InnerVolumeSpecName "kube-api-access-2f5dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.412146 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/423deddf-7d05-4111-90ad-1ad45b585f9b-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.412180 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f5dx\" (UniqueName: \"kubernetes.io/projected/423deddf-7d05-4111-90ad-1ad45b585f9b-kube-api-access-2f5dx\") on node \"crc\" DevicePath \"\"" Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.634518 4954 generic.go:334] "Generic (PLEG): container finished" podID="423deddf-7d05-4111-90ad-1ad45b585f9b" containerID="713bea431480802520c1506b69760308d8811f1364c7e897f18ebe60cb310d0f" exitCode=0 Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.634588 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8bsv" Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.634638 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bsv" event={"ID":"423deddf-7d05-4111-90ad-1ad45b585f9b","Type":"ContainerDied","Data":"713bea431480802520c1506b69760308d8811f1364c7e897f18ebe60cb310d0f"} Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.634750 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bsv" event={"ID":"423deddf-7d05-4111-90ad-1ad45b585f9b","Type":"ContainerDied","Data":"be6666d86615d987c526c92090bf686cd1387e37c1374529cccc24b3fa7be9de"} Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.634776 4954 scope.go:117] "RemoveContainer" containerID="713bea431480802520c1506b69760308d8811f1364c7e897f18ebe60cb310d0f" Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.659074 4954 scope.go:117] "RemoveContainer" containerID="4295407045da2fb68bc0323aef971da57a94ffebac8892b97d2f1d9604fc414b" Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.686832 4954 scope.go:117] "RemoveContainer" containerID="b299fa78756fc09836f5b7aec3a254e7e57cccca655ea1d56ac0d21f9214e21c" Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.716157 4954 scope.go:117] "RemoveContainer" containerID="713bea431480802520c1506b69760308d8811f1364c7e897f18ebe60cb310d0f" Dec 06 08:48:57 crc kubenswrapper[4954]: E1206 08:48:57.716862 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"713bea431480802520c1506b69760308d8811f1364c7e897f18ebe60cb310d0f\": container with ID starting with 713bea431480802520c1506b69760308d8811f1364c7e897f18ebe60cb310d0f not found: ID does not exist" containerID="713bea431480802520c1506b69760308d8811f1364c7e897f18ebe60cb310d0f" Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.716919 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"713bea431480802520c1506b69760308d8811f1364c7e897f18ebe60cb310d0f"} err="failed to get container status \"713bea431480802520c1506b69760308d8811f1364c7e897f18ebe60cb310d0f\": rpc error: code = NotFound desc = could not find container \"713bea431480802520c1506b69760308d8811f1364c7e897f18ebe60cb310d0f\": container with ID starting with 713bea431480802520c1506b69760308d8811f1364c7e897f18ebe60cb310d0f not found: ID does not exist" Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.716950 4954 scope.go:117] "RemoveContainer" containerID="4295407045da2fb68bc0323aef971da57a94ffebac8892b97d2f1d9604fc414b" Dec 06 08:48:57 crc kubenswrapper[4954]: E1206 08:48:57.717466 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4295407045da2fb68bc0323aef971da57a94ffebac8892b97d2f1d9604fc414b\": container with ID starting with 4295407045da2fb68bc0323aef971da57a94ffebac8892b97d2f1d9604fc414b not found: ID does not exist" containerID="4295407045da2fb68bc0323aef971da57a94ffebac8892b97d2f1d9604fc414b" Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.717494 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4295407045da2fb68bc0323aef971da57a94ffebac8892b97d2f1d9604fc414b"} err="failed to get container status \"4295407045da2fb68bc0323aef971da57a94ffebac8892b97d2f1d9604fc414b\": rpc error: code = NotFound desc = could not find container \"4295407045da2fb68bc0323aef971da57a94ffebac8892b97d2f1d9604fc414b\": container with ID starting with 4295407045da2fb68bc0323aef971da57a94ffebac8892b97d2f1d9604fc414b not found: ID does not exist" Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.717509 4954 scope.go:117] "RemoveContainer" containerID="b299fa78756fc09836f5b7aec3a254e7e57cccca655ea1d56ac0d21f9214e21c" Dec 06 08:48:57 crc kubenswrapper[4954]: E1206 08:48:57.717821 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b299fa78756fc09836f5b7aec3a254e7e57cccca655ea1d56ac0d21f9214e21c\": container with ID starting with b299fa78756fc09836f5b7aec3a254e7e57cccca655ea1d56ac0d21f9214e21c not found: ID does not exist" containerID="b299fa78756fc09836f5b7aec3a254e7e57cccca655ea1d56ac0d21f9214e21c" Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.717846 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b299fa78756fc09836f5b7aec3a254e7e57cccca655ea1d56ac0d21f9214e21c"} err="failed to get container status \"b299fa78756fc09836f5b7aec3a254e7e57cccca655ea1d56ac0d21f9214e21c\": rpc error: code = NotFound desc = could not find container \"b299fa78756fc09836f5b7aec3a254e7e57cccca655ea1d56ac0d21f9214e21c\": container with ID starting with b299fa78756fc09836f5b7aec3a254e7e57cccca655ea1d56ac0d21f9214e21c not found: ID does not exist" Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.778198 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/423deddf-7d05-4111-90ad-1ad45b585f9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "423deddf-7d05-4111-90ad-1ad45b585f9b" (UID: "423deddf-7d05-4111-90ad-1ad45b585f9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.818148 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/423deddf-7d05-4111-90ad-1ad45b585f9b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.969878 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n8bsv"] Dec 06 08:48:57 crc kubenswrapper[4954]: I1206 08:48:57.976270 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n8bsv"] Dec 06 08:48:59 crc kubenswrapper[4954]: I1206 08:48:59.460059 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="423deddf-7d05-4111-90ad-1ad45b585f9b" path="/var/lib/kubelet/pods/423deddf-7d05-4111-90ad-1ad45b585f9b/volumes" Dec 06 08:49:00 crc kubenswrapper[4954]: I1206 08:49:00.580421 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q9fqz"] Dec 06 08:49:00 crc kubenswrapper[4954]: E1206 08:49:00.580950 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="423deddf-7d05-4111-90ad-1ad45b585f9b" containerName="extract-content" Dec 06 08:49:00 crc kubenswrapper[4954]: I1206 08:49:00.580965 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="423deddf-7d05-4111-90ad-1ad45b585f9b" containerName="extract-content" Dec 06 08:49:00 crc kubenswrapper[4954]: E1206 08:49:00.580983 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="423deddf-7d05-4111-90ad-1ad45b585f9b" containerName="registry-server" Dec 06 08:49:00 crc kubenswrapper[4954]: I1206 08:49:00.581017 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="423deddf-7d05-4111-90ad-1ad45b585f9b" containerName="registry-server" Dec 06 08:49:00 crc kubenswrapper[4954]: E1206 08:49:00.581031 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="423deddf-7d05-4111-90ad-1ad45b585f9b" containerName="extract-utilities" Dec 06 08:49:00 crc kubenswrapper[4954]: I1206 08:49:00.581038 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="423deddf-7d05-4111-90ad-1ad45b585f9b" containerName="extract-utilities" Dec 06 08:49:00 crc kubenswrapper[4954]: I1206 08:49:00.581304 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="423deddf-7d05-4111-90ad-1ad45b585f9b" containerName="registry-server" Dec 06 08:49:00 crc kubenswrapper[4954]: I1206 08:49:00.584172 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q9fqz" Dec 06 08:49:00 crc kubenswrapper[4954]: I1206 08:49:00.595421 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q9fqz"] Dec 06 08:49:00 crc kubenswrapper[4954]: I1206 08:49:00.767857 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce24b03f-640f-4c8f-a446-0b18acf260e9-utilities\") pod \"community-operators-q9fqz\" (UID: \"ce24b03f-640f-4c8f-a446-0b18acf260e9\") " pod="openshift-marketplace/community-operators-q9fqz" Dec 06 08:49:00 crc kubenswrapper[4954]: I1206 08:49:00.768385 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftbpf\" (UniqueName: \"kubernetes.io/projected/ce24b03f-640f-4c8f-a446-0b18acf260e9-kube-api-access-ftbpf\") pod \"community-operators-q9fqz\" (UID: \"ce24b03f-640f-4c8f-a446-0b18acf260e9\") " pod="openshift-marketplace/community-operators-q9fqz" Dec 06 08:49:00 crc kubenswrapper[4954]: I1206 08:49:00.768413 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce24b03f-640f-4c8f-a446-0b18acf260e9-catalog-content\") pod \"community-operators-q9fqz\" (UID: \"ce24b03f-640f-4c8f-a446-0b18acf260e9\") " pod="openshift-marketplace/community-operators-q9fqz" Dec 06 08:49:00 crc kubenswrapper[4954]: I1206 08:49:00.869618 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce24b03f-640f-4c8f-a446-0b18acf260e9-utilities\") pod \"community-operators-q9fqz\" (UID: \"ce24b03f-640f-4c8f-a446-0b18acf260e9\") " pod="openshift-marketplace/community-operators-q9fqz" Dec 06 08:49:00 crc kubenswrapper[4954]: I1206 08:49:00.869682 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftbpf\" (UniqueName: \"kubernetes.io/projected/ce24b03f-640f-4c8f-a446-0b18acf260e9-kube-api-access-ftbpf\") pod \"community-operators-q9fqz\" (UID: \"ce24b03f-640f-4c8f-a446-0b18acf260e9\") " pod="openshift-marketplace/community-operators-q9fqz" Dec 06 08:49:00 crc kubenswrapper[4954]: I1206 08:49:00.869703 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce24b03f-640f-4c8f-a446-0b18acf260e9-catalog-content\") pod \"community-operators-q9fqz\" (UID: \"ce24b03f-640f-4c8f-a446-0b18acf260e9\") " pod="openshift-marketplace/community-operators-q9fqz" Dec 06 08:49:00 crc kubenswrapper[4954]: I1206 08:49:00.870282 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce24b03f-640f-4c8f-a446-0b18acf260e9-catalog-content\") pod \"community-operators-q9fqz\" (UID: \"ce24b03f-640f-4c8f-a446-0b18acf260e9\") " pod="openshift-marketplace/community-operators-q9fqz" Dec 06 08:49:00 crc kubenswrapper[4954]: I1206 08:49:00.870417 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce24b03f-640f-4c8f-a446-0b18acf260e9-utilities\") pod \"community-operators-q9fqz\" (UID: \"ce24b03f-640f-4c8f-a446-0b18acf260e9\") " pod="openshift-marketplace/community-operators-q9fqz" Dec 06 08:49:00 crc kubenswrapper[4954]: I1206 08:49:00.893329 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftbpf\" (UniqueName: \"kubernetes.io/projected/ce24b03f-640f-4c8f-a446-0b18acf260e9-kube-api-access-ftbpf\") pod \"community-operators-q9fqz\" (UID: \"ce24b03f-640f-4c8f-a446-0b18acf260e9\") " pod="openshift-marketplace/community-operators-q9fqz" Dec 06 08:49:00 crc kubenswrapper[4954]: I1206 08:49:00.943939 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q9fqz" Dec 06 08:49:01 crc kubenswrapper[4954]: I1206 08:49:01.420656 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q9fqz"] Dec 06 08:49:01 crc kubenswrapper[4954]: I1206 08:49:01.673426 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9fqz" event={"ID":"ce24b03f-640f-4c8f-a446-0b18acf260e9","Type":"ContainerStarted","Data":"eefa2faa78c67e2b4ec8b3126dd8e810fdb5078fd6544a5591b61db15abaa537"} Dec 06 08:49:02 crc kubenswrapper[4954]: I1206 08:49:02.682165 4954 generic.go:334] "Generic (PLEG): container finished" podID="ce24b03f-640f-4c8f-a446-0b18acf260e9" containerID="8589da1a09d8da4acb2c1d121bc9c444d02a30d4a4de1c0db08eac1ffa55b73a" exitCode=0 Dec 06 08:49:02 crc kubenswrapper[4954]: I1206 08:49:02.682653 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9fqz" event={"ID":"ce24b03f-640f-4c8f-a446-0b18acf260e9","Type":"ContainerDied","Data":"8589da1a09d8da4acb2c1d121bc9c444d02a30d4a4de1c0db08eac1ffa55b73a"} Dec 06 08:49:03 crc kubenswrapper[4954]: I1206 08:49:03.692089 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9fqz" event={"ID":"ce24b03f-640f-4c8f-a446-0b18acf260e9","Type":"ContainerStarted","Data":"530a333483715e8109d11d9fc84a168cd5e60ad9446ff025f906f270d0b4ad06"} Dec 06 08:49:04 crc kubenswrapper[4954]: I1206 08:49:04.701124 4954 generic.go:334] "Generic (PLEG): container finished" podID="ce24b03f-640f-4c8f-a446-0b18acf260e9" containerID="530a333483715e8109d11d9fc84a168cd5e60ad9446ff025f906f270d0b4ad06" exitCode=0 Dec 06 08:49:04 crc kubenswrapper[4954]: I1206 08:49:04.701602 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9fqz" event={"ID":"ce24b03f-640f-4c8f-a446-0b18acf260e9","Type":"ContainerDied","Data":"530a333483715e8109d11d9fc84a168cd5e60ad9446ff025f906f270d0b4ad06"} Dec 06 08:49:05 crc kubenswrapper[4954]: I1206 08:49:05.713374 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9fqz" event={"ID":"ce24b03f-640f-4c8f-a446-0b18acf260e9","Type":"ContainerStarted","Data":"6f473dae377005ebd8652fef0e395c7eb31215c3d4cb5ec0a5e02ce65bb06324"} Dec 06 08:49:05 crc kubenswrapper[4954]: I1206 08:49:05.734437 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q9fqz" podStartSLOduration=3.133604409 podStartE2EDuration="5.73441949s" podCreationTimestamp="2025-12-06 08:49:00 +0000 UTC" firstStartedPulling="2025-12-06 08:49:02.68541586 +0000 UTC m=+6717.498775269" lastFinishedPulling="2025-12-06 08:49:05.286230961 +0000 UTC m=+6720.099590350" observedRunningTime="2025-12-06 08:49:05.732272493 +0000 UTC m=+6720.545631872" watchObservedRunningTime="2025-12-06 08:49:05.73441949 +0000 UTC m=+6720.547778879" Dec 06 08:49:10 crc kubenswrapper[4954]: I1206 08:49:10.944750 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q9fqz" Dec 06 08:49:10 crc kubenswrapper[4954]: I1206 08:49:10.945488 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q9fqz" Dec 06 08:49:11 crc kubenswrapper[4954]: I1206 08:49:11.019658 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q9fqz" Dec 06 08:49:11 crc kubenswrapper[4954]: I1206 08:49:11.806934 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q9fqz" Dec 06 08:49:11 crc kubenswrapper[4954]: I1206 08:49:11.859221 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q9fqz"] Dec 06 08:49:13 crc kubenswrapper[4954]: I1206 08:49:13.776162 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q9fqz" podUID="ce24b03f-640f-4c8f-a446-0b18acf260e9" containerName="registry-server" containerID="cri-o://6f473dae377005ebd8652fef0e395c7eb31215c3d4cb5ec0a5e02ce65bb06324" gracePeriod=2 Dec 06 08:49:14 crc kubenswrapper[4954]: I1206 08:49:14.783583 4954 generic.go:334] "Generic (PLEG): container finished" podID="ce24b03f-640f-4c8f-a446-0b18acf260e9" containerID="6f473dae377005ebd8652fef0e395c7eb31215c3d4cb5ec0a5e02ce65bb06324" exitCode=0 Dec 06 08:49:14 crc kubenswrapper[4954]: I1206 08:49:14.783625 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9fqz" event={"ID":"ce24b03f-640f-4c8f-a446-0b18acf260e9","Type":"ContainerDied","Data":"6f473dae377005ebd8652fef0e395c7eb31215c3d4cb5ec0a5e02ce65bb06324"} Dec 06 08:49:15 crc kubenswrapper[4954]: I1206 08:49:15.290977 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q9fqz" Dec 06 08:49:15 crc kubenswrapper[4954]: I1206 08:49:15.411122 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce24b03f-640f-4c8f-a446-0b18acf260e9-catalog-content\") pod \"ce24b03f-640f-4c8f-a446-0b18acf260e9\" (UID: \"ce24b03f-640f-4c8f-a446-0b18acf260e9\") " Dec 06 08:49:15 crc kubenswrapper[4954]: I1206 08:49:15.411263 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftbpf\" (UniqueName: \"kubernetes.io/projected/ce24b03f-640f-4c8f-a446-0b18acf260e9-kube-api-access-ftbpf\") pod \"ce24b03f-640f-4c8f-a446-0b18acf260e9\" (UID: \"ce24b03f-640f-4c8f-a446-0b18acf260e9\") " Dec 06 08:49:15 crc kubenswrapper[4954]: I1206 08:49:15.411337 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce24b03f-640f-4c8f-a446-0b18acf260e9-utilities\") pod \"ce24b03f-640f-4c8f-a446-0b18acf260e9\" (UID: \"ce24b03f-640f-4c8f-a446-0b18acf260e9\") " Dec 06 08:49:15 crc kubenswrapper[4954]: I1206 08:49:15.412281 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce24b03f-640f-4c8f-a446-0b18acf260e9-utilities" (OuterVolumeSpecName: "utilities") pod "ce24b03f-640f-4c8f-a446-0b18acf260e9" (UID: "ce24b03f-640f-4c8f-a446-0b18acf260e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:49:15 crc kubenswrapper[4954]: I1206 08:49:15.417739 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce24b03f-640f-4c8f-a446-0b18acf260e9-kube-api-access-ftbpf" (OuterVolumeSpecName: "kube-api-access-ftbpf") pod "ce24b03f-640f-4c8f-a446-0b18acf260e9" (UID: "ce24b03f-640f-4c8f-a446-0b18acf260e9"). InnerVolumeSpecName "kube-api-access-ftbpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:49:15 crc kubenswrapper[4954]: I1206 08:49:15.465973 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce24b03f-640f-4c8f-a446-0b18acf260e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce24b03f-640f-4c8f-a446-0b18acf260e9" (UID: "ce24b03f-640f-4c8f-a446-0b18acf260e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:49:15 crc kubenswrapper[4954]: I1206 08:49:15.513284 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftbpf\" (UniqueName: \"kubernetes.io/projected/ce24b03f-640f-4c8f-a446-0b18acf260e9-kube-api-access-ftbpf\") on node \"crc\" DevicePath \"\"" Dec 06 08:49:15 crc kubenswrapper[4954]: I1206 08:49:15.513322 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce24b03f-640f-4c8f-a446-0b18acf260e9-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:49:15 crc kubenswrapper[4954]: I1206 08:49:15.513337 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce24b03f-640f-4c8f-a446-0b18acf260e9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:49:15 crc kubenswrapper[4954]: I1206 08:49:15.793373 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9fqz" event={"ID":"ce24b03f-640f-4c8f-a446-0b18acf260e9","Type":"ContainerDied","Data":"eefa2faa78c67e2b4ec8b3126dd8e810fdb5078fd6544a5591b61db15abaa537"} Dec 06 08:49:15 crc kubenswrapper[4954]: I1206 08:49:15.793433 4954 scope.go:117] "RemoveContainer" containerID="6f473dae377005ebd8652fef0e395c7eb31215c3d4cb5ec0a5e02ce65bb06324" Dec 06 08:49:15 crc kubenswrapper[4954]: I1206 08:49:15.793445 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q9fqz" Dec 06 08:49:15 crc kubenswrapper[4954]: I1206 08:49:15.822622 4954 scope.go:117] "RemoveContainer" containerID="530a333483715e8109d11d9fc84a168cd5e60ad9446ff025f906f270d0b4ad06" Dec 06 08:49:15 crc kubenswrapper[4954]: I1206 08:49:15.828653 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q9fqz"] Dec 06 08:49:15 crc kubenswrapper[4954]: I1206 08:49:15.835375 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q9fqz"] Dec 06 08:49:15 crc kubenswrapper[4954]: I1206 08:49:15.858932 4954 scope.go:117] "RemoveContainer" containerID="8589da1a09d8da4acb2c1d121bc9c444d02a30d4a4de1c0db08eac1ffa55b73a" Dec 06 08:49:17 crc kubenswrapper[4954]: I1206 08:49:17.454447 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce24b03f-640f-4c8f-a446-0b18acf260e9" path="/var/lib/kubelet/pods/ce24b03f-640f-4c8f-a446-0b18acf260e9/volumes" Dec 06 08:49:22 crc kubenswrapper[4954]: I1206 08:49:22.835495 4954 scope.go:117] "RemoveContainer" containerID="f4e9e21265b502bf16a90ab46c92742a42444925df8ef72e04e5fdcb34df8218" Dec 06 08:49:22 crc kubenswrapper[4954]: I1206 08:49:22.854458 4954 scope.go:117] "RemoveContainer" containerID="ff0a76f79636a16af4ee3fd5ce79baf8fd95782cbbe4478372284f9a1446200e" Dec 06 08:49:22 crc kubenswrapper[4954]: I1206 08:49:22.887639 4954 scope.go:117] "RemoveContainer" containerID="60fc5c446c5192654fe08533607b78dcfa3c982894b17283e72057e115e16858" Dec 06 08:49:40 crc kubenswrapper[4954]: I1206 08:49:40.101180 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:49:40 crc kubenswrapper[4954]: I1206 08:49:40.102043 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:50:10 crc kubenswrapper[4954]: I1206 08:50:10.100873 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:50:10 crc kubenswrapper[4954]: I1206 08:50:10.101454 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:50:40 crc kubenswrapper[4954]: I1206 08:50:40.100796 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:50:40 crc kubenswrapper[4954]: I1206 08:50:40.101355 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:50:40 crc kubenswrapper[4954]: I1206 08:50:40.101419 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 08:50:40 crc kubenswrapper[4954]: I1206 08:50:40.102203 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e7e067e69197d16060215dd7fc7f0e43e8a7a1453c025d5bb5df2952f17a7ae4"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 08:50:40 crc kubenswrapper[4954]: I1206 08:50:40.102277 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://e7e067e69197d16060215dd7fc7f0e43e8a7a1453c025d5bb5df2952f17a7ae4" gracePeriod=600 Dec 06 08:50:40 crc kubenswrapper[4954]: I1206 08:50:40.427396 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="e7e067e69197d16060215dd7fc7f0e43e8a7a1453c025d5bb5df2952f17a7ae4" exitCode=0 Dec 06 08:50:40 crc kubenswrapper[4954]: I1206 08:50:40.427695 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"e7e067e69197d16060215dd7fc7f0e43e8a7a1453c025d5bb5df2952f17a7ae4"} Dec 06 08:50:40 crc kubenswrapper[4954]: I1206 08:50:40.428489 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8"} Dec 06 08:50:40 crc kubenswrapper[4954]: I1206 08:50:40.428614 4954 scope.go:117] "RemoveContainer" containerID="4d431084dd818de1881127cb257aa34ed11759c80a476f5ef4ebd6f84fca5448" Dec 06 08:51:09 crc kubenswrapper[4954]: I1206 08:51:09.748132 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Dec 06 08:51:09 crc kubenswrapper[4954]: E1206 08:51:09.749064 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce24b03f-640f-4c8f-a446-0b18acf260e9" containerName="extract-utilities" Dec 06 08:51:09 crc kubenswrapper[4954]: I1206 08:51:09.749081 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce24b03f-640f-4c8f-a446-0b18acf260e9" containerName="extract-utilities" Dec 06 08:51:09 crc kubenswrapper[4954]: E1206 08:51:09.749105 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce24b03f-640f-4c8f-a446-0b18acf260e9" containerName="registry-server" Dec 06 08:51:09 crc kubenswrapper[4954]: I1206 08:51:09.749113 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce24b03f-640f-4c8f-a446-0b18acf260e9" containerName="registry-server" Dec 06 08:51:09 crc kubenswrapper[4954]: E1206 08:51:09.749132 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce24b03f-640f-4c8f-a446-0b18acf260e9" containerName="extract-content" Dec 06 08:51:09 crc kubenswrapper[4954]: I1206 08:51:09.749142 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce24b03f-640f-4c8f-a446-0b18acf260e9" containerName="extract-content" Dec 06 08:51:09 crc kubenswrapper[4954]: I1206 08:51:09.749343 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce24b03f-640f-4c8f-a446-0b18acf260e9" containerName="registry-server" Dec 06 08:51:09 crc kubenswrapper[4954]: I1206 08:51:09.750041 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 06 08:51:09 crc kubenswrapper[4954]: I1206 08:51:09.755004 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-s69fq" Dec 06 08:51:09 crc kubenswrapper[4954]: I1206 08:51:09.760645 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 06 08:51:09 crc kubenswrapper[4954]: I1206 08:51:09.844053 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs598\" (UniqueName: \"kubernetes.io/projected/c1d0492d-6e62-434f-9ede-02d942146a89-kube-api-access-qs598\") pod \"mariadb-copy-data\" (UID: \"c1d0492d-6e62-434f-9ede-02d942146a89\") " pod="openstack/mariadb-copy-data" Dec 06 08:51:09 crc kubenswrapper[4954]: I1206 08:51:09.844617 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b8ccc5f0-1276-4b9a-9397-189833c8d274\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8ccc5f0-1276-4b9a-9397-189833c8d274\") pod \"mariadb-copy-data\" (UID: \"c1d0492d-6e62-434f-9ede-02d942146a89\") " pod="openstack/mariadb-copy-data" Dec 06 08:51:09 crc kubenswrapper[4954]: I1206 08:51:09.948356 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs598\" (UniqueName: \"kubernetes.io/projected/c1d0492d-6e62-434f-9ede-02d942146a89-kube-api-access-qs598\") pod \"mariadb-copy-data\" (UID: \"c1d0492d-6e62-434f-9ede-02d942146a89\") " pod="openstack/mariadb-copy-data" Dec 06 08:51:09 crc kubenswrapper[4954]: I1206 08:51:09.948490 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b8ccc5f0-1276-4b9a-9397-189833c8d274\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8ccc5f0-1276-4b9a-9397-189833c8d274\") pod \"mariadb-copy-data\" (UID: \"c1d0492d-6e62-434f-9ede-02d942146a89\") " pod="openstack/mariadb-copy-data" Dec 06 08:51:09 crc kubenswrapper[4954]: I1206 08:51:09.951779 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 08:51:09 crc kubenswrapper[4954]: I1206 08:51:09.951817 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b8ccc5f0-1276-4b9a-9397-189833c8d274\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8ccc5f0-1276-4b9a-9397-189833c8d274\") pod \"mariadb-copy-data\" (UID: \"c1d0492d-6e62-434f-9ede-02d942146a89\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c78dfd54010218d2dade7f900f48d1af4238490929498bf5c02db408c9b2e71b/globalmount\"" pod="openstack/mariadb-copy-data" Dec 06 08:51:09 crc kubenswrapper[4954]: I1206 08:51:09.969089 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs598\" (UniqueName: \"kubernetes.io/projected/c1d0492d-6e62-434f-9ede-02d942146a89-kube-api-access-qs598\") pod \"mariadb-copy-data\" (UID: \"c1d0492d-6e62-434f-9ede-02d942146a89\") " pod="openstack/mariadb-copy-data" Dec 06 08:51:09 crc kubenswrapper[4954]: I1206 08:51:09.977324 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b8ccc5f0-1276-4b9a-9397-189833c8d274\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8ccc5f0-1276-4b9a-9397-189833c8d274\") pod \"mariadb-copy-data\" (UID: \"c1d0492d-6e62-434f-9ede-02d942146a89\") " pod="openstack/mariadb-copy-data" Dec 06 08:51:10 crc kubenswrapper[4954]: I1206 08:51:10.074508 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 06 08:51:10 crc kubenswrapper[4954]: I1206 08:51:10.390617 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 06 08:51:10 crc kubenswrapper[4954]: I1206 08:51:10.636748 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"c1d0492d-6e62-434f-9ede-02d942146a89","Type":"ContainerStarted","Data":"a6ed010538ba1029f5bb0790e878476d37f6e16617297d651553ed5342902e62"} Dec 06 08:51:10 crc kubenswrapper[4954]: I1206 08:51:10.637113 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"c1d0492d-6e62-434f-9ede-02d942146a89","Type":"ContainerStarted","Data":"6dfb18bcf7e4abe9c17c5ce1e19844e0a6234d83fc350cee0979a833c876425d"} Dec 06 08:51:10 crc kubenswrapper[4954]: I1206 08:51:10.660528 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.660505671 podStartE2EDuration="2.660505671s" podCreationTimestamp="2025-12-06 08:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:51:10.654360937 +0000 UTC m=+6845.467720346" watchObservedRunningTime="2025-12-06 08:51:10.660505671 +0000 UTC m=+6845.473865060" Dec 06 08:51:13 crc kubenswrapper[4954]: I1206 08:51:13.974817 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 06 08:51:13 crc kubenswrapper[4954]: I1206 08:51:13.976945 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 06 08:51:13 crc kubenswrapper[4954]: I1206 08:51:13.985301 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 06 08:51:14 crc kubenswrapper[4954]: I1206 08:51:14.012012 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6gt4\" (UniqueName: \"kubernetes.io/projected/4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b-kube-api-access-s6gt4\") pod \"mariadb-client\" (UID: \"4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b\") " pod="openstack/mariadb-client" Dec 06 08:51:14 crc kubenswrapper[4954]: I1206 08:51:14.113513 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6gt4\" (UniqueName: \"kubernetes.io/projected/4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b-kube-api-access-s6gt4\") pod \"mariadb-client\" (UID: \"4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b\") " pod="openstack/mariadb-client" Dec 06 08:51:14 crc kubenswrapper[4954]: I1206 08:51:14.147381 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6gt4\" (UniqueName: \"kubernetes.io/projected/4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b-kube-api-access-s6gt4\") pod \"mariadb-client\" (UID: \"4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b\") " pod="openstack/mariadb-client" Dec 06 08:51:14 crc kubenswrapper[4954]: I1206 08:51:14.301128 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 06 08:51:14 crc kubenswrapper[4954]: I1206 08:51:14.781518 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 06 08:51:15 crc kubenswrapper[4954]: I1206 08:51:15.674191 4954 generic.go:334] "Generic (PLEG): container finished" podID="4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b" containerID="187e90abeeb32e29abf38fa176f6f86ebac8075279d9c2d5590cdab2773317f4" exitCode=0 Dec 06 08:51:15 crc kubenswrapper[4954]: I1206 08:51:15.674229 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b","Type":"ContainerDied","Data":"187e90abeeb32e29abf38fa176f6f86ebac8075279d9c2d5590cdab2773317f4"} Dec 06 08:51:15 crc kubenswrapper[4954]: I1206 08:51:15.674251 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b","Type":"ContainerStarted","Data":"e1e32233f7f467c64c69c62735304a88e160968cf4dcca24aa94fb5657b17151"} Dec 06 08:51:16 crc kubenswrapper[4954]: I1206 08:51:16.996059 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.026112 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b/mariadb-client/0.log" Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.051059 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.057439 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.060727 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6gt4\" (UniqueName: \"kubernetes.io/projected/4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b-kube-api-access-s6gt4\") pod \"4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b\" (UID: \"4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b\") " Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.068857 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b-kube-api-access-s6gt4" (OuterVolumeSpecName: "kube-api-access-s6gt4") pod "4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b" (UID: "4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b"). InnerVolumeSpecName "kube-api-access-s6gt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.162609 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6gt4\" (UniqueName: \"kubernetes.io/projected/4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b-kube-api-access-s6gt4\") on node \"crc\" DevicePath \"\"" Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.190618 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 06 08:51:17 crc kubenswrapper[4954]: E1206 08:51:17.191041 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b" containerName="mariadb-client" Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.191067 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b" containerName="mariadb-client" Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.191293 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b" containerName="mariadb-client" Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.191962 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.223900 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.264736 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp5g2\" (UniqueName: \"kubernetes.io/projected/003721ef-0867-46cc-a650-f79ffdfc95a9-kube-api-access-bp5g2\") pod \"mariadb-client\" (UID: \"003721ef-0867-46cc-a650-f79ffdfc95a9\") " pod="openstack/mariadb-client" Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.365939 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp5g2\" (UniqueName: \"kubernetes.io/projected/003721ef-0867-46cc-a650-f79ffdfc95a9-kube-api-access-bp5g2\") pod \"mariadb-client\" (UID: \"003721ef-0867-46cc-a650-f79ffdfc95a9\") " pod="openstack/mariadb-client" Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.385165 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp5g2\" (UniqueName: \"kubernetes.io/projected/003721ef-0867-46cc-a650-f79ffdfc95a9-kube-api-access-bp5g2\") pod \"mariadb-client\" (UID: \"003721ef-0867-46cc-a650-f79ffdfc95a9\") " pod="openstack/mariadb-client" Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.452527 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b" path="/var/lib/kubelet/pods/4fc5cda1-bc6c-46ad-ba20-b7e58cf62f1b/volumes" Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.527263 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.688796 4954 scope.go:117] "RemoveContainer" containerID="187e90abeeb32e29abf38fa176f6f86ebac8075279d9c2d5590cdab2773317f4" Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.688949 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 06 08:51:17 crc kubenswrapper[4954]: I1206 08:51:17.932049 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 06 08:51:18 crc kubenswrapper[4954]: I1206 08:51:18.696462 4954 generic.go:334] "Generic (PLEG): container finished" podID="003721ef-0867-46cc-a650-f79ffdfc95a9" containerID="a36d8863c1f2fcdf0df44c66eaf9bfa80d788088a0035b5104e00363b99dfba4" exitCode=0 Dec 06 08:51:18 crc kubenswrapper[4954]: I1206 08:51:18.696536 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"003721ef-0867-46cc-a650-f79ffdfc95a9","Type":"ContainerDied","Data":"a36d8863c1f2fcdf0df44c66eaf9bfa80d788088a0035b5104e00363b99dfba4"} Dec 06 08:51:18 crc kubenswrapper[4954]: I1206 08:51:18.696829 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"003721ef-0867-46cc-a650-f79ffdfc95a9","Type":"ContainerStarted","Data":"fb74b61c145c90fceed4a9e80ec6a20e7acd8ff1f30b9cb071e84a3997cc59f2"} Dec 06 08:51:19 crc kubenswrapper[4954]: I1206 08:51:19.969098 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 06 08:51:19 crc kubenswrapper[4954]: I1206 08:51:19.986650 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_003721ef-0867-46cc-a650-f79ffdfc95a9/mariadb-client/0.log" Dec 06 08:51:20 crc kubenswrapper[4954]: I1206 08:51:20.030059 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp5g2\" (UniqueName: \"kubernetes.io/projected/003721ef-0867-46cc-a650-f79ffdfc95a9-kube-api-access-bp5g2\") pod \"003721ef-0867-46cc-a650-f79ffdfc95a9\" (UID: \"003721ef-0867-46cc-a650-f79ffdfc95a9\") " Dec 06 08:51:20 crc kubenswrapper[4954]: I1206 08:51:20.035825 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/003721ef-0867-46cc-a650-f79ffdfc95a9-kube-api-access-bp5g2" (OuterVolumeSpecName: "kube-api-access-bp5g2") pod "003721ef-0867-46cc-a650-f79ffdfc95a9" (UID: "003721ef-0867-46cc-a650-f79ffdfc95a9"). InnerVolumeSpecName "kube-api-access-bp5g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:51:20 crc kubenswrapper[4954]: I1206 08:51:20.056039 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 06 08:51:20 crc kubenswrapper[4954]: I1206 08:51:20.063330 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 06 08:51:20 crc kubenswrapper[4954]: I1206 08:51:20.131768 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp5g2\" (UniqueName: \"kubernetes.io/projected/003721ef-0867-46cc-a650-f79ffdfc95a9-kube-api-access-bp5g2\") on node \"crc\" DevicePath \"\"" Dec 06 08:51:20 crc kubenswrapper[4954]: I1206 08:51:20.711150 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb74b61c145c90fceed4a9e80ec6a20e7acd8ff1f30b9cb071e84a3997cc59f2" Dec 06 08:51:20 crc kubenswrapper[4954]: I1206 08:51:20.711401 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 06 08:51:21 crc kubenswrapper[4954]: I1206 08:51:21.451790 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="003721ef-0867-46cc-a650-f79ffdfc95a9" path="/var/lib/kubelet/pods/003721ef-0867-46cc-a650-f79ffdfc95a9/volumes" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.161945 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 08:51:50 crc kubenswrapper[4954]: E1206 08:51:50.162985 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003721ef-0867-46cc-a650-f79ffdfc95a9" containerName="mariadb-client" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.163005 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="003721ef-0867-46cc-a650-f79ffdfc95a9" containerName="mariadb-client" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.163231 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="003721ef-0867-46cc-a650-f79ffdfc95a9" containerName="mariadb-client" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.164409 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.169165 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.169214 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.169499 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.169686 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-8xm78" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.170065 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.181626 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.183834 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.205189 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-293d47a4-b4ef-4982-a888-d2d00ee6ca10\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-293d47a4-b4ef-4982-a888-d2d00ee6ca10\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.205251 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dd08ec8-09d0-414b-a539-44085352f9d8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.205281 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dd08ec8-09d0-414b-a539-44085352f9d8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.205313 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1dd08ec8-09d0-414b-a539-44085352f9d8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.205345 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dd08ec8-09d0-414b-a539-44085352f9d8-config\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.205407 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv9q9\" (UniqueName: \"kubernetes.io/projected/1dd08ec8-09d0-414b-a539-44085352f9d8-kube-api-access-fv9q9\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.205434 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1dd08ec8-09d0-414b-a539-44085352f9d8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.205474 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dd08ec8-09d0-414b-a539-44085352f9d8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.223577 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.228935 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.231409 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.245484 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.259103 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.306880 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/81b9c775-f786-4b39-9b42-9a3ad830eb17-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.306928 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81b9c775-f786-4b39-9b42-9a3ad830eb17-config\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.306947 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/81b9c775-f786-4b39-9b42-9a3ad830eb17-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.306968 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcd6r\" (UniqueName: \"kubernetes.io/projected/81b9c775-f786-4b39-9b42-9a3ad830eb17-kube-api-access-tcd6r\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.307156 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-293d47a4-b4ef-4982-a888-d2d00ee6ca10\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-293d47a4-b4ef-4982-a888-d2d00ee6ca10\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.307215 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dd08ec8-09d0-414b-a539-44085352f9d8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.307253 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81b9c775-f786-4b39-9b42-9a3ad830eb17-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.307288 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dd08ec8-09d0-414b-a539-44085352f9d8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.307335 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1dd08ec8-09d0-414b-a539-44085352f9d8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.307369 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dd08ec8-09d0-414b-a539-44085352f9d8-config\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.307396 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81b9c775-f786-4b39-9b42-9a3ad830eb17-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.307473 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/81b9c775-f786-4b39-9b42-9a3ad830eb17-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.307572 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv9q9\" (UniqueName: \"kubernetes.io/projected/1dd08ec8-09d0-414b-a539-44085352f9d8-kube-api-access-fv9q9\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.307618 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1dd08ec8-09d0-414b-a539-44085352f9d8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.307679 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dd08ec8-09d0-414b-a539-44085352f9d8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.307715 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-58c521e9-27c8-4fd3-8675-e02a89de308f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-58c521e9-27c8-4fd3-8675-e02a89de308f\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.308810 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dd08ec8-09d0-414b-a539-44085352f9d8-config\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.309120 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1dd08ec8-09d0-414b-a539-44085352f9d8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.309437 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1dd08ec8-09d0-414b-a539-44085352f9d8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.313547 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dd08ec8-09d0-414b-a539-44085352f9d8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.314142 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dd08ec8-09d0-414b-a539-44085352f9d8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.316431 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dd08ec8-09d0-414b-a539-44085352f9d8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.317887 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.317926 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-293d47a4-b4ef-4982-a888-d2d00ee6ca10\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-293d47a4-b4ef-4982-a888-d2d00ee6ca10\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3ff8a6fbb6c387073f2fe201f0fcd84d5eabcdd361f90de802498650d97dd12b/globalmount\"" pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.330587 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv9q9\" (UniqueName: \"kubernetes.io/projected/1dd08ec8-09d0-414b-a539-44085352f9d8-kube-api-access-fv9q9\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.354217 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-293d47a4-b4ef-4982-a888-d2d00ee6ca10\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-293d47a4-b4ef-4982-a888-d2d00ee6ca10\") pod \"ovsdbserver-nb-0\" (UID: \"1dd08ec8-09d0-414b-a539-44085352f9d8\") " pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.410336 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.410383 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-58c521e9-27c8-4fd3-8675-e02a89de308f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-58c521e9-27c8-4fd3-8675-e02a89de308f\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.410410 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.410458 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/81b9c775-f786-4b39-9b42-9a3ad830eb17-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.410486 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-config\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.410507 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81b9c775-f786-4b39-9b42-9a3ad830eb17-config\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.410529 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/81b9c775-f786-4b39-9b42-9a3ad830eb17-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.410552 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcd6r\" (UniqueName: \"kubernetes.io/projected/81b9c775-f786-4b39-9b42-9a3ad830eb17-kube-api-access-tcd6r\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.410604 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.410638 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.410672 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx5lv\" (UniqueName: \"kubernetes.io/projected/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-kube-api-access-hx5lv\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.410696 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b5758859-1b78-4f74-9a26-342567ec3c35\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b5758859-1b78-4f74-9a26-342567ec3c35\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.410821 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81b9c775-f786-4b39-9b42-9a3ad830eb17-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.410878 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81b9c775-f786-4b39-9b42-9a3ad830eb17-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.410919 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/81b9c775-f786-4b39-9b42-9a3ad830eb17-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.410974 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.412258 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81b9c775-f786-4b39-9b42-9a3ad830eb17-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.412477 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/81b9c775-f786-4b39-9b42-9a3ad830eb17-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.413618 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.413664 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-58c521e9-27c8-4fd3-8675-e02a89de308f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-58c521e9-27c8-4fd3-8675-e02a89de308f\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/99ffe69e055cec5c50a51211a3af84544df8519a6d96243c0287688833fd488b/globalmount\"" pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.414316 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81b9c775-f786-4b39-9b42-9a3ad830eb17-config\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.416637 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81b9c775-f786-4b39-9b42-9a3ad830eb17-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.416839 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/81b9c775-f786-4b39-9b42-9a3ad830eb17-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.417233 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/81b9c775-f786-4b39-9b42-9a3ad830eb17-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.433539 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcd6r\" (UniqueName: \"kubernetes.io/projected/81b9c775-f786-4b39-9b42-9a3ad830eb17-kube-api-access-tcd6r\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.452799 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-58c521e9-27c8-4fd3-8675-e02a89de308f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-58c521e9-27c8-4fd3-8675-e02a89de308f\") pod \"ovsdbserver-nb-1\" (UID: \"81b9c775-f786-4b39-9b42-9a3ad830eb17\") " pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.512063 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.512145 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-config\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.512186 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.512237 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.512274 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx5lv\" (UniqueName: \"kubernetes.io/projected/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-kube-api-access-hx5lv\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.512300 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b5758859-1b78-4f74-9a26-342567ec3c35\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b5758859-1b78-4f74-9a26-342567ec3c35\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.512369 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.512398 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.513450 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-config\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.513924 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.514325 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.514343 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.517410 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.517431 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.517967 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.518138 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b5758859-1b78-4f74-9a26-342567ec3c35\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b5758859-1b78-4f74-9a26-342567ec3c35\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/110b1be97ade6ca1a216eecb9845fe254dafdceedb3aaf6834c6c6ab003735c4/globalmount\"" pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.518369 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.521497 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.540825 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx5lv\" (UniqueName: \"kubernetes.io/projected/da5862bf-e1b3-4b7c-8730-47c2de9e7f40-kube-api-access-hx5lv\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.555255 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b5758859-1b78-4f74-9a26-342567ec3c35\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b5758859-1b78-4f74-9a26-342567ec3c35\") pod \"ovsdbserver-nb-2\" (UID: \"da5862bf-e1b3-4b7c-8730-47c2de9e7f40\") " pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:50 crc kubenswrapper[4954]: I1206 08:51:50.850556 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:51 crc kubenswrapper[4954]: I1206 08:51:51.188160 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 08:51:51 crc kubenswrapper[4954]: I1206 08:51:51.561812 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 06 08:51:51 crc kubenswrapper[4954]: I1206 08:51:51.914914 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 06 08:51:51 crc kubenswrapper[4954]: W1206 08:51:51.946262 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81b9c775_f786_4b39_9b42_9a3ad830eb17.slice/crio-7b043e641fe23e61dab51b4cc38dcaabd3dca1c1cbc16301c046ca25b33d13a1 WatchSource:0}: Error finding container 7b043e641fe23e61dab51b4cc38dcaabd3dca1c1cbc16301c046ca25b33d13a1: Status 404 returned error can't find the container with id 7b043e641fe23e61dab51b4cc38dcaabd3dca1c1cbc16301c046ca25b33d13a1 Dec 06 08:51:51 crc kubenswrapper[4954]: I1206 08:51:51.972230 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"da5862bf-e1b3-4b7c-8730-47c2de9e7f40","Type":"ContainerStarted","Data":"6f79463cdfeee3b5aabf48c68140ce00dfac38ac32745a1eb20983cb4b036683"} Dec 06 08:51:51 crc kubenswrapper[4954]: I1206 08:51:51.975015 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"1dd08ec8-09d0-414b-a539-44085352f9d8","Type":"ContainerStarted","Data":"e0c856a24925aa8c6365db5be79f5e9f93eb3244c9d964e9df296f14df156d99"} Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.469487 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.476059 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.481977 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.489950 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-tmdlw" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.490214 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.490998 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.493146 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.504200 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.505655 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.517273 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.518626 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.531766 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.539402 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.567970 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d819f14-112f-40a4-82df-4c2d74278715-config\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.568077 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-497a69d7-79db-4ff9-a82e-6ef60d1dab2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-497a69d7-79db-4ff9-a82e-6ef60d1dab2d\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.568108 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d819f14-112f-40a4-82df-4c2d74278715-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.568217 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d819f14-112f-40a4-82df-4c2d74278715-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.568276 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d819f14-112f-40a4-82df-4c2d74278715-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.568312 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0d819f14-112f-40a4-82df-4c2d74278715-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.568332 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hccl\" (UniqueName: \"kubernetes.io/projected/0d819f14-112f-40a4-82df-4c2d74278715-kube-api-access-8hccl\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.568644 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d819f14-112f-40a4-82df-4c2d74278715-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.670621 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-497a69d7-79db-4ff9-a82e-6ef60d1dab2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-497a69d7-79db-4ff9-a82e-6ef60d1dab2d\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.670684 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/56b50592-4180-4185-a6df-1ae1346e18b1-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.670715 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d819f14-112f-40a4-82df-4c2d74278715-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.670748 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d819f14-112f-40a4-82df-4c2d74278715-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.670774 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ff1ab7-855e-450b-86e4-12a4e394d429-config\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.670812 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9686a104-42b8-429f-9725-b0a7309f38d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9686a104-42b8-429f-9725-b0a7309f38d5\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.670836 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d819f14-112f-40a4-82df-4c2d74278715-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.670875 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0d819f14-112f-40a4-82df-4c2d74278715-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.670896 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hccl\" (UniqueName: \"kubernetes.io/projected/0d819f14-112f-40a4-82df-4c2d74278715-kube-api-access-8hccl\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.670939 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b50592-4180-4185-a6df-1ae1346e18b1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.670962 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk2hw\" (UniqueName: \"kubernetes.io/projected/78ff1ab7-855e-450b-86e4-12a4e394d429-kube-api-access-vk2hw\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.670983 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78ff1ab7-855e-450b-86e4-12a4e394d429-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.671024 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78ff1ab7-855e-450b-86e4-12a4e394d429-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.671045 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56b50592-4180-4185-a6df-1ae1346e18b1-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.671073 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdrks\" (UniqueName: \"kubernetes.io/projected/56b50592-4180-4185-a6df-1ae1346e18b1-kube-api-access-fdrks\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.671098 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d819f14-112f-40a4-82df-4c2d74278715-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.671119 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b50592-4180-4185-a6df-1ae1346e18b1-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.671141 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d819f14-112f-40a4-82df-4c2d74278715-config\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.671163 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56b50592-4180-4185-a6df-1ae1346e18b1-config\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.671181 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b50592-4180-4185-a6df-1ae1346e18b1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.671198 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/78ff1ab7-855e-450b-86e4-12a4e394d429-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.671213 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/78ff1ab7-855e-450b-86e4-12a4e394d429-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.671238 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e27b66da-138f-410a-b2c0-b863be074f76\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e27b66da-138f-410a-b2c0-b863be074f76\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.671256 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78ff1ab7-855e-450b-86e4-12a4e394d429-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.672697 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d819f14-112f-40a4-82df-4c2d74278715-config\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.672753 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0d819f14-112f-40a4-82df-4c2d74278715-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.675792 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d819f14-112f-40a4-82df-4c2d74278715-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.677248 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.677355 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-497a69d7-79db-4ff9-a82e-6ef60d1dab2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-497a69d7-79db-4ff9-a82e-6ef60d1dab2d\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/292443b050b59e319c80f951b32d7d6cd5237d9c65b07966a3289e9660b14798/globalmount\"" pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.680725 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d819f14-112f-40a4-82df-4c2d74278715-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.690639 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d819f14-112f-40a4-82df-4c2d74278715-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.693583 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d819f14-112f-40a4-82df-4c2d74278715-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.705500 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hccl\" (UniqueName: \"kubernetes.io/projected/0d819f14-112f-40a4-82df-4c2d74278715-kube-api-access-8hccl\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.718737 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-497a69d7-79db-4ff9-a82e-6ef60d1dab2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-497a69d7-79db-4ff9-a82e-6ef60d1dab2d\") pod \"ovsdbserver-sb-0\" (UID: \"0d819f14-112f-40a4-82df-4c2d74278715\") " pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.772671 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ff1ab7-855e-450b-86e4-12a4e394d429-config\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.772744 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9686a104-42b8-429f-9725-b0a7309f38d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9686a104-42b8-429f-9725-b0a7309f38d5\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.772815 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b50592-4180-4185-a6df-1ae1346e18b1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.772847 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk2hw\" (UniqueName: \"kubernetes.io/projected/78ff1ab7-855e-450b-86e4-12a4e394d429-kube-api-access-vk2hw\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.772873 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78ff1ab7-855e-450b-86e4-12a4e394d429-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.772893 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78ff1ab7-855e-450b-86e4-12a4e394d429-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.772916 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56b50592-4180-4185-a6df-1ae1346e18b1-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.772943 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdrks\" (UniqueName: \"kubernetes.io/projected/56b50592-4180-4185-a6df-1ae1346e18b1-kube-api-access-fdrks\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.772985 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b50592-4180-4185-a6df-1ae1346e18b1-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.773025 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56b50592-4180-4185-a6df-1ae1346e18b1-config\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.773049 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b50592-4180-4185-a6df-1ae1346e18b1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.773070 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/78ff1ab7-855e-450b-86e4-12a4e394d429-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.773093 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/78ff1ab7-855e-450b-86e4-12a4e394d429-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.773133 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e27b66da-138f-410a-b2c0-b863be074f76\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e27b66da-138f-410a-b2c0-b863be074f76\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.773160 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78ff1ab7-855e-450b-86e4-12a4e394d429-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.773187 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/56b50592-4180-4185-a6df-1ae1346e18b1-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.773643 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ff1ab7-855e-450b-86e4-12a4e394d429-config\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.773830 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/56b50592-4180-4185-a6df-1ae1346e18b1-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.774180 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78ff1ab7-855e-450b-86e4-12a4e394d429-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.775171 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78ff1ab7-855e-450b-86e4-12a4e394d429-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.775212 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56b50592-4180-4185-a6df-1ae1346e18b1-config\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.776934 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56b50592-4180-4185-a6df-1ae1346e18b1-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.777645 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.777777 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e27b66da-138f-410a-b2c0-b863be074f76\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e27b66da-138f-410a-b2c0-b863be074f76\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d38b1a328bb2a4d4f76ed1a2b36b20f6bac3636d1836b5195856705126453c82/globalmount\"" pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.777900 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b50592-4180-4185-a6df-1ae1346e18b1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.777841 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.777964 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/78ff1ab7-855e-450b-86e4-12a4e394d429-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.777985 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9686a104-42b8-429f-9725-b0a7309f38d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9686a104-42b8-429f-9725-b0a7309f38d5\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d4886d9332953a4d91df13cf214aa6a5e09bcfb4ee66be96b267731ef29711f6/globalmount\"" pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.779794 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b50592-4180-4185-a6df-1ae1346e18b1-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.780198 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78ff1ab7-855e-450b-86e4-12a4e394d429-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.784622 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b50592-4180-4185-a6df-1ae1346e18b1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.785311 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/78ff1ab7-855e-450b-86e4-12a4e394d429-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.792030 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdrks\" (UniqueName: \"kubernetes.io/projected/56b50592-4180-4185-a6df-1ae1346e18b1-kube-api-access-fdrks\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.795363 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk2hw\" (UniqueName: \"kubernetes.io/projected/78ff1ab7-855e-450b-86e4-12a4e394d429-kube-api-access-vk2hw\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.806719 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.816835 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e27b66da-138f-410a-b2c0-b863be074f76\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e27b66da-138f-410a-b2c0-b863be074f76\") pod \"ovsdbserver-sb-2\" (UID: \"78ff1ab7-855e-450b-86e4-12a4e394d429\") " pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.823182 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9686a104-42b8-429f-9725-b0a7309f38d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9686a104-42b8-429f-9725-b0a7309f38d5\") pod \"ovsdbserver-sb-1\" (UID: \"56b50592-4180-4185-a6df-1ae1346e18b1\") " pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.835672 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.845919 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:52 crc kubenswrapper[4954]: I1206 08:51:52.985529 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"81b9c775-f786-4b39-9b42-9a3ad830eb17","Type":"ContainerStarted","Data":"7b043e641fe23e61dab51b4cc38dcaabd3dca1c1cbc16301c046ca25b33d13a1"} Dec 06 08:51:53 crc kubenswrapper[4954]: I1206 08:51:53.424426 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 08:51:53 crc kubenswrapper[4954]: I1206 08:51:53.512435 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 06 08:51:53 crc kubenswrapper[4954]: W1206 08:51:53.516531 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78ff1ab7_855e_450b_86e4_12a4e394d429.slice/crio-73aa45cf006c03d4d9aaee77dd91fab8cd76895fe25997cb8d29f97ac70921b6 WatchSource:0}: Error finding container 73aa45cf006c03d4d9aaee77dd91fab8cd76895fe25997cb8d29f97ac70921b6: Status 404 returned error can't find the container with id 73aa45cf006c03d4d9aaee77dd91fab8cd76895fe25997cb8d29f97ac70921b6 Dec 06 08:51:53 crc kubenswrapper[4954]: I1206 08:51:53.999338 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"78ff1ab7-855e-450b-86e4-12a4e394d429","Type":"ContainerStarted","Data":"73aa45cf006c03d4d9aaee77dd91fab8cd76895fe25997cb8d29f97ac70921b6"} Dec 06 08:51:54 crc kubenswrapper[4954]: I1206 08:51:54.001160 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0d819f14-112f-40a4-82df-4c2d74278715","Type":"ContainerStarted","Data":"70a2fe79e19f12095189777c4d72aa1784e01d1278c2aa18ff6e91dd6e9b6009"} Dec 06 08:51:54 crc kubenswrapper[4954]: I1206 08:51:54.448349 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 06 08:51:56 crc kubenswrapper[4954]: I1206 08:51:56.028800 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0d819f14-112f-40a4-82df-4c2d74278715","Type":"ContainerStarted","Data":"01bc29b3ace523a8d2d93e2a126ee38e3869c80ab0e3005dd3190d4c60ad1c3e"} Dec 06 08:51:56 crc kubenswrapper[4954]: I1206 08:51:56.031043 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"56b50592-4180-4185-a6df-1ae1346e18b1","Type":"ContainerStarted","Data":"73efc56d9e433ec3916e183c84ad0efe6dbaf939ed23fa978543506997e97036"} Dec 06 08:51:56 crc kubenswrapper[4954]: I1206 08:51:56.032892 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"da5862bf-e1b3-4b7c-8730-47c2de9e7f40","Type":"ContainerStarted","Data":"9ea77958225759f1ac695e6187b9fbed7f00036bd44e68497c4dcfc8bf85a98b"} Dec 06 08:51:56 crc kubenswrapper[4954]: I1206 08:51:56.042018 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"81b9c775-f786-4b39-9b42-9a3ad830eb17","Type":"ContainerStarted","Data":"92e4cff6d89063946a19d21d0add9942d5cb149e97db18b238dc3dcb27f11d69"} Dec 06 08:51:56 crc kubenswrapper[4954]: I1206 08:51:56.044583 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"1dd08ec8-09d0-414b-a539-44085352f9d8","Type":"ContainerStarted","Data":"32554eb82d2d3cccd377f46eca11be3733394be09bc8f3683526bd94c925a9dc"} Dec 06 08:51:57 crc kubenswrapper[4954]: I1206 08:51:57.053125 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0d819f14-112f-40a4-82df-4c2d74278715","Type":"ContainerStarted","Data":"e329d19135a8ec9cdb8a8bb3e2d8297bb361c1a90efb0d45f3450630441d5d65"} Dec 06 08:51:57 crc kubenswrapper[4954]: I1206 08:51:57.055109 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"56b50592-4180-4185-a6df-1ae1346e18b1","Type":"ContainerStarted","Data":"8068728b5f3a228cbc22311e84edeefda2825d3db8900a2f577b4ae0db7804c8"} Dec 06 08:51:57 crc kubenswrapper[4954]: I1206 08:51:57.055143 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"56b50592-4180-4185-a6df-1ae1346e18b1","Type":"ContainerStarted","Data":"c709b6225550ad6d42a8bfd7b6cda591839a22fcd0d988bba8b07b2ba86203a5"} Dec 06 08:51:57 crc kubenswrapper[4954]: I1206 08:51:57.056823 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"da5862bf-e1b3-4b7c-8730-47c2de9e7f40","Type":"ContainerStarted","Data":"b126792150047eceb37d7082b893cd77615b2049a04a40c6d6978d0137933382"} Dec 06 08:51:57 crc kubenswrapper[4954]: I1206 08:51:57.058351 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"81b9c775-f786-4b39-9b42-9a3ad830eb17","Type":"ContainerStarted","Data":"b3514c69c30582de821e5c22ab8ea2d38f4a4e9d338772216908d11c0d59f7b0"} Dec 06 08:51:57 crc kubenswrapper[4954]: I1206 08:51:57.060225 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"78ff1ab7-855e-450b-86e4-12a4e394d429","Type":"ContainerStarted","Data":"6244aade761c9c5cecdaa87fd260e607cb081459204703ea632097a0eacd704e"} Dec 06 08:51:57 crc kubenswrapper[4954]: I1206 08:51:57.060259 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"78ff1ab7-855e-450b-86e4-12a4e394d429","Type":"ContainerStarted","Data":"1ee6ad2503326f37fd627bb9c97383830ba6ef6cbf929826d3610c583adc0a64"} Dec 06 08:51:57 crc kubenswrapper[4954]: I1206 08:51:57.062442 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"1dd08ec8-09d0-414b-a539-44085352f9d8","Type":"ContainerStarted","Data":"4980135dcc7e23010da0a3447a8331db9185afd0c79d5e3328c72d21e169a43c"} Dec 06 08:51:57 crc kubenswrapper[4954]: I1206 08:51:57.106967 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.531285156 podStartE2EDuration="8.106927522s" podCreationTimestamp="2025-12-06 08:51:49 +0000 UTC" firstStartedPulling="2025-12-06 08:51:51.592700601 +0000 UTC m=+6886.406059990" lastFinishedPulling="2025-12-06 08:51:55.168342967 +0000 UTC m=+6889.981702356" observedRunningTime="2025-12-06 08:51:57.101869077 +0000 UTC m=+6891.915228466" watchObservedRunningTime="2025-12-06 08:51:57.106927522 +0000 UTC m=+6891.920286911" Dec 06 08:51:57 crc kubenswrapper[4954]: I1206 08:51:57.110223 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.958359667 podStartE2EDuration="6.110202739s" podCreationTimestamp="2025-12-06 08:51:51 +0000 UTC" firstStartedPulling="2025-12-06 08:51:53.418125914 +0000 UTC m=+6888.231485293" lastFinishedPulling="2025-12-06 08:51:55.569968966 +0000 UTC m=+6890.383328365" observedRunningTime="2025-12-06 08:51:57.085015489 +0000 UTC m=+6891.898374868" watchObservedRunningTime="2025-12-06 08:51:57.110202739 +0000 UTC m=+6891.923562128" Dec 06 08:51:57 crc kubenswrapper[4954]: I1206 08:51:57.128807 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=5.591112584 podStartE2EDuration="6.128786454s" podCreationTimestamp="2025-12-06 08:51:51 +0000 UTC" firstStartedPulling="2025-12-06 08:51:55.05649458 +0000 UTC m=+6889.869853969" lastFinishedPulling="2025-12-06 08:51:55.59416845 +0000 UTC m=+6890.407527839" observedRunningTime="2025-12-06 08:51:57.12187078 +0000 UTC m=+6891.935230199" watchObservedRunningTime="2025-12-06 08:51:57.128786454 +0000 UTC m=+6891.942145833" Dec 06 08:51:57 crc kubenswrapper[4954]: I1206 08:51:57.145631 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.069162566 podStartE2EDuration="6.145608942s" podCreationTimestamp="2025-12-06 08:51:51 +0000 UTC" firstStartedPulling="2025-12-06 08:51:53.51904974 +0000 UTC m=+6888.332409129" lastFinishedPulling="2025-12-06 08:51:55.595496116 +0000 UTC m=+6890.408855505" observedRunningTime="2025-12-06 08:51:57.140750682 +0000 UTC m=+6891.954110081" watchObservedRunningTime="2025-12-06 08:51:57.145608942 +0000 UTC m=+6891.958968341" Dec 06 08:51:57 crc kubenswrapper[4954]: I1206 08:51:57.174151 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.23313269 podStartE2EDuration="8.174131321s" podCreationTimestamp="2025-12-06 08:51:49 +0000 UTC" firstStartedPulling="2025-12-06 08:51:51.203325947 +0000 UTC m=+6886.016685336" lastFinishedPulling="2025-12-06 08:51:55.144324578 +0000 UTC m=+6889.957683967" observedRunningTime="2025-12-06 08:51:57.168509301 +0000 UTC m=+6891.981868690" watchObservedRunningTime="2025-12-06 08:51:57.174131321 +0000 UTC m=+6891.987490710" Dec 06 08:51:57 crc kubenswrapper[4954]: I1206 08:51:57.208010 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=5.020091776 podStartE2EDuration="8.207984682s" podCreationTimestamp="2025-12-06 08:51:49 +0000 UTC" firstStartedPulling="2025-12-06 08:51:51.953961106 +0000 UTC m=+6886.767320495" lastFinishedPulling="2025-12-06 08:51:55.141854012 +0000 UTC m=+6889.955213401" observedRunningTime="2025-12-06 08:51:57.183466789 +0000 UTC m=+6891.996826188" watchObservedRunningTime="2025-12-06 08:51:57.207984682 +0000 UTC m=+6892.021344081" Dec 06 08:51:57 crc kubenswrapper[4954]: I1206 08:51:57.807226 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:57 crc kubenswrapper[4954]: I1206 08:51:57.836802 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:57 crc kubenswrapper[4954]: I1206 08:51:57.846808 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:58 crc kubenswrapper[4954]: I1206 08:51:58.807656 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 06 08:51:58 crc kubenswrapper[4954]: I1206 08:51:58.836831 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Dec 06 08:51:58 crc kubenswrapper[4954]: I1206 08:51:58.847110 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Dec 06 08:51:59 crc kubenswrapper[4954]: I1206 08:51:59.514597 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:59 crc kubenswrapper[4954]: I1206 08:51:59.518686 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:59 crc kubenswrapper[4954]: I1206 08:51:59.558090 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Dec 06 08:51:59 crc kubenswrapper[4954]: I1206 08:51:59.563165 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 06 08:51:59 crc kubenswrapper[4954]: I1206 08:51:59.850789 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Dec 06 08:51:59 crc kubenswrapper[4954]: I1206 08:51:59.888060 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.084592 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.084625 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.084637 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.172158 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.190168 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.294269 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.485589 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9fbdcf-j4kfl"] Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.488754 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.492467 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.498940 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9fbdcf-j4kfl"] Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.654738 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/417d7ac9-0d9c-4833-9731-72064ab8cb9e-config\") pod \"dnsmasq-dns-6c9fbdcf-j4kfl\" (UID: \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\") " pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.654782 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9p9k\" (UniqueName: \"kubernetes.io/projected/417d7ac9-0d9c-4833-9731-72064ab8cb9e-kube-api-access-f9p9k\") pod \"dnsmasq-dns-6c9fbdcf-j4kfl\" (UID: \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\") " pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.654822 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/417d7ac9-0d9c-4833-9731-72064ab8cb9e-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9fbdcf-j4kfl\" (UID: \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\") " pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.655248 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/417d7ac9-0d9c-4833-9731-72064ab8cb9e-dns-svc\") pod \"dnsmasq-dns-6c9fbdcf-j4kfl\" (UID: \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\") " pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.756697 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/417d7ac9-0d9c-4833-9731-72064ab8cb9e-dns-svc\") pod \"dnsmasq-dns-6c9fbdcf-j4kfl\" (UID: \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\") " pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.756802 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/417d7ac9-0d9c-4833-9731-72064ab8cb9e-config\") pod \"dnsmasq-dns-6c9fbdcf-j4kfl\" (UID: \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\") " pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.756840 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9p9k\" (UniqueName: \"kubernetes.io/projected/417d7ac9-0d9c-4833-9731-72064ab8cb9e-kube-api-access-f9p9k\") pod \"dnsmasq-dns-6c9fbdcf-j4kfl\" (UID: \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\") " pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.756897 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/417d7ac9-0d9c-4833-9731-72064ab8cb9e-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9fbdcf-j4kfl\" (UID: \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\") " pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.757860 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/417d7ac9-0d9c-4833-9731-72064ab8cb9e-config\") pod \"dnsmasq-dns-6c9fbdcf-j4kfl\" (UID: \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\") " pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.758009 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/417d7ac9-0d9c-4833-9731-72064ab8cb9e-dns-svc\") pod \"dnsmasq-dns-6c9fbdcf-j4kfl\" (UID: \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\") " pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.758330 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/417d7ac9-0d9c-4833-9731-72064ab8cb9e-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9fbdcf-j4kfl\" (UID: \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\") " pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.776755 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9p9k\" (UniqueName: \"kubernetes.io/projected/417d7ac9-0d9c-4833-9731-72064ab8cb9e-kube-api-access-f9p9k\") pod \"dnsmasq-dns-6c9fbdcf-j4kfl\" (UID: \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\") " pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" Dec 06 08:52:00 crc kubenswrapper[4954]: I1206 08:52:00.806773 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" Dec 06 08:52:01 crc kubenswrapper[4954]: I1206 08:52:01.257645 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9fbdcf-j4kfl"] Dec 06 08:52:01 crc kubenswrapper[4954]: I1206 08:52:01.844113 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 06 08:52:01 crc kubenswrapper[4954]: I1206 08:52:01.882140 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Dec 06 08:52:01 crc kubenswrapper[4954]: I1206 08:52:01.890147 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 06 08:52:01 crc kubenswrapper[4954]: I1206 08:52:01.897210 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Dec 06 08:52:01 crc kubenswrapper[4954]: I1206 08:52:01.928788 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Dec 06 08:52:01 crc kubenswrapper[4954]: I1206 08:52:01.945618 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.100916 4954 generic.go:334] "Generic (PLEG): container finished" podID="417d7ac9-0d9c-4833-9731-72064ab8cb9e" containerID="90c0a165e1941bfe1f9e4a398e8c529b8c8ac95dcf4fa274437c869ee4cb764b" exitCode=0 Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.101736 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" event={"ID":"417d7ac9-0d9c-4833-9731-72064ab8cb9e","Type":"ContainerDied","Data":"90c0a165e1941bfe1f9e4a398e8c529b8c8ac95dcf4fa274437c869ee4cb764b"} Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.101803 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" event={"ID":"417d7ac9-0d9c-4833-9731-72064ab8cb9e","Type":"ContainerStarted","Data":"d428280cd929456f4e07dd1dbb53d3ba84717025e011e7c726a8b682547e5d6b"} Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.136847 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9fbdcf-j4kfl"] Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.200909 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-869d89ccc5-67qzv"] Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.202583 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.204465 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.222528 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869d89ccc5-67qzv"] Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.383621 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-dns-svc\") pod \"dnsmasq-dns-869d89ccc5-67qzv\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.383681 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-ovsdbserver-nb\") pod \"dnsmasq-dns-869d89ccc5-67qzv\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.383786 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-config\") pod \"dnsmasq-dns-869d89ccc5-67qzv\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.383874 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-ovsdbserver-sb\") pod \"dnsmasq-dns-869d89ccc5-67qzv\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.383918 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx8tn\" (UniqueName: \"kubernetes.io/projected/060e5362-4797-4fb4-8855-7dde90bb4bf8-kube-api-access-vx8tn\") pod \"dnsmasq-dns-869d89ccc5-67qzv\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.485288 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-dns-svc\") pod \"dnsmasq-dns-869d89ccc5-67qzv\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.485351 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-ovsdbserver-nb\") pod \"dnsmasq-dns-869d89ccc5-67qzv\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.485377 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-config\") pod \"dnsmasq-dns-869d89ccc5-67qzv\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.485417 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-ovsdbserver-sb\") pod \"dnsmasq-dns-869d89ccc5-67qzv\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.485450 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx8tn\" (UniqueName: \"kubernetes.io/projected/060e5362-4797-4fb4-8855-7dde90bb4bf8-kube-api-access-vx8tn\") pod \"dnsmasq-dns-869d89ccc5-67qzv\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.486428 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-config\") pod \"dnsmasq-dns-869d89ccc5-67qzv\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.486432 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-ovsdbserver-sb\") pod \"dnsmasq-dns-869d89ccc5-67qzv\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.486547 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-ovsdbserver-nb\") pod \"dnsmasq-dns-869d89ccc5-67qzv\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.486614 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-dns-svc\") pod \"dnsmasq-dns-869d89ccc5-67qzv\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.506286 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx8tn\" (UniqueName: \"kubernetes.io/projected/060e5362-4797-4fb4-8855-7dde90bb4bf8-kube-api-access-vx8tn\") pod \"dnsmasq-dns-869d89ccc5-67qzv\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.576324 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:02 crc kubenswrapper[4954]: I1206 08:52:02.987086 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869d89ccc5-67qzv"] Dec 06 08:52:03 crc kubenswrapper[4954]: I1206 08:52:03.110026 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" event={"ID":"060e5362-4797-4fb4-8855-7dde90bb4bf8","Type":"ContainerStarted","Data":"a0c8540c81e18a6fa03c321332ff36b3bc9185120b7fcecd16c9e38bb99e3715"} Dec 06 08:52:03 crc kubenswrapper[4954]: I1206 08:52:03.113197 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" event={"ID":"417d7ac9-0d9c-4833-9731-72064ab8cb9e","Type":"ContainerStarted","Data":"b75e8d469ebd7a9c38e6cccad0a651f548248bd988a07f425ab824a9f97798c8"} Dec 06 08:52:03 crc kubenswrapper[4954]: I1206 08:52:03.113484 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" podUID="417d7ac9-0d9c-4833-9731-72064ab8cb9e" containerName="dnsmasq-dns" containerID="cri-o://b75e8d469ebd7a9c38e6cccad0a651f548248bd988a07f425ab824a9f97798c8" gracePeriod=10 Dec 06 08:52:03 crc kubenswrapper[4954]: I1206 08:52:03.113811 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" Dec 06 08:52:03 crc kubenswrapper[4954]: I1206 08:52:03.483351 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" Dec 06 08:52:03 crc kubenswrapper[4954]: I1206 08:52:03.613154 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/417d7ac9-0d9c-4833-9731-72064ab8cb9e-dns-svc\") pod \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\" (UID: \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\") " Dec 06 08:52:03 crc kubenswrapper[4954]: I1206 08:52:03.613201 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/417d7ac9-0d9c-4833-9731-72064ab8cb9e-config\") pod \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\" (UID: \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\") " Dec 06 08:52:03 crc kubenswrapper[4954]: I1206 08:52:03.613274 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9p9k\" (UniqueName: \"kubernetes.io/projected/417d7ac9-0d9c-4833-9731-72064ab8cb9e-kube-api-access-f9p9k\") pod \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\" (UID: \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\") " Dec 06 08:52:03 crc kubenswrapper[4954]: I1206 08:52:03.613318 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/417d7ac9-0d9c-4833-9731-72064ab8cb9e-ovsdbserver-nb\") pod \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\" (UID: \"417d7ac9-0d9c-4833-9731-72064ab8cb9e\") " Dec 06 08:52:03 crc kubenswrapper[4954]: I1206 08:52:03.635938 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/417d7ac9-0d9c-4833-9731-72064ab8cb9e-kube-api-access-f9p9k" (OuterVolumeSpecName: "kube-api-access-f9p9k") pod "417d7ac9-0d9c-4833-9731-72064ab8cb9e" (UID: "417d7ac9-0d9c-4833-9731-72064ab8cb9e"). InnerVolumeSpecName "kube-api-access-f9p9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:52:03 crc kubenswrapper[4954]: I1206 08:52:03.671107 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/417d7ac9-0d9c-4833-9731-72064ab8cb9e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "417d7ac9-0d9c-4833-9731-72064ab8cb9e" (UID: "417d7ac9-0d9c-4833-9731-72064ab8cb9e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:52:03 crc kubenswrapper[4954]: I1206 08:52:03.690295 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/417d7ac9-0d9c-4833-9731-72064ab8cb9e-config" (OuterVolumeSpecName: "config") pod "417d7ac9-0d9c-4833-9731-72064ab8cb9e" (UID: "417d7ac9-0d9c-4833-9731-72064ab8cb9e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:52:03 crc kubenswrapper[4954]: I1206 08:52:03.696670 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/417d7ac9-0d9c-4833-9731-72064ab8cb9e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "417d7ac9-0d9c-4833-9731-72064ab8cb9e" (UID: "417d7ac9-0d9c-4833-9731-72064ab8cb9e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:52:03 crc kubenswrapper[4954]: I1206 08:52:03.715344 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/417d7ac9-0d9c-4833-9731-72064ab8cb9e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:03 crc kubenswrapper[4954]: I1206 08:52:03.715389 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/417d7ac9-0d9c-4833-9731-72064ab8cb9e-config\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:03 crc kubenswrapper[4954]: I1206 08:52:03.715405 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9p9k\" (UniqueName: \"kubernetes.io/projected/417d7ac9-0d9c-4833-9731-72064ab8cb9e-kube-api-access-f9p9k\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:03 crc kubenswrapper[4954]: I1206 08:52:03.715416 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/417d7ac9-0d9c-4833-9731-72064ab8cb9e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.122530 4954 generic.go:334] "Generic (PLEG): container finished" podID="060e5362-4797-4fb4-8855-7dde90bb4bf8" containerID="51a6d5b8a82fba03786299e422806156058bd23f79d612d7af562b8a848ce1e3" exitCode=0 Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.122589 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" event={"ID":"060e5362-4797-4fb4-8855-7dde90bb4bf8","Type":"ContainerDied","Data":"51a6d5b8a82fba03786299e422806156058bd23f79d612d7af562b8a848ce1e3"} Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.126825 4954 generic.go:334] "Generic (PLEG): container finished" podID="417d7ac9-0d9c-4833-9731-72064ab8cb9e" containerID="b75e8d469ebd7a9c38e6cccad0a651f548248bd988a07f425ab824a9f97798c8" exitCode=0 Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.126862 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" event={"ID":"417d7ac9-0d9c-4833-9731-72064ab8cb9e","Type":"ContainerDied","Data":"b75e8d469ebd7a9c38e6cccad0a651f548248bd988a07f425ab824a9f97798c8"} Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.126885 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" event={"ID":"417d7ac9-0d9c-4833-9731-72064ab8cb9e","Type":"ContainerDied","Data":"d428280cd929456f4e07dd1dbb53d3ba84717025e011e7c726a8b682547e5d6b"} Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.126902 4954 scope.go:117] "RemoveContainer" containerID="b75e8d469ebd7a9c38e6cccad0a651f548248bd988a07f425ab824a9f97798c8" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.126905 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9fbdcf-j4kfl" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.219725 4954 scope.go:117] "RemoveContainer" containerID="90c0a165e1941bfe1f9e4a398e8c529b8c8ac95dcf4fa274437c869ee4cb764b" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.271990 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9fbdcf-j4kfl"] Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.276783 4954 scope.go:117] "RemoveContainer" containerID="b75e8d469ebd7a9c38e6cccad0a651f548248bd988a07f425ab824a9f97798c8" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.279487 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9fbdcf-j4kfl"] Dec 06 08:52:04 crc kubenswrapper[4954]: E1206 08:52:04.280033 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b75e8d469ebd7a9c38e6cccad0a651f548248bd988a07f425ab824a9f97798c8\": container with ID starting with b75e8d469ebd7a9c38e6cccad0a651f548248bd988a07f425ab824a9f97798c8 not found: ID does not exist" containerID="b75e8d469ebd7a9c38e6cccad0a651f548248bd988a07f425ab824a9f97798c8" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.280070 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b75e8d469ebd7a9c38e6cccad0a651f548248bd988a07f425ab824a9f97798c8"} err="failed to get container status \"b75e8d469ebd7a9c38e6cccad0a651f548248bd988a07f425ab824a9f97798c8\": rpc error: code = NotFound desc = could not find container \"b75e8d469ebd7a9c38e6cccad0a651f548248bd988a07f425ab824a9f97798c8\": container with ID starting with b75e8d469ebd7a9c38e6cccad0a651f548248bd988a07f425ab824a9f97798c8 not found: ID does not exist" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.280095 4954 scope.go:117] "RemoveContainer" containerID="90c0a165e1941bfe1f9e4a398e8c529b8c8ac95dcf4fa274437c869ee4cb764b" Dec 06 08:52:04 crc kubenswrapper[4954]: E1206 08:52:04.280574 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90c0a165e1941bfe1f9e4a398e8c529b8c8ac95dcf4fa274437c869ee4cb764b\": container with ID starting with 90c0a165e1941bfe1f9e4a398e8c529b8c8ac95dcf4fa274437c869ee4cb764b not found: ID does not exist" containerID="90c0a165e1941bfe1f9e4a398e8c529b8c8ac95dcf4fa274437c869ee4cb764b" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.280603 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90c0a165e1941bfe1f9e4a398e8c529b8c8ac95dcf4fa274437c869ee4cb764b"} err="failed to get container status \"90c0a165e1941bfe1f9e4a398e8c529b8c8ac95dcf4fa274437c869ee4cb764b\": rpc error: code = NotFound desc = could not find container \"90c0a165e1941bfe1f9e4a398e8c529b8c8ac95dcf4fa274437c869ee4cb764b\": container with ID starting with 90c0a165e1941bfe1f9e4a398e8c529b8c8ac95dcf4fa274437c869ee4cb764b not found: ID does not exist" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.572266 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Dec 06 08:52:04 crc kubenswrapper[4954]: E1206 08:52:04.572676 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="417d7ac9-0d9c-4833-9731-72064ab8cb9e" containerName="init" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.572692 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="417d7ac9-0d9c-4833-9731-72064ab8cb9e" containerName="init" Dec 06 08:52:04 crc kubenswrapper[4954]: E1206 08:52:04.572731 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="417d7ac9-0d9c-4833-9731-72064ab8cb9e" containerName="dnsmasq-dns" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.572738 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="417d7ac9-0d9c-4833-9731-72064ab8cb9e" containerName="dnsmasq-dns" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.572915 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="417d7ac9-0d9c-4833-9731-72064ab8cb9e" containerName="dnsmasq-dns" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.573561 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.582218 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.583327 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.732328 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/8730273a-edb3-4f15-8e60-f359dfa1e91c-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"8730273a-edb3-4f15-8e60-f359dfa1e91c\") " pod="openstack/ovn-copy-data" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.732851 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ad36409d-98bf-4466-88d2-f82ea3baa695\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad36409d-98bf-4466-88d2-f82ea3baa695\") pod \"ovn-copy-data\" (UID: \"8730273a-edb3-4f15-8e60-f359dfa1e91c\") " pod="openstack/ovn-copy-data" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.732893 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7664z\" (UniqueName: \"kubernetes.io/projected/8730273a-edb3-4f15-8e60-f359dfa1e91c-kube-api-access-7664z\") pod \"ovn-copy-data\" (UID: \"8730273a-edb3-4f15-8e60-f359dfa1e91c\") " pod="openstack/ovn-copy-data" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.835279 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ad36409d-98bf-4466-88d2-f82ea3baa695\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad36409d-98bf-4466-88d2-f82ea3baa695\") pod \"ovn-copy-data\" (UID: \"8730273a-edb3-4f15-8e60-f359dfa1e91c\") " pod="openstack/ovn-copy-data" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.835325 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7664z\" (UniqueName: \"kubernetes.io/projected/8730273a-edb3-4f15-8e60-f359dfa1e91c-kube-api-access-7664z\") pod \"ovn-copy-data\" (UID: \"8730273a-edb3-4f15-8e60-f359dfa1e91c\") " pod="openstack/ovn-copy-data" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.835426 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/8730273a-edb3-4f15-8e60-f359dfa1e91c-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"8730273a-edb3-4f15-8e60-f359dfa1e91c\") " pod="openstack/ovn-copy-data" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.838475 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.838525 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ad36409d-98bf-4466-88d2-f82ea3baa695\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad36409d-98bf-4466-88d2-f82ea3baa695\") pod \"ovn-copy-data\" (UID: \"8730273a-edb3-4f15-8e60-f359dfa1e91c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/43292426840b00e3771596305fa2c45411435d588ca3d340a68a41763c803b59/globalmount\"" pod="openstack/ovn-copy-data" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.845959 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/8730273a-edb3-4f15-8e60-f359dfa1e91c-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"8730273a-edb3-4f15-8e60-f359dfa1e91c\") " pod="openstack/ovn-copy-data" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.858701 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7664z\" (UniqueName: \"kubernetes.io/projected/8730273a-edb3-4f15-8e60-f359dfa1e91c-kube-api-access-7664z\") pod \"ovn-copy-data\" (UID: \"8730273a-edb3-4f15-8e60-f359dfa1e91c\") " pod="openstack/ovn-copy-data" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.869291 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ad36409d-98bf-4466-88d2-f82ea3baa695\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad36409d-98bf-4466-88d2-f82ea3baa695\") pod \"ovn-copy-data\" (UID: \"8730273a-edb3-4f15-8e60-f359dfa1e91c\") " pod="openstack/ovn-copy-data" Dec 06 08:52:04 crc kubenswrapper[4954]: I1206 08:52:04.902218 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 06 08:52:05 crc kubenswrapper[4954]: I1206 08:52:05.139934 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" event={"ID":"060e5362-4797-4fb4-8855-7dde90bb4bf8","Type":"ContainerStarted","Data":"a3abc95c23bdb4f2bcf32f418ea864da9eda7ba7590a3d4ab03cc259abf79669"} Dec 06 08:52:05 crc kubenswrapper[4954]: I1206 08:52:05.140355 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:05 crc kubenswrapper[4954]: I1206 08:52:05.162641 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" podStartSLOduration=3.162621325 podStartE2EDuration="3.162621325s" podCreationTimestamp="2025-12-06 08:52:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:52:05.157941861 +0000 UTC m=+6899.971301250" watchObservedRunningTime="2025-12-06 08:52:05.162621325 +0000 UTC m=+6899.975980724" Dec 06 08:52:05 crc kubenswrapper[4954]: W1206 08:52:05.417095 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8730273a_edb3_4f15_8e60_f359dfa1e91c.slice/crio-48a448171a772a7c8397b024516cfeb17c5df8bab2c7b59325e0c130492d6f36 WatchSource:0}: Error finding container 48a448171a772a7c8397b024516cfeb17c5df8bab2c7b59325e0c130492d6f36: Status 404 returned error can't find the container with id 48a448171a772a7c8397b024516cfeb17c5df8bab2c7b59325e0c130492d6f36 Dec 06 08:52:05 crc kubenswrapper[4954]: I1206 08:52:05.417447 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 06 08:52:05 crc kubenswrapper[4954]: I1206 08:52:05.455794 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="417d7ac9-0d9c-4833-9731-72064ab8cb9e" path="/var/lib/kubelet/pods/417d7ac9-0d9c-4833-9731-72064ab8cb9e/volumes" Dec 06 08:52:06 crc kubenswrapper[4954]: I1206 08:52:06.151609 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"8730273a-edb3-4f15-8e60-f359dfa1e91c","Type":"ContainerStarted","Data":"564581786a415133bc0c02db3ea0b38fcae5ad9875b91e81e66912dd706fda8d"} Dec 06 08:52:06 crc kubenswrapper[4954]: I1206 08:52:06.152008 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"8730273a-edb3-4f15-8e60-f359dfa1e91c","Type":"ContainerStarted","Data":"48a448171a772a7c8397b024516cfeb17c5df8bab2c7b59325e0c130492d6f36"} Dec 06 08:52:06 crc kubenswrapper[4954]: I1206 08:52:06.172609 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=2.979660802 podStartE2EDuration="3.172565076s" podCreationTimestamp="2025-12-06 08:52:03 +0000 UTC" firstStartedPulling="2025-12-06 08:52:05.419549284 +0000 UTC m=+6900.232908683" lastFinishedPulling="2025-12-06 08:52:05.612453568 +0000 UTC m=+6900.425812957" observedRunningTime="2025-12-06 08:52:06.169102763 +0000 UTC m=+6900.982462162" watchObservedRunningTime="2025-12-06 08:52:06.172565076 +0000 UTC m=+6900.985924485" Dec 06 08:52:12 crc kubenswrapper[4954]: I1206 08:52:12.578841 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:12 crc kubenswrapper[4954]: I1206 08:52:12.680129 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f7f6bbcbf-8lknl"] Dec 06 08:52:12 crc kubenswrapper[4954]: I1206 08:52:12.680533 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" podUID="2e847fab-859e-418c-96d2-9780288fd5f7" containerName="dnsmasq-dns" containerID="cri-o://c29efca274771d1f3acaa9a5c9f8e49e159e07b57590bc770f4236bf95339cdb" gracePeriod=10 Dec 06 08:52:12 crc kubenswrapper[4954]: E1206 08:52:12.962149 4954 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.114:43260->38.129.56.114:35379: write tcp 38.129.56.114:43260->38.129.56.114:35379: write: broken pipe Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.156722 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.210249 4954 generic.go:334] "Generic (PLEG): container finished" podID="2e847fab-859e-418c-96d2-9780288fd5f7" containerID="c29efca274771d1f3acaa9a5c9f8e49e159e07b57590bc770f4236bf95339cdb" exitCode=0 Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.210295 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" event={"ID":"2e847fab-859e-418c-96d2-9780288fd5f7","Type":"ContainerDied","Data":"c29efca274771d1f3acaa9a5c9f8e49e159e07b57590bc770f4236bf95339cdb"} Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.210331 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" event={"ID":"2e847fab-859e-418c-96d2-9780288fd5f7","Type":"ContainerDied","Data":"8482361d5278f2d6989bc1b904462faa0e91ee52bb971921f06f842a6696bd41"} Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.210338 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7f6bbcbf-8lknl" Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.210349 4954 scope.go:117] "RemoveContainer" containerID="c29efca274771d1f3acaa9a5c9f8e49e159e07b57590bc770f4236bf95339cdb" Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.232783 4954 scope.go:117] "RemoveContainer" containerID="4ca5edda898d9d83a00886d13e2d117dab63652b8322ab9ef7e6029e19a6d7ce" Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.254650 4954 scope.go:117] "RemoveContainer" containerID="c29efca274771d1f3acaa9a5c9f8e49e159e07b57590bc770f4236bf95339cdb" Dec 06 08:52:13 crc kubenswrapper[4954]: E1206 08:52:13.255676 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c29efca274771d1f3acaa9a5c9f8e49e159e07b57590bc770f4236bf95339cdb\": container with ID starting with c29efca274771d1f3acaa9a5c9f8e49e159e07b57590bc770f4236bf95339cdb not found: ID does not exist" containerID="c29efca274771d1f3acaa9a5c9f8e49e159e07b57590bc770f4236bf95339cdb" Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.255722 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c29efca274771d1f3acaa9a5c9f8e49e159e07b57590bc770f4236bf95339cdb"} err="failed to get container status \"c29efca274771d1f3acaa9a5c9f8e49e159e07b57590bc770f4236bf95339cdb\": rpc error: code = NotFound desc = could not find container \"c29efca274771d1f3acaa9a5c9f8e49e159e07b57590bc770f4236bf95339cdb\": container with ID starting with c29efca274771d1f3acaa9a5c9f8e49e159e07b57590bc770f4236bf95339cdb not found: ID does not exist" Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.255758 4954 scope.go:117] "RemoveContainer" containerID="4ca5edda898d9d83a00886d13e2d117dab63652b8322ab9ef7e6029e19a6d7ce" Dec 06 08:52:13 crc kubenswrapper[4954]: E1206 08:52:13.256303 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ca5edda898d9d83a00886d13e2d117dab63652b8322ab9ef7e6029e19a6d7ce\": container with ID starting with 4ca5edda898d9d83a00886d13e2d117dab63652b8322ab9ef7e6029e19a6d7ce not found: ID does not exist" containerID="4ca5edda898d9d83a00886d13e2d117dab63652b8322ab9ef7e6029e19a6d7ce" Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.256349 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca5edda898d9d83a00886d13e2d117dab63652b8322ab9ef7e6029e19a6d7ce"} err="failed to get container status \"4ca5edda898d9d83a00886d13e2d117dab63652b8322ab9ef7e6029e19a6d7ce\": rpc error: code = NotFound desc = could not find container \"4ca5edda898d9d83a00886d13e2d117dab63652b8322ab9ef7e6029e19a6d7ce\": container with ID starting with 4ca5edda898d9d83a00886d13e2d117dab63652b8322ab9ef7e6029e19a6d7ce not found: ID does not exist" Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.279966 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e847fab-859e-418c-96d2-9780288fd5f7-config\") pod \"2e847fab-859e-418c-96d2-9780288fd5f7\" (UID: \"2e847fab-859e-418c-96d2-9780288fd5f7\") " Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.280081 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e847fab-859e-418c-96d2-9780288fd5f7-dns-svc\") pod \"2e847fab-859e-418c-96d2-9780288fd5f7\" (UID: \"2e847fab-859e-418c-96d2-9780288fd5f7\") " Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.280177 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd5zb\" (UniqueName: \"kubernetes.io/projected/2e847fab-859e-418c-96d2-9780288fd5f7-kube-api-access-qd5zb\") pod \"2e847fab-859e-418c-96d2-9780288fd5f7\" (UID: \"2e847fab-859e-418c-96d2-9780288fd5f7\") " Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.286261 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e847fab-859e-418c-96d2-9780288fd5f7-kube-api-access-qd5zb" (OuterVolumeSpecName: "kube-api-access-qd5zb") pod "2e847fab-859e-418c-96d2-9780288fd5f7" (UID: "2e847fab-859e-418c-96d2-9780288fd5f7"). InnerVolumeSpecName "kube-api-access-qd5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.317869 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e847fab-859e-418c-96d2-9780288fd5f7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2e847fab-859e-418c-96d2-9780288fd5f7" (UID: "2e847fab-859e-418c-96d2-9780288fd5f7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.319166 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e847fab-859e-418c-96d2-9780288fd5f7-config" (OuterVolumeSpecName: "config") pod "2e847fab-859e-418c-96d2-9780288fd5f7" (UID: "2e847fab-859e-418c-96d2-9780288fd5f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.382224 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e847fab-859e-418c-96d2-9780288fd5f7-config\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.382255 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e847fab-859e-418c-96d2-9780288fd5f7-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.382267 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd5zb\" (UniqueName: \"kubernetes.io/projected/2e847fab-859e-418c-96d2-9780288fd5f7-kube-api-access-qd5zb\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.536449 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f7f6bbcbf-8lknl"] Dec 06 08:52:13 crc kubenswrapper[4954]: I1206 08:52:13.551840 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f7f6bbcbf-8lknl"] Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.224070 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 06 08:52:14 crc kubenswrapper[4954]: E1206 08:52:14.224440 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e847fab-859e-418c-96d2-9780288fd5f7" containerName="dnsmasq-dns" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.224455 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e847fab-859e-418c-96d2-9780288fd5f7" containerName="dnsmasq-dns" Dec 06 08:52:14 crc kubenswrapper[4954]: E1206 08:52:14.224484 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e847fab-859e-418c-96d2-9780288fd5f7" containerName="init" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.224492 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e847fab-859e-418c-96d2-9780288fd5f7" containerName="init" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.224673 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e847fab-859e-418c-96d2-9780288fd5f7" containerName="dnsmasq-dns" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.225528 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.238331 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.238430 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.239126 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-kn58p" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.240126 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.261880 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.298876 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsrl5\" (UniqueName: \"kubernetes.io/projected/56016706-35a9-41e9-afef-3555f10a4e94-kube-api-access-gsrl5\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.298949 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/56016706-35a9-41e9-afef-3555f10a4e94-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.299121 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56016706-35a9-41e9-afef-3555f10a4e94-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.299390 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56016706-35a9-41e9-afef-3555f10a4e94-scripts\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.299456 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56016706-35a9-41e9-afef-3555f10a4e94-config\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.299686 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/56016706-35a9-41e9-afef-3555f10a4e94-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.299866 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/56016706-35a9-41e9-afef-3555f10a4e94-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.401097 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56016706-35a9-41e9-afef-3555f10a4e94-scripts\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.401163 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56016706-35a9-41e9-afef-3555f10a4e94-config\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.401198 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/56016706-35a9-41e9-afef-3555f10a4e94-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.401238 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/56016706-35a9-41e9-afef-3555f10a4e94-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.401292 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsrl5\" (UniqueName: \"kubernetes.io/projected/56016706-35a9-41e9-afef-3555f10a4e94-kube-api-access-gsrl5\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.401327 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/56016706-35a9-41e9-afef-3555f10a4e94-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.401355 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56016706-35a9-41e9-afef-3555f10a4e94-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.401998 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/56016706-35a9-41e9-afef-3555f10a4e94-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.402698 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56016706-35a9-41e9-afef-3555f10a4e94-config\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.403024 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56016706-35a9-41e9-afef-3555f10a4e94-scripts\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.405618 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56016706-35a9-41e9-afef-3555f10a4e94-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.405722 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/56016706-35a9-41e9-afef-3555f10a4e94-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.407317 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/56016706-35a9-41e9-afef-3555f10a4e94-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.417961 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsrl5\" (UniqueName: \"kubernetes.io/projected/56016706-35a9-41e9-afef-3555f10a4e94-kube-api-access-gsrl5\") pod \"ovn-northd-0\" (UID: \"56016706-35a9-41e9-afef-3555f10a4e94\") " pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.540687 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.821172 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 06 08:52:14 crc kubenswrapper[4954]: I1206 08:52:14.833878 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 08:52:15 crc kubenswrapper[4954]: I1206 08:52:15.236636 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"56016706-35a9-41e9-afef-3555f10a4e94","Type":"ContainerStarted","Data":"67fccfb48d08c6bdf62e6d1181aca9e4789e2f4459c8a88c04371d4da8c5eedf"} Dec 06 08:52:15 crc kubenswrapper[4954]: I1206 08:52:15.456319 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e847fab-859e-418c-96d2-9780288fd5f7" path="/var/lib/kubelet/pods/2e847fab-859e-418c-96d2-9780288fd5f7/volumes" Dec 06 08:52:16 crc kubenswrapper[4954]: I1206 08:52:16.247117 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"56016706-35a9-41e9-afef-3555f10a4e94","Type":"ContainerStarted","Data":"cb1c710fd36a5ae84fd549724a988af6d49302dd421547d630ba5f32bfc57a05"} Dec 06 08:52:16 crc kubenswrapper[4954]: I1206 08:52:16.247467 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"56016706-35a9-41e9-afef-3555f10a4e94","Type":"ContainerStarted","Data":"f040544d223d42a0dbb39cf1c35cfec1d547a3d741e7617cc42f662ad296e713"} Dec 06 08:52:16 crc kubenswrapper[4954]: I1206 08:52:16.247482 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 06 08:52:16 crc kubenswrapper[4954]: I1206 08:52:16.271128 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.570381272 podStartE2EDuration="2.271100682s" podCreationTimestamp="2025-12-06 08:52:14 +0000 UTC" firstStartedPulling="2025-12-06 08:52:14.833507449 +0000 UTC m=+6909.646866838" lastFinishedPulling="2025-12-06 08:52:15.534226859 +0000 UTC m=+6910.347586248" observedRunningTime="2025-12-06 08:52:16.269297734 +0000 UTC m=+6911.082657123" watchObservedRunningTime="2025-12-06 08:52:16.271100682 +0000 UTC m=+6911.084460071" Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.714158 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7a78-account-create-update-79rdj"] Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.715904 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7a78-account-create-update-79rdj" Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.718638 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.722510 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-lmzpf"] Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.723899 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lmzpf" Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.750579 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-lmzpf"] Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.778218 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7a78-account-create-update-79rdj"] Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.855542 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbcth\" (UniqueName: \"kubernetes.io/projected/9542d633-4858-414f-8132-e4f3e860d56b-kube-api-access-bbcth\") pod \"keystone-7a78-account-create-update-79rdj\" (UID: \"9542d633-4858-414f-8132-e4f3e860d56b\") " pod="openstack/keystone-7a78-account-create-update-79rdj" Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.855836 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9542d633-4858-414f-8132-e4f3e860d56b-operator-scripts\") pod \"keystone-7a78-account-create-update-79rdj\" (UID: \"9542d633-4858-414f-8132-e4f3e860d56b\") " pod="openstack/keystone-7a78-account-create-update-79rdj" Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.855948 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c30c8ba-bb27-4cbf-9586-8c4c915bd571-operator-scripts\") pod \"keystone-db-create-lmzpf\" (UID: \"3c30c8ba-bb27-4cbf-9586-8c4c915bd571\") " pod="openstack/keystone-db-create-lmzpf" Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.856024 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k4rl\" (UniqueName: \"kubernetes.io/projected/3c30c8ba-bb27-4cbf-9586-8c4c915bd571-kube-api-access-5k4rl\") pod \"keystone-db-create-lmzpf\" (UID: \"3c30c8ba-bb27-4cbf-9586-8c4c915bd571\") " pod="openstack/keystone-db-create-lmzpf" Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.958292 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbcth\" (UniqueName: \"kubernetes.io/projected/9542d633-4858-414f-8132-e4f3e860d56b-kube-api-access-bbcth\") pod \"keystone-7a78-account-create-update-79rdj\" (UID: \"9542d633-4858-414f-8132-e4f3e860d56b\") " pod="openstack/keystone-7a78-account-create-update-79rdj" Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.958399 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9542d633-4858-414f-8132-e4f3e860d56b-operator-scripts\") pod \"keystone-7a78-account-create-update-79rdj\" (UID: \"9542d633-4858-414f-8132-e4f3e860d56b\") " pod="openstack/keystone-7a78-account-create-update-79rdj" Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.958447 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c30c8ba-bb27-4cbf-9586-8c4c915bd571-operator-scripts\") pod \"keystone-db-create-lmzpf\" (UID: \"3c30c8ba-bb27-4cbf-9586-8c4c915bd571\") " pod="openstack/keystone-db-create-lmzpf" Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.958483 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k4rl\" (UniqueName: \"kubernetes.io/projected/3c30c8ba-bb27-4cbf-9586-8c4c915bd571-kube-api-access-5k4rl\") pod \"keystone-db-create-lmzpf\" (UID: \"3c30c8ba-bb27-4cbf-9586-8c4c915bd571\") " pod="openstack/keystone-db-create-lmzpf" Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.959387 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9542d633-4858-414f-8132-e4f3e860d56b-operator-scripts\") pod \"keystone-7a78-account-create-update-79rdj\" (UID: \"9542d633-4858-414f-8132-e4f3e860d56b\") " pod="openstack/keystone-7a78-account-create-update-79rdj" Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.959504 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c30c8ba-bb27-4cbf-9586-8c4c915bd571-operator-scripts\") pod \"keystone-db-create-lmzpf\" (UID: \"3c30c8ba-bb27-4cbf-9586-8c4c915bd571\") " pod="openstack/keystone-db-create-lmzpf" Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.979323 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k4rl\" (UniqueName: \"kubernetes.io/projected/3c30c8ba-bb27-4cbf-9586-8c4c915bd571-kube-api-access-5k4rl\") pod \"keystone-db-create-lmzpf\" (UID: \"3c30c8ba-bb27-4cbf-9586-8c4c915bd571\") " pod="openstack/keystone-db-create-lmzpf" Dec 06 08:52:21 crc kubenswrapper[4954]: I1206 08:52:21.979323 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbcth\" (UniqueName: \"kubernetes.io/projected/9542d633-4858-414f-8132-e4f3e860d56b-kube-api-access-bbcth\") pod \"keystone-7a78-account-create-update-79rdj\" (UID: \"9542d633-4858-414f-8132-e4f3e860d56b\") " pod="openstack/keystone-7a78-account-create-update-79rdj" Dec 06 08:52:22 crc kubenswrapper[4954]: I1206 08:52:22.071443 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lmzpf" Dec 06 08:52:22 crc kubenswrapper[4954]: I1206 08:52:22.075555 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7a78-account-create-update-79rdj" Dec 06 08:52:22 crc kubenswrapper[4954]: I1206 08:52:22.529697 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-lmzpf"] Dec 06 08:52:22 crc kubenswrapper[4954]: W1206 08:52:22.534374 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c30c8ba_bb27_4cbf_9586_8c4c915bd571.slice/crio-86685829ae4e761275cbb101c2fa0526d649ddaab48ee5e3ce380bdf5b4f30c4 WatchSource:0}: Error finding container 86685829ae4e761275cbb101c2fa0526d649ddaab48ee5e3ce380bdf5b4f30c4: Status 404 returned error can't find the container with id 86685829ae4e761275cbb101c2fa0526d649ddaab48ee5e3ce380bdf5b4f30c4 Dec 06 08:52:22 crc kubenswrapper[4954]: W1206 08:52:22.611635 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9542d633_4858_414f_8132_e4f3e860d56b.slice/crio-8bd44a05bd709db7bcc3d212ee0502075885ecf61a89973e7ba923d0dcfa64ba WatchSource:0}: Error finding container 8bd44a05bd709db7bcc3d212ee0502075885ecf61a89973e7ba923d0dcfa64ba: Status 404 returned error can't find the container with id 8bd44a05bd709db7bcc3d212ee0502075885ecf61a89973e7ba923d0dcfa64ba Dec 06 08:52:22 crc kubenswrapper[4954]: I1206 08:52:22.616754 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7a78-account-create-update-79rdj"] Dec 06 08:52:23 crc kubenswrapper[4954]: I1206 08:52:23.304202 4954 generic.go:334] "Generic (PLEG): container finished" podID="3c30c8ba-bb27-4cbf-9586-8c4c915bd571" containerID="c0268e8e515dc62fa3730ba52504e551aa9923e39cabd7e224c50081a0f62485" exitCode=0 Dec 06 08:52:23 crc kubenswrapper[4954]: I1206 08:52:23.304288 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lmzpf" event={"ID":"3c30c8ba-bb27-4cbf-9586-8c4c915bd571","Type":"ContainerDied","Data":"c0268e8e515dc62fa3730ba52504e551aa9923e39cabd7e224c50081a0f62485"} Dec 06 08:52:23 crc kubenswrapper[4954]: I1206 08:52:23.304320 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lmzpf" event={"ID":"3c30c8ba-bb27-4cbf-9586-8c4c915bd571","Type":"ContainerStarted","Data":"86685829ae4e761275cbb101c2fa0526d649ddaab48ee5e3ce380bdf5b4f30c4"} Dec 06 08:52:23 crc kubenswrapper[4954]: I1206 08:52:23.306256 4954 generic.go:334] "Generic (PLEG): container finished" podID="9542d633-4858-414f-8132-e4f3e860d56b" containerID="dbc8f66159d7cae384b467443a21ce3e67df52b39bad636b0e7d7fb3391acfcf" exitCode=0 Dec 06 08:52:23 crc kubenswrapper[4954]: I1206 08:52:23.306296 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7a78-account-create-update-79rdj" event={"ID":"9542d633-4858-414f-8132-e4f3e860d56b","Type":"ContainerDied","Data":"dbc8f66159d7cae384b467443a21ce3e67df52b39bad636b0e7d7fb3391acfcf"} Dec 06 08:52:23 crc kubenswrapper[4954]: I1206 08:52:23.306318 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7a78-account-create-update-79rdj" event={"ID":"9542d633-4858-414f-8132-e4f3e860d56b","Type":"ContainerStarted","Data":"8bd44a05bd709db7bcc3d212ee0502075885ecf61a89973e7ba923d0dcfa64ba"} Dec 06 08:52:24 crc kubenswrapper[4954]: I1206 08:52:24.694820 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7a78-account-create-update-79rdj" Dec 06 08:52:24 crc kubenswrapper[4954]: I1206 08:52:24.703303 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lmzpf" Dec 06 08:52:24 crc kubenswrapper[4954]: I1206 08:52:24.805455 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9542d633-4858-414f-8132-e4f3e860d56b-operator-scripts\") pod \"9542d633-4858-414f-8132-e4f3e860d56b\" (UID: \"9542d633-4858-414f-8132-e4f3e860d56b\") " Dec 06 08:52:24 crc kubenswrapper[4954]: I1206 08:52:24.805499 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c30c8ba-bb27-4cbf-9586-8c4c915bd571-operator-scripts\") pod \"3c30c8ba-bb27-4cbf-9586-8c4c915bd571\" (UID: \"3c30c8ba-bb27-4cbf-9586-8c4c915bd571\") " Dec 06 08:52:24 crc kubenswrapper[4954]: I1206 08:52:24.805539 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k4rl\" (UniqueName: \"kubernetes.io/projected/3c30c8ba-bb27-4cbf-9586-8c4c915bd571-kube-api-access-5k4rl\") pod \"3c30c8ba-bb27-4cbf-9586-8c4c915bd571\" (UID: \"3c30c8ba-bb27-4cbf-9586-8c4c915bd571\") " Dec 06 08:52:24 crc kubenswrapper[4954]: I1206 08:52:24.805743 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbcth\" (UniqueName: \"kubernetes.io/projected/9542d633-4858-414f-8132-e4f3e860d56b-kube-api-access-bbcth\") pod \"9542d633-4858-414f-8132-e4f3e860d56b\" (UID: \"9542d633-4858-414f-8132-e4f3e860d56b\") " Dec 06 08:52:24 crc kubenswrapper[4954]: I1206 08:52:24.806329 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9542d633-4858-414f-8132-e4f3e860d56b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9542d633-4858-414f-8132-e4f3e860d56b" (UID: "9542d633-4858-414f-8132-e4f3e860d56b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:52:24 crc kubenswrapper[4954]: I1206 08:52:24.806352 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c30c8ba-bb27-4cbf-9586-8c4c915bd571-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c30c8ba-bb27-4cbf-9586-8c4c915bd571" (UID: "3c30c8ba-bb27-4cbf-9586-8c4c915bd571"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:52:24 crc kubenswrapper[4954]: I1206 08:52:24.811310 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9542d633-4858-414f-8132-e4f3e860d56b-kube-api-access-bbcth" (OuterVolumeSpecName: "kube-api-access-bbcth") pod "9542d633-4858-414f-8132-e4f3e860d56b" (UID: "9542d633-4858-414f-8132-e4f3e860d56b"). InnerVolumeSpecName "kube-api-access-bbcth". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:52:24 crc kubenswrapper[4954]: I1206 08:52:24.825125 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c30c8ba-bb27-4cbf-9586-8c4c915bd571-kube-api-access-5k4rl" (OuterVolumeSpecName: "kube-api-access-5k4rl") pod "3c30c8ba-bb27-4cbf-9586-8c4c915bd571" (UID: "3c30c8ba-bb27-4cbf-9586-8c4c915bd571"). InnerVolumeSpecName "kube-api-access-5k4rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:52:24 crc kubenswrapper[4954]: I1206 08:52:24.908008 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbcth\" (UniqueName: \"kubernetes.io/projected/9542d633-4858-414f-8132-e4f3e860d56b-kube-api-access-bbcth\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:24 crc kubenswrapper[4954]: I1206 08:52:24.908048 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9542d633-4858-414f-8132-e4f3e860d56b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:24 crc kubenswrapper[4954]: I1206 08:52:24.908058 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c30c8ba-bb27-4cbf-9586-8c4c915bd571-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:24 crc kubenswrapper[4954]: I1206 08:52:24.908071 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k4rl\" (UniqueName: \"kubernetes.io/projected/3c30c8ba-bb27-4cbf-9586-8c4c915bd571-kube-api-access-5k4rl\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:25 crc kubenswrapper[4954]: I1206 08:52:25.322151 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lmzpf" Dec 06 08:52:25 crc kubenswrapper[4954]: I1206 08:52:25.322387 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lmzpf" event={"ID":"3c30c8ba-bb27-4cbf-9586-8c4c915bd571","Type":"ContainerDied","Data":"86685829ae4e761275cbb101c2fa0526d649ddaab48ee5e3ce380bdf5b4f30c4"} Dec 06 08:52:25 crc kubenswrapper[4954]: I1206 08:52:25.322428 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86685829ae4e761275cbb101c2fa0526d649ddaab48ee5e3ce380bdf5b4f30c4" Dec 06 08:52:25 crc kubenswrapper[4954]: I1206 08:52:25.323489 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7a78-account-create-update-79rdj" event={"ID":"9542d633-4858-414f-8132-e4f3e860d56b","Type":"ContainerDied","Data":"8bd44a05bd709db7bcc3d212ee0502075885ecf61a89973e7ba923d0dcfa64ba"} Dec 06 08:52:25 crc kubenswrapper[4954]: I1206 08:52:25.323518 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bd44a05bd709db7bcc3d212ee0502075885ecf61a89973e7ba923d0dcfa64ba" Dec 06 08:52:25 crc kubenswrapper[4954]: I1206 08:52:25.323581 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7a78-account-create-update-79rdj" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.245372 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-nv5pz"] Dec 06 08:52:27 crc kubenswrapper[4954]: E1206 08:52:27.246114 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c30c8ba-bb27-4cbf-9586-8c4c915bd571" containerName="mariadb-database-create" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.246132 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c30c8ba-bb27-4cbf-9586-8c4c915bd571" containerName="mariadb-database-create" Dec 06 08:52:27 crc kubenswrapper[4954]: E1206 08:52:27.246157 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9542d633-4858-414f-8132-e4f3e860d56b" containerName="mariadb-account-create-update" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.246165 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9542d633-4858-414f-8132-e4f3e860d56b" containerName="mariadb-account-create-update" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.246368 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9542d633-4858-414f-8132-e4f3e860d56b" containerName="mariadb-account-create-update" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.246386 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c30c8ba-bb27-4cbf-9586-8c4c915bd571" containerName="mariadb-database-create" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.247153 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nv5pz" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.251824 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.252515 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pw4hk" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.252606 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-nv5pz"] Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.253293 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.259051 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.349184 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z4mp\" (UniqueName: \"kubernetes.io/projected/a1477797-cbd1-46d5-a238-70b8a626f523-kube-api-access-6z4mp\") pod \"keystone-db-sync-nv5pz\" (UID: \"a1477797-cbd1-46d5-a238-70b8a626f523\") " pod="openstack/keystone-db-sync-nv5pz" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.349402 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1477797-cbd1-46d5-a238-70b8a626f523-combined-ca-bundle\") pod \"keystone-db-sync-nv5pz\" (UID: \"a1477797-cbd1-46d5-a238-70b8a626f523\") " pod="openstack/keystone-db-sync-nv5pz" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.349655 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1477797-cbd1-46d5-a238-70b8a626f523-config-data\") pod \"keystone-db-sync-nv5pz\" (UID: \"a1477797-cbd1-46d5-a238-70b8a626f523\") " pod="openstack/keystone-db-sync-nv5pz" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.450817 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z4mp\" (UniqueName: \"kubernetes.io/projected/a1477797-cbd1-46d5-a238-70b8a626f523-kube-api-access-6z4mp\") pod \"keystone-db-sync-nv5pz\" (UID: \"a1477797-cbd1-46d5-a238-70b8a626f523\") " pod="openstack/keystone-db-sync-nv5pz" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.450907 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1477797-cbd1-46d5-a238-70b8a626f523-combined-ca-bundle\") pod \"keystone-db-sync-nv5pz\" (UID: \"a1477797-cbd1-46d5-a238-70b8a626f523\") " pod="openstack/keystone-db-sync-nv5pz" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.450980 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1477797-cbd1-46d5-a238-70b8a626f523-config-data\") pod \"keystone-db-sync-nv5pz\" (UID: \"a1477797-cbd1-46d5-a238-70b8a626f523\") " pod="openstack/keystone-db-sync-nv5pz" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.455942 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1477797-cbd1-46d5-a238-70b8a626f523-config-data\") pod \"keystone-db-sync-nv5pz\" (UID: \"a1477797-cbd1-46d5-a238-70b8a626f523\") " pod="openstack/keystone-db-sync-nv5pz" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.465243 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1477797-cbd1-46d5-a238-70b8a626f523-combined-ca-bundle\") pod \"keystone-db-sync-nv5pz\" (UID: \"a1477797-cbd1-46d5-a238-70b8a626f523\") " pod="openstack/keystone-db-sync-nv5pz" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.470088 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z4mp\" (UniqueName: \"kubernetes.io/projected/a1477797-cbd1-46d5-a238-70b8a626f523-kube-api-access-6z4mp\") pod \"keystone-db-sync-nv5pz\" (UID: \"a1477797-cbd1-46d5-a238-70b8a626f523\") " pod="openstack/keystone-db-sync-nv5pz" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.564006 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nv5pz" Dec 06 08:52:27 crc kubenswrapper[4954]: I1206 08:52:27.999233 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-nv5pz"] Dec 06 08:52:28 crc kubenswrapper[4954]: W1206 08:52:28.006813 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1477797_cbd1_46d5_a238_70b8a626f523.slice/crio-6c767d72097cfe159f796f32cd6bc998299e7d15041644c094ee2c35ed9a7494 WatchSource:0}: Error finding container 6c767d72097cfe159f796f32cd6bc998299e7d15041644c094ee2c35ed9a7494: Status 404 returned error can't find the container with id 6c767d72097cfe159f796f32cd6bc998299e7d15041644c094ee2c35ed9a7494 Dec 06 08:52:28 crc kubenswrapper[4954]: I1206 08:52:28.346854 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nv5pz" event={"ID":"a1477797-cbd1-46d5-a238-70b8a626f523","Type":"ContainerStarted","Data":"6c767d72097cfe159f796f32cd6bc998299e7d15041644c094ee2c35ed9a7494"} Dec 06 08:52:29 crc kubenswrapper[4954]: I1206 08:52:29.614933 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 06 08:52:33 crc kubenswrapper[4954]: I1206 08:52:33.388602 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nv5pz" event={"ID":"a1477797-cbd1-46d5-a238-70b8a626f523","Type":"ContainerStarted","Data":"ec5af4d7d75513ef4eeebe851183690db923c6fa8c063cb21797c6a784499a9f"} Dec 06 08:52:33 crc kubenswrapper[4954]: I1206 08:52:33.407959 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-nv5pz" podStartSLOduration=1.7708085740000001 podStartE2EDuration="6.407939482s" podCreationTimestamp="2025-12-06 08:52:27 +0000 UTC" firstStartedPulling="2025-12-06 08:52:28.008726351 +0000 UTC m=+6922.822085740" lastFinishedPulling="2025-12-06 08:52:32.645857259 +0000 UTC m=+6927.459216648" observedRunningTime="2025-12-06 08:52:33.400901394 +0000 UTC m=+6928.214260783" watchObservedRunningTime="2025-12-06 08:52:33.407939482 +0000 UTC m=+6928.221298871" Dec 06 08:52:35 crc kubenswrapper[4954]: I1206 08:52:35.405268 4954 generic.go:334] "Generic (PLEG): container finished" podID="a1477797-cbd1-46d5-a238-70b8a626f523" containerID="ec5af4d7d75513ef4eeebe851183690db923c6fa8c063cb21797c6a784499a9f" exitCode=0 Dec 06 08:52:35 crc kubenswrapper[4954]: I1206 08:52:35.405335 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nv5pz" event={"ID":"a1477797-cbd1-46d5-a238-70b8a626f523","Type":"ContainerDied","Data":"ec5af4d7d75513ef4eeebe851183690db923c6fa8c063cb21797c6a784499a9f"} Dec 06 08:52:36 crc kubenswrapper[4954]: I1206 08:52:36.724306 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nv5pz" Dec 06 08:52:36 crc kubenswrapper[4954]: I1206 08:52:36.817350 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z4mp\" (UniqueName: \"kubernetes.io/projected/a1477797-cbd1-46d5-a238-70b8a626f523-kube-api-access-6z4mp\") pod \"a1477797-cbd1-46d5-a238-70b8a626f523\" (UID: \"a1477797-cbd1-46d5-a238-70b8a626f523\") " Dec 06 08:52:36 crc kubenswrapper[4954]: I1206 08:52:36.817458 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1477797-cbd1-46d5-a238-70b8a626f523-config-data\") pod \"a1477797-cbd1-46d5-a238-70b8a626f523\" (UID: \"a1477797-cbd1-46d5-a238-70b8a626f523\") " Dec 06 08:52:36 crc kubenswrapper[4954]: I1206 08:52:36.817492 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1477797-cbd1-46d5-a238-70b8a626f523-combined-ca-bundle\") pod \"a1477797-cbd1-46d5-a238-70b8a626f523\" (UID: \"a1477797-cbd1-46d5-a238-70b8a626f523\") " Dec 06 08:52:36 crc kubenswrapper[4954]: I1206 08:52:36.827320 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1477797-cbd1-46d5-a238-70b8a626f523-kube-api-access-6z4mp" (OuterVolumeSpecName: "kube-api-access-6z4mp") pod "a1477797-cbd1-46d5-a238-70b8a626f523" (UID: "a1477797-cbd1-46d5-a238-70b8a626f523"). InnerVolumeSpecName "kube-api-access-6z4mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:52:36 crc kubenswrapper[4954]: I1206 08:52:36.841838 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1477797-cbd1-46d5-a238-70b8a626f523-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1477797-cbd1-46d5-a238-70b8a626f523" (UID: "a1477797-cbd1-46d5-a238-70b8a626f523"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:52:36 crc kubenswrapper[4954]: I1206 08:52:36.860801 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1477797-cbd1-46d5-a238-70b8a626f523-config-data" (OuterVolumeSpecName: "config-data") pod "a1477797-cbd1-46d5-a238-70b8a626f523" (UID: "a1477797-cbd1-46d5-a238-70b8a626f523"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:52:36 crc kubenswrapper[4954]: I1206 08:52:36.920055 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z4mp\" (UniqueName: \"kubernetes.io/projected/a1477797-cbd1-46d5-a238-70b8a626f523-kube-api-access-6z4mp\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:36 crc kubenswrapper[4954]: I1206 08:52:36.920097 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1477797-cbd1-46d5-a238-70b8a626f523-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:36 crc kubenswrapper[4954]: I1206 08:52:36.920109 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1477797-cbd1-46d5-a238-70b8a626f523-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.420386 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nv5pz" event={"ID":"a1477797-cbd1-46d5-a238-70b8a626f523","Type":"ContainerDied","Data":"6c767d72097cfe159f796f32cd6bc998299e7d15041644c094ee2c35ed9a7494"} Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.420429 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c767d72097cfe159f796f32cd6bc998299e7d15041644c094ee2c35ed9a7494" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.420481 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nv5pz" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.668788 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79577dcf9-s97g2"] Dec 06 08:52:37 crc kubenswrapper[4954]: E1206 08:52:37.669403 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1477797-cbd1-46d5-a238-70b8a626f523" containerName="keystone-db-sync" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.669419 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1477797-cbd1-46d5-a238-70b8a626f523" containerName="keystone-db-sync" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.669619 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1477797-cbd1-46d5-a238-70b8a626f523" containerName="keystone-db-sync" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.670455 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.681372 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79577dcf9-s97g2"] Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.716047 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-zxfrk"] Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.717455 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.721133 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.721181 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.721354 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.721474 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pw4hk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.721768 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.732091 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zxfrk"] Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.736238 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-ovsdbserver-sb\") pod \"dnsmasq-dns-79577dcf9-s97g2\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.736314 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-ovsdbserver-nb\") pod \"dnsmasq-dns-79577dcf9-s97g2\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.736454 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-dns-svc\") pod \"dnsmasq-dns-79577dcf9-s97g2\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.736738 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-config\") pod \"dnsmasq-dns-79577dcf9-s97g2\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.736774 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gzrr\" (UniqueName: \"kubernetes.io/projected/511a55c7-38eb-4bed-a1b2-fabf31088a7f-kube-api-access-8gzrr\") pod \"dnsmasq-dns-79577dcf9-s97g2\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.838572 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-ovsdbserver-nb\") pod \"dnsmasq-dns-79577dcf9-s97g2\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.838625 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-fernet-keys\") pod \"keystone-bootstrap-zxfrk\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.838658 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94n8s\" (UniqueName: \"kubernetes.io/projected/d39ed8cc-d509-4f07-bde1-4ac69c810c60-kube-api-access-94n8s\") pod \"keystone-bootstrap-zxfrk\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.838687 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-dns-svc\") pod \"dnsmasq-dns-79577dcf9-s97g2\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.838739 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-credential-keys\") pod \"keystone-bootstrap-zxfrk\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.838761 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-config-data\") pod \"keystone-bootstrap-zxfrk\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.838841 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-config\") pod \"dnsmasq-dns-79577dcf9-s97g2\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.838865 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gzrr\" (UniqueName: \"kubernetes.io/projected/511a55c7-38eb-4bed-a1b2-fabf31088a7f-kube-api-access-8gzrr\") pod \"dnsmasq-dns-79577dcf9-s97g2\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.838892 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-combined-ca-bundle\") pod \"keystone-bootstrap-zxfrk\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.838921 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-scripts\") pod \"keystone-bootstrap-zxfrk\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.838988 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-ovsdbserver-sb\") pod \"dnsmasq-dns-79577dcf9-s97g2\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.839639 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-ovsdbserver-nb\") pod \"dnsmasq-dns-79577dcf9-s97g2\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.839889 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-dns-svc\") pod \"dnsmasq-dns-79577dcf9-s97g2\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.840294 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-config\") pod \"dnsmasq-dns-79577dcf9-s97g2\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.840514 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-ovsdbserver-sb\") pod \"dnsmasq-dns-79577dcf9-s97g2\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.865640 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gzrr\" (UniqueName: \"kubernetes.io/projected/511a55c7-38eb-4bed-a1b2-fabf31088a7f-kube-api-access-8gzrr\") pod \"dnsmasq-dns-79577dcf9-s97g2\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.940282 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-combined-ca-bundle\") pod \"keystone-bootstrap-zxfrk\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.940343 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-scripts\") pod \"keystone-bootstrap-zxfrk\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.940446 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-fernet-keys\") pod \"keystone-bootstrap-zxfrk\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.940482 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94n8s\" (UniqueName: \"kubernetes.io/projected/d39ed8cc-d509-4f07-bde1-4ac69c810c60-kube-api-access-94n8s\") pod \"keystone-bootstrap-zxfrk\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.940540 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-credential-keys\") pod \"keystone-bootstrap-zxfrk\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.940583 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-config-data\") pod \"keystone-bootstrap-zxfrk\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.944885 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-config-data\") pod \"keystone-bootstrap-zxfrk\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.945294 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-fernet-keys\") pod \"keystone-bootstrap-zxfrk\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.945737 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-scripts\") pod \"keystone-bootstrap-zxfrk\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.945878 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-credential-keys\") pod \"keystone-bootstrap-zxfrk\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.946740 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-combined-ca-bundle\") pod \"keystone-bootstrap-zxfrk\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.964235 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94n8s\" (UniqueName: \"kubernetes.io/projected/d39ed8cc-d509-4f07-bde1-4ac69c810c60-kube-api-access-94n8s\") pod \"keystone-bootstrap-zxfrk\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:37 crc kubenswrapper[4954]: I1206 08:52:37.989883 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:38 crc kubenswrapper[4954]: I1206 08:52:38.035345 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:38 crc kubenswrapper[4954]: I1206 08:52:38.470955 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79577dcf9-s97g2"] Dec 06 08:52:38 crc kubenswrapper[4954]: I1206 08:52:38.561778 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zxfrk"] Dec 06 08:52:39 crc kubenswrapper[4954]: I1206 08:52:39.441692 4954 generic.go:334] "Generic (PLEG): container finished" podID="511a55c7-38eb-4bed-a1b2-fabf31088a7f" containerID="eb02962ebc321424f484beab13799ab59e7848c8a359502020dd1c8bb4220982" exitCode=0 Dec 06 08:52:39 crc kubenswrapper[4954]: I1206 08:52:39.442605 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79577dcf9-s97g2" event={"ID":"511a55c7-38eb-4bed-a1b2-fabf31088a7f","Type":"ContainerDied","Data":"eb02962ebc321424f484beab13799ab59e7848c8a359502020dd1c8bb4220982"} Dec 06 08:52:39 crc kubenswrapper[4954]: I1206 08:52:39.466754 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79577dcf9-s97g2" event={"ID":"511a55c7-38eb-4bed-a1b2-fabf31088a7f","Type":"ContainerStarted","Data":"41de9ec3760f5c7bc1c3f3a578be7562f4dcec364f5bdc793e7371632c919b6b"} Dec 06 08:52:39 crc kubenswrapper[4954]: I1206 08:52:39.466807 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zxfrk" event={"ID":"d39ed8cc-d509-4f07-bde1-4ac69c810c60","Type":"ContainerStarted","Data":"b00dfc96ff5255ac93b2e6a1f9041cb8e56f587ef1bbc1dac79fb4e40710a890"} Dec 06 08:52:39 crc kubenswrapper[4954]: I1206 08:52:39.466827 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zxfrk" event={"ID":"d39ed8cc-d509-4f07-bde1-4ac69c810c60","Type":"ContainerStarted","Data":"3d2ab0c894d9b660f8f6bfe5681b75fadc4555267b27c45efbf59ef49bd22762"} Dec 06 08:52:39 crc kubenswrapper[4954]: I1206 08:52:39.504957 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-zxfrk" podStartSLOduration=2.5049353439999997 podStartE2EDuration="2.504935344s" podCreationTimestamp="2025-12-06 08:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:52:39.494050624 +0000 UTC m=+6934.307410033" watchObservedRunningTime="2025-12-06 08:52:39.504935344 +0000 UTC m=+6934.318294733" Dec 06 08:52:40 crc kubenswrapper[4954]: I1206 08:52:40.101431 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:52:40 crc kubenswrapper[4954]: I1206 08:52:40.101734 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:52:40 crc kubenswrapper[4954]: I1206 08:52:40.464318 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79577dcf9-s97g2" event={"ID":"511a55c7-38eb-4bed-a1b2-fabf31088a7f","Type":"ContainerStarted","Data":"1948c38edc7e0e732906870967cc1b74ef13074a468c2c88d5f57c9297df1c08"} Dec 06 08:52:40 crc kubenswrapper[4954]: I1206 08:52:40.465602 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:40 crc kubenswrapper[4954]: I1206 08:52:40.489447 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79577dcf9-s97g2" podStartSLOduration=3.489426027 podStartE2EDuration="3.489426027s" podCreationTimestamp="2025-12-06 08:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:52:40.482864982 +0000 UTC m=+6935.296224371" watchObservedRunningTime="2025-12-06 08:52:40.489426027 +0000 UTC m=+6935.302785416" Dec 06 08:52:42 crc kubenswrapper[4954]: I1206 08:52:42.495339 4954 generic.go:334] "Generic (PLEG): container finished" podID="d39ed8cc-d509-4f07-bde1-4ac69c810c60" containerID="b00dfc96ff5255ac93b2e6a1f9041cb8e56f587ef1bbc1dac79fb4e40710a890" exitCode=0 Dec 06 08:52:42 crc kubenswrapper[4954]: I1206 08:52:42.495948 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zxfrk" event={"ID":"d39ed8cc-d509-4f07-bde1-4ac69c810c60","Type":"ContainerDied","Data":"b00dfc96ff5255ac93b2e6a1f9041cb8e56f587ef1bbc1dac79fb4e40710a890"} Dec 06 08:52:43 crc kubenswrapper[4954]: I1206 08:52:43.916522 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:43 crc kubenswrapper[4954]: I1206 08:52:43.962641 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-config-data\") pod \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " Dec 06 08:52:43 crc kubenswrapper[4954]: I1206 08:52:43.962686 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-credential-keys\") pod \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " Dec 06 08:52:43 crc kubenswrapper[4954]: I1206 08:52:43.962762 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-scripts\") pod \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " Dec 06 08:52:43 crc kubenswrapper[4954]: I1206 08:52:43.962815 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94n8s\" (UniqueName: \"kubernetes.io/projected/d39ed8cc-d509-4f07-bde1-4ac69c810c60-kube-api-access-94n8s\") pod \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " Dec 06 08:52:43 crc kubenswrapper[4954]: I1206 08:52:43.962903 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-fernet-keys\") pod \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " Dec 06 08:52:43 crc kubenswrapper[4954]: I1206 08:52:43.962944 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-combined-ca-bundle\") pod \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\" (UID: \"d39ed8cc-d509-4f07-bde1-4ac69c810c60\") " Dec 06 08:52:43 crc kubenswrapper[4954]: I1206 08:52:43.969506 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d39ed8cc-d509-4f07-bde1-4ac69c810c60" (UID: "d39ed8cc-d509-4f07-bde1-4ac69c810c60"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:52:43 crc kubenswrapper[4954]: I1206 08:52:43.977987 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-scripts" (OuterVolumeSpecName: "scripts") pod "d39ed8cc-d509-4f07-bde1-4ac69c810c60" (UID: "d39ed8cc-d509-4f07-bde1-4ac69c810c60"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:52:43 crc kubenswrapper[4954]: I1206 08:52:43.978778 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d39ed8cc-d509-4f07-bde1-4ac69c810c60" (UID: "d39ed8cc-d509-4f07-bde1-4ac69c810c60"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:52:43 crc kubenswrapper[4954]: I1206 08:52:43.993795 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-config-data" (OuterVolumeSpecName: "config-data") pod "d39ed8cc-d509-4f07-bde1-4ac69c810c60" (UID: "d39ed8cc-d509-4f07-bde1-4ac69c810c60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:52:43 crc kubenswrapper[4954]: I1206 08:52:43.993833 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d39ed8cc-d509-4f07-bde1-4ac69c810c60-kube-api-access-94n8s" (OuterVolumeSpecName: "kube-api-access-94n8s") pod "d39ed8cc-d509-4f07-bde1-4ac69c810c60" (UID: "d39ed8cc-d509-4f07-bde1-4ac69c810c60"). InnerVolumeSpecName "kube-api-access-94n8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:52:43 crc kubenswrapper[4954]: I1206 08:52:43.999702 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d39ed8cc-d509-4f07-bde1-4ac69c810c60" (UID: "d39ed8cc-d509-4f07-bde1-4ac69c810c60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.065989 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.066045 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94n8s\" (UniqueName: \"kubernetes.io/projected/d39ed8cc-d509-4f07-bde1-4ac69c810c60-kube-api-access-94n8s\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.066057 4954 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.066068 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.066078 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.066086 4954 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d39ed8cc-d509-4f07-bde1-4ac69c810c60-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.510748 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zxfrk" event={"ID":"d39ed8cc-d509-4f07-bde1-4ac69c810c60","Type":"ContainerDied","Data":"3d2ab0c894d9b660f8f6bfe5681b75fadc4555267b27c45efbf59ef49bd22762"} Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.510796 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d2ab0c894d9b660f8f6bfe5681b75fadc4555267b27c45efbf59ef49bd22762" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.510820 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zxfrk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.583517 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-zxfrk"] Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.590203 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-zxfrk"] Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.676732 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xjqjk"] Dec 06 08:52:44 crc kubenswrapper[4954]: E1206 08:52:44.677341 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d39ed8cc-d509-4f07-bde1-4ac69c810c60" containerName="keystone-bootstrap" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.677440 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39ed8cc-d509-4f07-bde1-4ac69c810c60" containerName="keystone-bootstrap" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.677785 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d39ed8cc-d509-4f07-bde1-4ac69c810c60" containerName="keystone-bootstrap" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.678684 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.682901 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.683295 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pw4hk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.683422 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.683815 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.683860 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.685128 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xjqjk"] Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.777071 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-credential-keys\") pod \"keystone-bootstrap-xjqjk\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.777214 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-scripts\") pod \"keystone-bootstrap-xjqjk\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.777290 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-combined-ca-bundle\") pod \"keystone-bootstrap-xjqjk\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.777480 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-config-data\") pod \"keystone-bootstrap-xjqjk\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.777660 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrp5r\" (UniqueName: \"kubernetes.io/projected/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-kube-api-access-xrp5r\") pod \"keystone-bootstrap-xjqjk\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.777731 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-fernet-keys\") pod \"keystone-bootstrap-xjqjk\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.879816 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-config-data\") pod \"keystone-bootstrap-xjqjk\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.879913 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrp5r\" (UniqueName: \"kubernetes.io/projected/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-kube-api-access-xrp5r\") pod \"keystone-bootstrap-xjqjk\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.879965 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-fernet-keys\") pod \"keystone-bootstrap-xjqjk\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.879990 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-credential-keys\") pod \"keystone-bootstrap-xjqjk\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.880047 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-scripts\") pod \"keystone-bootstrap-xjqjk\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.880085 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-combined-ca-bundle\") pod \"keystone-bootstrap-xjqjk\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.885069 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-scripts\") pod \"keystone-bootstrap-xjqjk\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.885207 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-fernet-keys\") pod \"keystone-bootstrap-xjqjk\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.885395 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-credential-keys\") pod \"keystone-bootstrap-xjqjk\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.886307 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-combined-ca-bundle\") pod \"keystone-bootstrap-xjqjk\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.887695 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-config-data\") pod \"keystone-bootstrap-xjqjk\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.896002 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrp5r\" (UniqueName: \"kubernetes.io/projected/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-kube-api-access-xrp5r\") pod \"keystone-bootstrap-xjqjk\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:44 crc kubenswrapper[4954]: I1206 08:52:44.995930 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:45 crc kubenswrapper[4954]: I1206 08:52:45.418018 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xjqjk"] Dec 06 08:52:45 crc kubenswrapper[4954]: W1206 08:52:45.431009 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod668295d0_3bb0_41cd_ae1d_6ab1558d79fc.slice/crio-e19267172cbca1ddc1db7a99b353b54d2351cf4706621602fec69fd4411870a9 WatchSource:0}: Error finding container e19267172cbca1ddc1db7a99b353b54d2351cf4706621602fec69fd4411870a9: Status 404 returned error can't find the container with id e19267172cbca1ddc1db7a99b353b54d2351cf4706621602fec69fd4411870a9 Dec 06 08:52:45 crc kubenswrapper[4954]: I1206 08:52:45.457882 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d39ed8cc-d509-4f07-bde1-4ac69c810c60" path="/var/lib/kubelet/pods/d39ed8cc-d509-4f07-bde1-4ac69c810c60/volumes" Dec 06 08:52:45 crc kubenswrapper[4954]: I1206 08:52:45.520983 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xjqjk" event={"ID":"668295d0-3bb0-41cd-ae1d-6ab1558d79fc","Type":"ContainerStarted","Data":"e19267172cbca1ddc1db7a99b353b54d2351cf4706621602fec69fd4411870a9"} Dec 06 08:52:46 crc kubenswrapper[4954]: I1206 08:52:46.529381 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xjqjk" event={"ID":"668295d0-3bb0-41cd-ae1d-6ab1558d79fc","Type":"ContainerStarted","Data":"5720dd372db598a4d9ed79a7faf225ca141e8786642b06a3540a3def26930713"} Dec 06 08:52:46 crc kubenswrapper[4954]: I1206 08:52:46.549658 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xjqjk" podStartSLOduration=2.54963996 podStartE2EDuration="2.54963996s" podCreationTimestamp="2025-12-06 08:52:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:52:46.545589222 +0000 UTC m=+6941.358948671" watchObservedRunningTime="2025-12-06 08:52:46.54963996 +0000 UTC m=+6941.362999349" Dec 06 08:52:47 crc kubenswrapper[4954]: I1206 08:52:47.991731 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.040930 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869d89ccc5-67qzv"] Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.041641 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" podUID="060e5362-4797-4fb4-8855-7dde90bb4bf8" containerName="dnsmasq-dns" containerID="cri-o://a3abc95c23bdb4f2bcf32f418ea864da9eda7ba7590a3d4ab03cc259abf79669" gracePeriod=10 Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.547023 4954 generic.go:334] "Generic (PLEG): container finished" podID="060e5362-4797-4fb4-8855-7dde90bb4bf8" containerID="a3abc95c23bdb4f2bcf32f418ea864da9eda7ba7590a3d4ab03cc259abf79669" exitCode=0 Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.547073 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" event={"ID":"060e5362-4797-4fb4-8855-7dde90bb4bf8","Type":"ContainerDied","Data":"a3abc95c23bdb4f2bcf32f418ea864da9eda7ba7590a3d4ab03cc259abf79669"} Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.547101 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" event={"ID":"060e5362-4797-4fb4-8855-7dde90bb4bf8","Type":"ContainerDied","Data":"a0c8540c81e18a6fa03c321332ff36b3bc9185120b7fcecd16c9e38bb99e3715"} Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.547111 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0c8540c81e18a6fa03c321332ff36b3bc9185120b7fcecd16c9e38bb99e3715" Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.571527 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.643191 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx8tn\" (UniqueName: \"kubernetes.io/projected/060e5362-4797-4fb4-8855-7dde90bb4bf8-kube-api-access-vx8tn\") pod \"060e5362-4797-4fb4-8855-7dde90bb4bf8\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.643319 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-config\") pod \"060e5362-4797-4fb4-8855-7dde90bb4bf8\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.643372 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-ovsdbserver-nb\") pod \"060e5362-4797-4fb4-8855-7dde90bb4bf8\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.643429 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-ovsdbserver-sb\") pod \"060e5362-4797-4fb4-8855-7dde90bb4bf8\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.643475 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-dns-svc\") pod \"060e5362-4797-4fb4-8855-7dde90bb4bf8\" (UID: \"060e5362-4797-4fb4-8855-7dde90bb4bf8\") " Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.663653 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/060e5362-4797-4fb4-8855-7dde90bb4bf8-kube-api-access-vx8tn" (OuterVolumeSpecName: "kube-api-access-vx8tn") pod "060e5362-4797-4fb4-8855-7dde90bb4bf8" (UID: "060e5362-4797-4fb4-8855-7dde90bb4bf8"). InnerVolumeSpecName "kube-api-access-vx8tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.693722 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "060e5362-4797-4fb4-8855-7dde90bb4bf8" (UID: "060e5362-4797-4fb4-8855-7dde90bb4bf8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.695704 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-config" (OuterVolumeSpecName: "config") pod "060e5362-4797-4fb4-8855-7dde90bb4bf8" (UID: "060e5362-4797-4fb4-8855-7dde90bb4bf8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.697890 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "060e5362-4797-4fb4-8855-7dde90bb4bf8" (UID: "060e5362-4797-4fb4-8855-7dde90bb4bf8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.699023 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "060e5362-4797-4fb4-8855-7dde90bb4bf8" (UID: "060e5362-4797-4fb4-8855-7dde90bb4bf8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.745608 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.745637 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.745667 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx8tn\" (UniqueName: \"kubernetes.io/projected/060e5362-4797-4fb4-8855-7dde90bb4bf8-kube-api-access-vx8tn\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.745678 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-config\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:48 crc kubenswrapper[4954]: I1206 08:52:48.745687 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/060e5362-4797-4fb4-8855-7dde90bb4bf8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:49 crc kubenswrapper[4954]: I1206 08:52:49.556887 4954 generic.go:334] "Generic (PLEG): container finished" podID="668295d0-3bb0-41cd-ae1d-6ab1558d79fc" containerID="5720dd372db598a4d9ed79a7faf225ca141e8786642b06a3540a3def26930713" exitCode=0 Dec 06 08:52:49 crc kubenswrapper[4954]: I1206 08:52:49.556945 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xjqjk" event={"ID":"668295d0-3bb0-41cd-ae1d-6ab1558d79fc","Type":"ContainerDied","Data":"5720dd372db598a4d9ed79a7faf225ca141e8786642b06a3540a3def26930713"} Dec 06 08:52:49 crc kubenswrapper[4954]: I1206 08:52:49.557034 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869d89ccc5-67qzv" Dec 06 08:52:49 crc kubenswrapper[4954]: I1206 08:52:49.600957 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869d89ccc5-67qzv"] Dec 06 08:52:49 crc kubenswrapper[4954]: I1206 08:52:49.614236 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-869d89ccc5-67qzv"] Dec 06 08:52:50 crc kubenswrapper[4954]: I1206 08:52:50.917674 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:50 crc kubenswrapper[4954]: I1206 08:52:50.996401 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrp5r\" (UniqueName: \"kubernetes.io/projected/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-kube-api-access-xrp5r\") pod \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " Dec 06 08:52:50 crc kubenswrapper[4954]: I1206 08:52:50.996451 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-credential-keys\") pod \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " Dec 06 08:52:50 crc kubenswrapper[4954]: I1206 08:52:50.996496 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-fernet-keys\") pod \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " Dec 06 08:52:50 crc kubenswrapper[4954]: I1206 08:52:50.996582 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-scripts\") pod \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " Dec 06 08:52:50 crc kubenswrapper[4954]: I1206 08:52:50.996600 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-combined-ca-bundle\") pod \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " Dec 06 08:52:50 crc kubenswrapper[4954]: I1206 08:52:50.996631 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-config-data\") pod \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\" (UID: \"668295d0-3bb0-41cd-ae1d-6ab1558d79fc\") " Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.002342 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-kube-api-access-xrp5r" (OuterVolumeSpecName: "kube-api-access-xrp5r") pod "668295d0-3bb0-41cd-ae1d-6ab1558d79fc" (UID: "668295d0-3bb0-41cd-ae1d-6ab1558d79fc"). InnerVolumeSpecName "kube-api-access-xrp5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.003409 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "668295d0-3bb0-41cd-ae1d-6ab1558d79fc" (UID: "668295d0-3bb0-41cd-ae1d-6ab1558d79fc"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.003828 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-scripts" (OuterVolumeSpecName: "scripts") pod "668295d0-3bb0-41cd-ae1d-6ab1558d79fc" (UID: "668295d0-3bb0-41cd-ae1d-6ab1558d79fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.005720 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "668295d0-3bb0-41cd-ae1d-6ab1558d79fc" (UID: "668295d0-3bb0-41cd-ae1d-6ab1558d79fc"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.023532 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "668295d0-3bb0-41cd-ae1d-6ab1558d79fc" (UID: "668295d0-3bb0-41cd-ae1d-6ab1558d79fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.031026 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-config-data" (OuterVolumeSpecName: "config-data") pod "668295d0-3bb0-41cd-ae1d-6ab1558d79fc" (UID: "668295d0-3bb0-41cd-ae1d-6ab1558d79fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.098733 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrp5r\" (UniqueName: \"kubernetes.io/projected/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-kube-api-access-xrp5r\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.098769 4954 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.098780 4954 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.098789 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.098800 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.098808 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/668295d0-3bb0-41cd-ae1d-6ab1558d79fc-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.453100 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="060e5362-4797-4fb4-8855-7dde90bb4bf8" path="/var/lib/kubelet/pods/060e5362-4797-4fb4-8855-7dde90bb4bf8/volumes" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.578152 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xjqjk" event={"ID":"668295d0-3bb0-41cd-ae1d-6ab1558d79fc","Type":"ContainerDied","Data":"e19267172cbca1ddc1db7a99b353b54d2351cf4706621602fec69fd4411870a9"} Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.578207 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e19267172cbca1ddc1db7a99b353b54d2351cf4706621602fec69fd4411870a9" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.578215 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xjqjk" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.660425 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6dc49868fc-6xrnw"] Dec 06 08:52:51 crc kubenswrapper[4954]: E1206 08:52:51.660784 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="060e5362-4797-4fb4-8855-7dde90bb4bf8" containerName="init" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.660800 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="060e5362-4797-4fb4-8855-7dde90bb4bf8" containerName="init" Dec 06 08:52:51 crc kubenswrapper[4954]: E1206 08:52:51.660818 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="060e5362-4797-4fb4-8855-7dde90bb4bf8" containerName="dnsmasq-dns" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.660825 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="060e5362-4797-4fb4-8855-7dde90bb4bf8" containerName="dnsmasq-dns" Dec 06 08:52:51 crc kubenswrapper[4954]: E1206 08:52:51.660852 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668295d0-3bb0-41cd-ae1d-6ab1558d79fc" containerName="keystone-bootstrap" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.660859 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="668295d0-3bb0-41cd-ae1d-6ab1558d79fc" containerName="keystone-bootstrap" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.661003 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="668295d0-3bb0-41cd-ae1d-6ab1558d79fc" containerName="keystone-bootstrap" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.661017 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="060e5362-4797-4fb4-8855-7dde90bb4bf8" containerName="dnsmasq-dns" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.661699 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.663239 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.670858 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.670941 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.671204 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pw4hk" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.672231 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.672280 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.682998 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6dc49868fc-6xrnw"] Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.708208 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-scripts\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.708271 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-public-tls-certs\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.708328 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-fernet-keys\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.708346 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-credential-keys\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.708465 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-config-data\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.708863 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfjbm\" (UniqueName: \"kubernetes.io/projected/9b2e2368-f7ef-44f1-a105-8c72469d28a1-kube-api-access-jfjbm\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.708942 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-combined-ca-bundle\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.708976 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-internal-tls-certs\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.810891 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-fernet-keys\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.810949 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-credential-keys\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.810993 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-config-data\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.811056 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfjbm\" (UniqueName: \"kubernetes.io/projected/9b2e2368-f7ef-44f1-a105-8c72469d28a1-kube-api-access-jfjbm\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.811095 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-combined-ca-bundle\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.811123 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-internal-tls-certs\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.811207 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-scripts\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.811252 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-public-tls-certs\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.815047 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-internal-tls-certs\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.815069 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-fernet-keys\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.815172 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-credential-keys\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.815264 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-public-tls-certs\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.817883 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-scripts\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.818819 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-combined-ca-bundle\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.822204 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b2e2368-f7ef-44f1-a105-8c72469d28a1-config-data\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.836773 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfjbm\" (UniqueName: \"kubernetes.io/projected/9b2e2368-f7ef-44f1-a105-8c72469d28a1-kube-api-access-jfjbm\") pod \"keystone-6dc49868fc-6xrnw\" (UID: \"9b2e2368-f7ef-44f1-a105-8c72469d28a1\") " pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:51 crc kubenswrapper[4954]: I1206 08:52:51.986737 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:52 crc kubenswrapper[4954]: I1206 08:52:52.469395 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6dc49868fc-6xrnw"] Dec 06 08:52:52 crc kubenswrapper[4954]: I1206 08:52:52.587087 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6dc49868fc-6xrnw" event={"ID":"9b2e2368-f7ef-44f1-a105-8c72469d28a1","Type":"ContainerStarted","Data":"b14ee48320c6ff6c6b5c37c01d91db90070813fe7739f8c6b056a9c81855cb61"} Dec 06 08:52:53 crc kubenswrapper[4954]: I1206 08:52:53.597205 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6dc49868fc-6xrnw" event={"ID":"9b2e2368-f7ef-44f1-a105-8c72469d28a1","Type":"ContainerStarted","Data":"8cb4e78447cef10287b052ba131f99d967ba953768793735782bbffa4d0cdc13"} Dec 06 08:52:53 crc kubenswrapper[4954]: I1206 08:52:53.597467 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:52:53 crc kubenswrapper[4954]: I1206 08:52:53.624131 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6dc49868fc-6xrnw" podStartSLOduration=2.624086011 podStartE2EDuration="2.624086011s" podCreationTimestamp="2025-12-06 08:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:52:53.615750629 +0000 UTC m=+6948.429110028" watchObservedRunningTime="2025-12-06 08:52:53.624086011 +0000 UTC m=+6948.437445390" Dec 06 08:53:10 crc kubenswrapper[4954]: I1206 08:53:10.101289 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:53:10 crc kubenswrapper[4954]: I1206 08:53:10.101814 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:53:23 crc kubenswrapper[4954]: I1206 08:53:23.605671 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6dc49868fc-6xrnw" Dec 06 08:53:26 crc kubenswrapper[4954]: I1206 08:53:26.714900 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 06 08:53:26 crc kubenswrapper[4954]: I1206 08:53:26.716506 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 08:53:26 crc kubenswrapper[4954]: I1206 08:53:26.718993 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 06 08:53:26 crc kubenswrapper[4954]: I1206 08:53:26.719428 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 06 08:53:26 crc kubenswrapper[4954]: I1206 08:53:26.719512 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-f4jqh" Dec 06 08:53:26 crc kubenswrapper[4954]: I1206 08:53:26.725424 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 06 08:53:26 crc kubenswrapper[4954]: I1206 08:53:26.825847 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56926ef8-fd5e-4746-972f-42a622dabda3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"56926ef8-fd5e-4746-972f-42a622dabda3\") " pod="openstack/openstackclient" Dec 06 08:53:26 crc kubenswrapper[4954]: I1206 08:53:26.825895 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8jsh\" (UniqueName: \"kubernetes.io/projected/56926ef8-fd5e-4746-972f-42a622dabda3-kube-api-access-n8jsh\") pod \"openstackclient\" (UID: \"56926ef8-fd5e-4746-972f-42a622dabda3\") " pod="openstack/openstackclient" Dec 06 08:53:26 crc kubenswrapper[4954]: I1206 08:53:26.825922 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/56926ef8-fd5e-4746-972f-42a622dabda3-openstack-config\") pod \"openstackclient\" (UID: \"56926ef8-fd5e-4746-972f-42a622dabda3\") " pod="openstack/openstackclient" Dec 06 08:53:26 crc kubenswrapper[4954]: I1206 08:53:26.825944 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/56926ef8-fd5e-4746-972f-42a622dabda3-openstack-config-secret\") pod \"openstackclient\" (UID: \"56926ef8-fd5e-4746-972f-42a622dabda3\") " pod="openstack/openstackclient" Dec 06 08:53:26 crc kubenswrapper[4954]: I1206 08:53:26.928125 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56926ef8-fd5e-4746-972f-42a622dabda3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"56926ef8-fd5e-4746-972f-42a622dabda3\") " pod="openstack/openstackclient" Dec 06 08:53:26 crc kubenswrapper[4954]: I1206 08:53:26.928177 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8jsh\" (UniqueName: \"kubernetes.io/projected/56926ef8-fd5e-4746-972f-42a622dabda3-kube-api-access-n8jsh\") pod \"openstackclient\" (UID: \"56926ef8-fd5e-4746-972f-42a622dabda3\") " pod="openstack/openstackclient" Dec 06 08:53:26 crc kubenswrapper[4954]: I1206 08:53:26.928215 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/56926ef8-fd5e-4746-972f-42a622dabda3-openstack-config\") pod \"openstackclient\" (UID: \"56926ef8-fd5e-4746-972f-42a622dabda3\") " pod="openstack/openstackclient" Dec 06 08:53:26 crc kubenswrapper[4954]: I1206 08:53:26.928249 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/56926ef8-fd5e-4746-972f-42a622dabda3-openstack-config-secret\") pod \"openstackclient\" (UID: \"56926ef8-fd5e-4746-972f-42a622dabda3\") " pod="openstack/openstackclient" Dec 06 08:53:26 crc kubenswrapper[4954]: I1206 08:53:26.929838 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/56926ef8-fd5e-4746-972f-42a622dabda3-openstack-config\") pod \"openstackclient\" (UID: \"56926ef8-fd5e-4746-972f-42a622dabda3\") " pod="openstack/openstackclient" Dec 06 08:53:26 crc kubenswrapper[4954]: I1206 08:53:26.935440 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/56926ef8-fd5e-4746-972f-42a622dabda3-openstack-config-secret\") pod \"openstackclient\" (UID: \"56926ef8-fd5e-4746-972f-42a622dabda3\") " pod="openstack/openstackclient" Dec 06 08:53:26 crc kubenswrapper[4954]: I1206 08:53:26.938211 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56926ef8-fd5e-4746-972f-42a622dabda3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"56926ef8-fd5e-4746-972f-42a622dabda3\") " pod="openstack/openstackclient" Dec 06 08:53:26 crc kubenswrapper[4954]: I1206 08:53:26.954258 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8jsh\" (UniqueName: \"kubernetes.io/projected/56926ef8-fd5e-4746-972f-42a622dabda3-kube-api-access-n8jsh\") pod \"openstackclient\" (UID: \"56926ef8-fd5e-4746-972f-42a622dabda3\") " pod="openstack/openstackclient" Dec 06 08:53:27 crc kubenswrapper[4954]: I1206 08:53:27.041486 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 08:53:27 crc kubenswrapper[4954]: I1206 08:53:27.481410 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 06 08:53:27 crc kubenswrapper[4954]: I1206 08:53:27.868793 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"56926ef8-fd5e-4746-972f-42a622dabda3","Type":"ContainerStarted","Data":"3d68edab8ad01f6c9d1a393738e8affa961d61f0e9d6e08a90a3473700331a01"} Dec 06 08:53:38 crc kubenswrapper[4954]: I1206 08:53:38.994461 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"56926ef8-fd5e-4746-972f-42a622dabda3","Type":"ContainerStarted","Data":"50d6661c17d8765a45cf437582afafb80de05bdb5b77260c55310a025f3a6fd4"} Dec 06 08:53:39 crc kubenswrapper[4954]: I1206 08:53:39.019173 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.424841122 podStartE2EDuration="13.019146081s" podCreationTimestamp="2025-12-06 08:53:26 +0000 UTC" firstStartedPulling="2025-12-06 08:53:27.488226503 +0000 UTC m=+6982.301585892" lastFinishedPulling="2025-12-06 08:53:38.082531462 +0000 UTC m=+6992.895890851" observedRunningTime="2025-12-06 08:53:39.00821055 +0000 UTC m=+6993.821569959" watchObservedRunningTime="2025-12-06 08:53:39.019146081 +0000 UTC m=+6993.832505480" Dec 06 08:53:40 crc kubenswrapper[4954]: I1206 08:53:40.101474 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 08:53:40 crc kubenswrapper[4954]: I1206 08:53:40.101812 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 08:53:40 crc kubenswrapper[4954]: I1206 08:53:40.101853 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 08:53:40 crc kubenswrapper[4954]: I1206 08:53:40.102523 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 08:53:40 crc kubenswrapper[4954]: I1206 08:53:40.102604 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" gracePeriod=600 Dec 06 08:53:40 crc kubenswrapper[4954]: E1206 08:53:40.228166 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:53:41 crc kubenswrapper[4954]: I1206 08:53:41.013130 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" exitCode=0 Dec 06 08:53:41 crc kubenswrapper[4954]: I1206 08:53:41.013219 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8"} Dec 06 08:53:41 crc kubenswrapper[4954]: I1206 08:53:41.013299 4954 scope.go:117] "RemoveContainer" containerID="e7e067e69197d16060215dd7fc7f0e43e8a7a1453c025d5bb5df2952f17a7ae4" Dec 06 08:53:41 crc kubenswrapper[4954]: I1206 08:53:41.014280 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:53:41 crc kubenswrapper[4954]: E1206 08:53:41.014542 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:53:55 crc kubenswrapper[4954]: I1206 08:53:55.448864 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:53:55 crc kubenswrapper[4954]: E1206 08:53:55.449686 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:54:08 crc kubenswrapper[4954]: I1206 08:54:08.443145 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:54:08 crc kubenswrapper[4954]: E1206 08:54:08.443854 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:54:19 crc kubenswrapper[4954]: I1206 08:54:19.444999 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:54:19 crc kubenswrapper[4954]: E1206 08:54:19.445917 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:54:34 crc kubenswrapper[4954]: I1206 08:54:34.443965 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:54:34 crc kubenswrapper[4954]: E1206 08:54:34.445763 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:54:45 crc kubenswrapper[4954]: I1206 08:54:45.474387 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:54:45 crc kubenswrapper[4954]: E1206 08:54:45.475201 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:54:59 crc kubenswrapper[4954]: I1206 08:54:59.444011 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:54:59 crc kubenswrapper[4954]: E1206 08:54:59.444945 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:54:59 crc kubenswrapper[4954]: I1206 08:54:59.802748 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0ef9-account-create-update-dxp8f"] Dec 06 08:54:59 crc kubenswrapper[4954]: I1206 08:54:59.805400 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0ef9-account-create-update-dxp8f" Dec 06 08:54:59 crc kubenswrapper[4954]: I1206 08:54:59.808531 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 06 08:54:59 crc kubenswrapper[4954]: I1206 08:54:59.813358 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-twgpt"] Dec 06 08:54:59 crc kubenswrapper[4954]: I1206 08:54:59.816067 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-twgpt" Dec 06 08:54:59 crc kubenswrapper[4954]: I1206 08:54:59.824932 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-twgpt"] Dec 06 08:54:59 crc kubenswrapper[4954]: I1206 08:54:59.835283 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0ef9-account-create-update-dxp8f"] Dec 06 08:54:59 crc kubenswrapper[4954]: I1206 08:54:59.899410 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5vn5\" (UniqueName: \"kubernetes.io/projected/3d552c02-3df7-4f00-bd16-9c128a0fd274-kube-api-access-c5vn5\") pod \"barbican-0ef9-account-create-update-dxp8f\" (UID: \"3d552c02-3df7-4f00-bd16-9c128a0fd274\") " pod="openstack/barbican-0ef9-account-create-update-dxp8f" Dec 06 08:54:59 crc kubenswrapper[4954]: I1206 08:54:59.899460 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbsqc\" (UniqueName: \"kubernetes.io/projected/f9359d5c-ba3c-4a69-91ea-c13180163dc8-kube-api-access-cbsqc\") pod \"barbican-db-create-twgpt\" (UID: \"f9359d5c-ba3c-4a69-91ea-c13180163dc8\") " pod="openstack/barbican-db-create-twgpt" Dec 06 08:54:59 crc kubenswrapper[4954]: I1206 08:54:59.899572 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d552c02-3df7-4f00-bd16-9c128a0fd274-operator-scripts\") pod \"barbican-0ef9-account-create-update-dxp8f\" (UID: \"3d552c02-3df7-4f00-bd16-9c128a0fd274\") " pod="openstack/barbican-0ef9-account-create-update-dxp8f" Dec 06 08:54:59 crc kubenswrapper[4954]: I1206 08:54:59.899613 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9359d5c-ba3c-4a69-91ea-c13180163dc8-operator-scripts\") pod \"barbican-db-create-twgpt\" (UID: \"f9359d5c-ba3c-4a69-91ea-c13180163dc8\") " pod="openstack/barbican-db-create-twgpt" Dec 06 08:55:00 crc kubenswrapper[4954]: I1206 08:55:00.000766 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d552c02-3df7-4f00-bd16-9c128a0fd274-operator-scripts\") pod \"barbican-0ef9-account-create-update-dxp8f\" (UID: \"3d552c02-3df7-4f00-bd16-9c128a0fd274\") " pod="openstack/barbican-0ef9-account-create-update-dxp8f" Dec 06 08:55:00 crc kubenswrapper[4954]: I1206 08:55:00.001175 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9359d5c-ba3c-4a69-91ea-c13180163dc8-operator-scripts\") pod \"barbican-db-create-twgpt\" (UID: \"f9359d5c-ba3c-4a69-91ea-c13180163dc8\") " pod="openstack/barbican-db-create-twgpt" Dec 06 08:55:00 crc kubenswrapper[4954]: I1206 08:55:00.001252 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5vn5\" (UniqueName: \"kubernetes.io/projected/3d552c02-3df7-4f00-bd16-9c128a0fd274-kube-api-access-c5vn5\") pod \"barbican-0ef9-account-create-update-dxp8f\" (UID: \"3d552c02-3df7-4f00-bd16-9c128a0fd274\") " pod="openstack/barbican-0ef9-account-create-update-dxp8f" Dec 06 08:55:00 crc kubenswrapper[4954]: I1206 08:55:00.001269 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbsqc\" (UniqueName: \"kubernetes.io/projected/f9359d5c-ba3c-4a69-91ea-c13180163dc8-kube-api-access-cbsqc\") pod \"barbican-db-create-twgpt\" (UID: \"f9359d5c-ba3c-4a69-91ea-c13180163dc8\") " pod="openstack/barbican-db-create-twgpt" Dec 06 08:55:00 crc kubenswrapper[4954]: I1206 08:55:00.001737 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d552c02-3df7-4f00-bd16-9c128a0fd274-operator-scripts\") pod \"barbican-0ef9-account-create-update-dxp8f\" (UID: \"3d552c02-3df7-4f00-bd16-9c128a0fd274\") " pod="openstack/barbican-0ef9-account-create-update-dxp8f" Dec 06 08:55:00 crc kubenswrapper[4954]: I1206 08:55:00.002709 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9359d5c-ba3c-4a69-91ea-c13180163dc8-operator-scripts\") pod \"barbican-db-create-twgpt\" (UID: \"f9359d5c-ba3c-4a69-91ea-c13180163dc8\") " pod="openstack/barbican-db-create-twgpt" Dec 06 08:55:00 crc kubenswrapper[4954]: I1206 08:55:00.025459 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5vn5\" (UniqueName: \"kubernetes.io/projected/3d552c02-3df7-4f00-bd16-9c128a0fd274-kube-api-access-c5vn5\") pod \"barbican-0ef9-account-create-update-dxp8f\" (UID: \"3d552c02-3df7-4f00-bd16-9c128a0fd274\") " pod="openstack/barbican-0ef9-account-create-update-dxp8f" Dec 06 08:55:00 crc kubenswrapper[4954]: I1206 08:55:00.026280 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbsqc\" (UniqueName: \"kubernetes.io/projected/f9359d5c-ba3c-4a69-91ea-c13180163dc8-kube-api-access-cbsqc\") pod \"barbican-db-create-twgpt\" (UID: \"f9359d5c-ba3c-4a69-91ea-c13180163dc8\") " pod="openstack/barbican-db-create-twgpt" Dec 06 08:55:00 crc kubenswrapper[4954]: I1206 08:55:00.203698 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0ef9-account-create-update-dxp8f" Dec 06 08:55:00 crc kubenswrapper[4954]: I1206 08:55:00.229628 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-twgpt" Dec 06 08:55:00 crc kubenswrapper[4954]: I1206 08:55:00.768374 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-twgpt"] Dec 06 08:55:00 crc kubenswrapper[4954]: I1206 08:55:00.828973 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0ef9-account-create-update-dxp8f"] Dec 06 08:55:01 crc kubenswrapper[4954]: I1206 08:55:01.706849 4954 generic.go:334] "Generic (PLEG): container finished" podID="f9359d5c-ba3c-4a69-91ea-c13180163dc8" containerID="44b4bd98edb9bcd48ab2b4c3aec550a32514aa64e0069b68b3a9d2a665392849" exitCode=0 Dec 06 08:55:01 crc kubenswrapper[4954]: I1206 08:55:01.707207 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-twgpt" event={"ID":"f9359d5c-ba3c-4a69-91ea-c13180163dc8","Type":"ContainerDied","Data":"44b4bd98edb9bcd48ab2b4c3aec550a32514aa64e0069b68b3a9d2a665392849"} Dec 06 08:55:01 crc kubenswrapper[4954]: I1206 08:55:01.707304 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-twgpt" event={"ID":"f9359d5c-ba3c-4a69-91ea-c13180163dc8","Type":"ContainerStarted","Data":"6bb6eae71c75d5a044243c3d4d14c515173bbfa4dfae1fd4a80fdafac2631239"} Dec 06 08:55:01 crc kubenswrapper[4954]: I1206 08:55:01.708748 4954 generic.go:334] "Generic (PLEG): container finished" podID="3d552c02-3df7-4f00-bd16-9c128a0fd274" containerID="ca8553feff5c2dc82ad34b81ceb5d46320297a21c5e130c084e1398088992129" exitCode=0 Dec 06 08:55:01 crc kubenswrapper[4954]: I1206 08:55:01.708777 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0ef9-account-create-update-dxp8f" event={"ID":"3d552c02-3df7-4f00-bd16-9c128a0fd274","Type":"ContainerDied","Data":"ca8553feff5c2dc82ad34b81ceb5d46320297a21c5e130c084e1398088992129"} Dec 06 08:55:01 crc kubenswrapper[4954]: I1206 08:55:01.708798 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0ef9-account-create-update-dxp8f" event={"ID":"3d552c02-3df7-4f00-bd16-9c128a0fd274","Type":"ContainerStarted","Data":"b716d071abdc4739b295c916adc834815fc0dbbd3bf838cc745a8e88dc8372f6"} Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.100918 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-twgpt" Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.108595 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0ef9-account-create-update-dxp8f" Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.289141 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5vn5\" (UniqueName: \"kubernetes.io/projected/3d552c02-3df7-4f00-bd16-9c128a0fd274-kube-api-access-c5vn5\") pod \"3d552c02-3df7-4f00-bd16-9c128a0fd274\" (UID: \"3d552c02-3df7-4f00-bd16-9c128a0fd274\") " Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.289192 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbsqc\" (UniqueName: \"kubernetes.io/projected/f9359d5c-ba3c-4a69-91ea-c13180163dc8-kube-api-access-cbsqc\") pod \"f9359d5c-ba3c-4a69-91ea-c13180163dc8\" (UID: \"f9359d5c-ba3c-4a69-91ea-c13180163dc8\") " Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.289265 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9359d5c-ba3c-4a69-91ea-c13180163dc8-operator-scripts\") pod \"f9359d5c-ba3c-4a69-91ea-c13180163dc8\" (UID: \"f9359d5c-ba3c-4a69-91ea-c13180163dc8\") " Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.289403 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d552c02-3df7-4f00-bd16-9c128a0fd274-operator-scripts\") pod \"3d552c02-3df7-4f00-bd16-9c128a0fd274\" (UID: \"3d552c02-3df7-4f00-bd16-9c128a0fd274\") " Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.289871 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9359d5c-ba3c-4a69-91ea-c13180163dc8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9359d5c-ba3c-4a69-91ea-c13180163dc8" (UID: "f9359d5c-ba3c-4a69-91ea-c13180163dc8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.289966 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d552c02-3df7-4f00-bd16-9c128a0fd274-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d552c02-3df7-4f00-bd16-9c128a0fd274" (UID: "3d552c02-3df7-4f00-bd16-9c128a0fd274"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.296466 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9359d5c-ba3c-4a69-91ea-c13180163dc8-kube-api-access-cbsqc" (OuterVolumeSpecName: "kube-api-access-cbsqc") pod "f9359d5c-ba3c-4a69-91ea-c13180163dc8" (UID: "f9359d5c-ba3c-4a69-91ea-c13180163dc8"). InnerVolumeSpecName "kube-api-access-cbsqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.296584 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d552c02-3df7-4f00-bd16-9c128a0fd274-kube-api-access-c5vn5" (OuterVolumeSpecName: "kube-api-access-c5vn5") pod "3d552c02-3df7-4f00-bd16-9c128a0fd274" (UID: "3d552c02-3df7-4f00-bd16-9c128a0fd274"). InnerVolumeSpecName "kube-api-access-c5vn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.391383 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d552c02-3df7-4f00-bd16-9c128a0fd274-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.391434 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5vn5\" (UniqueName: \"kubernetes.io/projected/3d552c02-3df7-4f00-bd16-9c128a0fd274-kube-api-access-c5vn5\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.391449 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbsqc\" (UniqueName: \"kubernetes.io/projected/f9359d5c-ba3c-4a69-91ea-c13180163dc8-kube-api-access-cbsqc\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.391461 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9359d5c-ba3c-4a69-91ea-c13180163dc8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:03 crc kubenswrapper[4954]: E1206 08:55:03.566064 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9359d5c_ba3c_4a69_91ea_c13180163dc8.slice/crio-6bb6eae71c75d5a044243c3d4d14c515173bbfa4dfae1fd4a80fdafac2631239\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9359d5c_ba3c_4a69_91ea_c13180163dc8.slice\": RecentStats: unable to find data in memory cache]" Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.726899 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0ef9-account-create-update-dxp8f" Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.726882 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0ef9-account-create-update-dxp8f" event={"ID":"3d552c02-3df7-4f00-bd16-9c128a0fd274","Type":"ContainerDied","Data":"b716d071abdc4739b295c916adc834815fc0dbbd3bf838cc745a8e88dc8372f6"} Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.727038 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b716d071abdc4739b295c916adc834815fc0dbbd3bf838cc745a8e88dc8372f6" Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.727928 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-twgpt" event={"ID":"f9359d5c-ba3c-4a69-91ea-c13180163dc8","Type":"ContainerDied","Data":"6bb6eae71c75d5a044243c3d4d14c515173bbfa4dfae1fd4a80fdafac2631239"} Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.727963 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bb6eae71c75d5a044243c3d4d14c515173bbfa4dfae1fd4a80fdafac2631239" Dec 06 08:55:03 crc kubenswrapper[4954]: I1206 08:55:03.727977 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-twgpt" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.144370 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-kstzz"] Dec 06 08:55:05 crc kubenswrapper[4954]: E1206 08:55:05.144983 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d552c02-3df7-4f00-bd16-9c128a0fd274" containerName="mariadb-account-create-update" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.144996 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d552c02-3df7-4f00-bd16-9c128a0fd274" containerName="mariadb-account-create-update" Dec 06 08:55:05 crc kubenswrapper[4954]: E1206 08:55:05.145014 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9359d5c-ba3c-4a69-91ea-c13180163dc8" containerName="mariadb-database-create" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.145021 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9359d5c-ba3c-4a69-91ea-c13180163dc8" containerName="mariadb-database-create" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.145192 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9359d5c-ba3c-4a69-91ea-c13180163dc8" containerName="mariadb-database-create" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.145207 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d552c02-3df7-4f00-bd16-9c128a0fd274" containerName="mariadb-account-create-update" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.145795 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kstzz" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.148177 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.148877 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2xdpn" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.159462 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-kstzz"] Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.336845 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c6e30c14-e779-4b5d-ab68-479e2d136bdb-db-sync-config-data\") pod \"barbican-db-sync-kstzz\" (UID: \"c6e30c14-e779-4b5d-ab68-479e2d136bdb\") " pod="openstack/barbican-db-sync-kstzz" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.336989 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6e30c14-e779-4b5d-ab68-479e2d136bdb-combined-ca-bundle\") pod \"barbican-db-sync-kstzz\" (UID: \"c6e30c14-e779-4b5d-ab68-479e2d136bdb\") " pod="openstack/barbican-db-sync-kstzz" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.337055 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwpgt\" (UniqueName: \"kubernetes.io/projected/c6e30c14-e779-4b5d-ab68-479e2d136bdb-kube-api-access-pwpgt\") pod \"barbican-db-sync-kstzz\" (UID: \"c6e30c14-e779-4b5d-ab68-479e2d136bdb\") " pod="openstack/barbican-db-sync-kstzz" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.438792 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c6e30c14-e779-4b5d-ab68-479e2d136bdb-db-sync-config-data\") pod \"barbican-db-sync-kstzz\" (UID: \"c6e30c14-e779-4b5d-ab68-479e2d136bdb\") " pod="openstack/barbican-db-sync-kstzz" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.438938 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6e30c14-e779-4b5d-ab68-479e2d136bdb-combined-ca-bundle\") pod \"barbican-db-sync-kstzz\" (UID: \"c6e30c14-e779-4b5d-ab68-479e2d136bdb\") " pod="openstack/barbican-db-sync-kstzz" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.439009 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwpgt\" (UniqueName: \"kubernetes.io/projected/c6e30c14-e779-4b5d-ab68-479e2d136bdb-kube-api-access-pwpgt\") pod \"barbican-db-sync-kstzz\" (UID: \"c6e30c14-e779-4b5d-ab68-479e2d136bdb\") " pod="openstack/barbican-db-sync-kstzz" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.445002 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6e30c14-e779-4b5d-ab68-479e2d136bdb-combined-ca-bundle\") pod \"barbican-db-sync-kstzz\" (UID: \"c6e30c14-e779-4b5d-ab68-479e2d136bdb\") " pod="openstack/barbican-db-sync-kstzz" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.458306 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c6e30c14-e779-4b5d-ab68-479e2d136bdb-db-sync-config-data\") pod \"barbican-db-sync-kstzz\" (UID: \"c6e30c14-e779-4b5d-ab68-479e2d136bdb\") " pod="openstack/barbican-db-sync-kstzz" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.461084 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwpgt\" (UniqueName: \"kubernetes.io/projected/c6e30c14-e779-4b5d-ab68-479e2d136bdb-kube-api-access-pwpgt\") pod \"barbican-db-sync-kstzz\" (UID: \"c6e30c14-e779-4b5d-ab68-479e2d136bdb\") " pod="openstack/barbican-db-sync-kstzz" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.474451 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kstzz" Dec 06 08:55:05 crc kubenswrapper[4954]: I1206 08:55:05.963364 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-kstzz"] Dec 06 08:55:06 crc kubenswrapper[4954]: I1206 08:55:06.768353 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kstzz" event={"ID":"c6e30c14-e779-4b5d-ab68-479e2d136bdb","Type":"ContainerStarted","Data":"244d93642c59f4e5f9f3220269f3987470f46eb2346eea0f783bd29c8c3fac5d"} Dec 06 08:55:10 crc kubenswrapper[4954]: I1206 08:55:10.818830 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kstzz" event={"ID":"c6e30c14-e779-4b5d-ab68-479e2d136bdb","Type":"ContainerStarted","Data":"6b756174f73f0e8fe692af2bab18bcb0e3c4582b66190530c405f910ae983e36"} Dec 06 08:55:10 crc kubenswrapper[4954]: I1206 08:55:10.841192 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-kstzz" podStartSLOduration=1.545989823 podStartE2EDuration="5.84115575s" podCreationTimestamp="2025-12-06 08:55:05 +0000 UTC" firstStartedPulling="2025-12-06 08:55:05.972438978 +0000 UTC m=+7080.785798367" lastFinishedPulling="2025-12-06 08:55:10.267604905 +0000 UTC m=+7085.080964294" observedRunningTime="2025-12-06 08:55:10.839016083 +0000 UTC m=+7085.652375482" watchObservedRunningTime="2025-12-06 08:55:10.84115575 +0000 UTC m=+7085.654515139" Dec 06 08:55:11 crc kubenswrapper[4954]: I1206 08:55:11.443892 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:55:11 crc kubenswrapper[4954]: E1206 08:55:11.444536 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:55:12 crc kubenswrapper[4954]: I1206 08:55:12.835927 4954 generic.go:334] "Generic (PLEG): container finished" podID="c6e30c14-e779-4b5d-ab68-479e2d136bdb" containerID="6b756174f73f0e8fe692af2bab18bcb0e3c4582b66190530c405f910ae983e36" exitCode=0 Dec 06 08:55:12 crc kubenswrapper[4954]: I1206 08:55:12.836021 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kstzz" event={"ID":"c6e30c14-e779-4b5d-ab68-479e2d136bdb","Type":"ContainerDied","Data":"6b756174f73f0e8fe692af2bab18bcb0e3c4582b66190530c405f910ae983e36"} Dec 06 08:55:14 crc kubenswrapper[4954]: I1206 08:55:14.196185 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kstzz" Dec 06 08:55:14 crc kubenswrapper[4954]: I1206 08:55:14.301110 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c6e30c14-e779-4b5d-ab68-479e2d136bdb-db-sync-config-data\") pod \"c6e30c14-e779-4b5d-ab68-479e2d136bdb\" (UID: \"c6e30c14-e779-4b5d-ab68-479e2d136bdb\") " Dec 06 08:55:14 crc kubenswrapper[4954]: I1206 08:55:14.301228 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwpgt\" (UniqueName: \"kubernetes.io/projected/c6e30c14-e779-4b5d-ab68-479e2d136bdb-kube-api-access-pwpgt\") pod \"c6e30c14-e779-4b5d-ab68-479e2d136bdb\" (UID: \"c6e30c14-e779-4b5d-ab68-479e2d136bdb\") " Dec 06 08:55:14 crc kubenswrapper[4954]: I1206 08:55:14.301384 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6e30c14-e779-4b5d-ab68-479e2d136bdb-combined-ca-bundle\") pod \"c6e30c14-e779-4b5d-ab68-479e2d136bdb\" (UID: \"c6e30c14-e779-4b5d-ab68-479e2d136bdb\") " Dec 06 08:55:14 crc kubenswrapper[4954]: I1206 08:55:14.307194 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6e30c14-e779-4b5d-ab68-479e2d136bdb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c6e30c14-e779-4b5d-ab68-479e2d136bdb" (UID: "c6e30c14-e779-4b5d-ab68-479e2d136bdb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:55:14 crc kubenswrapper[4954]: I1206 08:55:14.308123 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6e30c14-e779-4b5d-ab68-479e2d136bdb-kube-api-access-pwpgt" (OuterVolumeSpecName: "kube-api-access-pwpgt") pod "c6e30c14-e779-4b5d-ab68-479e2d136bdb" (UID: "c6e30c14-e779-4b5d-ab68-479e2d136bdb"). InnerVolumeSpecName "kube-api-access-pwpgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:55:14 crc kubenswrapper[4954]: I1206 08:55:14.326240 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6e30c14-e779-4b5d-ab68-479e2d136bdb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6e30c14-e779-4b5d-ab68-479e2d136bdb" (UID: "c6e30c14-e779-4b5d-ab68-479e2d136bdb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:55:14 crc kubenswrapper[4954]: I1206 08:55:14.404068 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6e30c14-e779-4b5d-ab68-479e2d136bdb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:14 crc kubenswrapper[4954]: I1206 08:55:14.404113 4954 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c6e30c14-e779-4b5d-ab68-479e2d136bdb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:14 crc kubenswrapper[4954]: I1206 08:55:14.404131 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwpgt\" (UniqueName: \"kubernetes.io/projected/c6e30c14-e779-4b5d-ab68-479e2d136bdb-kube-api-access-pwpgt\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:14 crc kubenswrapper[4954]: I1206 08:55:14.856464 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kstzz" event={"ID":"c6e30c14-e779-4b5d-ab68-479e2d136bdb","Type":"ContainerDied","Data":"244d93642c59f4e5f9f3220269f3987470f46eb2346eea0f783bd29c8c3fac5d"} Dec 06 08:55:14 crc kubenswrapper[4954]: I1206 08:55:14.856849 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="244d93642c59f4e5f9f3220269f3987470f46eb2346eea0f783bd29c8c3fac5d" Dec 06 08:55:14 crc kubenswrapper[4954]: I1206 08:55:14.856550 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kstzz" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.025520 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-74c854b897-gshsw"] Dec 06 08:55:15 crc kubenswrapper[4954]: E1206 08:55:15.025903 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e30c14-e779-4b5d-ab68-479e2d136bdb" containerName="barbican-db-sync" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.025920 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e30c14-e779-4b5d-ab68-479e2d136bdb" containerName="barbican-db-sync" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.026076 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e30c14-e779-4b5d-ab68-479e2d136bdb" containerName="barbican-db-sync" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.026975 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-74c854b897-gshsw" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.032465 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-774cb646b6-tnq5v"] Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.033845 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.042398 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.042662 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.042948 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.043443 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2xdpn" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.088728 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-74c854b897-gshsw"] Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.094595 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-774cb646b6-tnq5v"] Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.134267 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3694a89-4fb6-43ef-abe2-dcc7f455d4d4-config-data-custom\") pod \"barbican-worker-74c854b897-gshsw\" (UID: \"c3694a89-4fb6-43ef-abe2-dcc7f455d4d4\") " pod="openstack/barbican-worker-74c854b897-gshsw" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.134359 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04dc125f-8988-4195-975a-a8156cfc59f4-logs\") pod \"barbican-keystone-listener-774cb646b6-tnq5v\" (UID: \"04dc125f-8988-4195-975a-a8156cfc59f4\") " pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.134388 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04dc125f-8988-4195-975a-a8156cfc59f4-config-data\") pod \"barbican-keystone-listener-774cb646b6-tnq5v\" (UID: \"04dc125f-8988-4195-975a-a8156cfc59f4\") " pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.134429 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3694a89-4fb6-43ef-abe2-dcc7f455d4d4-config-data\") pod \"barbican-worker-74c854b897-gshsw\" (UID: \"c3694a89-4fb6-43ef-abe2-dcc7f455d4d4\") " pod="openstack/barbican-worker-74c854b897-gshsw" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.134465 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3694a89-4fb6-43ef-abe2-dcc7f455d4d4-logs\") pod \"barbican-worker-74c854b897-gshsw\" (UID: \"c3694a89-4fb6-43ef-abe2-dcc7f455d4d4\") " pod="openstack/barbican-worker-74c854b897-gshsw" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.134509 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b7db\" (UniqueName: \"kubernetes.io/projected/c3694a89-4fb6-43ef-abe2-dcc7f455d4d4-kube-api-access-7b7db\") pod \"barbican-worker-74c854b897-gshsw\" (UID: \"c3694a89-4fb6-43ef-abe2-dcc7f455d4d4\") " pod="openstack/barbican-worker-74c854b897-gshsw" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.134541 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04dc125f-8988-4195-975a-a8156cfc59f4-config-data-custom\") pod \"barbican-keystone-listener-774cb646b6-tnq5v\" (UID: \"04dc125f-8988-4195-975a-a8156cfc59f4\") " pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.134615 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3694a89-4fb6-43ef-abe2-dcc7f455d4d4-combined-ca-bundle\") pod \"barbican-worker-74c854b897-gshsw\" (UID: \"c3694a89-4fb6-43ef-abe2-dcc7f455d4d4\") " pod="openstack/barbican-worker-74c854b897-gshsw" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.134650 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l79g\" (UniqueName: \"kubernetes.io/projected/04dc125f-8988-4195-975a-a8156cfc59f4-kube-api-access-6l79g\") pod \"barbican-keystone-listener-774cb646b6-tnq5v\" (UID: \"04dc125f-8988-4195-975a-a8156cfc59f4\") " pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.134714 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04dc125f-8988-4195-975a-a8156cfc59f4-combined-ca-bundle\") pod \"barbican-keystone-listener-774cb646b6-tnq5v\" (UID: \"04dc125f-8988-4195-975a-a8156cfc59f4\") " pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.138415 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78f54cd975-dcltx"] Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.140359 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.158927 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78f54cd975-dcltx"] Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.223841 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-57d77cbfb6-mdjnk"] Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.225176 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.228178 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.235834 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3694a89-4fb6-43ef-abe2-dcc7f455d4d4-config-data-custom\") pod \"barbican-worker-74c854b897-gshsw\" (UID: \"c3694a89-4fb6-43ef-abe2-dcc7f455d4d4\") " pod="openstack/barbican-worker-74c854b897-gshsw" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.235894 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-ovsdbserver-sb\") pod \"dnsmasq-dns-78f54cd975-dcltx\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.235941 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04dc125f-8988-4195-975a-a8156cfc59f4-config-data\") pod \"barbican-keystone-listener-774cb646b6-tnq5v\" (UID: \"04dc125f-8988-4195-975a-a8156cfc59f4\") " pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.235962 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04dc125f-8988-4195-975a-a8156cfc59f4-logs\") pod \"barbican-keystone-listener-774cb646b6-tnq5v\" (UID: \"04dc125f-8988-4195-975a-a8156cfc59f4\") " pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.235996 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3694a89-4fb6-43ef-abe2-dcc7f455d4d4-config-data\") pod \"barbican-worker-74c854b897-gshsw\" (UID: \"c3694a89-4fb6-43ef-abe2-dcc7f455d4d4\") " pod="openstack/barbican-worker-74c854b897-gshsw" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.236028 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3694a89-4fb6-43ef-abe2-dcc7f455d4d4-logs\") pod \"barbican-worker-74c854b897-gshsw\" (UID: \"c3694a89-4fb6-43ef-abe2-dcc7f455d4d4\") " pod="openstack/barbican-worker-74c854b897-gshsw" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.236065 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b7db\" (UniqueName: \"kubernetes.io/projected/c3694a89-4fb6-43ef-abe2-dcc7f455d4d4-kube-api-access-7b7db\") pod \"barbican-worker-74c854b897-gshsw\" (UID: \"c3694a89-4fb6-43ef-abe2-dcc7f455d4d4\") " pod="openstack/barbican-worker-74c854b897-gshsw" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.236096 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04dc125f-8988-4195-975a-a8156cfc59f4-config-data-custom\") pod \"barbican-keystone-listener-774cb646b6-tnq5v\" (UID: \"04dc125f-8988-4195-975a-a8156cfc59f4\") " pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.236129 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3694a89-4fb6-43ef-abe2-dcc7f455d4d4-combined-ca-bundle\") pod \"barbican-worker-74c854b897-gshsw\" (UID: \"c3694a89-4fb6-43ef-abe2-dcc7f455d4d4\") " pod="openstack/barbican-worker-74c854b897-gshsw" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.236150 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-ovsdbserver-nb\") pod \"dnsmasq-dns-78f54cd975-dcltx\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.236177 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l79g\" (UniqueName: \"kubernetes.io/projected/04dc125f-8988-4195-975a-a8156cfc59f4-kube-api-access-6l79g\") pod \"barbican-keystone-listener-774cb646b6-tnq5v\" (UID: \"04dc125f-8988-4195-975a-a8156cfc59f4\") " pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.236207 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlksc\" (UniqueName: \"kubernetes.io/projected/18d52357-0b5c-4d38-abc9-28eef64b0987-kube-api-access-xlksc\") pod \"dnsmasq-dns-78f54cd975-dcltx\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.236228 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-dns-svc\") pod \"dnsmasq-dns-78f54cd975-dcltx\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.236256 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-config\") pod \"dnsmasq-dns-78f54cd975-dcltx\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.236297 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04dc125f-8988-4195-975a-a8156cfc59f4-combined-ca-bundle\") pod \"barbican-keystone-listener-774cb646b6-tnq5v\" (UID: \"04dc125f-8988-4195-975a-a8156cfc59f4\") " pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.237657 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04dc125f-8988-4195-975a-a8156cfc59f4-logs\") pod \"barbican-keystone-listener-774cb646b6-tnq5v\" (UID: \"04dc125f-8988-4195-975a-a8156cfc59f4\") " pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.239295 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3694a89-4fb6-43ef-abe2-dcc7f455d4d4-logs\") pod \"barbican-worker-74c854b897-gshsw\" (UID: \"c3694a89-4fb6-43ef-abe2-dcc7f455d4d4\") " pod="openstack/barbican-worker-74c854b897-gshsw" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.245718 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04dc125f-8988-4195-975a-a8156cfc59f4-config-data\") pod \"barbican-keystone-listener-774cb646b6-tnq5v\" (UID: \"04dc125f-8988-4195-975a-a8156cfc59f4\") " pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.246280 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3694a89-4fb6-43ef-abe2-dcc7f455d4d4-config-data-custom\") pod \"barbican-worker-74c854b897-gshsw\" (UID: \"c3694a89-4fb6-43ef-abe2-dcc7f455d4d4\") " pod="openstack/barbican-worker-74c854b897-gshsw" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.247328 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3694a89-4fb6-43ef-abe2-dcc7f455d4d4-config-data\") pod \"barbican-worker-74c854b897-gshsw\" (UID: \"c3694a89-4fb6-43ef-abe2-dcc7f455d4d4\") " pod="openstack/barbican-worker-74c854b897-gshsw" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.254687 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3694a89-4fb6-43ef-abe2-dcc7f455d4d4-combined-ca-bundle\") pod \"barbican-worker-74c854b897-gshsw\" (UID: \"c3694a89-4fb6-43ef-abe2-dcc7f455d4d4\") " pod="openstack/barbican-worker-74c854b897-gshsw" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.263404 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57d77cbfb6-mdjnk"] Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.266774 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b7db\" (UniqueName: \"kubernetes.io/projected/c3694a89-4fb6-43ef-abe2-dcc7f455d4d4-kube-api-access-7b7db\") pod \"barbican-worker-74c854b897-gshsw\" (UID: \"c3694a89-4fb6-43ef-abe2-dcc7f455d4d4\") " pod="openstack/barbican-worker-74c854b897-gshsw" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.268190 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04dc125f-8988-4195-975a-a8156cfc59f4-config-data-custom\") pod \"barbican-keystone-listener-774cb646b6-tnq5v\" (UID: \"04dc125f-8988-4195-975a-a8156cfc59f4\") " pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.268816 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04dc125f-8988-4195-975a-a8156cfc59f4-combined-ca-bundle\") pod \"barbican-keystone-listener-774cb646b6-tnq5v\" (UID: \"04dc125f-8988-4195-975a-a8156cfc59f4\") " pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.269183 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l79g\" (UniqueName: \"kubernetes.io/projected/04dc125f-8988-4195-975a-a8156cfc59f4-kube-api-access-6l79g\") pod \"barbican-keystone-listener-774cb646b6-tnq5v\" (UID: \"04dc125f-8988-4195-975a-a8156cfc59f4\") " pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.337970 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bsg5\" (UniqueName: \"kubernetes.io/projected/9b979809-e192-4ab6-bb03-08dc38cd18ad-kube-api-access-4bsg5\") pod \"barbican-api-57d77cbfb6-mdjnk\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.338039 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-ovsdbserver-sb\") pod \"dnsmasq-dns-78f54cd975-dcltx\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.338252 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b979809-e192-4ab6-bb03-08dc38cd18ad-combined-ca-bundle\") pod \"barbican-api-57d77cbfb6-mdjnk\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.338401 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-ovsdbserver-nb\") pod \"dnsmasq-dns-78f54cd975-dcltx\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.338463 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlksc\" (UniqueName: \"kubernetes.io/projected/18d52357-0b5c-4d38-abc9-28eef64b0987-kube-api-access-xlksc\") pod \"dnsmasq-dns-78f54cd975-dcltx\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.338485 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-dns-svc\") pod \"dnsmasq-dns-78f54cd975-dcltx\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.338525 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-config\") pod \"dnsmasq-dns-78f54cd975-dcltx\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.338547 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b979809-e192-4ab6-bb03-08dc38cd18ad-logs\") pod \"barbican-api-57d77cbfb6-mdjnk\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.338590 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b979809-e192-4ab6-bb03-08dc38cd18ad-config-data\") pod \"barbican-api-57d77cbfb6-mdjnk\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.338630 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b979809-e192-4ab6-bb03-08dc38cd18ad-config-data-custom\") pod \"barbican-api-57d77cbfb6-mdjnk\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.340035 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-ovsdbserver-sb\") pod \"dnsmasq-dns-78f54cd975-dcltx\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.340994 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-ovsdbserver-nb\") pod \"dnsmasq-dns-78f54cd975-dcltx\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.342340 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-config\") pod \"dnsmasq-dns-78f54cd975-dcltx\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.342386 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-dns-svc\") pod \"dnsmasq-dns-78f54cd975-dcltx\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.356551 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlksc\" (UniqueName: \"kubernetes.io/projected/18d52357-0b5c-4d38-abc9-28eef64b0987-kube-api-access-xlksc\") pod \"dnsmasq-dns-78f54cd975-dcltx\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.373391 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-74c854b897-gshsw" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.395717 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.447498 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bsg5\" (UniqueName: \"kubernetes.io/projected/9b979809-e192-4ab6-bb03-08dc38cd18ad-kube-api-access-4bsg5\") pod \"barbican-api-57d77cbfb6-mdjnk\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.447660 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b979809-e192-4ab6-bb03-08dc38cd18ad-combined-ca-bundle\") pod \"barbican-api-57d77cbfb6-mdjnk\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.447737 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b979809-e192-4ab6-bb03-08dc38cd18ad-logs\") pod \"barbican-api-57d77cbfb6-mdjnk\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.447760 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b979809-e192-4ab6-bb03-08dc38cd18ad-config-data\") pod \"barbican-api-57d77cbfb6-mdjnk\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.447807 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b979809-e192-4ab6-bb03-08dc38cd18ad-config-data-custom\") pod \"barbican-api-57d77cbfb6-mdjnk\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.449195 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b979809-e192-4ab6-bb03-08dc38cd18ad-logs\") pod \"barbican-api-57d77cbfb6-mdjnk\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.455064 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b979809-e192-4ab6-bb03-08dc38cd18ad-combined-ca-bundle\") pod \"barbican-api-57d77cbfb6-mdjnk\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.455071 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b979809-e192-4ab6-bb03-08dc38cd18ad-config-data\") pod \"barbican-api-57d77cbfb6-mdjnk\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.456901 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b979809-e192-4ab6-bb03-08dc38cd18ad-config-data-custom\") pod \"barbican-api-57d77cbfb6-mdjnk\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.463466 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.486270 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bsg5\" (UniqueName: \"kubernetes.io/projected/9b979809-e192-4ab6-bb03-08dc38cd18ad-kube-api-access-4bsg5\") pod \"barbican-api-57d77cbfb6-mdjnk\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:15 crc kubenswrapper[4954]: I1206 08:55:15.628083 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:16 crc kubenswrapper[4954]: I1206 08:55:16.058526 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-74c854b897-gshsw"] Dec 06 08:55:16 crc kubenswrapper[4954]: W1206 08:55:16.227527 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b979809_e192_4ab6_bb03_08dc38cd18ad.slice/crio-d4acf9d57ee12f6c32cf0a5f35d8fb55d536aefbbcf16440c05a6265453c3816 WatchSource:0}: Error finding container d4acf9d57ee12f6c32cf0a5f35d8fb55d536aefbbcf16440c05a6265453c3816: Status 404 returned error can't find the container with id d4acf9d57ee12f6c32cf0a5f35d8fb55d536aefbbcf16440c05a6265453c3816 Dec 06 08:55:16 crc kubenswrapper[4954]: W1206 08:55:16.231147 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04dc125f_8988_4195_975a_a8156cfc59f4.slice/crio-87ac1df8cff8362bbd72f16babe7f1ca07ce4c15668b65004c00879085757dc0 WatchSource:0}: Error finding container 87ac1df8cff8362bbd72f16babe7f1ca07ce4c15668b65004c00879085757dc0: Status 404 returned error can't find the container with id 87ac1df8cff8362bbd72f16babe7f1ca07ce4c15668b65004c00879085757dc0 Dec 06 08:55:16 crc kubenswrapper[4954]: I1206 08:55:16.231959 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57d77cbfb6-mdjnk"] Dec 06 08:55:16 crc kubenswrapper[4954]: I1206 08:55:16.241107 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-774cb646b6-tnq5v"] Dec 06 08:55:16 crc kubenswrapper[4954]: I1206 08:55:16.357395 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78f54cd975-dcltx"] Dec 06 08:55:16 crc kubenswrapper[4954]: I1206 08:55:16.913266 4954 generic.go:334] "Generic (PLEG): container finished" podID="18d52357-0b5c-4d38-abc9-28eef64b0987" containerID="80cee4ea69672d7aec6aa3bcd0b8881418a5d92e0ad74501f1ded895bf634a8c" exitCode=0 Dec 06 08:55:16 crc kubenswrapper[4954]: I1206 08:55:16.913420 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78f54cd975-dcltx" event={"ID":"18d52357-0b5c-4d38-abc9-28eef64b0987","Type":"ContainerDied","Data":"80cee4ea69672d7aec6aa3bcd0b8881418a5d92e0ad74501f1ded895bf634a8c"} Dec 06 08:55:16 crc kubenswrapper[4954]: I1206 08:55:16.913681 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78f54cd975-dcltx" event={"ID":"18d52357-0b5c-4d38-abc9-28eef64b0987","Type":"ContainerStarted","Data":"313efe409e1bd0e95fa62ebc6f3c728b09263da6c3a0168fa19897fd03cb82d1"} Dec 06 08:55:16 crc kubenswrapper[4954]: I1206 08:55:16.918464 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-74c854b897-gshsw" event={"ID":"c3694a89-4fb6-43ef-abe2-dcc7f455d4d4","Type":"ContainerStarted","Data":"20cf67d21338cc080b82e0e3457dc56c83760899e04980eea2a394851a2fc437"} Dec 06 08:55:16 crc kubenswrapper[4954]: I1206 08:55:16.920982 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57d77cbfb6-mdjnk" event={"ID":"9b979809-e192-4ab6-bb03-08dc38cd18ad","Type":"ContainerStarted","Data":"e1bab8f718bbdaf271aa1917bde095f3ec24c2fdd7dfec366325b20638d48231"} Dec 06 08:55:16 crc kubenswrapper[4954]: I1206 08:55:16.921016 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57d77cbfb6-mdjnk" event={"ID":"9b979809-e192-4ab6-bb03-08dc38cd18ad","Type":"ContainerStarted","Data":"656718cde07d89dea2934e7a932842edbb99f974e7c505f722502de4b015721f"} Dec 06 08:55:16 crc kubenswrapper[4954]: I1206 08:55:16.921026 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57d77cbfb6-mdjnk" event={"ID":"9b979809-e192-4ab6-bb03-08dc38cd18ad","Type":"ContainerStarted","Data":"d4acf9d57ee12f6c32cf0a5f35d8fb55d536aefbbcf16440c05a6265453c3816"} Dec 06 08:55:16 crc kubenswrapper[4954]: I1206 08:55:16.921874 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:16 crc kubenswrapper[4954]: I1206 08:55:16.921914 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:16 crc kubenswrapper[4954]: I1206 08:55:16.923608 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" event={"ID":"04dc125f-8988-4195-975a-a8156cfc59f4","Type":"ContainerStarted","Data":"87ac1df8cff8362bbd72f16babe7f1ca07ce4c15668b65004c00879085757dc0"} Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.230252 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-57d77cbfb6-mdjnk" podStartSLOduration=2.230224127 podStartE2EDuration="2.230224127s" podCreationTimestamp="2025-12-06 08:55:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:55:16.971999474 +0000 UTC m=+7091.785358863" watchObservedRunningTime="2025-12-06 08:55:17.230224127 +0000 UTC m=+7092.043583516" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.231762 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-9d56f68b6-zctqj"] Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.246180 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9d56f68b6-zctqj"] Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.253728 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.256802 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.277106 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.396408 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4aeecef-ecdc-4aa0-861b-d0a262302982-combined-ca-bundle\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.396796 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4aeecef-ecdc-4aa0-861b-d0a262302982-config-data\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.396838 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4aeecef-ecdc-4aa0-861b-d0a262302982-public-tls-certs\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.396880 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fm4m\" (UniqueName: \"kubernetes.io/projected/f4aeecef-ecdc-4aa0-861b-d0a262302982-kube-api-access-4fm4m\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.397106 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4aeecef-ecdc-4aa0-861b-d0a262302982-config-data-custom\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.397231 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4aeecef-ecdc-4aa0-861b-d0a262302982-logs\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.397847 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4aeecef-ecdc-4aa0-861b-d0a262302982-internal-tls-certs\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.499934 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fm4m\" (UniqueName: \"kubernetes.io/projected/f4aeecef-ecdc-4aa0-861b-d0a262302982-kube-api-access-4fm4m\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.500054 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4aeecef-ecdc-4aa0-861b-d0a262302982-config-data-custom\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.500100 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4aeecef-ecdc-4aa0-861b-d0a262302982-logs\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.500147 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4aeecef-ecdc-4aa0-861b-d0a262302982-internal-tls-certs\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.500256 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4aeecef-ecdc-4aa0-861b-d0a262302982-combined-ca-bundle\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.500290 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4aeecef-ecdc-4aa0-861b-d0a262302982-config-data\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.500336 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4aeecef-ecdc-4aa0-861b-d0a262302982-public-tls-certs\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.500649 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4aeecef-ecdc-4aa0-861b-d0a262302982-logs\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.516511 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4aeecef-ecdc-4aa0-861b-d0a262302982-public-tls-certs\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.516531 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4aeecef-ecdc-4aa0-861b-d0a262302982-internal-tls-certs\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.517803 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4aeecef-ecdc-4aa0-861b-d0a262302982-config-data\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.518172 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4aeecef-ecdc-4aa0-861b-d0a262302982-combined-ca-bundle\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.519705 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4aeecef-ecdc-4aa0-861b-d0a262302982-config-data-custom\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.521497 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fm4m\" (UniqueName: \"kubernetes.io/projected/f4aeecef-ecdc-4aa0-861b-d0a262302982-kube-api-access-4fm4m\") pod \"barbican-api-9d56f68b6-zctqj\" (UID: \"f4aeecef-ecdc-4aa0-861b-d0a262302982\") " pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.635865 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.943220 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78f54cd975-dcltx" event={"ID":"18d52357-0b5c-4d38-abc9-28eef64b0987","Type":"ContainerStarted","Data":"903979689bf5a6b261e67f3a7d1d7c626c2b9bdd7a6c09d161614dd9163913b9"} Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.943515 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:17 crc kubenswrapper[4954]: I1206 08:55:17.966469 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78f54cd975-dcltx" podStartSLOduration=2.966448262 podStartE2EDuration="2.966448262s" podCreationTimestamp="2025-12-06 08:55:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:55:17.960064712 +0000 UTC m=+7092.773424101" watchObservedRunningTime="2025-12-06 08:55:17.966448262 +0000 UTC m=+7092.779807651" Dec 06 08:55:18 crc kubenswrapper[4954]: I1206 08:55:18.782072 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9d56f68b6-zctqj"] Dec 06 08:55:18 crc kubenswrapper[4954]: I1206 08:55:18.963741 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-74c854b897-gshsw" event={"ID":"c3694a89-4fb6-43ef-abe2-dcc7f455d4d4","Type":"ContainerStarted","Data":"e012d0cdb626e56b67f26c2580bbb424d2a608c3a0d9f712444e295d15acd6c6"} Dec 06 08:55:18 crc kubenswrapper[4954]: I1206 08:55:18.963790 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-74c854b897-gshsw" event={"ID":"c3694a89-4fb6-43ef-abe2-dcc7f455d4d4","Type":"ContainerStarted","Data":"ca9da4cb44434019a972a590dfa25153d958a47f6847aea888232816a67a2117"} Dec 06 08:55:18 crc kubenswrapper[4954]: I1206 08:55:18.980410 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" event={"ID":"04dc125f-8988-4195-975a-a8156cfc59f4","Type":"ContainerStarted","Data":"9f0d17bd8a2f0b19e114d15ef691f84c4e0c2e9a8b6270b2d16444410267fa8b"} Dec 06 08:55:18 crc kubenswrapper[4954]: I1206 08:55:18.980448 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" event={"ID":"04dc125f-8988-4195-975a-a8156cfc59f4","Type":"ContainerStarted","Data":"fbb0e5cc77005f305867ac6ecb2ff7d9607eece9f3cbc3b0357fd6aced534f7d"} Dec 06 08:55:18 crc kubenswrapper[4954]: I1206 08:55:18.987074 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d56f68b6-zctqj" event={"ID":"f4aeecef-ecdc-4aa0-861b-d0a262302982","Type":"ContainerStarted","Data":"47451424cdd4df1649a79edd1a59af86ead6aa55beb64511a902623ed9874fb1"} Dec 06 08:55:18 crc kubenswrapper[4954]: I1206 08:55:18.992980 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-74c854b897-gshsw" podStartSLOduration=2.897065161 podStartE2EDuration="4.992934152s" podCreationTimestamp="2025-12-06 08:55:14 +0000 UTC" firstStartedPulling="2025-12-06 08:55:16.064335347 +0000 UTC m=+7090.877694736" lastFinishedPulling="2025-12-06 08:55:18.160204338 +0000 UTC m=+7092.973563727" observedRunningTime="2025-12-06 08:55:18.982137525 +0000 UTC m=+7093.795496924" watchObservedRunningTime="2025-12-06 08:55:18.992934152 +0000 UTC m=+7093.806293541" Dec 06 08:55:19 crc kubenswrapper[4954]: I1206 08:55:19.015443 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-774cb646b6-tnq5v" podStartSLOduration=3.083426011 podStartE2EDuration="5.015422871s" podCreationTimestamp="2025-12-06 08:55:14 +0000 UTC" firstStartedPulling="2025-12-06 08:55:16.233114069 +0000 UTC m=+7091.046473458" lastFinishedPulling="2025-12-06 08:55:18.165110929 +0000 UTC m=+7092.978470318" observedRunningTime="2025-12-06 08:55:19.010975772 +0000 UTC m=+7093.824335161" watchObservedRunningTime="2025-12-06 08:55:19.015422871 +0000 UTC m=+7093.828782260" Dec 06 08:55:19 crc kubenswrapper[4954]: I1206 08:55:19.995919 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d56f68b6-zctqj" event={"ID":"f4aeecef-ecdc-4aa0-861b-d0a262302982","Type":"ContainerStarted","Data":"fa44112cf9b5c18cf41115d8c805cef436ed7c6133eda33af9ff5b2e9b53c512"} Dec 06 08:55:19 crc kubenswrapper[4954]: I1206 08:55:19.996494 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:19 crc kubenswrapper[4954]: I1206 08:55:19.996532 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d56f68b6-zctqj" event={"ID":"f4aeecef-ecdc-4aa0-861b-d0a262302982","Type":"ContainerStarted","Data":"4d53f93da0278926530bdeafb17b6e7ca727e3a52ba36717d7a3fb2c7deb8eb9"} Dec 06 08:55:20 crc kubenswrapper[4954]: I1206 08:55:20.037196 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-9d56f68b6-zctqj" podStartSLOduration=3.0371717560000002 podStartE2EDuration="3.037171756s" podCreationTimestamp="2025-12-06 08:55:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:55:20.028153696 +0000 UTC m=+7094.841513085" watchObservedRunningTime="2025-12-06 08:55:20.037171756 +0000 UTC m=+7094.850531145" Dec 06 08:55:21 crc kubenswrapper[4954]: I1206 08:55:21.002984 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:23 crc kubenswrapper[4954]: I1206 08:55:23.443193 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:55:23 crc kubenswrapper[4954]: E1206 08:55:23.443717 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:55:25 crc kubenswrapper[4954]: I1206 08:55:25.465378 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:55:25 crc kubenswrapper[4954]: I1206 08:55:25.537002 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79577dcf9-s97g2"] Dec 06 08:55:25 crc kubenswrapper[4954]: I1206 08:55:25.537280 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79577dcf9-s97g2" podUID="511a55c7-38eb-4bed-a1b2-fabf31088a7f" containerName="dnsmasq-dns" containerID="cri-o://1948c38edc7e0e732906870967cc1b74ef13074a468c2c88d5f57c9297df1c08" gracePeriod=10 Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.041769 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.047489 4954 generic.go:334] "Generic (PLEG): container finished" podID="511a55c7-38eb-4bed-a1b2-fabf31088a7f" containerID="1948c38edc7e0e732906870967cc1b74ef13074a468c2c88d5f57c9297df1c08" exitCode=0 Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.047540 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79577dcf9-s97g2" event={"ID":"511a55c7-38eb-4bed-a1b2-fabf31088a7f","Type":"ContainerDied","Data":"1948c38edc7e0e732906870967cc1b74ef13074a468c2c88d5f57c9297df1c08"} Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.047635 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79577dcf9-s97g2" event={"ID":"511a55c7-38eb-4bed-a1b2-fabf31088a7f","Type":"ContainerDied","Data":"41de9ec3760f5c7bc1c3f3a578be7562f4dcec364f5bdc793e7371632c919b6b"} Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.047651 4954 scope.go:117] "RemoveContainer" containerID="1948c38edc7e0e732906870967cc1b74ef13074a468c2c88d5f57c9297df1c08" Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.094047 4954 scope.go:117] "RemoveContainer" containerID="eb02962ebc321424f484beab13799ab59e7848c8a359502020dd1c8bb4220982" Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.124702 4954 scope.go:117] "RemoveContainer" containerID="1948c38edc7e0e732906870967cc1b74ef13074a468c2c88d5f57c9297df1c08" Dec 06 08:55:26 crc kubenswrapper[4954]: E1206 08:55:26.125369 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1948c38edc7e0e732906870967cc1b74ef13074a468c2c88d5f57c9297df1c08\": container with ID starting with 1948c38edc7e0e732906870967cc1b74ef13074a468c2c88d5f57c9297df1c08 not found: ID does not exist" containerID="1948c38edc7e0e732906870967cc1b74ef13074a468c2c88d5f57c9297df1c08" Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.125637 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1948c38edc7e0e732906870967cc1b74ef13074a468c2c88d5f57c9297df1c08"} err="failed to get container status \"1948c38edc7e0e732906870967cc1b74ef13074a468c2c88d5f57c9297df1c08\": rpc error: code = NotFound desc = could not find container \"1948c38edc7e0e732906870967cc1b74ef13074a468c2c88d5f57c9297df1c08\": container with ID starting with 1948c38edc7e0e732906870967cc1b74ef13074a468c2c88d5f57c9297df1c08 not found: ID does not exist" Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.125674 4954 scope.go:117] "RemoveContainer" containerID="eb02962ebc321424f484beab13799ab59e7848c8a359502020dd1c8bb4220982" Dec 06 08:55:26 crc kubenswrapper[4954]: E1206 08:55:26.129784 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb02962ebc321424f484beab13799ab59e7848c8a359502020dd1c8bb4220982\": container with ID starting with eb02962ebc321424f484beab13799ab59e7848c8a359502020dd1c8bb4220982 not found: ID does not exist" containerID="eb02962ebc321424f484beab13799ab59e7848c8a359502020dd1c8bb4220982" Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.129848 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb02962ebc321424f484beab13799ab59e7848c8a359502020dd1c8bb4220982"} err="failed to get container status \"eb02962ebc321424f484beab13799ab59e7848c8a359502020dd1c8bb4220982\": rpc error: code = NotFound desc = could not find container \"eb02962ebc321424f484beab13799ab59e7848c8a359502020dd1c8bb4220982\": container with ID starting with eb02962ebc321424f484beab13799ab59e7848c8a359502020dd1c8bb4220982 not found: ID does not exist" Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.179614 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-ovsdbserver-sb\") pod \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.179753 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gzrr\" (UniqueName: \"kubernetes.io/projected/511a55c7-38eb-4bed-a1b2-fabf31088a7f-kube-api-access-8gzrr\") pod \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.179823 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-ovsdbserver-nb\") pod \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.179966 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-config\") pod \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.180003 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-dns-svc\") pod \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\" (UID: \"511a55c7-38eb-4bed-a1b2-fabf31088a7f\") " Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.197032 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/511a55c7-38eb-4bed-a1b2-fabf31088a7f-kube-api-access-8gzrr" (OuterVolumeSpecName: "kube-api-access-8gzrr") pod "511a55c7-38eb-4bed-a1b2-fabf31088a7f" (UID: "511a55c7-38eb-4bed-a1b2-fabf31088a7f"). InnerVolumeSpecName "kube-api-access-8gzrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.227854 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "511a55c7-38eb-4bed-a1b2-fabf31088a7f" (UID: "511a55c7-38eb-4bed-a1b2-fabf31088a7f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.230138 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "511a55c7-38eb-4bed-a1b2-fabf31088a7f" (UID: "511a55c7-38eb-4bed-a1b2-fabf31088a7f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.233125 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "511a55c7-38eb-4bed-a1b2-fabf31088a7f" (UID: "511a55c7-38eb-4bed-a1b2-fabf31088a7f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.234781 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-config" (OuterVolumeSpecName: "config") pod "511a55c7-38eb-4bed-a1b2-fabf31088a7f" (UID: "511a55c7-38eb-4bed-a1b2-fabf31088a7f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.282381 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.282424 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-config\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.282433 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.282442 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/511a55c7-38eb-4bed-a1b2-fabf31088a7f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:26 crc kubenswrapper[4954]: I1206 08:55:26.282453 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gzrr\" (UniqueName: \"kubernetes.io/projected/511a55c7-38eb-4bed-a1b2-fabf31088a7f-kube-api-access-8gzrr\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:27 crc kubenswrapper[4954]: I1206 08:55:27.057656 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79577dcf9-s97g2" Dec 06 08:55:27 crc kubenswrapper[4954]: I1206 08:55:27.101666 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79577dcf9-s97g2"] Dec 06 08:55:27 crc kubenswrapper[4954]: I1206 08:55:27.115156 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79577dcf9-s97g2"] Dec 06 08:55:27 crc kubenswrapper[4954]: I1206 08:55:27.302867 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:27 crc kubenswrapper[4954]: I1206 08:55:27.430328 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:27 crc kubenswrapper[4954]: I1206 08:55:27.455184 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="511a55c7-38eb-4bed-a1b2-fabf31088a7f" path="/var/lib/kubelet/pods/511a55c7-38eb-4bed-a1b2-fabf31088a7f/volumes" Dec 06 08:55:29 crc kubenswrapper[4954]: I1206 08:55:29.127346 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:29 crc kubenswrapper[4954]: I1206 08:55:29.178400 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9d56f68b6-zctqj" Dec 06 08:55:29 crc kubenswrapper[4954]: I1206 08:55:29.244544 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-57d77cbfb6-mdjnk"] Dec 06 08:55:29 crc kubenswrapper[4954]: I1206 08:55:29.245132 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-57d77cbfb6-mdjnk" podUID="9b979809-e192-4ab6-bb03-08dc38cd18ad" containerName="barbican-api-log" containerID="cri-o://656718cde07d89dea2934e7a932842edbb99f974e7c505f722502de4b015721f" gracePeriod=30 Dec 06 08:55:29 crc kubenswrapper[4954]: I1206 08:55:29.245847 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-57d77cbfb6-mdjnk" podUID="9b979809-e192-4ab6-bb03-08dc38cd18ad" containerName="barbican-api" containerID="cri-o://e1bab8f718bbdaf271aa1917bde095f3ec24c2fdd7dfec366325b20638d48231" gracePeriod=30 Dec 06 08:55:30 crc kubenswrapper[4954]: I1206 08:55:30.094033 4954 generic.go:334] "Generic (PLEG): container finished" podID="9b979809-e192-4ab6-bb03-08dc38cd18ad" containerID="656718cde07d89dea2934e7a932842edbb99f974e7c505f722502de4b015721f" exitCode=143 Dec 06 08:55:30 crc kubenswrapper[4954]: I1206 08:55:30.094174 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57d77cbfb6-mdjnk" event={"ID":"9b979809-e192-4ab6-bb03-08dc38cd18ad","Type":"ContainerDied","Data":"656718cde07d89dea2934e7a932842edbb99f974e7c505f722502de4b015721f"} Dec 06 08:55:32 crc kubenswrapper[4954]: I1206 08:55:32.397393 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57d77cbfb6-mdjnk" podUID="9b979809-e192-4ab6-bb03-08dc38cd18ad" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.47:9311/healthcheck\": read tcp 10.217.0.2:57712->10.217.1.47:9311: read: connection reset by peer" Dec 06 08:55:32 crc kubenswrapper[4954]: I1206 08:55:32.397786 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57d77cbfb6-mdjnk" podUID="9b979809-e192-4ab6-bb03-08dc38cd18ad" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.47:9311/healthcheck\": read tcp 10.217.0.2:57718->10.217.1.47:9311: read: connection reset by peer" Dec 06 08:55:32 crc kubenswrapper[4954]: I1206 08:55:32.828954 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:32 crc kubenswrapper[4954]: I1206 08:55:32.903491 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bsg5\" (UniqueName: \"kubernetes.io/projected/9b979809-e192-4ab6-bb03-08dc38cd18ad-kube-api-access-4bsg5\") pod \"9b979809-e192-4ab6-bb03-08dc38cd18ad\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " Dec 06 08:55:32 crc kubenswrapper[4954]: I1206 08:55:32.903716 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b979809-e192-4ab6-bb03-08dc38cd18ad-config-data\") pod \"9b979809-e192-4ab6-bb03-08dc38cd18ad\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " Dec 06 08:55:32 crc kubenswrapper[4954]: I1206 08:55:32.903768 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b979809-e192-4ab6-bb03-08dc38cd18ad-logs\") pod \"9b979809-e192-4ab6-bb03-08dc38cd18ad\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " Dec 06 08:55:32 crc kubenswrapper[4954]: I1206 08:55:32.903785 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b979809-e192-4ab6-bb03-08dc38cd18ad-config-data-custom\") pod \"9b979809-e192-4ab6-bb03-08dc38cd18ad\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " Dec 06 08:55:32 crc kubenswrapper[4954]: I1206 08:55:32.903804 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b979809-e192-4ab6-bb03-08dc38cd18ad-combined-ca-bundle\") pod \"9b979809-e192-4ab6-bb03-08dc38cd18ad\" (UID: \"9b979809-e192-4ab6-bb03-08dc38cd18ad\") " Dec 06 08:55:32 crc kubenswrapper[4954]: I1206 08:55:32.904490 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b979809-e192-4ab6-bb03-08dc38cd18ad-logs" (OuterVolumeSpecName: "logs") pod "9b979809-e192-4ab6-bb03-08dc38cd18ad" (UID: "9b979809-e192-4ab6-bb03-08dc38cd18ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:55:32 crc kubenswrapper[4954]: I1206 08:55:32.911937 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b979809-e192-4ab6-bb03-08dc38cd18ad-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9b979809-e192-4ab6-bb03-08dc38cd18ad" (UID: "9b979809-e192-4ab6-bb03-08dc38cd18ad"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:55:32 crc kubenswrapper[4954]: I1206 08:55:32.912045 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b979809-e192-4ab6-bb03-08dc38cd18ad-kube-api-access-4bsg5" (OuterVolumeSpecName: "kube-api-access-4bsg5") pod "9b979809-e192-4ab6-bb03-08dc38cd18ad" (UID: "9b979809-e192-4ab6-bb03-08dc38cd18ad"). InnerVolumeSpecName "kube-api-access-4bsg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:55:32 crc kubenswrapper[4954]: I1206 08:55:32.943665 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b979809-e192-4ab6-bb03-08dc38cd18ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b979809-e192-4ab6-bb03-08dc38cd18ad" (UID: "9b979809-e192-4ab6-bb03-08dc38cd18ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:55:32 crc kubenswrapper[4954]: I1206 08:55:32.948319 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b979809-e192-4ab6-bb03-08dc38cd18ad-config-data" (OuterVolumeSpecName: "config-data") pod "9b979809-e192-4ab6-bb03-08dc38cd18ad" (UID: "9b979809-e192-4ab6-bb03-08dc38cd18ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:55:33 crc kubenswrapper[4954]: I1206 08:55:33.006295 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bsg5\" (UniqueName: \"kubernetes.io/projected/9b979809-e192-4ab6-bb03-08dc38cd18ad-kube-api-access-4bsg5\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:33 crc kubenswrapper[4954]: I1206 08:55:33.006346 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b979809-e192-4ab6-bb03-08dc38cd18ad-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:33 crc kubenswrapper[4954]: I1206 08:55:33.006358 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b979809-e192-4ab6-bb03-08dc38cd18ad-logs\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:33 crc kubenswrapper[4954]: I1206 08:55:33.006367 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b979809-e192-4ab6-bb03-08dc38cd18ad-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:33 crc kubenswrapper[4954]: I1206 08:55:33.006375 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b979809-e192-4ab6-bb03-08dc38cd18ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:33 crc kubenswrapper[4954]: I1206 08:55:33.117210 4954 generic.go:334] "Generic (PLEG): container finished" podID="9b979809-e192-4ab6-bb03-08dc38cd18ad" containerID="e1bab8f718bbdaf271aa1917bde095f3ec24c2fdd7dfec366325b20638d48231" exitCode=0 Dec 06 08:55:33 crc kubenswrapper[4954]: I1206 08:55:33.117285 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57d77cbfb6-mdjnk" event={"ID":"9b979809-e192-4ab6-bb03-08dc38cd18ad","Type":"ContainerDied","Data":"e1bab8f718bbdaf271aa1917bde095f3ec24c2fdd7dfec366325b20638d48231"} Dec 06 08:55:33 crc kubenswrapper[4954]: I1206 08:55:33.117316 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57d77cbfb6-mdjnk" event={"ID":"9b979809-e192-4ab6-bb03-08dc38cd18ad","Type":"ContainerDied","Data":"d4acf9d57ee12f6c32cf0a5f35d8fb55d536aefbbcf16440c05a6265453c3816"} Dec 06 08:55:33 crc kubenswrapper[4954]: I1206 08:55:33.117335 4954 scope.go:117] "RemoveContainer" containerID="e1bab8f718bbdaf271aa1917bde095f3ec24c2fdd7dfec366325b20638d48231" Dec 06 08:55:33 crc kubenswrapper[4954]: I1206 08:55:33.117483 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57d77cbfb6-mdjnk" Dec 06 08:55:33 crc kubenswrapper[4954]: I1206 08:55:33.158370 4954 scope.go:117] "RemoveContainer" containerID="656718cde07d89dea2934e7a932842edbb99f974e7c505f722502de4b015721f" Dec 06 08:55:33 crc kubenswrapper[4954]: I1206 08:55:33.161281 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-57d77cbfb6-mdjnk"] Dec 06 08:55:33 crc kubenswrapper[4954]: I1206 08:55:33.170330 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-57d77cbfb6-mdjnk"] Dec 06 08:55:33 crc kubenswrapper[4954]: I1206 08:55:33.182825 4954 scope.go:117] "RemoveContainer" containerID="e1bab8f718bbdaf271aa1917bde095f3ec24c2fdd7dfec366325b20638d48231" Dec 06 08:55:33 crc kubenswrapper[4954]: E1206 08:55:33.183703 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1bab8f718bbdaf271aa1917bde095f3ec24c2fdd7dfec366325b20638d48231\": container with ID starting with e1bab8f718bbdaf271aa1917bde095f3ec24c2fdd7dfec366325b20638d48231 not found: ID does not exist" containerID="e1bab8f718bbdaf271aa1917bde095f3ec24c2fdd7dfec366325b20638d48231" Dec 06 08:55:33 crc kubenswrapper[4954]: I1206 08:55:33.183786 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1bab8f718bbdaf271aa1917bde095f3ec24c2fdd7dfec366325b20638d48231"} err="failed to get container status \"e1bab8f718bbdaf271aa1917bde095f3ec24c2fdd7dfec366325b20638d48231\": rpc error: code = NotFound desc = could not find container \"e1bab8f718bbdaf271aa1917bde095f3ec24c2fdd7dfec366325b20638d48231\": container with ID starting with e1bab8f718bbdaf271aa1917bde095f3ec24c2fdd7dfec366325b20638d48231 not found: ID does not exist" Dec 06 08:55:33 crc kubenswrapper[4954]: I1206 08:55:33.183838 4954 scope.go:117] "RemoveContainer" containerID="656718cde07d89dea2934e7a932842edbb99f974e7c505f722502de4b015721f" Dec 06 08:55:33 crc kubenswrapper[4954]: E1206 08:55:33.184479 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"656718cde07d89dea2934e7a932842edbb99f974e7c505f722502de4b015721f\": container with ID starting with 656718cde07d89dea2934e7a932842edbb99f974e7c505f722502de4b015721f not found: ID does not exist" containerID="656718cde07d89dea2934e7a932842edbb99f974e7c505f722502de4b015721f" Dec 06 08:55:33 crc kubenswrapper[4954]: I1206 08:55:33.184512 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656718cde07d89dea2934e7a932842edbb99f974e7c505f722502de4b015721f"} err="failed to get container status \"656718cde07d89dea2934e7a932842edbb99f974e7c505f722502de4b015721f\": rpc error: code = NotFound desc = could not find container \"656718cde07d89dea2934e7a932842edbb99f974e7c505f722502de4b015721f\": container with ID starting with 656718cde07d89dea2934e7a932842edbb99f974e7c505f722502de4b015721f not found: ID does not exist" Dec 06 08:55:33 crc kubenswrapper[4954]: I1206 08:55:33.455097 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b979809-e192-4ab6-bb03-08dc38cd18ad" path="/var/lib/kubelet/pods/9b979809-e192-4ab6-bb03-08dc38cd18ad/volumes" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.088266 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-z8n9l"] Dec 06 08:55:37 crc kubenswrapper[4954]: E1206 08:55:37.089133 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="511a55c7-38eb-4bed-a1b2-fabf31088a7f" containerName="init" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.089148 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="511a55c7-38eb-4bed-a1b2-fabf31088a7f" containerName="init" Dec 06 08:55:37 crc kubenswrapper[4954]: E1206 08:55:37.089164 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="511a55c7-38eb-4bed-a1b2-fabf31088a7f" containerName="dnsmasq-dns" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.089170 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="511a55c7-38eb-4bed-a1b2-fabf31088a7f" containerName="dnsmasq-dns" Dec 06 08:55:37 crc kubenswrapper[4954]: E1206 08:55:37.089185 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b979809-e192-4ab6-bb03-08dc38cd18ad" containerName="barbican-api" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.089191 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b979809-e192-4ab6-bb03-08dc38cd18ad" containerName="barbican-api" Dec 06 08:55:37 crc kubenswrapper[4954]: E1206 08:55:37.089203 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b979809-e192-4ab6-bb03-08dc38cd18ad" containerName="barbican-api-log" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.089210 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b979809-e192-4ab6-bb03-08dc38cd18ad" containerName="barbican-api-log" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.089374 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="511a55c7-38eb-4bed-a1b2-fabf31088a7f" containerName="dnsmasq-dns" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.089390 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b979809-e192-4ab6-bb03-08dc38cd18ad" containerName="barbican-api-log" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.089400 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b979809-e192-4ab6-bb03-08dc38cd18ad" containerName="barbican-api" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.090140 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z8n9l" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.101397 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-z8n9l"] Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.179308 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2198ae3-47db-45ba-990d-2bf1c32800e3-operator-scripts\") pod \"neutron-db-create-z8n9l\" (UID: \"c2198ae3-47db-45ba-990d-2bf1c32800e3\") " pod="openstack/neutron-db-create-z8n9l" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.179715 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc8zx\" (UniqueName: \"kubernetes.io/projected/c2198ae3-47db-45ba-990d-2bf1c32800e3-kube-api-access-gc8zx\") pod \"neutron-db-create-z8n9l\" (UID: \"c2198ae3-47db-45ba-990d-2bf1c32800e3\") " pod="openstack/neutron-db-create-z8n9l" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.190375 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-fa1f-account-create-update-2lftp"] Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.191499 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fa1f-account-create-update-2lftp" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.193277 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.206265 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fa1f-account-create-update-2lftp"] Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.281817 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/667c7f97-0625-43e8-bd1d-b248f07f231f-operator-scripts\") pod \"neutron-fa1f-account-create-update-2lftp\" (UID: \"667c7f97-0625-43e8-bd1d-b248f07f231f\") " pod="openstack/neutron-fa1f-account-create-update-2lftp" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.282072 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2198ae3-47db-45ba-990d-2bf1c32800e3-operator-scripts\") pod \"neutron-db-create-z8n9l\" (UID: \"c2198ae3-47db-45ba-990d-2bf1c32800e3\") " pod="openstack/neutron-db-create-z8n9l" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.282206 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc8zx\" (UniqueName: \"kubernetes.io/projected/c2198ae3-47db-45ba-990d-2bf1c32800e3-kube-api-access-gc8zx\") pod \"neutron-db-create-z8n9l\" (UID: \"c2198ae3-47db-45ba-990d-2bf1c32800e3\") " pod="openstack/neutron-db-create-z8n9l" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.282346 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dmnx\" (UniqueName: \"kubernetes.io/projected/667c7f97-0625-43e8-bd1d-b248f07f231f-kube-api-access-2dmnx\") pod \"neutron-fa1f-account-create-update-2lftp\" (UID: \"667c7f97-0625-43e8-bd1d-b248f07f231f\") " pod="openstack/neutron-fa1f-account-create-update-2lftp" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.283010 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2198ae3-47db-45ba-990d-2bf1c32800e3-operator-scripts\") pod \"neutron-db-create-z8n9l\" (UID: \"c2198ae3-47db-45ba-990d-2bf1c32800e3\") " pod="openstack/neutron-db-create-z8n9l" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.301498 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc8zx\" (UniqueName: \"kubernetes.io/projected/c2198ae3-47db-45ba-990d-2bf1c32800e3-kube-api-access-gc8zx\") pod \"neutron-db-create-z8n9l\" (UID: \"c2198ae3-47db-45ba-990d-2bf1c32800e3\") " pod="openstack/neutron-db-create-z8n9l" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.386188 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/667c7f97-0625-43e8-bd1d-b248f07f231f-operator-scripts\") pod \"neutron-fa1f-account-create-update-2lftp\" (UID: \"667c7f97-0625-43e8-bd1d-b248f07f231f\") " pod="openstack/neutron-fa1f-account-create-update-2lftp" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.386335 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dmnx\" (UniqueName: \"kubernetes.io/projected/667c7f97-0625-43e8-bd1d-b248f07f231f-kube-api-access-2dmnx\") pod \"neutron-fa1f-account-create-update-2lftp\" (UID: \"667c7f97-0625-43e8-bd1d-b248f07f231f\") " pod="openstack/neutron-fa1f-account-create-update-2lftp" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.387372 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/667c7f97-0625-43e8-bd1d-b248f07f231f-operator-scripts\") pod \"neutron-fa1f-account-create-update-2lftp\" (UID: \"667c7f97-0625-43e8-bd1d-b248f07f231f\") " pod="openstack/neutron-fa1f-account-create-update-2lftp" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.412158 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z8n9l" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.415320 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dmnx\" (UniqueName: \"kubernetes.io/projected/667c7f97-0625-43e8-bd1d-b248f07f231f-kube-api-access-2dmnx\") pod \"neutron-fa1f-account-create-update-2lftp\" (UID: \"667c7f97-0625-43e8-bd1d-b248f07f231f\") " pod="openstack/neutron-fa1f-account-create-update-2lftp" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.445826 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:55:37 crc kubenswrapper[4954]: E1206 08:55:37.446079 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.509332 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fa1f-account-create-update-2lftp" Dec 06 08:55:37 crc kubenswrapper[4954]: I1206 08:55:37.851425 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-z8n9l"] Dec 06 08:55:37 crc kubenswrapper[4954]: W1206 08:55:37.860138 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2198ae3_47db_45ba_990d_2bf1c32800e3.slice/crio-2cd003f9f14354f76025a3d943a41a9d77573d57928561417204a775a8c857f3 WatchSource:0}: Error finding container 2cd003f9f14354f76025a3d943a41a9d77573d57928561417204a775a8c857f3: Status 404 returned error can't find the container with id 2cd003f9f14354f76025a3d943a41a9d77573d57928561417204a775a8c857f3 Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.003497 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fa1f-account-create-update-2lftp"] Dec 06 08:55:38 crc kubenswrapper[4954]: W1206 08:55:38.010393 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod667c7f97_0625_43e8_bd1d_b248f07f231f.slice/crio-996100c181775e33972ddad1f463714cba8f75f75f03268564121d392bd28862 WatchSource:0}: Error finding container 996100c181775e33972ddad1f463714cba8f75f75f03268564121d392bd28862: Status 404 returned error can't find the container with id 996100c181775e33972ddad1f463714cba8f75f75f03268564121d392bd28862 Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.170131 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fa1f-account-create-update-2lftp" event={"ID":"667c7f97-0625-43e8-bd1d-b248f07f231f","Type":"ContainerStarted","Data":"94601b554cf78848be44218c3a266428e96317bd91aee40881f5e1b614fbe3e1"} Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.170191 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fa1f-account-create-update-2lftp" event={"ID":"667c7f97-0625-43e8-bd1d-b248f07f231f","Type":"ContainerStarted","Data":"996100c181775e33972ddad1f463714cba8f75f75f03268564121d392bd28862"} Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.174400 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-z8n9l" event={"ID":"c2198ae3-47db-45ba-990d-2bf1c32800e3","Type":"ContainerStarted","Data":"408ace98b13ac0a40a56d12e3715987414139b725ebe95977a65f9b57dbf9d52"} Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.174443 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-z8n9l" event={"ID":"c2198ae3-47db-45ba-990d-2bf1c32800e3","Type":"ContainerStarted","Data":"2cd003f9f14354f76025a3d943a41a9d77573d57928561417204a775a8c857f3"} Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.193197 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dsrn9"] Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.194999 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dsrn9" Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.195798 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-fa1f-account-create-update-2lftp" podStartSLOduration=1.19577237 podStartE2EDuration="1.19577237s" podCreationTimestamp="2025-12-06 08:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:55:38.191992919 +0000 UTC m=+7113.005352318" watchObservedRunningTime="2025-12-06 08:55:38.19577237 +0000 UTC m=+7113.009131759" Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.209875 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsrn9"] Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.211794 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-z8n9l" podStartSLOduration=1.2117711660000001 podStartE2EDuration="1.211771166s" podCreationTimestamp="2025-12-06 08:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:55:38.210674546 +0000 UTC m=+7113.024033935" watchObservedRunningTime="2025-12-06 08:55:38.211771166 +0000 UTC m=+7113.025130565" Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.306368 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/752d1b6e-0838-4d77-bd81-0b85d7266e2a-utilities\") pod \"redhat-marketplace-dsrn9\" (UID: \"752d1b6e-0838-4d77-bd81-0b85d7266e2a\") " pod="openshift-marketplace/redhat-marketplace-dsrn9" Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.306465 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/752d1b6e-0838-4d77-bd81-0b85d7266e2a-catalog-content\") pod \"redhat-marketplace-dsrn9\" (UID: \"752d1b6e-0838-4d77-bd81-0b85d7266e2a\") " pod="openshift-marketplace/redhat-marketplace-dsrn9" Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.306534 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf45l\" (UniqueName: \"kubernetes.io/projected/752d1b6e-0838-4d77-bd81-0b85d7266e2a-kube-api-access-mf45l\") pod \"redhat-marketplace-dsrn9\" (UID: \"752d1b6e-0838-4d77-bd81-0b85d7266e2a\") " pod="openshift-marketplace/redhat-marketplace-dsrn9" Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.408127 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/752d1b6e-0838-4d77-bd81-0b85d7266e2a-catalog-content\") pod \"redhat-marketplace-dsrn9\" (UID: \"752d1b6e-0838-4d77-bd81-0b85d7266e2a\") " pod="openshift-marketplace/redhat-marketplace-dsrn9" Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.408185 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf45l\" (UniqueName: \"kubernetes.io/projected/752d1b6e-0838-4d77-bd81-0b85d7266e2a-kube-api-access-mf45l\") pod \"redhat-marketplace-dsrn9\" (UID: \"752d1b6e-0838-4d77-bd81-0b85d7266e2a\") " pod="openshift-marketplace/redhat-marketplace-dsrn9" Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.408289 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/752d1b6e-0838-4d77-bd81-0b85d7266e2a-utilities\") pod \"redhat-marketplace-dsrn9\" (UID: \"752d1b6e-0838-4d77-bd81-0b85d7266e2a\") " pod="openshift-marketplace/redhat-marketplace-dsrn9" Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.408694 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/752d1b6e-0838-4d77-bd81-0b85d7266e2a-catalog-content\") pod \"redhat-marketplace-dsrn9\" (UID: \"752d1b6e-0838-4d77-bd81-0b85d7266e2a\") " pod="openshift-marketplace/redhat-marketplace-dsrn9" Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.408785 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/752d1b6e-0838-4d77-bd81-0b85d7266e2a-utilities\") pod \"redhat-marketplace-dsrn9\" (UID: \"752d1b6e-0838-4d77-bd81-0b85d7266e2a\") " pod="openshift-marketplace/redhat-marketplace-dsrn9" Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.432621 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf45l\" (UniqueName: \"kubernetes.io/projected/752d1b6e-0838-4d77-bd81-0b85d7266e2a-kube-api-access-mf45l\") pod \"redhat-marketplace-dsrn9\" (UID: \"752d1b6e-0838-4d77-bd81-0b85d7266e2a\") " pod="openshift-marketplace/redhat-marketplace-dsrn9" Dec 06 08:55:38 crc kubenswrapper[4954]: I1206 08:55:38.548913 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dsrn9" Dec 06 08:55:39 crc kubenswrapper[4954]: I1206 08:55:39.055047 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsrn9"] Dec 06 08:55:39 crc kubenswrapper[4954]: W1206 08:55:39.060091 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod752d1b6e_0838_4d77_bd81_0b85d7266e2a.slice/crio-1f9839e027c7da5b371e88f487a177d19da54f9e50b4a645c7953e57ba69b3dc WatchSource:0}: Error finding container 1f9839e027c7da5b371e88f487a177d19da54f9e50b4a645c7953e57ba69b3dc: Status 404 returned error can't find the container with id 1f9839e027c7da5b371e88f487a177d19da54f9e50b4a645c7953e57ba69b3dc Dec 06 08:55:39 crc kubenswrapper[4954]: I1206 08:55:39.184323 4954 generic.go:334] "Generic (PLEG): container finished" podID="667c7f97-0625-43e8-bd1d-b248f07f231f" containerID="94601b554cf78848be44218c3a266428e96317bd91aee40881f5e1b614fbe3e1" exitCode=0 Dec 06 08:55:39 crc kubenswrapper[4954]: I1206 08:55:39.184367 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fa1f-account-create-update-2lftp" event={"ID":"667c7f97-0625-43e8-bd1d-b248f07f231f","Type":"ContainerDied","Data":"94601b554cf78848be44218c3a266428e96317bd91aee40881f5e1b614fbe3e1"} Dec 06 08:55:39 crc kubenswrapper[4954]: I1206 08:55:39.186326 4954 generic.go:334] "Generic (PLEG): container finished" podID="c2198ae3-47db-45ba-990d-2bf1c32800e3" containerID="408ace98b13ac0a40a56d12e3715987414139b725ebe95977a65f9b57dbf9d52" exitCode=0 Dec 06 08:55:39 crc kubenswrapper[4954]: I1206 08:55:39.186386 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-z8n9l" event={"ID":"c2198ae3-47db-45ba-990d-2bf1c32800e3","Type":"ContainerDied","Data":"408ace98b13ac0a40a56d12e3715987414139b725ebe95977a65f9b57dbf9d52"} Dec 06 08:55:39 crc kubenswrapper[4954]: I1206 08:55:39.188079 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsrn9" event={"ID":"752d1b6e-0838-4d77-bd81-0b85d7266e2a","Type":"ContainerStarted","Data":"1f9839e027c7da5b371e88f487a177d19da54f9e50b4a645c7953e57ba69b3dc"} Dec 06 08:55:40 crc kubenswrapper[4954]: I1206 08:55:40.198253 4954 generic.go:334] "Generic (PLEG): container finished" podID="752d1b6e-0838-4d77-bd81-0b85d7266e2a" containerID="da0dd12d49e7bc1396b84886a351a86f98672e7a94a0cc17f0d781f0cae3ccfa" exitCode=0 Dec 06 08:55:40 crc kubenswrapper[4954]: I1206 08:55:40.198370 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsrn9" event={"ID":"752d1b6e-0838-4d77-bd81-0b85d7266e2a","Type":"ContainerDied","Data":"da0dd12d49e7bc1396b84886a351a86f98672e7a94a0cc17f0d781f0cae3ccfa"} Dec 06 08:55:40 crc kubenswrapper[4954]: I1206 08:55:40.634157 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z8n9l" Dec 06 08:55:40 crc kubenswrapper[4954]: I1206 08:55:40.638952 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fa1f-account-create-update-2lftp" Dec 06 08:55:40 crc kubenswrapper[4954]: I1206 08:55:40.646185 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc8zx\" (UniqueName: \"kubernetes.io/projected/c2198ae3-47db-45ba-990d-2bf1c32800e3-kube-api-access-gc8zx\") pod \"c2198ae3-47db-45ba-990d-2bf1c32800e3\" (UID: \"c2198ae3-47db-45ba-990d-2bf1c32800e3\") " Dec 06 08:55:40 crc kubenswrapper[4954]: I1206 08:55:40.646228 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2198ae3-47db-45ba-990d-2bf1c32800e3-operator-scripts\") pod \"c2198ae3-47db-45ba-990d-2bf1c32800e3\" (UID: \"c2198ae3-47db-45ba-990d-2bf1c32800e3\") " Dec 06 08:55:40 crc kubenswrapper[4954]: I1206 08:55:40.646296 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/667c7f97-0625-43e8-bd1d-b248f07f231f-operator-scripts\") pod \"667c7f97-0625-43e8-bd1d-b248f07f231f\" (UID: \"667c7f97-0625-43e8-bd1d-b248f07f231f\") " Dec 06 08:55:40 crc kubenswrapper[4954]: I1206 08:55:40.646401 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dmnx\" (UniqueName: \"kubernetes.io/projected/667c7f97-0625-43e8-bd1d-b248f07f231f-kube-api-access-2dmnx\") pod \"667c7f97-0625-43e8-bd1d-b248f07f231f\" (UID: \"667c7f97-0625-43e8-bd1d-b248f07f231f\") " Dec 06 08:55:40 crc kubenswrapper[4954]: I1206 08:55:40.646968 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2198ae3-47db-45ba-990d-2bf1c32800e3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2198ae3-47db-45ba-990d-2bf1c32800e3" (UID: "c2198ae3-47db-45ba-990d-2bf1c32800e3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:55:40 crc kubenswrapper[4954]: I1206 08:55:40.647279 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/667c7f97-0625-43e8-bd1d-b248f07f231f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "667c7f97-0625-43e8-bd1d-b248f07f231f" (UID: "667c7f97-0625-43e8-bd1d-b248f07f231f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:55:40 crc kubenswrapper[4954]: I1206 08:55:40.654107 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/667c7f97-0625-43e8-bd1d-b248f07f231f-kube-api-access-2dmnx" (OuterVolumeSpecName: "kube-api-access-2dmnx") pod "667c7f97-0625-43e8-bd1d-b248f07f231f" (UID: "667c7f97-0625-43e8-bd1d-b248f07f231f"). InnerVolumeSpecName "kube-api-access-2dmnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:55:40 crc kubenswrapper[4954]: I1206 08:55:40.654255 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2198ae3-47db-45ba-990d-2bf1c32800e3-kube-api-access-gc8zx" (OuterVolumeSpecName: "kube-api-access-gc8zx") pod "c2198ae3-47db-45ba-990d-2bf1c32800e3" (UID: "c2198ae3-47db-45ba-990d-2bf1c32800e3"). InnerVolumeSpecName "kube-api-access-gc8zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:55:40 crc kubenswrapper[4954]: I1206 08:55:40.747952 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc8zx\" (UniqueName: \"kubernetes.io/projected/c2198ae3-47db-45ba-990d-2bf1c32800e3-kube-api-access-gc8zx\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:40 crc kubenswrapper[4954]: I1206 08:55:40.747995 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2198ae3-47db-45ba-990d-2bf1c32800e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:40 crc kubenswrapper[4954]: I1206 08:55:40.748010 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/667c7f97-0625-43e8-bd1d-b248f07f231f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:40 crc kubenswrapper[4954]: I1206 08:55:40.748021 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dmnx\" (UniqueName: \"kubernetes.io/projected/667c7f97-0625-43e8-bd1d-b248f07f231f-kube-api-access-2dmnx\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:41 crc kubenswrapper[4954]: I1206 08:55:41.211077 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fa1f-account-create-update-2lftp" Dec 06 08:55:41 crc kubenswrapper[4954]: I1206 08:55:41.211114 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fa1f-account-create-update-2lftp" event={"ID":"667c7f97-0625-43e8-bd1d-b248f07f231f","Type":"ContainerDied","Data":"996100c181775e33972ddad1f463714cba8f75f75f03268564121d392bd28862"} Dec 06 08:55:41 crc kubenswrapper[4954]: I1206 08:55:41.211195 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="996100c181775e33972ddad1f463714cba8f75f75f03268564121d392bd28862" Dec 06 08:55:41 crc kubenswrapper[4954]: I1206 08:55:41.213843 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-z8n9l" event={"ID":"c2198ae3-47db-45ba-990d-2bf1c32800e3","Type":"ContainerDied","Data":"2cd003f9f14354f76025a3d943a41a9d77573d57928561417204a775a8c857f3"} Dec 06 08:55:41 crc kubenswrapper[4954]: I1206 08:55:41.213893 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z8n9l" Dec 06 08:55:41 crc kubenswrapper[4954]: I1206 08:55:41.213899 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cd003f9f14354f76025a3d943a41a9d77573d57928561417204a775a8c857f3" Dec 06 08:55:41 crc kubenswrapper[4954]: I1206 08:55:41.216693 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsrn9" event={"ID":"752d1b6e-0838-4d77-bd81-0b85d7266e2a","Type":"ContainerStarted","Data":"e534fe811d02b86b84d9f2739f89c9537e1febc4bcd92751fd1b3c6cff07f067"} Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.226393 4954 generic.go:334] "Generic (PLEG): container finished" podID="752d1b6e-0838-4d77-bd81-0b85d7266e2a" containerID="e534fe811d02b86b84d9f2739f89c9537e1febc4bcd92751fd1b3c6cff07f067" exitCode=0 Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.226464 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsrn9" event={"ID":"752d1b6e-0838-4d77-bd81-0b85d7266e2a","Type":"ContainerDied","Data":"e534fe811d02b86b84d9f2739f89c9537e1febc4bcd92751fd1b3c6cff07f067"} Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.391930 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-nntvl"] Dec 06 08:55:42 crc kubenswrapper[4954]: E1206 08:55:42.392292 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667c7f97-0625-43e8-bd1d-b248f07f231f" containerName="mariadb-account-create-update" Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.392312 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="667c7f97-0625-43e8-bd1d-b248f07f231f" containerName="mariadb-account-create-update" Dec 06 08:55:42 crc kubenswrapper[4954]: E1206 08:55:42.392332 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2198ae3-47db-45ba-990d-2bf1c32800e3" containerName="mariadb-database-create" Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.392338 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2198ae3-47db-45ba-990d-2bf1c32800e3" containerName="mariadb-database-create" Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.392523 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="667c7f97-0625-43e8-bd1d-b248f07f231f" containerName="mariadb-account-create-update" Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.392555 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2198ae3-47db-45ba-990d-2bf1c32800e3" containerName="mariadb-database-create" Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.393115 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nntvl" Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.396464 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.396658 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vgddh" Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.396748 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.404340 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-nntvl"] Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.477526 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbf7w\" (UniqueName: \"kubernetes.io/projected/5bcff096-e20d-4ca1-9211-d48324b0fb6a-kube-api-access-vbf7w\") pod \"neutron-db-sync-nntvl\" (UID: \"5bcff096-e20d-4ca1-9211-d48324b0fb6a\") " pod="openstack/neutron-db-sync-nntvl" Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.477610 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5bcff096-e20d-4ca1-9211-d48324b0fb6a-config\") pod \"neutron-db-sync-nntvl\" (UID: \"5bcff096-e20d-4ca1-9211-d48324b0fb6a\") " pod="openstack/neutron-db-sync-nntvl" Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.477638 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bcff096-e20d-4ca1-9211-d48324b0fb6a-combined-ca-bundle\") pod \"neutron-db-sync-nntvl\" (UID: \"5bcff096-e20d-4ca1-9211-d48324b0fb6a\") " pod="openstack/neutron-db-sync-nntvl" Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.579830 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbf7w\" (UniqueName: \"kubernetes.io/projected/5bcff096-e20d-4ca1-9211-d48324b0fb6a-kube-api-access-vbf7w\") pod \"neutron-db-sync-nntvl\" (UID: \"5bcff096-e20d-4ca1-9211-d48324b0fb6a\") " pod="openstack/neutron-db-sync-nntvl" Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.579913 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5bcff096-e20d-4ca1-9211-d48324b0fb6a-config\") pod \"neutron-db-sync-nntvl\" (UID: \"5bcff096-e20d-4ca1-9211-d48324b0fb6a\") " pod="openstack/neutron-db-sync-nntvl" Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.579952 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bcff096-e20d-4ca1-9211-d48324b0fb6a-combined-ca-bundle\") pod \"neutron-db-sync-nntvl\" (UID: \"5bcff096-e20d-4ca1-9211-d48324b0fb6a\") " pod="openstack/neutron-db-sync-nntvl" Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.585778 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5bcff096-e20d-4ca1-9211-d48324b0fb6a-config\") pod \"neutron-db-sync-nntvl\" (UID: \"5bcff096-e20d-4ca1-9211-d48324b0fb6a\") " pod="openstack/neutron-db-sync-nntvl" Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.585877 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bcff096-e20d-4ca1-9211-d48324b0fb6a-combined-ca-bundle\") pod \"neutron-db-sync-nntvl\" (UID: \"5bcff096-e20d-4ca1-9211-d48324b0fb6a\") " pod="openstack/neutron-db-sync-nntvl" Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.598283 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbf7w\" (UniqueName: \"kubernetes.io/projected/5bcff096-e20d-4ca1-9211-d48324b0fb6a-kube-api-access-vbf7w\") pod \"neutron-db-sync-nntvl\" (UID: \"5bcff096-e20d-4ca1-9211-d48324b0fb6a\") " pod="openstack/neutron-db-sync-nntvl" Dec 06 08:55:42 crc kubenswrapper[4954]: I1206 08:55:42.712328 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nntvl" Dec 06 08:55:43 crc kubenswrapper[4954]: I1206 08:55:43.169284 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-nntvl"] Dec 06 08:55:43 crc kubenswrapper[4954]: I1206 08:55:43.237104 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nntvl" event={"ID":"5bcff096-e20d-4ca1-9211-d48324b0fb6a","Type":"ContainerStarted","Data":"40340181dfac49f151a07bf92d0c4903fb9328cc3dd18088157336254a6c3d2f"} Dec 06 08:55:43 crc kubenswrapper[4954]: I1206 08:55:43.239790 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsrn9" event={"ID":"752d1b6e-0838-4d77-bd81-0b85d7266e2a","Type":"ContainerStarted","Data":"0dca0ba37a56bcf79842c1ba3a5161c95da7575e2e6ba262eef745155e0c1286"} Dec 06 08:55:43 crc kubenswrapper[4954]: I1206 08:55:43.267590 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dsrn9" podStartSLOduration=2.846365676 podStartE2EDuration="5.267548686s" podCreationTimestamp="2025-12-06 08:55:38 +0000 UTC" firstStartedPulling="2025-12-06 08:55:40.200414374 +0000 UTC m=+7115.013773763" lastFinishedPulling="2025-12-06 08:55:42.621597394 +0000 UTC m=+7117.434956773" observedRunningTime="2025-12-06 08:55:43.258834445 +0000 UTC m=+7118.072193834" watchObservedRunningTime="2025-12-06 08:55:43.267548686 +0000 UTC m=+7118.080908085" Dec 06 08:55:44 crc kubenswrapper[4954]: I1206 08:55:44.249653 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nntvl" event={"ID":"5bcff096-e20d-4ca1-9211-d48324b0fb6a","Type":"ContainerStarted","Data":"fd04ce991f30226b3bb92f166e317cda14679d9d93acb2f43f6c14dd283e8e2e"} Dec 06 08:55:44 crc kubenswrapper[4954]: I1206 08:55:44.270259 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-nntvl" podStartSLOduration=2.270238403 podStartE2EDuration="2.270238403s" podCreationTimestamp="2025-12-06 08:55:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:55:44.266584126 +0000 UTC m=+7119.079943525" watchObservedRunningTime="2025-12-06 08:55:44.270238403 +0000 UTC m=+7119.083597792" Dec 06 08:55:48 crc kubenswrapper[4954]: I1206 08:55:48.549071 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dsrn9" Dec 06 08:55:48 crc kubenswrapper[4954]: I1206 08:55:48.549648 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dsrn9" Dec 06 08:55:48 crc kubenswrapper[4954]: I1206 08:55:48.591868 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dsrn9" Dec 06 08:55:49 crc kubenswrapper[4954]: I1206 08:55:49.316362 4954 generic.go:334] "Generic (PLEG): container finished" podID="5bcff096-e20d-4ca1-9211-d48324b0fb6a" containerID="fd04ce991f30226b3bb92f166e317cda14679d9d93acb2f43f6c14dd283e8e2e" exitCode=0 Dec 06 08:55:49 crc kubenswrapper[4954]: I1206 08:55:49.316820 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nntvl" event={"ID":"5bcff096-e20d-4ca1-9211-d48324b0fb6a","Type":"ContainerDied","Data":"fd04ce991f30226b3bb92f166e317cda14679d9d93acb2f43f6c14dd283e8e2e"} Dec 06 08:55:49 crc kubenswrapper[4954]: I1206 08:55:49.365424 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dsrn9" Dec 06 08:55:49 crc kubenswrapper[4954]: I1206 08:55:49.418876 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsrn9"] Dec 06 08:55:50 crc kubenswrapper[4954]: I1206 08:55:50.444206 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:55:50 crc kubenswrapper[4954]: E1206 08:55:50.444495 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:55:50 crc kubenswrapper[4954]: I1206 08:55:50.653639 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nntvl" Dec 06 08:55:50 crc kubenswrapper[4954]: I1206 08:55:50.829286 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bcff096-e20d-4ca1-9211-d48324b0fb6a-combined-ca-bundle\") pod \"5bcff096-e20d-4ca1-9211-d48324b0fb6a\" (UID: \"5bcff096-e20d-4ca1-9211-d48324b0fb6a\") " Dec 06 08:55:50 crc kubenswrapper[4954]: I1206 08:55:50.829345 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbf7w\" (UniqueName: \"kubernetes.io/projected/5bcff096-e20d-4ca1-9211-d48324b0fb6a-kube-api-access-vbf7w\") pod \"5bcff096-e20d-4ca1-9211-d48324b0fb6a\" (UID: \"5bcff096-e20d-4ca1-9211-d48324b0fb6a\") " Dec 06 08:55:50 crc kubenswrapper[4954]: I1206 08:55:50.829409 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5bcff096-e20d-4ca1-9211-d48324b0fb6a-config\") pod \"5bcff096-e20d-4ca1-9211-d48324b0fb6a\" (UID: \"5bcff096-e20d-4ca1-9211-d48324b0fb6a\") " Dec 06 08:55:50 crc kubenswrapper[4954]: I1206 08:55:50.861726 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bcff096-e20d-4ca1-9211-d48324b0fb6a-kube-api-access-vbf7w" (OuterVolumeSpecName: "kube-api-access-vbf7w") pod "5bcff096-e20d-4ca1-9211-d48324b0fb6a" (UID: "5bcff096-e20d-4ca1-9211-d48324b0fb6a"). InnerVolumeSpecName "kube-api-access-vbf7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:55:50 crc kubenswrapper[4954]: I1206 08:55:50.865702 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bcff096-e20d-4ca1-9211-d48324b0fb6a-config" (OuterVolumeSpecName: "config") pod "5bcff096-e20d-4ca1-9211-d48324b0fb6a" (UID: "5bcff096-e20d-4ca1-9211-d48324b0fb6a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:55:50 crc kubenswrapper[4954]: I1206 08:55:50.865902 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bcff096-e20d-4ca1-9211-d48324b0fb6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bcff096-e20d-4ca1-9211-d48324b0fb6a" (UID: "5bcff096-e20d-4ca1-9211-d48324b0fb6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:55:50 crc kubenswrapper[4954]: I1206 08:55:50.931919 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bcff096-e20d-4ca1-9211-d48324b0fb6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:50 crc kubenswrapper[4954]: I1206 08:55:50.931959 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbf7w\" (UniqueName: \"kubernetes.io/projected/5bcff096-e20d-4ca1-9211-d48324b0fb6a-kube-api-access-vbf7w\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:50 crc kubenswrapper[4954]: I1206 08:55:50.931995 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5bcff096-e20d-4ca1-9211-d48324b0fb6a-config\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.332499 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nntvl" event={"ID":"5bcff096-e20d-4ca1-9211-d48324b0fb6a","Type":"ContainerDied","Data":"40340181dfac49f151a07bf92d0c4903fb9328cc3dd18088157336254a6c3d2f"} Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.332553 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40340181dfac49f151a07bf92d0c4903fb9328cc3dd18088157336254a6c3d2f" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.332684 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dsrn9" podUID="752d1b6e-0838-4d77-bd81-0b85d7266e2a" containerName="registry-server" containerID="cri-o://0dca0ba37a56bcf79842c1ba3a5161c95da7575e2e6ba262eef745155e0c1286" gracePeriod=2 Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.332533 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nntvl" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.564806 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d94c4b4dc-qtwjz"] Dec 06 08:55:51 crc kubenswrapper[4954]: E1206 08:55:51.565721 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bcff096-e20d-4ca1-9211-d48324b0fb6a" containerName="neutron-db-sync" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.565743 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcff096-e20d-4ca1-9211-d48324b0fb6a" containerName="neutron-db-sync" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.565969 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bcff096-e20d-4ca1-9211-d48324b0fb6a" containerName="neutron-db-sync" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.567205 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.577957 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d94c4b4dc-qtwjz"] Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.674765 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7bfd6458d4-54qqp"] Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.676491 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.681257 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.681482 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.681657 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.681890 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vgddh" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.694981 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bfd6458d4-54qqp"] Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.745839 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-dns-svc\") pod \"dnsmasq-dns-7d94c4b4dc-qtwjz\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.745884 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-config\") pod \"dnsmasq-dns-7d94c4b4dc-qtwjz\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.745954 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-ovsdbserver-nb\") pod \"dnsmasq-dns-7d94c4b4dc-qtwjz\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.746431 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-ovsdbserver-sb\") pod \"dnsmasq-dns-7d94c4b4dc-qtwjz\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.746522 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrhzl\" (UniqueName: \"kubernetes.io/projected/93010cdb-946e-4a9b-8ca7-8531d352d881-kube-api-access-nrhzl\") pod \"dnsmasq-dns-7d94c4b4dc-qtwjz\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.848090 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-ovsdbserver-sb\") pod \"dnsmasq-dns-7d94c4b4dc-qtwjz\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.848141 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrhzl\" (UniqueName: \"kubernetes.io/projected/93010cdb-946e-4a9b-8ca7-8531d352d881-kube-api-access-nrhzl\") pod \"dnsmasq-dns-7d94c4b4dc-qtwjz\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.848188 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-dns-svc\") pod \"dnsmasq-dns-7d94c4b4dc-qtwjz\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.848209 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-config\") pod \"dnsmasq-dns-7d94c4b4dc-qtwjz\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.848271 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-ovsdbserver-nb\") pod \"dnsmasq-dns-7d94c4b4dc-qtwjz\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.848311 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-ovndb-tls-certs\") pod \"neutron-7bfd6458d4-54qqp\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.848343 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-combined-ca-bundle\") pod \"neutron-7bfd6458d4-54qqp\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.848384 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-config\") pod \"neutron-7bfd6458d4-54qqp\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.848421 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-httpd-config\") pod \"neutron-7bfd6458d4-54qqp\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.848451 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkzl5\" (UniqueName: \"kubernetes.io/projected/264f0438-c16d-4dbc-9496-eba8c23251b4-kube-api-access-wkzl5\") pod \"neutron-7bfd6458d4-54qqp\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.849420 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-ovsdbserver-sb\") pod \"dnsmasq-dns-7d94c4b4dc-qtwjz\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.850491 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-dns-svc\") pod \"dnsmasq-dns-7d94c4b4dc-qtwjz\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.851157 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-config\") pod \"dnsmasq-dns-7d94c4b4dc-qtwjz\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.851366 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-ovsdbserver-nb\") pod \"dnsmasq-dns-7d94c4b4dc-qtwjz\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.868176 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrhzl\" (UniqueName: \"kubernetes.io/projected/93010cdb-946e-4a9b-8ca7-8531d352d881-kube-api-access-nrhzl\") pod \"dnsmasq-dns-7d94c4b4dc-qtwjz\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.917087 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.949777 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-httpd-config\") pod \"neutron-7bfd6458d4-54qqp\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.949839 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkzl5\" (UniqueName: \"kubernetes.io/projected/264f0438-c16d-4dbc-9496-eba8c23251b4-kube-api-access-wkzl5\") pod \"neutron-7bfd6458d4-54qqp\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.949954 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-ovndb-tls-certs\") pod \"neutron-7bfd6458d4-54qqp\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.949980 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-combined-ca-bundle\") pod \"neutron-7bfd6458d4-54qqp\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.950009 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-config\") pod \"neutron-7bfd6458d4-54qqp\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.954412 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-config\") pod \"neutron-7bfd6458d4-54qqp\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.956173 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-httpd-config\") pod \"neutron-7bfd6458d4-54qqp\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.956883 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-combined-ca-bundle\") pod \"neutron-7bfd6458d4-54qqp\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.959042 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-ovndb-tls-certs\") pod \"neutron-7bfd6458d4-54qqp\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:55:51 crc kubenswrapper[4954]: I1206 08:55:51.972837 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkzl5\" (UniqueName: \"kubernetes.io/projected/264f0438-c16d-4dbc-9496-eba8c23251b4-kube-api-access-wkzl5\") pod \"neutron-7bfd6458d4-54qqp\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.010413 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.140622 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dsrn9" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.257310 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/752d1b6e-0838-4d77-bd81-0b85d7266e2a-utilities\") pod \"752d1b6e-0838-4d77-bd81-0b85d7266e2a\" (UID: \"752d1b6e-0838-4d77-bd81-0b85d7266e2a\") " Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.257401 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/752d1b6e-0838-4d77-bd81-0b85d7266e2a-catalog-content\") pod \"752d1b6e-0838-4d77-bd81-0b85d7266e2a\" (UID: \"752d1b6e-0838-4d77-bd81-0b85d7266e2a\") " Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.257451 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf45l\" (UniqueName: \"kubernetes.io/projected/752d1b6e-0838-4d77-bd81-0b85d7266e2a-kube-api-access-mf45l\") pod \"752d1b6e-0838-4d77-bd81-0b85d7266e2a\" (UID: \"752d1b6e-0838-4d77-bd81-0b85d7266e2a\") " Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.271359 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/752d1b6e-0838-4d77-bd81-0b85d7266e2a-utilities" (OuterVolumeSpecName: "utilities") pod "752d1b6e-0838-4d77-bd81-0b85d7266e2a" (UID: "752d1b6e-0838-4d77-bd81-0b85d7266e2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.284751 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/752d1b6e-0838-4d77-bd81-0b85d7266e2a-kube-api-access-mf45l" (OuterVolumeSpecName: "kube-api-access-mf45l") pod "752d1b6e-0838-4d77-bd81-0b85d7266e2a" (UID: "752d1b6e-0838-4d77-bd81-0b85d7266e2a"). InnerVolumeSpecName "kube-api-access-mf45l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.285134 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/752d1b6e-0838-4d77-bd81-0b85d7266e2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "752d1b6e-0838-4d77-bd81-0b85d7266e2a" (UID: "752d1b6e-0838-4d77-bd81-0b85d7266e2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.355938 4954 generic.go:334] "Generic (PLEG): container finished" podID="752d1b6e-0838-4d77-bd81-0b85d7266e2a" containerID="0dca0ba37a56bcf79842c1ba3a5161c95da7575e2e6ba262eef745155e0c1286" exitCode=0 Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.356178 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dsrn9" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.356705 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsrn9" event={"ID":"752d1b6e-0838-4d77-bd81-0b85d7266e2a","Type":"ContainerDied","Data":"0dca0ba37a56bcf79842c1ba3a5161c95da7575e2e6ba262eef745155e0c1286"} Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.356768 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsrn9" event={"ID":"752d1b6e-0838-4d77-bd81-0b85d7266e2a","Type":"ContainerDied","Data":"1f9839e027c7da5b371e88f487a177d19da54f9e50b4a645c7953e57ba69b3dc"} Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.356787 4954 scope.go:117] "RemoveContainer" containerID="0dca0ba37a56bcf79842c1ba3a5161c95da7575e2e6ba262eef745155e0c1286" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.359627 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/752d1b6e-0838-4d77-bd81-0b85d7266e2a-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.359652 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/752d1b6e-0838-4d77-bd81-0b85d7266e2a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.359663 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf45l\" (UniqueName: \"kubernetes.io/projected/752d1b6e-0838-4d77-bd81-0b85d7266e2a-kube-api-access-mf45l\") on node \"crc\" DevicePath \"\"" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.391046 4954 scope.go:117] "RemoveContainer" containerID="e534fe811d02b86b84d9f2739f89c9537e1febc4bcd92751fd1b3c6cff07f067" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.413752 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsrn9"] Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.418309 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsrn9"] Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.431454 4954 scope.go:117] "RemoveContainer" containerID="da0dd12d49e7bc1396b84886a351a86f98672e7a94a0cc17f0d781f0cae3ccfa" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.465871 4954 scope.go:117] "RemoveContainer" containerID="0dca0ba37a56bcf79842c1ba3a5161c95da7575e2e6ba262eef745155e0c1286" Dec 06 08:55:52 crc kubenswrapper[4954]: E1206 08:55:52.466341 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dca0ba37a56bcf79842c1ba3a5161c95da7575e2e6ba262eef745155e0c1286\": container with ID starting with 0dca0ba37a56bcf79842c1ba3a5161c95da7575e2e6ba262eef745155e0c1286 not found: ID does not exist" containerID="0dca0ba37a56bcf79842c1ba3a5161c95da7575e2e6ba262eef745155e0c1286" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.466372 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dca0ba37a56bcf79842c1ba3a5161c95da7575e2e6ba262eef745155e0c1286"} err="failed to get container status \"0dca0ba37a56bcf79842c1ba3a5161c95da7575e2e6ba262eef745155e0c1286\": rpc error: code = NotFound desc = could not find container \"0dca0ba37a56bcf79842c1ba3a5161c95da7575e2e6ba262eef745155e0c1286\": container with ID starting with 0dca0ba37a56bcf79842c1ba3a5161c95da7575e2e6ba262eef745155e0c1286 not found: ID does not exist" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.466393 4954 scope.go:117] "RemoveContainer" containerID="e534fe811d02b86b84d9f2739f89c9537e1febc4bcd92751fd1b3c6cff07f067" Dec 06 08:55:52 crc kubenswrapper[4954]: E1206 08:55:52.466812 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e534fe811d02b86b84d9f2739f89c9537e1febc4bcd92751fd1b3c6cff07f067\": container with ID starting with e534fe811d02b86b84d9f2739f89c9537e1febc4bcd92751fd1b3c6cff07f067 not found: ID does not exist" containerID="e534fe811d02b86b84d9f2739f89c9537e1febc4bcd92751fd1b3c6cff07f067" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.466867 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e534fe811d02b86b84d9f2739f89c9537e1febc4bcd92751fd1b3c6cff07f067"} err="failed to get container status \"e534fe811d02b86b84d9f2739f89c9537e1febc4bcd92751fd1b3c6cff07f067\": rpc error: code = NotFound desc = could not find container \"e534fe811d02b86b84d9f2739f89c9537e1febc4bcd92751fd1b3c6cff07f067\": container with ID starting with e534fe811d02b86b84d9f2739f89c9537e1febc4bcd92751fd1b3c6cff07f067 not found: ID does not exist" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.466903 4954 scope.go:117] "RemoveContainer" containerID="da0dd12d49e7bc1396b84886a351a86f98672e7a94a0cc17f0d781f0cae3ccfa" Dec 06 08:55:52 crc kubenswrapper[4954]: E1206 08:55:52.471923 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da0dd12d49e7bc1396b84886a351a86f98672e7a94a0cc17f0d781f0cae3ccfa\": container with ID starting with da0dd12d49e7bc1396b84886a351a86f98672e7a94a0cc17f0d781f0cae3ccfa not found: ID does not exist" containerID="da0dd12d49e7bc1396b84886a351a86f98672e7a94a0cc17f0d781f0cae3ccfa" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.471990 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0dd12d49e7bc1396b84886a351a86f98672e7a94a0cc17f0d781f0cae3ccfa"} err="failed to get container status \"da0dd12d49e7bc1396b84886a351a86f98672e7a94a0cc17f0d781f0cae3ccfa\": rpc error: code = NotFound desc = could not find container \"da0dd12d49e7bc1396b84886a351a86f98672e7a94a0cc17f0d781f0cae3ccfa\": container with ID starting with da0dd12d49e7bc1396b84886a351a86f98672e7a94a0cc17f0d781f0cae3ccfa not found: ID does not exist" Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.596227 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d94c4b4dc-qtwjz"] Dec 06 08:55:52 crc kubenswrapper[4954]: I1206 08:55:52.896344 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bfd6458d4-54qqp"] Dec 06 08:55:53 crc kubenswrapper[4954]: I1206 08:55:53.366901 4954 generic.go:334] "Generic (PLEG): container finished" podID="93010cdb-946e-4a9b-8ca7-8531d352d881" containerID="b5a9f69008f9e3ebbfe873dfd711064490052249cfaf662e9996cf36a2518e31" exitCode=0 Dec 06 08:55:53 crc kubenswrapper[4954]: I1206 08:55:53.366981 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" event={"ID":"93010cdb-946e-4a9b-8ca7-8531d352d881","Type":"ContainerDied","Data":"b5a9f69008f9e3ebbfe873dfd711064490052249cfaf662e9996cf36a2518e31"} Dec 06 08:55:53 crc kubenswrapper[4954]: I1206 08:55:53.367012 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" event={"ID":"93010cdb-946e-4a9b-8ca7-8531d352d881","Type":"ContainerStarted","Data":"9204c89a5303b3a400d9c2e4efce15488b44d3140fd1cca008fca65b84e7b6ed"} Dec 06 08:55:53 crc kubenswrapper[4954]: I1206 08:55:53.372788 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bfd6458d4-54qqp" event={"ID":"264f0438-c16d-4dbc-9496-eba8c23251b4","Type":"ContainerStarted","Data":"bece989f67c1243ec5edf447af0f67c5301c3c83b42cde9d53bafd652abcc16c"} Dec 06 08:55:53 crc kubenswrapper[4954]: I1206 08:55:53.372829 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bfd6458d4-54qqp" event={"ID":"264f0438-c16d-4dbc-9496-eba8c23251b4","Type":"ContainerStarted","Data":"357f4c9a7004eaf37948f2312c881dae3f2c91d94b9f49962598a8fcbf48f8c9"} Dec 06 08:55:53 crc kubenswrapper[4954]: I1206 08:55:53.456991 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="752d1b6e-0838-4d77-bd81-0b85d7266e2a" path="/var/lib/kubelet/pods/752d1b6e-0838-4d77-bd81-0b85d7266e2a/volumes" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.171816 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-686ff754fc-sms8t"] Dec 06 08:55:54 crc kubenswrapper[4954]: E1206 08:55:54.172576 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752d1b6e-0838-4d77-bd81-0b85d7266e2a" containerName="extract-utilities" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.172598 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="752d1b6e-0838-4d77-bd81-0b85d7266e2a" containerName="extract-utilities" Dec 06 08:55:54 crc kubenswrapper[4954]: E1206 08:55:54.172621 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752d1b6e-0838-4d77-bd81-0b85d7266e2a" containerName="registry-server" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.172629 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="752d1b6e-0838-4d77-bd81-0b85d7266e2a" containerName="registry-server" Dec 06 08:55:54 crc kubenswrapper[4954]: E1206 08:55:54.172655 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752d1b6e-0838-4d77-bd81-0b85d7266e2a" containerName="extract-content" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.172664 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="752d1b6e-0838-4d77-bd81-0b85d7266e2a" containerName="extract-content" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.172842 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="752d1b6e-0838-4d77-bd81-0b85d7266e2a" containerName="registry-server" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.173819 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.177159 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 06 08:55:54 crc kubenswrapper[4954]: W1206 08:55:54.177358 4954 reflector.go:561] object-"openstack"/"cert-neutron-public-svc": failed to list *v1.Secret: secrets "cert-neutron-public-svc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 06 08:55:54 crc kubenswrapper[4954]: E1206 08:55:54.177408 4954 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"cert-neutron-public-svc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cert-neutron-public-svc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.194879 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-686ff754fc-sms8t"] Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.317732 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-config\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.318094 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbb5x\" (UniqueName: \"kubernetes.io/projected/02ebab58-0e79-4851-9845-15e647a11e68-kube-api-access-zbb5x\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.318229 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-internal-tls-certs\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.318381 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-httpd-config\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.318503 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-ovndb-tls-certs\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.318651 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-combined-ca-bundle\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.318769 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-public-tls-certs\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.384302 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bfd6458d4-54qqp" event={"ID":"264f0438-c16d-4dbc-9496-eba8c23251b4","Type":"ContainerStarted","Data":"3aacc8c225ac774edc29b66ff382120dc980b12cc3a20c0326569d2710fb6086"} Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.384737 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.386831 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" event={"ID":"93010cdb-946e-4a9b-8ca7-8531d352d881","Type":"ContainerStarted","Data":"7d37d4181228a99335367b7cd1121c4343ecf649173f24f7151ab12635174d82"} Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.386984 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.405636 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7bfd6458d4-54qqp" podStartSLOduration=3.405617058 podStartE2EDuration="3.405617058s" podCreationTimestamp="2025-12-06 08:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:55:54.402588058 +0000 UTC m=+7129.215947447" watchObservedRunningTime="2025-12-06 08:55:54.405617058 +0000 UTC m=+7129.218976447" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.420483 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-httpd-config\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.420957 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-ovndb-tls-certs\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.421134 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-combined-ca-bundle\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.421337 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-public-tls-certs\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.421630 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-config\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.421853 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbb5x\" (UniqueName: \"kubernetes.io/projected/02ebab58-0e79-4851-9845-15e647a11e68-kube-api-access-zbb5x\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.422233 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-internal-tls-certs\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.424789 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" podStartSLOduration=3.424763198 podStartE2EDuration="3.424763198s" podCreationTimestamp="2025-12-06 08:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:55:54.420439703 +0000 UTC m=+7129.233799092" watchObservedRunningTime="2025-12-06 08:55:54.424763198 +0000 UTC m=+7129.238122587" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.426241 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-httpd-config\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.426509 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-combined-ca-bundle\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.427221 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-config\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.428392 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-ovndb-tls-certs\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.439260 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-internal-tls-certs\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:54 crc kubenswrapper[4954]: I1206 08:55:54.440403 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbb5x\" (UniqueName: \"kubernetes.io/projected/02ebab58-0e79-4851-9845-15e647a11e68-kube-api-access-zbb5x\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:55 crc kubenswrapper[4954]: E1206 08:55:55.421871 4954 secret.go:188] Couldn't get secret openstack/cert-neutron-public-svc: failed to sync secret cache: timed out waiting for the condition Dec 06 08:55:55 crc kubenswrapper[4954]: E1206 08:55:55.422010 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-public-tls-certs podName:02ebab58-0e79-4851-9845-15e647a11e68 nodeName:}" failed. No retries permitted until 2025-12-06 08:55:55.921957149 +0000 UTC m=+7130.735316538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-public-tls-certs") pod "neutron-686ff754fc-sms8t" (UID: "02ebab58-0e79-4851-9845-15e647a11e68") : failed to sync secret cache: timed out waiting for the condition Dec 06 08:55:55 crc kubenswrapper[4954]: I1206 08:55:55.676037 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 06 08:55:55 crc kubenswrapper[4954]: I1206 08:55:55.952328 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-public-tls-certs\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:55 crc kubenswrapper[4954]: I1206 08:55:55.963496 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02ebab58-0e79-4851-9845-15e647a11e68-public-tls-certs\") pod \"neutron-686ff754fc-sms8t\" (UID: \"02ebab58-0e79-4851-9845-15e647a11e68\") " pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:56 crc kubenswrapper[4954]: I1206 08:55:56.002144 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:56 crc kubenswrapper[4954]: I1206 08:55:56.532926 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-686ff754fc-sms8t"] Dec 06 08:55:57 crc kubenswrapper[4954]: I1206 08:55:57.409241 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-686ff754fc-sms8t" event={"ID":"02ebab58-0e79-4851-9845-15e647a11e68","Type":"ContainerStarted","Data":"878588d1a34869f01810b47ec8ce1bfbab3a7e3606261cd50468a1a51e4f08d7"} Dec 06 08:55:57 crc kubenswrapper[4954]: I1206 08:55:57.409699 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:55:57 crc kubenswrapper[4954]: I1206 08:55:57.409721 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-686ff754fc-sms8t" event={"ID":"02ebab58-0e79-4851-9845-15e647a11e68","Type":"ContainerStarted","Data":"1399684b5e8f4018265f08843e6a6a3ca22e7ee5f3d244be5dd27e50707733ea"} Dec 06 08:55:57 crc kubenswrapper[4954]: I1206 08:55:57.409735 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-686ff754fc-sms8t" event={"ID":"02ebab58-0e79-4851-9845-15e647a11e68","Type":"ContainerStarted","Data":"74b18e6b0279f07738b4c4d22c7c609f9b0a95143802beb5306379538d0c171c"} Dec 06 08:55:57 crc kubenswrapper[4954]: I1206 08:55:57.429083 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-686ff754fc-sms8t" podStartSLOduration=3.429057728 podStartE2EDuration="3.429057728s" podCreationTimestamp="2025-12-06 08:55:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:55:57.426926801 +0000 UTC m=+7132.240286210" watchObservedRunningTime="2025-12-06 08:55:57.429057728 +0000 UTC m=+7132.242417117" Dec 06 08:56:01 crc kubenswrapper[4954]: I1206 08:56:01.918825 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:56:01 crc kubenswrapper[4954]: I1206 08:56:01.979550 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78f54cd975-dcltx"] Dec 06 08:56:01 crc kubenswrapper[4954]: I1206 08:56:01.979879 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78f54cd975-dcltx" podUID="18d52357-0b5c-4d38-abc9-28eef64b0987" containerName="dnsmasq-dns" containerID="cri-o://903979689bf5a6b261e67f3a7d1d7c626c2b9bdd7a6c09d161614dd9163913b9" gracePeriod=10 Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.454992 4954 generic.go:334] "Generic (PLEG): container finished" podID="18d52357-0b5c-4d38-abc9-28eef64b0987" containerID="903979689bf5a6b261e67f3a7d1d7c626c2b9bdd7a6c09d161614dd9163913b9" exitCode=0 Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.455079 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78f54cd975-dcltx" event={"ID":"18d52357-0b5c-4d38-abc9-28eef64b0987","Type":"ContainerDied","Data":"903979689bf5a6b261e67f3a7d1d7c626c2b9bdd7a6c09d161614dd9163913b9"} Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.455675 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78f54cd975-dcltx" event={"ID":"18d52357-0b5c-4d38-abc9-28eef64b0987","Type":"ContainerDied","Data":"313efe409e1bd0e95fa62ebc6f3c728b09263da6c3a0168fa19897fd03cb82d1"} Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.455783 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="313efe409e1bd0e95fa62ebc6f3c728b09263da6c3a0168fa19897fd03cb82d1" Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.537494 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.690883 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-ovsdbserver-nb\") pod \"18d52357-0b5c-4d38-abc9-28eef64b0987\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.691009 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-dns-svc\") pod \"18d52357-0b5c-4d38-abc9-28eef64b0987\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.691043 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-ovsdbserver-sb\") pod \"18d52357-0b5c-4d38-abc9-28eef64b0987\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.691067 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-config\") pod \"18d52357-0b5c-4d38-abc9-28eef64b0987\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.691102 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlksc\" (UniqueName: \"kubernetes.io/projected/18d52357-0b5c-4d38-abc9-28eef64b0987-kube-api-access-xlksc\") pod \"18d52357-0b5c-4d38-abc9-28eef64b0987\" (UID: \"18d52357-0b5c-4d38-abc9-28eef64b0987\") " Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.698122 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18d52357-0b5c-4d38-abc9-28eef64b0987-kube-api-access-xlksc" (OuterVolumeSpecName: "kube-api-access-xlksc") pod "18d52357-0b5c-4d38-abc9-28eef64b0987" (UID: "18d52357-0b5c-4d38-abc9-28eef64b0987"). InnerVolumeSpecName "kube-api-access-xlksc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.737971 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "18d52357-0b5c-4d38-abc9-28eef64b0987" (UID: "18d52357-0b5c-4d38-abc9-28eef64b0987"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.741618 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-config" (OuterVolumeSpecName: "config") pod "18d52357-0b5c-4d38-abc9-28eef64b0987" (UID: "18d52357-0b5c-4d38-abc9-28eef64b0987"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.745514 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "18d52357-0b5c-4d38-abc9-28eef64b0987" (UID: "18d52357-0b5c-4d38-abc9-28eef64b0987"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.778805 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "18d52357-0b5c-4d38-abc9-28eef64b0987" (UID: "18d52357-0b5c-4d38-abc9-28eef64b0987"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.792776 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.792811 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.792825 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-config\") on node \"crc\" DevicePath \"\"" Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.792835 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlksc\" (UniqueName: \"kubernetes.io/projected/18d52357-0b5c-4d38-abc9-28eef64b0987-kube-api-access-xlksc\") on node \"crc\" DevicePath \"\"" Dec 06 08:56:02 crc kubenswrapper[4954]: I1206 08:56:02.792845 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18d52357-0b5c-4d38-abc9-28eef64b0987-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 08:56:03 crc kubenswrapper[4954]: I1206 08:56:03.461904 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78f54cd975-dcltx" Dec 06 08:56:03 crc kubenswrapper[4954]: I1206 08:56:03.491244 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78f54cd975-dcltx"] Dec 06 08:56:03 crc kubenswrapper[4954]: I1206 08:56:03.499444 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78f54cd975-dcltx"] Dec 06 08:56:05 crc kubenswrapper[4954]: I1206 08:56:05.449490 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:56:05 crc kubenswrapper[4954]: E1206 08:56:05.450127 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:56:05 crc kubenswrapper[4954]: I1206 08:56:05.455763 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18d52357-0b5c-4d38-abc9-28eef64b0987" path="/var/lib/kubelet/pods/18d52357-0b5c-4d38-abc9-28eef64b0987/volumes" Dec 06 08:56:19 crc kubenswrapper[4954]: I1206 08:56:19.443704 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:56:19 crc kubenswrapper[4954]: E1206 08:56:19.444640 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:56:22 crc kubenswrapper[4954]: I1206 08:56:22.019983 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:56:26 crc kubenswrapper[4954]: I1206 08:56:26.013501 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-686ff754fc-sms8t" Dec 06 08:56:26 crc kubenswrapper[4954]: I1206 08:56:26.070837 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7bfd6458d4-54qqp"] Dec 06 08:56:26 crc kubenswrapper[4954]: I1206 08:56:26.071070 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7bfd6458d4-54qqp" podUID="264f0438-c16d-4dbc-9496-eba8c23251b4" containerName="neutron-api" containerID="cri-o://bece989f67c1243ec5edf447af0f67c5301c3c83b42cde9d53bafd652abcc16c" gracePeriod=30 Dec 06 08:56:26 crc kubenswrapper[4954]: I1206 08:56:26.071452 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7bfd6458d4-54qqp" podUID="264f0438-c16d-4dbc-9496-eba8c23251b4" containerName="neutron-httpd" containerID="cri-o://3aacc8c225ac774edc29b66ff382120dc980b12cc3a20c0326569d2710fb6086" gracePeriod=30 Dec 06 08:56:26 crc kubenswrapper[4954]: I1206 08:56:26.656772 4954 generic.go:334] "Generic (PLEG): container finished" podID="264f0438-c16d-4dbc-9496-eba8c23251b4" containerID="3aacc8c225ac774edc29b66ff382120dc980b12cc3a20c0326569d2710fb6086" exitCode=0 Dec 06 08:56:26 crc kubenswrapper[4954]: I1206 08:56:26.656851 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bfd6458d4-54qqp" event={"ID":"264f0438-c16d-4dbc-9496-eba8c23251b4","Type":"ContainerDied","Data":"3aacc8c225ac774edc29b66ff382120dc980b12cc3a20c0326569d2710fb6086"} Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.105698 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.193510 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-config\") pod \"264f0438-c16d-4dbc-9496-eba8c23251b4\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.193635 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-combined-ca-bundle\") pod \"264f0438-c16d-4dbc-9496-eba8c23251b4\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.193708 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-httpd-config\") pod \"264f0438-c16d-4dbc-9496-eba8c23251b4\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.193725 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-ovndb-tls-certs\") pod \"264f0438-c16d-4dbc-9496-eba8c23251b4\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.193803 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkzl5\" (UniqueName: \"kubernetes.io/projected/264f0438-c16d-4dbc-9496-eba8c23251b4-kube-api-access-wkzl5\") pod \"264f0438-c16d-4dbc-9496-eba8c23251b4\" (UID: \"264f0438-c16d-4dbc-9496-eba8c23251b4\") " Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.199680 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "264f0438-c16d-4dbc-9496-eba8c23251b4" (UID: "264f0438-c16d-4dbc-9496-eba8c23251b4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.199734 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/264f0438-c16d-4dbc-9496-eba8c23251b4-kube-api-access-wkzl5" (OuterVolumeSpecName: "kube-api-access-wkzl5") pod "264f0438-c16d-4dbc-9496-eba8c23251b4" (UID: "264f0438-c16d-4dbc-9496-eba8c23251b4"). InnerVolumeSpecName "kube-api-access-wkzl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.242300 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "264f0438-c16d-4dbc-9496-eba8c23251b4" (UID: "264f0438-c16d-4dbc-9496-eba8c23251b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.255775 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-config" (OuterVolumeSpecName: "config") pod "264f0438-c16d-4dbc-9496-eba8c23251b4" (UID: "264f0438-c16d-4dbc-9496-eba8c23251b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.268851 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "264f0438-c16d-4dbc-9496-eba8c23251b4" (UID: "264f0438-c16d-4dbc-9496-eba8c23251b4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.297866 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-config\") on node \"crc\" DevicePath \"\"" Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.297900 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.297913 4954 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.297922 4954 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/264f0438-c16d-4dbc-9496-eba8c23251b4-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.297931 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkzl5\" (UniqueName: \"kubernetes.io/projected/264f0438-c16d-4dbc-9496-eba8c23251b4-kube-api-access-wkzl5\") on node \"crc\" DevicePath \"\"" Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.672840 4954 generic.go:334] "Generic (PLEG): container finished" podID="264f0438-c16d-4dbc-9496-eba8c23251b4" containerID="bece989f67c1243ec5edf447af0f67c5301c3c83b42cde9d53bafd652abcc16c" exitCode=0 Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.672955 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bfd6458d4-54qqp" Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.672951 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bfd6458d4-54qqp" event={"ID":"264f0438-c16d-4dbc-9496-eba8c23251b4","Type":"ContainerDied","Data":"bece989f67c1243ec5edf447af0f67c5301c3c83b42cde9d53bafd652abcc16c"} Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.673275 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bfd6458d4-54qqp" event={"ID":"264f0438-c16d-4dbc-9496-eba8c23251b4","Type":"ContainerDied","Data":"357f4c9a7004eaf37948f2312c881dae3f2c91d94b9f49962598a8fcbf48f8c9"} Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.673298 4954 scope.go:117] "RemoveContainer" containerID="3aacc8c225ac774edc29b66ff382120dc980b12cc3a20c0326569d2710fb6086" Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.715148 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7bfd6458d4-54qqp"] Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.721914 4954 scope.go:117] "RemoveContainer" containerID="bece989f67c1243ec5edf447af0f67c5301c3c83b42cde9d53bafd652abcc16c" Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.726281 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7bfd6458d4-54qqp"] Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.761617 4954 scope.go:117] "RemoveContainer" containerID="3aacc8c225ac774edc29b66ff382120dc980b12cc3a20c0326569d2710fb6086" Dec 06 08:56:28 crc kubenswrapper[4954]: E1206 08:56:28.762080 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aacc8c225ac774edc29b66ff382120dc980b12cc3a20c0326569d2710fb6086\": container with ID starting with 3aacc8c225ac774edc29b66ff382120dc980b12cc3a20c0326569d2710fb6086 not found: ID does not exist" containerID="3aacc8c225ac774edc29b66ff382120dc980b12cc3a20c0326569d2710fb6086" Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.762204 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aacc8c225ac774edc29b66ff382120dc980b12cc3a20c0326569d2710fb6086"} err="failed to get container status \"3aacc8c225ac774edc29b66ff382120dc980b12cc3a20c0326569d2710fb6086\": rpc error: code = NotFound desc = could not find container \"3aacc8c225ac774edc29b66ff382120dc980b12cc3a20c0326569d2710fb6086\": container with ID starting with 3aacc8c225ac774edc29b66ff382120dc980b12cc3a20c0326569d2710fb6086 not found: ID does not exist" Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.762312 4954 scope.go:117] "RemoveContainer" containerID="bece989f67c1243ec5edf447af0f67c5301c3c83b42cde9d53bafd652abcc16c" Dec 06 08:56:28 crc kubenswrapper[4954]: E1206 08:56:28.762730 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bece989f67c1243ec5edf447af0f67c5301c3c83b42cde9d53bafd652abcc16c\": container with ID starting with bece989f67c1243ec5edf447af0f67c5301c3c83b42cde9d53bafd652abcc16c not found: ID does not exist" containerID="bece989f67c1243ec5edf447af0f67c5301c3c83b42cde9d53bafd652abcc16c" Dec 06 08:56:28 crc kubenswrapper[4954]: I1206 08:56:28.762815 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bece989f67c1243ec5edf447af0f67c5301c3c83b42cde9d53bafd652abcc16c"} err="failed to get container status \"bece989f67c1243ec5edf447af0f67c5301c3c83b42cde9d53bafd652abcc16c\": rpc error: code = NotFound desc = could not find container \"bece989f67c1243ec5edf447af0f67c5301c3c83b42cde9d53bafd652abcc16c\": container with ID starting with bece989f67c1243ec5edf447af0f67c5301c3c83b42cde9d53bafd652abcc16c not found: ID does not exist" Dec 06 08:56:29 crc kubenswrapper[4954]: I1206 08:56:29.453788 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="264f0438-c16d-4dbc-9496-eba8c23251b4" path="/var/lib/kubelet/pods/264f0438-c16d-4dbc-9496-eba8c23251b4/volumes" Dec 06 08:56:33 crc kubenswrapper[4954]: I1206 08:56:33.444132 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:56:33 crc kubenswrapper[4954]: E1206 08:56:33.444764 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:56:46 crc kubenswrapper[4954]: I1206 08:56:46.443844 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:56:46 crc kubenswrapper[4954]: E1206 08:56:46.444714 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.502470 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-dw9vz"] Dec 06 08:56:53 crc kubenswrapper[4954]: E1206 08:56:53.504481 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d52357-0b5c-4d38-abc9-28eef64b0987" containerName="init" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.504640 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d52357-0b5c-4d38-abc9-28eef64b0987" containerName="init" Dec 06 08:56:53 crc kubenswrapper[4954]: E1206 08:56:53.504764 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="264f0438-c16d-4dbc-9496-eba8c23251b4" containerName="neutron-httpd" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.504848 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="264f0438-c16d-4dbc-9496-eba8c23251b4" containerName="neutron-httpd" Dec 06 08:56:53 crc kubenswrapper[4954]: E1206 08:56:53.504944 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="264f0438-c16d-4dbc-9496-eba8c23251b4" containerName="neutron-api" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.505019 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="264f0438-c16d-4dbc-9496-eba8c23251b4" containerName="neutron-api" Dec 06 08:56:53 crc kubenswrapper[4954]: E1206 08:56:53.505093 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d52357-0b5c-4d38-abc9-28eef64b0987" containerName="dnsmasq-dns" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.505262 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d52357-0b5c-4d38-abc9-28eef64b0987" containerName="dnsmasq-dns" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.505592 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d52357-0b5c-4d38-abc9-28eef64b0987" containerName="dnsmasq-dns" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.505707 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="264f0438-c16d-4dbc-9496-eba8c23251b4" containerName="neutron-httpd" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.505792 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="264f0438-c16d-4dbc-9496-eba8c23251b4" containerName="neutron-api" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.506634 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.509388 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.509522 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.509937 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.512808 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lmftf" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.514788 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.551081 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-dw9vz"] Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.630250 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6866cd7fc7-ln7tj"] Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.642042 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.654726 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7a87fde-b771-4bc7-946a-a3e5a26f8992-swiftconf\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.654795 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbwjq\" (UniqueName: \"kubernetes.io/projected/b7a87fde-b771-4bc7-946a-a3e5a26f8992-kube-api-access-bbwjq\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.654832 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7a87fde-b771-4bc7-946a-a3e5a26f8992-dispersionconf\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.654867 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7a87fde-b771-4bc7-946a-a3e5a26f8992-etc-swift\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.654921 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7a87fde-b771-4bc7-946a-a3e5a26f8992-ring-data-devices\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.654936 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7a87fde-b771-4bc7-946a-a3e5a26f8992-scripts\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.654952 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a87fde-b771-4bc7-946a-a3e5a26f8992-combined-ca-bundle\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.699216 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6866cd7fc7-ln7tj"] Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.765944 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7a87fde-b771-4bc7-946a-a3e5a26f8992-swiftconf\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.766110 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbwjq\" (UniqueName: \"kubernetes.io/projected/b7a87fde-b771-4bc7-946a-a3e5a26f8992-kube-api-access-bbwjq\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.766173 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7a87fde-b771-4bc7-946a-a3e5a26f8992-dispersionconf\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.766202 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-config\") pod \"dnsmasq-dns-6866cd7fc7-ln7tj\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.766267 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crst7\" (UniqueName: \"kubernetes.io/projected/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-kube-api-access-crst7\") pod \"dnsmasq-dns-6866cd7fc7-ln7tj\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.766293 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7a87fde-b771-4bc7-946a-a3e5a26f8992-etc-swift\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.766336 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-dns-svc\") pod \"dnsmasq-dns-6866cd7fc7-ln7tj\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.766429 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7a87fde-b771-4bc7-946a-a3e5a26f8992-ring-data-devices\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.766445 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7a87fde-b771-4bc7-946a-a3e5a26f8992-scripts\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.766460 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-ovsdbserver-sb\") pod \"dnsmasq-dns-6866cd7fc7-ln7tj\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.766480 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a87fde-b771-4bc7-946a-a3e5a26f8992-combined-ca-bundle\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.766545 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-ovsdbserver-nb\") pod \"dnsmasq-dns-6866cd7fc7-ln7tj\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.767251 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7a87fde-b771-4bc7-946a-a3e5a26f8992-etc-swift\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.769157 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7a87fde-b771-4bc7-946a-a3e5a26f8992-ring-data-devices\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.769465 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7a87fde-b771-4bc7-946a-a3e5a26f8992-scripts\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.775342 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7a87fde-b771-4bc7-946a-a3e5a26f8992-dispersionconf\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.782776 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a87fde-b771-4bc7-946a-a3e5a26f8992-combined-ca-bundle\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.786927 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7a87fde-b771-4bc7-946a-a3e5a26f8992-swiftconf\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.804533 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbwjq\" (UniqueName: \"kubernetes.io/projected/b7a87fde-b771-4bc7-946a-a3e5a26f8992-kube-api-access-bbwjq\") pod \"swift-ring-rebalance-dw9vz\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.859199 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.867928 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-config\") pod \"dnsmasq-dns-6866cd7fc7-ln7tj\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.868000 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crst7\" (UniqueName: \"kubernetes.io/projected/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-kube-api-access-crst7\") pod \"dnsmasq-dns-6866cd7fc7-ln7tj\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.868050 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-dns-svc\") pod \"dnsmasq-dns-6866cd7fc7-ln7tj\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.868113 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-ovsdbserver-sb\") pod \"dnsmasq-dns-6866cd7fc7-ln7tj\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.868162 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-ovsdbserver-nb\") pod \"dnsmasq-dns-6866cd7fc7-ln7tj\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.869148 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-ovsdbserver-nb\") pod \"dnsmasq-dns-6866cd7fc7-ln7tj\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.869730 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-config\") pod \"dnsmasq-dns-6866cd7fc7-ln7tj\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.873921 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-ovsdbserver-sb\") pod \"dnsmasq-dns-6866cd7fc7-ln7tj\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.876502 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-dns-svc\") pod \"dnsmasq-dns-6866cd7fc7-ln7tj\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:56:53 crc kubenswrapper[4954]: I1206 08:56:53.895880 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crst7\" (UniqueName: \"kubernetes.io/projected/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-kube-api-access-crst7\") pod \"dnsmasq-dns-6866cd7fc7-ln7tj\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:56:54 crc kubenswrapper[4954]: I1206 08:56:54.002153 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:56:54 crc kubenswrapper[4954]: I1206 08:56:54.403068 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-dw9vz"] Dec 06 08:56:54 crc kubenswrapper[4954]: W1206 08:56:54.413482 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7a87fde_b771_4bc7_946a_a3e5a26f8992.slice/crio-080c306b7c489a9809c9200ec91e3f9396d2400e54aefc40866d0e22e8fada61 WatchSource:0}: Error finding container 080c306b7c489a9809c9200ec91e3f9396d2400e54aefc40866d0e22e8fada61: Status 404 returned error can't find the container with id 080c306b7c489a9809c9200ec91e3f9396d2400e54aefc40866d0e22e8fada61 Dec 06 08:56:54 crc kubenswrapper[4954]: I1206 08:56:54.584849 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6866cd7fc7-ln7tj"] Dec 06 08:56:54 crc kubenswrapper[4954]: W1206 08:56:54.596404 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7f421aa_a1c3_4a3a_b305_b0571b0e3eb0.slice/crio-9b420856877844ada936d7f4af24fa80c9403ef2a3eab59682d3b4ad4b04156e WatchSource:0}: Error finding container 9b420856877844ada936d7f4af24fa80c9403ef2a3eab59682d3b4ad4b04156e: Status 404 returned error can't find the container with id 9b420856877844ada936d7f4af24fa80c9403ef2a3eab59682d3b4ad4b04156e Dec 06 08:56:54 crc kubenswrapper[4954]: I1206 08:56:54.896454 4954 generic.go:334] "Generic (PLEG): container finished" podID="d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0" containerID="7fb3067e63fc9857a5cb641c5245942c61e8b9b0cde14cdc0e3e6cc4983f1e5e" exitCode=0 Dec 06 08:56:54 crc kubenswrapper[4954]: I1206 08:56:54.896519 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" event={"ID":"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0","Type":"ContainerDied","Data":"7fb3067e63fc9857a5cb641c5245942c61e8b9b0cde14cdc0e3e6cc4983f1e5e"} Dec 06 08:56:54 crc kubenswrapper[4954]: I1206 08:56:54.896903 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" event={"ID":"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0","Type":"ContainerStarted","Data":"9b420856877844ada936d7f4af24fa80c9403ef2a3eab59682d3b4ad4b04156e"} Dec 06 08:56:54 crc kubenswrapper[4954]: I1206 08:56:54.898033 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dw9vz" event={"ID":"b7a87fde-b771-4bc7-946a-a3e5a26f8992","Type":"ContainerStarted","Data":"080c306b7c489a9809c9200ec91e3f9396d2400e54aefc40866d0e22e8fada61"} Dec 06 08:56:55 crc kubenswrapper[4954]: I1206 08:56:55.907033 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" event={"ID":"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0","Type":"ContainerStarted","Data":"7f8a9309aa161aa4a082e1887b35028048980185b73a6c42c42df5bcc0694ac1"} Dec 06 08:56:55 crc kubenswrapper[4954]: I1206 08:56:55.907444 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:56:55 crc kubenswrapper[4954]: I1206 08:56:55.931416 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" podStartSLOduration=2.93139311 podStartE2EDuration="2.93139311s" podCreationTimestamp="2025-12-06 08:56:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:56:55.923073018 +0000 UTC m=+7190.736432417" watchObservedRunningTime="2025-12-06 08:56:55.93139311 +0000 UTC m=+7190.744752499" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.014226 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6cb58c5648-fkjpb"] Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.022267 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.027502 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.047386 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6cb58c5648-fkjpb"] Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.115136 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-combined-ca-bundle\") pod \"swift-proxy-6cb58c5648-fkjpb\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.115549 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-etc-swift\") pod \"swift-proxy-6cb58c5648-fkjpb\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.115676 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-run-httpd\") pod \"swift-proxy-6cb58c5648-fkjpb\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.115762 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-log-httpd\") pod \"swift-proxy-6cb58c5648-fkjpb\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.115793 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-config-data\") pod \"swift-proxy-6cb58c5648-fkjpb\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.115815 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc68f\" (UniqueName: \"kubernetes.io/projected/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-kube-api-access-rc68f\") pod \"swift-proxy-6cb58c5648-fkjpb\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.217658 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-combined-ca-bundle\") pod \"swift-proxy-6cb58c5648-fkjpb\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.217742 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-etc-swift\") pod \"swift-proxy-6cb58c5648-fkjpb\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.217813 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-run-httpd\") pod \"swift-proxy-6cb58c5648-fkjpb\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.217898 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-log-httpd\") pod \"swift-proxy-6cb58c5648-fkjpb\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.217944 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-config-data\") pod \"swift-proxy-6cb58c5648-fkjpb\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.217973 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc68f\" (UniqueName: \"kubernetes.io/projected/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-kube-api-access-rc68f\") pod \"swift-proxy-6cb58c5648-fkjpb\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.219118 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-log-httpd\") pod \"swift-proxy-6cb58c5648-fkjpb\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.219217 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-run-httpd\") pod \"swift-proxy-6cb58c5648-fkjpb\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.226491 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-etc-swift\") pod \"swift-proxy-6cb58c5648-fkjpb\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.234175 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-config-data\") pod \"swift-proxy-6cb58c5648-fkjpb\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.234274 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-combined-ca-bundle\") pod \"swift-proxy-6cb58c5648-fkjpb\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.237867 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc68f\" (UniqueName: \"kubernetes.io/projected/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-kube-api-access-rc68f\") pod \"swift-proxy-6cb58c5648-fkjpb\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:56 crc kubenswrapper[4954]: I1206 08:56:56.362580 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.804127 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-d4b995744-2qch6"] Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.807874 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.810987 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.813633 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6cb58c5648-fkjpb"] Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.813649 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.834502 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d4b995744-2qch6"] Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.892605 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-run-httpd\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.892730 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-combined-ca-bundle\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.892958 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-config-data\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.893051 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czgvw\" (UniqueName: \"kubernetes.io/projected/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-kube-api-access-czgvw\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.893117 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-etc-swift\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.893322 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-internal-tls-certs\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.893434 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-log-httpd\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.893536 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-public-tls-certs\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.950136 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6cb58c5648-fkjpb" event={"ID":"a9103ab8-0c1e-4734-89d0-ed01b5bd366a","Type":"ContainerStarted","Data":"e40b9c0bdd95c157b067c434c475f3143946ffc5ead27d00a790d51e7a0d5b55"} Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.951810 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dw9vz" event={"ID":"b7a87fde-b771-4bc7-946a-a3e5a26f8992","Type":"ContainerStarted","Data":"43cb550dbdcb844f46e4b11a4086dd9b61fa298d97c537b605a9f664c7fd0320"} Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.973301 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-dw9vz" podStartSLOduration=2.141737713 podStartE2EDuration="5.97328116s" podCreationTimestamp="2025-12-06 08:56:53 +0000 UTC" firstStartedPulling="2025-12-06 08:56:54.417472726 +0000 UTC m=+7189.230832115" lastFinishedPulling="2025-12-06 08:56:58.249016173 +0000 UTC m=+7193.062375562" observedRunningTime="2025-12-06 08:56:58.971524673 +0000 UTC m=+7193.784884062" watchObservedRunningTime="2025-12-06 08:56:58.97328116 +0000 UTC m=+7193.786640539" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.994943 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-config-data\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.995364 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czgvw\" (UniqueName: \"kubernetes.io/projected/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-kube-api-access-czgvw\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.995400 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-etc-swift\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.995447 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-internal-tls-certs\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.995483 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-log-httpd\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.995539 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-public-tls-certs\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.995613 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-run-httpd\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.995682 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-combined-ca-bundle\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.996258 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-log-httpd\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:58 crc kubenswrapper[4954]: I1206 08:56:58.996313 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-run-httpd\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:59 crc kubenswrapper[4954]: I1206 08:56:59.000840 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-combined-ca-bundle\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:59 crc kubenswrapper[4954]: I1206 08:56:59.001286 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-etc-swift\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:59 crc kubenswrapper[4954]: I1206 08:56:59.001648 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-config-data\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:59 crc kubenswrapper[4954]: I1206 08:56:59.002788 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-internal-tls-certs\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:59 crc kubenswrapper[4954]: I1206 08:56:59.009762 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-public-tls-certs\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:59 crc kubenswrapper[4954]: I1206 08:56:59.014372 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czgvw\" (UniqueName: \"kubernetes.io/projected/4f6b671e-20ef-4d02-aedd-a5d46bc23b40-kube-api-access-czgvw\") pod \"swift-proxy-d4b995744-2qch6\" (UID: \"4f6b671e-20ef-4d02-aedd-a5d46bc23b40\") " pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:59 crc kubenswrapper[4954]: I1206 08:56:59.194977 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:56:59 crc kubenswrapper[4954]: I1206 08:56:59.963899 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6cb58c5648-fkjpb" event={"ID":"a9103ab8-0c1e-4734-89d0-ed01b5bd366a","Type":"ContainerStarted","Data":"fcf6745e574b5a199834eacc33aa9947b6938c624c995c3b418fd9e27508418a"} Dec 06 08:56:59 crc kubenswrapper[4954]: I1206 08:56:59.964210 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:59 crc kubenswrapper[4954]: I1206 08:56:59.964223 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6cb58c5648-fkjpb" event={"ID":"a9103ab8-0c1e-4734-89d0-ed01b5bd366a","Type":"ContainerStarted","Data":"c638259749bc722004567d66d4e50a88056f3410e9e80f6d1b6c7c8f081c7c59"} Dec 06 08:56:59 crc kubenswrapper[4954]: I1206 08:56:59.964234 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:56:59 crc kubenswrapper[4954]: I1206 08:56:59.989765 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6cb58c5648-fkjpb" podStartSLOduration=4.989744433 podStartE2EDuration="4.989744433s" podCreationTimestamp="2025-12-06 08:56:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:56:59.986363743 +0000 UTC m=+7194.799723142" watchObservedRunningTime="2025-12-06 08:56:59.989744433 +0000 UTC m=+7194.803103822" Dec 06 08:57:00 crc kubenswrapper[4954]: I1206 08:57:00.037386 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d4b995744-2qch6"] Dec 06 08:57:00 crc kubenswrapper[4954]: W1206 08:57:00.047021 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f6b671e_20ef_4d02_aedd_a5d46bc23b40.slice/crio-f011d712dca63a1303e97fc45fde1b14ec4385d678432aeac3b8e2517201dd4f WatchSource:0}: Error finding container f011d712dca63a1303e97fc45fde1b14ec4385d678432aeac3b8e2517201dd4f: Status 404 returned error can't find the container with id f011d712dca63a1303e97fc45fde1b14ec4385d678432aeac3b8e2517201dd4f Dec 06 08:57:00 crc kubenswrapper[4954]: I1206 08:57:00.976181 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d4b995744-2qch6" event={"ID":"4f6b671e-20ef-4d02-aedd-a5d46bc23b40","Type":"ContainerStarted","Data":"e3ba9961c988fa4d22dc75820b4caf06a554cce454777568a4a87fa50057a0e1"} Dec 06 08:57:00 crc kubenswrapper[4954]: I1206 08:57:00.976688 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d4b995744-2qch6" event={"ID":"4f6b671e-20ef-4d02-aedd-a5d46bc23b40","Type":"ContainerStarted","Data":"72bd1ff87ea6aaa36d66a3e6a1a5331f57fdf0b1fbf7d31b74e2d6519cebea49"} Dec 06 08:57:00 crc kubenswrapper[4954]: I1206 08:57:00.976704 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d4b995744-2qch6" event={"ID":"4f6b671e-20ef-4d02-aedd-a5d46bc23b40","Type":"ContainerStarted","Data":"f011d712dca63a1303e97fc45fde1b14ec4385d678432aeac3b8e2517201dd4f"} Dec 06 08:57:00 crc kubenswrapper[4954]: I1206 08:57:00.976743 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:57:00 crc kubenswrapper[4954]: I1206 08:57:00.976764 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:57:01 crc kubenswrapper[4954]: I1206 08:57:01.013317 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-d4b995744-2qch6" podStartSLOduration=3.013286054 podStartE2EDuration="3.013286054s" podCreationTimestamp="2025-12-06 08:56:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:57:00.99846752 +0000 UTC m=+7195.811826919" watchObservedRunningTime="2025-12-06 08:57:01.013286054 +0000 UTC m=+7195.826645453" Dec 06 08:57:01 crc kubenswrapper[4954]: I1206 08:57:01.444020 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:57:01 crc kubenswrapper[4954]: E1206 08:57:01.444722 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:57:02 crc kubenswrapper[4954]: I1206 08:57:02.992189 4954 generic.go:334] "Generic (PLEG): container finished" podID="b7a87fde-b771-4bc7-946a-a3e5a26f8992" containerID="43cb550dbdcb844f46e4b11a4086dd9b61fa298d97c537b605a9f664c7fd0320" exitCode=0 Dec 06 08:57:02 crc kubenswrapper[4954]: I1206 08:57:02.992273 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dw9vz" event={"ID":"b7a87fde-b771-4bc7-946a-a3e5a26f8992","Type":"ContainerDied","Data":"43cb550dbdcb844f46e4b11a4086dd9b61fa298d97c537b605a9f664c7fd0320"} Dec 06 08:57:04 crc kubenswrapper[4954]: I1206 08:57:04.004040 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:57:04 crc kubenswrapper[4954]: I1206 08:57:04.073416 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d94c4b4dc-qtwjz"] Dec 06 08:57:04 crc kubenswrapper[4954]: I1206 08:57:04.073958 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" podUID="93010cdb-946e-4a9b-8ca7-8531d352d881" containerName="dnsmasq-dns" containerID="cri-o://7d37d4181228a99335367b7cd1121c4343ecf649173f24f7151ab12635174d82" gracePeriod=10 Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.522988 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.619353 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7a87fde-b771-4bc7-946a-a3e5a26f8992-ring-data-devices\") pod \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.619849 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbwjq\" (UniqueName: \"kubernetes.io/projected/b7a87fde-b771-4bc7-946a-a3e5a26f8992-kube-api-access-bbwjq\") pod \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.619900 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7a87fde-b771-4bc7-946a-a3e5a26f8992-dispersionconf\") pod \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.619942 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a87fde-b771-4bc7-946a-a3e5a26f8992-combined-ca-bundle\") pod \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.619969 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7a87fde-b771-4bc7-946a-a3e5a26f8992-swiftconf\") pod \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.620068 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7a87fde-b771-4bc7-946a-a3e5a26f8992-scripts\") pod \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.620107 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7a87fde-b771-4bc7-946a-a3e5a26f8992-etc-swift\") pod \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\" (UID: \"b7a87fde-b771-4bc7-946a-a3e5a26f8992\") " Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.620182 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7a87fde-b771-4bc7-946a-a3e5a26f8992-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b7a87fde-b771-4bc7-946a-a3e5a26f8992" (UID: "b7a87fde-b771-4bc7-946a-a3e5a26f8992"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.620456 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7a87fde-b771-4bc7-946a-a3e5a26f8992-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.622077 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a87fde-b771-4bc7-946a-a3e5a26f8992-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b7a87fde-b771-4bc7-946a-a3e5a26f8992" (UID: "b7a87fde-b771-4bc7-946a-a3e5a26f8992"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.627168 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7a87fde-b771-4bc7-946a-a3e5a26f8992-kube-api-access-bbwjq" (OuterVolumeSpecName: "kube-api-access-bbwjq") pod "b7a87fde-b771-4bc7-946a-a3e5a26f8992" (UID: "b7a87fde-b771-4bc7-946a-a3e5a26f8992"). InnerVolumeSpecName "kube-api-access-bbwjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.632472 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7a87fde-b771-4bc7-946a-a3e5a26f8992-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b7a87fde-b771-4bc7-946a-a3e5a26f8992" (UID: "b7a87fde-b771-4bc7-946a-a3e5a26f8992"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.652113 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7a87fde-b771-4bc7-946a-a3e5a26f8992-scripts" (OuterVolumeSpecName: "scripts") pod "b7a87fde-b771-4bc7-946a-a3e5a26f8992" (UID: "b7a87fde-b771-4bc7-946a-a3e5a26f8992"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.655355 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7a87fde-b771-4bc7-946a-a3e5a26f8992-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7a87fde-b771-4bc7-946a-a3e5a26f8992" (UID: "b7a87fde-b771-4bc7-946a-a3e5a26f8992"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.664967 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7a87fde-b771-4bc7-946a-a3e5a26f8992-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b7a87fde-b771-4bc7-946a-a3e5a26f8992" (UID: "b7a87fde-b771-4bc7-946a-a3e5a26f8992"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.722408 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a87fde-b771-4bc7-946a-a3e5a26f8992-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.722439 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7a87fde-b771-4bc7-946a-a3e5a26f8992-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.722449 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7a87fde-b771-4bc7-946a-a3e5a26f8992-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.722459 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7a87fde-b771-4bc7-946a-a3e5a26f8992-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.722468 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbwjq\" (UniqueName: \"kubernetes.io/projected/b7a87fde-b771-4bc7-946a-a3e5a26f8992-kube-api-access-bbwjq\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.722477 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7a87fde-b771-4bc7-946a-a3e5a26f8992-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.731664 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.825473 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrhzl\" (UniqueName: \"kubernetes.io/projected/93010cdb-946e-4a9b-8ca7-8531d352d881-kube-api-access-nrhzl\") pod \"93010cdb-946e-4a9b-8ca7-8531d352d881\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.825538 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-ovsdbserver-nb\") pod \"93010cdb-946e-4a9b-8ca7-8531d352d881\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.825722 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-config\") pod \"93010cdb-946e-4a9b-8ca7-8531d352d881\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.825805 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-dns-svc\") pod \"93010cdb-946e-4a9b-8ca7-8531d352d881\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.825828 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-ovsdbserver-sb\") pod \"93010cdb-946e-4a9b-8ca7-8531d352d881\" (UID: \"93010cdb-946e-4a9b-8ca7-8531d352d881\") " Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.828887 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93010cdb-946e-4a9b-8ca7-8531d352d881-kube-api-access-nrhzl" (OuterVolumeSpecName: "kube-api-access-nrhzl") pod "93010cdb-946e-4a9b-8ca7-8531d352d881" (UID: "93010cdb-946e-4a9b-8ca7-8531d352d881"). InnerVolumeSpecName "kube-api-access-nrhzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.867516 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "93010cdb-946e-4a9b-8ca7-8531d352d881" (UID: "93010cdb-946e-4a9b-8ca7-8531d352d881"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.870716 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-config" (OuterVolumeSpecName: "config") pod "93010cdb-946e-4a9b-8ca7-8531d352d881" (UID: "93010cdb-946e-4a9b-8ca7-8531d352d881"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.873995 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "93010cdb-946e-4a9b-8ca7-8531d352d881" (UID: "93010cdb-946e-4a9b-8ca7-8531d352d881"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.877845 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "93010cdb-946e-4a9b-8ca7-8531d352d881" (UID: "93010cdb-946e-4a9b-8ca7-8531d352d881"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.927826 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.927862 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.927877 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrhzl\" (UniqueName: \"kubernetes.io/projected/93010cdb-946e-4a9b-8ca7-8531d352d881-kube-api-access-nrhzl\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.927888 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:04.927897 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93010cdb-946e-4a9b-8ca7-8531d352d881-config\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:05.036892 4954 generic.go:334] "Generic (PLEG): container finished" podID="93010cdb-946e-4a9b-8ca7-8531d352d881" containerID="7d37d4181228a99335367b7cd1121c4343ecf649173f24f7151ab12635174d82" exitCode=0 Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:05.036965 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:05.036964 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" event={"ID":"93010cdb-946e-4a9b-8ca7-8531d352d881","Type":"ContainerDied","Data":"7d37d4181228a99335367b7cd1121c4343ecf649173f24f7151ab12635174d82"} Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:05.036998 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d94c4b4dc-qtwjz" event={"ID":"93010cdb-946e-4a9b-8ca7-8531d352d881","Type":"ContainerDied","Data":"9204c89a5303b3a400d9c2e4efce15488b44d3140fd1cca008fca65b84e7b6ed"} Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:05.037398 4954 scope.go:117] "RemoveContainer" containerID="7d37d4181228a99335367b7cd1121c4343ecf649173f24f7151ab12635174d82" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:05.043594 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dw9vz" event={"ID":"b7a87fde-b771-4bc7-946a-a3e5a26f8992","Type":"ContainerDied","Data":"080c306b7c489a9809c9200ec91e3f9396d2400e54aefc40866d0e22e8fada61"} Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:05.043653 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="080c306b7c489a9809c9200ec91e3f9396d2400e54aefc40866d0e22e8fada61" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:05.043788 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dw9vz" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:05.063803 4954 scope.go:117] "RemoveContainer" containerID="b5a9f69008f9e3ebbfe873dfd711064490052249cfaf662e9996cf36a2518e31" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:05.086233 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d94c4b4dc-qtwjz"] Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:05.094844 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d94c4b4dc-qtwjz"] Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:05.105545 4954 scope.go:117] "RemoveContainer" containerID="7d37d4181228a99335367b7cd1121c4343ecf649173f24f7151ab12635174d82" Dec 06 08:57:05 crc kubenswrapper[4954]: E1206 08:57:05.106110 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d37d4181228a99335367b7cd1121c4343ecf649173f24f7151ab12635174d82\": container with ID starting with 7d37d4181228a99335367b7cd1121c4343ecf649173f24f7151ab12635174d82 not found: ID does not exist" containerID="7d37d4181228a99335367b7cd1121c4343ecf649173f24f7151ab12635174d82" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:05.106162 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d37d4181228a99335367b7cd1121c4343ecf649173f24f7151ab12635174d82"} err="failed to get container status \"7d37d4181228a99335367b7cd1121c4343ecf649173f24f7151ab12635174d82\": rpc error: code = NotFound desc = could not find container \"7d37d4181228a99335367b7cd1121c4343ecf649173f24f7151ab12635174d82\": container with ID starting with 7d37d4181228a99335367b7cd1121c4343ecf649173f24f7151ab12635174d82 not found: ID does not exist" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:05.106196 4954 scope.go:117] "RemoveContainer" containerID="b5a9f69008f9e3ebbfe873dfd711064490052249cfaf662e9996cf36a2518e31" Dec 06 08:57:05 crc kubenswrapper[4954]: E1206 08:57:05.106642 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5a9f69008f9e3ebbfe873dfd711064490052249cfaf662e9996cf36a2518e31\": container with ID starting with b5a9f69008f9e3ebbfe873dfd711064490052249cfaf662e9996cf36a2518e31 not found: ID does not exist" containerID="b5a9f69008f9e3ebbfe873dfd711064490052249cfaf662e9996cf36a2518e31" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:05.106667 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5a9f69008f9e3ebbfe873dfd711064490052249cfaf662e9996cf36a2518e31"} err="failed to get container status \"b5a9f69008f9e3ebbfe873dfd711064490052249cfaf662e9996cf36a2518e31\": rpc error: code = NotFound desc = could not find container \"b5a9f69008f9e3ebbfe873dfd711064490052249cfaf662e9996cf36a2518e31\": container with ID starting with b5a9f69008f9e3ebbfe873dfd711064490052249cfaf662e9996cf36a2518e31 not found: ID does not exist" Dec 06 08:57:05 crc kubenswrapper[4954]: I1206 08:57:05.452540 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93010cdb-946e-4a9b-8ca7-8531d352d881" path="/var/lib/kubelet/pods/93010cdb-946e-4a9b-8ca7-8531d352d881/volumes" Dec 06 08:57:06 crc kubenswrapper[4954]: I1206 08:57:06.367939 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:57:06 crc kubenswrapper[4954]: I1206 08:57:06.369220 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:57:09 crc kubenswrapper[4954]: I1206 08:57:09.201240 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:57:09 crc kubenswrapper[4954]: I1206 08:57:09.201786 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d4b995744-2qch6" Dec 06 08:57:09 crc kubenswrapper[4954]: I1206 08:57:09.286422 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6cb58c5648-fkjpb"] Dec 06 08:57:09 crc kubenswrapper[4954]: I1206 08:57:09.286664 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6cb58c5648-fkjpb" podUID="a9103ab8-0c1e-4734-89d0-ed01b5bd366a" containerName="proxy-httpd" containerID="cri-o://c638259749bc722004567d66d4e50a88056f3410e9e80f6d1b6c7c8f081c7c59" gracePeriod=30 Dec 06 08:57:09 crc kubenswrapper[4954]: I1206 08:57:09.286797 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6cb58c5648-fkjpb" podUID="a9103ab8-0c1e-4734-89d0-ed01b5bd366a" containerName="proxy-server" containerID="cri-o://fcf6745e574b5a199834eacc33aa9947b6938c624c995c3b418fd9e27508418a" gracePeriod=30 Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.090960 4954 generic.go:334] "Generic (PLEG): container finished" podID="a9103ab8-0c1e-4734-89d0-ed01b5bd366a" containerID="fcf6745e574b5a199834eacc33aa9947b6938c624c995c3b418fd9e27508418a" exitCode=0 Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.091280 4954 generic.go:334] "Generic (PLEG): container finished" podID="a9103ab8-0c1e-4734-89d0-ed01b5bd366a" containerID="c638259749bc722004567d66d4e50a88056f3410e9e80f6d1b6c7c8f081c7c59" exitCode=0 Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.091056 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6cb58c5648-fkjpb" event={"ID":"a9103ab8-0c1e-4734-89d0-ed01b5bd366a","Type":"ContainerDied","Data":"fcf6745e574b5a199834eacc33aa9947b6938c624c995c3b418fd9e27508418a"} Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.091328 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6cb58c5648-fkjpb" event={"ID":"a9103ab8-0c1e-4734-89d0-ed01b5bd366a","Type":"ContainerDied","Data":"c638259749bc722004567d66d4e50a88056f3410e9e80f6d1b6c7c8f081c7c59"} Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.434953 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.543662 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-combined-ca-bundle\") pod \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.543996 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-run-httpd\") pod \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.544090 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-etc-swift\") pod \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.544172 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-config-data\") pod \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.544309 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc68f\" (UniqueName: \"kubernetes.io/projected/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-kube-api-access-rc68f\") pod \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.544418 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-log-httpd\") pod \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\" (UID: \"a9103ab8-0c1e-4734-89d0-ed01b5bd366a\") " Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.545587 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a9103ab8-0c1e-4734-89d0-ed01b5bd366a" (UID: "a9103ab8-0c1e-4734-89d0-ed01b5bd366a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.546068 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a9103ab8-0c1e-4734-89d0-ed01b5bd366a" (UID: "a9103ab8-0c1e-4734-89d0-ed01b5bd366a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.549451 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a9103ab8-0c1e-4734-89d0-ed01b5bd366a" (UID: "a9103ab8-0c1e-4734-89d0-ed01b5bd366a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.549613 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-kube-api-access-rc68f" (OuterVolumeSpecName: "kube-api-access-rc68f") pod "a9103ab8-0c1e-4734-89d0-ed01b5bd366a" (UID: "a9103ab8-0c1e-4734-89d0-ed01b5bd366a"). InnerVolumeSpecName "kube-api-access-rc68f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.599314 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-config-data" (OuterVolumeSpecName: "config-data") pod "a9103ab8-0c1e-4734-89d0-ed01b5bd366a" (UID: "a9103ab8-0c1e-4734-89d0-ed01b5bd366a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.606174 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9103ab8-0c1e-4734-89d0-ed01b5bd366a" (UID: "a9103ab8-0c1e-4734-89d0-ed01b5bd366a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.646959 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.647003 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.647015 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.647029 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc68f\" (UniqueName: \"kubernetes.io/projected/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-kube-api-access-rc68f\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.647042 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:10 crc kubenswrapper[4954]: I1206 08:57:10.647054 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9103ab8-0c1e-4734-89d0-ed01b5bd366a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:11 crc kubenswrapper[4954]: I1206 08:57:11.103304 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6cb58c5648-fkjpb" event={"ID":"a9103ab8-0c1e-4734-89d0-ed01b5bd366a","Type":"ContainerDied","Data":"e40b9c0bdd95c157b067c434c475f3143946ffc5ead27d00a790d51e7a0d5b55"} Dec 06 08:57:11 crc kubenswrapper[4954]: I1206 08:57:11.103608 4954 scope.go:117] "RemoveContainer" containerID="fcf6745e574b5a199834eacc33aa9947b6938c624c995c3b418fd9e27508418a" Dec 06 08:57:11 crc kubenswrapper[4954]: I1206 08:57:11.103398 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6cb58c5648-fkjpb" Dec 06 08:57:11 crc kubenswrapper[4954]: I1206 08:57:11.136678 4954 scope.go:117] "RemoveContainer" containerID="c638259749bc722004567d66d4e50a88056f3410e9e80f6d1b6c7c8f081c7c59" Dec 06 08:57:11 crc kubenswrapper[4954]: I1206 08:57:11.149530 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6cb58c5648-fkjpb"] Dec 06 08:57:11 crc kubenswrapper[4954]: I1206 08:57:11.156954 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-6cb58c5648-fkjpb"] Dec 06 08:57:11 crc kubenswrapper[4954]: I1206 08:57:11.463169 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9103ab8-0c1e-4734-89d0-ed01b5bd366a" path="/var/lib/kubelet/pods/a9103ab8-0c1e-4734-89d0-ed01b5bd366a/volumes" Dec 06 08:57:13 crc kubenswrapper[4954]: I1206 08:57:13.444363 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:57:13 crc kubenswrapper[4954]: E1206 08:57:13.444634 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:57:23 crc kubenswrapper[4954]: I1206 08:57:23.164496 4954 scope.go:117] "RemoveContainer" containerID="a36d8863c1f2fcdf0df44c66eaf9bfa80d788088a0035b5104e00363b99dfba4" Dec 06 08:57:27 crc kubenswrapper[4954]: I1206 08:57:27.443537 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:57:27 crc kubenswrapper[4954]: E1206 08:57:27.444176 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:57:38 crc kubenswrapper[4954]: I1206 08:57:38.443513 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:57:38 crc kubenswrapper[4954]: E1206 08:57:38.444259 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.753739 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-z8js6"] Dec 06 08:57:41 crc kubenswrapper[4954]: E1206 08:57:41.754612 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93010cdb-946e-4a9b-8ca7-8531d352d881" containerName="init" Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.754629 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="93010cdb-946e-4a9b-8ca7-8531d352d881" containerName="init" Dec 06 08:57:41 crc kubenswrapper[4954]: E1206 08:57:41.754663 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93010cdb-946e-4a9b-8ca7-8531d352d881" containerName="dnsmasq-dns" Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.754671 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="93010cdb-946e-4a9b-8ca7-8531d352d881" containerName="dnsmasq-dns" Dec 06 08:57:41 crc kubenswrapper[4954]: E1206 08:57:41.754690 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a87fde-b771-4bc7-946a-a3e5a26f8992" containerName="swift-ring-rebalance" Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.754699 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a87fde-b771-4bc7-946a-a3e5a26f8992" containerName="swift-ring-rebalance" Dec 06 08:57:41 crc kubenswrapper[4954]: E1206 08:57:41.754714 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9103ab8-0c1e-4734-89d0-ed01b5bd366a" containerName="proxy-server" Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.754725 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9103ab8-0c1e-4734-89d0-ed01b5bd366a" containerName="proxy-server" Dec 06 08:57:41 crc kubenswrapper[4954]: E1206 08:57:41.754747 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9103ab8-0c1e-4734-89d0-ed01b5bd366a" containerName="proxy-httpd" Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.754755 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9103ab8-0c1e-4734-89d0-ed01b5bd366a" containerName="proxy-httpd" Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.754982 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9103ab8-0c1e-4734-89d0-ed01b5bd366a" containerName="proxy-httpd" Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.755000 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a87fde-b771-4bc7-946a-a3e5a26f8992" containerName="swift-ring-rebalance" Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.755011 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="93010cdb-946e-4a9b-8ca7-8531d352d881" containerName="dnsmasq-dns" Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.755022 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9103ab8-0c1e-4734-89d0-ed01b5bd366a" containerName="proxy-server" Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.755762 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z8js6" Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.769668 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-z8js6"] Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.851776 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a3d8-account-create-update-6p8sr"] Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.853005 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a3d8-account-create-update-6p8sr" Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.856410 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.867080 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a3d8-account-create-update-6p8sr"] Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.927077 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adc0a40a-e991-4416-962b-fad304dc412c-operator-scripts\") pod \"cinder-a3d8-account-create-update-6p8sr\" (UID: \"adc0a40a-e991-4416-962b-fad304dc412c\") " pod="openstack/cinder-a3d8-account-create-update-6p8sr" Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.927519 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss77g\" (UniqueName: \"kubernetes.io/projected/adc0a40a-e991-4416-962b-fad304dc412c-kube-api-access-ss77g\") pod \"cinder-a3d8-account-create-update-6p8sr\" (UID: \"adc0a40a-e991-4416-962b-fad304dc412c\") " pod="openstack/cinder-a3d8-account-create-update-6p8sr" Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.927668 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t28wv\" (UniqueName: \"kubernetes.io/projected/ded01376-eebe-4038-99eb-bd70cbc5e61f-kube-api-access-t28wv\") pod \"cinder-db-create-z8js6\" (UID: \"ded01376-eebe-4038-99eb-bd70cbc5e61f\") " pod="openstack/cinder-db-create-z8js6" Dec 06 08:57:41 crc kubenswrapper[4954]: I1206 08:57:41.927852 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ded01376-eebe-4038-99eb-bd70cbc5e61f-operator-scripts\") pod \"cinder-db-create-z8js6\" (UID: \"ded01376-eebe-4038-99eb-bd70cbc5e61f\") " pod="openstack/cinder-db-create-z8js6" Dec 06 08:57:42 crc kubenswrapper[4954]: I1206 08:57:42.031098 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ded01376-eebe-4038-99eb-bd70cbc5e61f-operator-scripts\") pod \"cinder-db-create-z8js6\" (UID: \"ded01376-eebe-4038-99eb-bd70cbc5e61f\") " pod="openstack/cinder-db-create-z8js6" Dec 06 08:57:42 crc kubenswrapper[4954]: I1206 08:57:42.031341 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adc0a40a-e991-4416-962b-fad304dc412c-operator-scripts\") pod \"cinder-a3d8-account-create-update-6p8sr\" (UID: \"adc0a40a-e991-4416-962b-fad304dc412c\") " pod="openstack/cinder-a3d8-account-create-update-6p8sr" Dec 06 08:57:42 crc kubenswrapper[4954]: I1206 08:57:42.031594 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss77g\" (UniqueName: \"kubernetes.io/projected/adc0a40a-e991-4416-962b-fad304dc412c-kube-api-access-ss77g\") pod \"cinder-a3d8-account-create-update-6p8sr\" (UID: \"adc0a40a-e991-4416-962b-fad304dc412c\") " pod="openstack/cinder-a3d8-account-create-update-6p8sr" Dec 06 08:57:42 crc kubenswrapper[4954]: I1206 08:57:42.031663 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t28wv\" (UniqueName: \"kubernetes.io/projected/ded01376-eebe-4038-99eb-bd70cbc5e61f-kube-api-access-t28wv\") pod \"cinder-db-create-z8js6\" (UID: \"ded01376-eebe-4038-99eb-bd70cbc5e61f\") " pod="openstack/cinder-db-create-z8js6" Dec 06 08:57:42 crc kubenswrapper[4954]: I1206 08:57:42.032784 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ded01376-eebe-4038-99eb-bd70cbc5e61f-operator-scripts\") pod \"cinder-db-create-z8js6\" (UID: \"ded01376-eebe-4038-99eb-bd70cbc5e61f\") " pod="openstack/cinder-db-create-z8js6" Dec 06 08:57:42 crc kubenswrapper[4954]: I1206 08:57:42.033362 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adc0a40a-e991-4416-962b-fad304dc412c-operator-scripts\") pod \"cinder-a3d8-account-create-update-6p8sr\" (UID: \"adc0a40a-e991-4416-962b-fad304dc412c\") " pod="openstack/cinder-a3d8-account-create-update-6p8sr" Dec 06 08:57:42 crc kubenswrapper[4954]: I1206 08:57:42.054550 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss77g\" (UniqueName: \"kubernetes.io/projected/adc0a40a-e991-4416-962b-fad304dc412c-kube-api-access-ss77g\") pod \"cinder-a3d8-account-create-update-6p8sr\" (UID: \"adc0a40a-e991-4416-962b-fad304dc412c\") " pod="openstack/cinder-a3d8-account-create-update-6p8sr" Dec 06 08:57:42 crc kubenswrapper[4954]: I1206 08:57:42.055451 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t28wv\" (UniqueName: \"kubernetes.io/projected/ded01376-eebe-4038-99eb-bd70cbc5e61f-kube-api-access-t28wv\") pod \"cinder-db-create-z8js6\" (UID: \"ded01376-eebe-4038-99eb-bd70cbc5e61f\") " pod="openstack/cinder-db-create-z8js6" Dec 06 08:57:42 crc kubenswrapper[4954]: I1206 08:57:42.082843 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z8js6" Dec 06 08:57:42 crc kubenswrapper[4954]: I1206 08:57:42.208172 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a3d8-account-create-update-6p8sr" Dec 06 08:57:42 crc kubenswrapper[4954]: I1206 08:57:42.532914 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-z8js6"] Dec 06 08:57:42 crc kubenswrapper[4954]: I1206 08:57:42.668127 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a3d8-account-create-update-6p8sr"] Dec 06 08:57:42 crc kubenswrapper[4954]: W1206 08:57:42.674522 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadc0a40a_e991_4416_962b_fad304dc412c.slice/crio-5c9f98706b92d795b3552bc93f54a3367aadd22a2f8d8d13316ebae457afd743 WatchSource:0}: Error finding container 5c9f98706b92d795b3552bc93f54a3367aadd22a2f8d8d13316ebae457afd743: Status 404 returned error can't find the container with id 5c9f98706b92d795b3552bc93f54a3367aadd22a2f8d8d13316ebae457afd743 Dec 06 08:57:43 crc kubenswrapper[4954]: I1206 08:57:43.408143 4954 generic.go:334] "Generic (PLEG): container finished" podID="adc0a40a-e991-4416-962b-fad304dc412c" containerID="d85be8ac5449c87f4e32dcd903dd026b7361d3d619a7b137a1ce711706cfb8c0" exitCode=0 Dec 06 08:57:43 crc kubenswrapper[4954]: I1206 08:57:43.408192 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a3d8-account-create-update-6p8sr" event={"ID":"adc0a40a-e991-4416-962b-fad304dc412c","Type":"ContainerDied","Data":"d85be8ac5449c87f4e32dcd903dd026b7361d3d619a7b137a1ce711706cfb8c0"} Dec 06 08:57:43 crc kubenswrapper[4954]: I1206 08:57:43.408235 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a3d8-account-create-update-6p8sr" event={"ID":"adc0a40a-e991-4416-962b-fad304dc412c","Type":"ContainerStarted","Data":"5c9f98706b92d795b3552bc93f54a3367aadd22a2f8d8d13316ebae457afd743"} Dec 06 08:57:43 crc kubenswrapper[4954]: I1206 08:57:43.410183 4954 generic.go:334] "Generic (PLEG): container finished" podID="ded01376-eebe-4038-99eb-bd70cbc5e61f" containerID="c8e7b319ad5c939409ebead664b35c2411b8da565fd7e7e30c6a787d5b100c45" exitCode=0 Dec 06 08:57:43 crc kubenswrapper[4954]: I1206 08:57:43.410232 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-z8js6" event={"ID":"ded01376-eebe-4038-99eb-bd70cbc5e61f","Type":"ContainerDied","Data":"c8e7b319ad5c939409ebead664b35c2411b8da565fd7e7e30c6a787d5b100c45"} Dec 06 08:57:43 crc kubenswrapper[4954]: I1206 08:57:43.410267 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-z8js6" event={"ID":"ded01376-eebe-4038-99eb-bd70cbc5e61f","Type":"ContainerStarted","Data":"bcf793de515a870237161d4f83a7eed05d1934b23d8d5304e4fdc0269b805067"} Dec 06 08:57:44 crc kubenswrapper[4954]: I1206 08:57:44.829459 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z8js6" Dec 06 08:57:44 crc kubenswrapper[4954]: I1206 08:57:44.835491 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a3d8-account-create-update-6p8sr" Dec 06 08:57:44 crc kubenswrapper[4954]: I1206 08:57:44.986367 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adc0a40a-e991-4416-962b-fad304dc412c-operator-scripts\") pod \"adc0a40a-e991-4416-962b-fad304dc412c\" (UID: \"adc0a40a-e991-4416-962b-fad304dc412c\") " Dec 06 08:57:44 crc kubenswrapper[4954]: I1206 08:57:44.986580 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t28wv\" (UniqueName: \"kubernetes.io/projected/ded01376-eebe-4038-99eb-bd70cbc5e61f-kube-api-access-t28wv\") pod \"ded01376-eebe-4038-99eb-bd70cbc5e61f\" (UID: \"ded01376-eebe-4038-99eb-bd70cbc5e61f\") " Dec 06 08:57:44 crc kubenswrapper[4954]: I1206 08:57:44.986650 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss77g\" (UniqueName: \"kubernetes.io/projected/adc0a40a-e991-4416-962b-fad304dc412c-kube-api-access-ss77g\") pod \"adc0a40a-e991-4416-962b-fad304dc412c\" (UID: \"adc0a40a-e991-4416-962b-fad304dc412c\") " Dec 06 08:57:44 crc kubenswrapper[4954]: I1206 08:57:44.986790 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ded01376-eebe-4038-99eb-bd70cbc5e61f-operator-scripts\") pod \"ded01376-eebe-4038-99eb-bd70cbc5e61f\" (UID: \"ded01376-eebe-4038-99eb-bd70cbc5e61f\") " Dec 06 08:57:44 crc kubenswrapper[4954]: I1206 08:57:44.987306 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adc0a40a-e991-4416-962b-fad304dc412c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "adc0a40a-e991-4416-962b-fad304dc412c" (UID: "adc0a40a-e991-4416-962b-fad304dc412c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:57:44 crc kubenswrapper[4954]: I1206 08:57:44.987658 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ded01376-eebe-4038-99eb-bd70cbc5e61f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ded01376-eebe-4038-99eb-bd70cbc5e61f" (UID: "ded01376-eebe-4038-99eb-bd70cbc5e61f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:57:44 crc kubenswrapper[4954]: I1206 08:57:44.993126 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ded01376-eebe-4038-99eb-bd70cbc5e61f-kube-api-access-t28wv" (OuterVolumeSpecName: "kube-api-access-t28wv") pod "ded01376-eebe-4038-99eb-bd70cbc5e61f" (UID: "ded01376-eebe-4038-99eb-bd70cbc5e61f"). InnerVolumeSpecName "kube-api-access-t28wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:57:44 crc kubenswrapper[4954]: I1206 08:57:44.993335 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adc0a40a-e991-4416-962b-fad304dc412c-kube-api-access-ss77g" (OuterVolumeSpecName: "kube-api-access-ss77g") pod "adc0a40a-e991-4416-962b-fad304dc412c" (UID: "adc0a40a-e991-4416-962b-fad304dc412c"). InnerVolumeSpecName "kube-api-access-ss77g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:57:45 crc kubenswrapper[4954]: I1206 08:57:45.089582 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss77g\" (UniqueName: \"kubernetes.io/projected/adc0a40a-e991-4416-962b-fad304dc412c-kube-api-access-ss77g\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:45 crc kubenswrapper[4954]: I1206 08:57:45.089629 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ded01376-eebe-4038-99eb-bd70cbc5e61f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:45 crc kubenswrapper[4954]: I1206 08:57:45.089643 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adc0a40a-e991-4416-962b-fad304dc412c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:45 crc kubenswrapper[4954]: I1206 08:57:45.089653 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t28wv\" (UniqueName: \"kubernetes.io/projected/ded01376-eebe-4038-99eb-bd70cbc5e61f-kube-api-access-t28wv\") on node \"crc\" DevicePath \"\"" Dec 06 08:57:45 crc kubenswrapper[4954]: I1206 08:57:45.426727 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a3d8-account-create-update-6p8sr" event={"ID":"adc0a40a-e991-4416-962b-fad304dc412c","Type":"ContainerDied","Data":"5c9f98706b92d795b3552bc93f54a3367aadd22a2f8d8d13316ebae457afd743"} Dec 06 08:57:45 crc kubenswrapper[4954]: I1206 08:57:45.426814 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c9f98706b92d795b3552bc93f54a3367aadd22a2f8d8d13316ebae457afd743" Dec 06 08:57:45 crc kubenswrapper[4954]: I1206 08:57:45.426776 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a3d8-account-create-update-6p8sr" Dec 06 08:57:45 crc kubenswrapper[4954]: I1206 08:57:45.428873 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-z8js6" event={"ID":"ded01376-eebe-4038-99eb-bd70cbc5e61f","Type":"ContainerDied","Data":"bcf793de515a870237161d4f83a7eed05d1934b23d8d5304e4fdc0269b805067"} Dec 06 08:57:45 crc kubenswrapper[4954]: I1206 08:57:45.428912 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcf793de515a870237161d4f83a7eed05d1934b23d8d5304e4fdc0269b805067" Dec 06 08:57:45 crc kubenswrapper[4954]: I1206 08:57:45.428943 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z8js6" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.142598 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-bpcjv"] Dec 06 08:57:47 crc kubenswrapper[4954]: E1206 08:57:47.143603 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adc0a40a-e991-4416-962b-fad304dc412c" containerName="mariadb-account-create-update" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.143628 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="adc0a40a-e991-4416-962b-fad304dc412c" containerName="mariadb-account-create-update" Dec 06 08:57:47 crc kubenswrapper[4954]: E1206 08:57:47.143675 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded01376-eebe-4038-99eb-bd70cbc5e61f" containerName="mariadb-database-create" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.143685 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded01376-eebe-4038-99eb-bd70cbc5e61f" containerName="mariadb-database-create" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.143917 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="adc0a40a-e991-4416-962b-fad304dc412c" containerName="mariadb-account-create-update" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.143938 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded01376-eebe-4038-99eb-bd70cbc5e61f" containerName="mariadb-database-create" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.144799 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.150346 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.151969 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.152237 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qcdlj" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.152363 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-bpcjv"] Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.278769 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-combined-ca-bundle\") pod \"cinder-db-sync-bpcjv\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.278816 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-db-sync-config-data\") pod \"cinder-db-sync-bpcjv\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.278842 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-config-data\") pod \"cinder-db-sync-bpcjv\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.278874 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-scripts\") pod \"cinder-db-sync-bpcjv\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.278927 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-etc-machine-id\") pod \"cinder-db-sync-bpcjv\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.279096 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxj7c\" (UniqueName: \"kubernetes.io/projected/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-kube-api-access-bxj7c\") pod \"cinder-db-sync-bpcjv\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.382477 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxj7c\" (UniqueName: \"kubernetes.io/projected/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-kube-api-access-bxj7c\") pod \"cinder-db-sync-bpcjv\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.382901 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-combined-ca-bundle\") pod \"cinder-db-sync-bpcjv\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.382940 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-db-sync-config-data\") pod \"cinder-db-sync-bpcjv\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.382970 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-config-data\") pod \"cinder-db-sync-bpcjv\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.383018 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-scripts\") pod \"cinder-db-sync-bpcjv\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.383092 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-etc-machine-id\") pod \"cinder-db-sync-bpcjv\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.384770 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-etc-machine-id\") pod \"cinder-db-sync-bpcjv\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.389135 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-scripts\") pod \"cinder-db-sync-bpcjv\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.400150 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-db-sync-config-data\") pod \"cinder-db-sync-bpcjv\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.400310 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-config-data\") pod \"cinder-db-sync-bpcjv\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.400785 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-combined-ca-bundle\") pod \"cinder-db-sync-bpcjv\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.414425 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxj7c\" (UniqueName: \"kubernetes.io/projected/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-kube-api-access-bxj7c\") pod \"cinder-db-sync-bpcjv\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.475223 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.964033 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-bpcjv"] Dec 06 08:57:47 crc kubenswrapper[4954]: W1206 08:57:47.969435 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd90c1a08_b5fa_49a4_b8b3_f26e610d5cc5.slice/crio-4277b835b614ac436834484461ea4e009ea3b944ae5a3df3a48b77264c8a4aa6 WatchSource:0}: Error finding container 4277b835b614ac436834484461ea4e009ea3b944ae5a3df3a48b77264c8a4aa6: Status 404 returned error can't find the container with id 4277b835b614ac436834484461ea4e009ea3b944ae5a3df3a48b77264c8a4aa6 Dec 06 08:57:47 crc kubenswrapper[4954]: I1206 08:57:47.972397 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 08:57:48 crc kubenswrapper[4954]: I1206 08:57:48.458925 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bpcjv" event={"ID":"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5","Type":"ContainerStarted","Data":"4277b835b614ac436834484461ea4e009ea3b944ae5a3df3a48b77264c8a4aa6"} Dec 06 08:57:53 crc kubenswrapper[4954]: I1206 08:57:53.443991 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:57:53 crc kubenswrapper[4954]: E1206 08:57:53.445286 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:58:04 crc kubenswrapper[4954]: I1206 08:58:04.443095 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:58:04 crc kubenswrapper[4954]: E1206 08:58:04.444073 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:58:07 crc kubenswrapper[4954]: E1206 08:58:07.237476 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:c3923531bcda0b0811b2d5053f189beb" Dec 06 08:58:07 crc kubenswrapper[4954]: E1206 08:58:07.237923 4954 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:c3923531bcda0b0811b2d5053f189beb" Dec 06 08:58:07 crc kubenswrapper[4954]: E1206 08:58:07.238096 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:c3923531bcda0b0811b2d5053f189beb,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bxj7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-bpcjv_openstack(d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 08:58:07 crc kubenswrapper[4954]: E1206 08:58:07.239333 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-bpcjv" podUID="d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5" Dec 06 08:58:07 crc kubenswrapper[4954]: E1206 08:58:07.678978 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:c3923531bcda0b0811b2d5053f189beb\\\"\"" pod="openstack/cinder-db-sync-bpcjv" podUID="d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5" Dec 06 08:58:16 crc kubenswrapper[4954]: I1206 08:58:16.444529 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:58:16 crc kubenswrapper[4954]: E1206 08:58:16.446611 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:58:23 crc kubenswrapper[4954]: I1206 08:58:23.247814 4954 scope.go:117] "RemoveContainer" containerID="51a6d5b8a82fba03786299e422806156058bd23f79d612d7af562b8a848ce1e3" Dec 06 08:58:23 crc kubenswrapper[4954]: I1206 08:58:23.277347 4954 scope.go:117] "RemoveContainer" containerID="a3abc95c23bdb4f2bcf32f418ea864da9eda7ba7590a3d4ab03cc259abf79669" Dec 06 08:58:23 crc kubenswrapper[4954]: I1206 08:58:23.812519 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bpcjv" event={"ID":"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5","Type":"ContainerStarted","Data":"4d426f7347144e8477495f01de0100abebba3f45d74ef577632ce7dd287936b0"} Dec 06 08:58:23 crc kubenswrapper[4954]: I1206 08:58:23.848306 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-bpcjv" podStartSLOduration=2.135223244 podStartE2EDuration="36.848264949s" podCreationTimestamp="2025-12-06 08:57:47 +0000 UTC" firstStartedPulling="2025-12-06 08:57:47.972115928 +0000 UTC m=+7242.785475317" lastFinishedPulling="2025-12-06 08:58:22.685157633 +0000 UTC m=+7277.498517022" observedRunningTime="2025-12-06 08:58:23.832252873 +0000 UTC m=+7278.645612272" watchObservedRunningTime="2025-12-06 08:58:23.848264949 +0000 UTC m=+7278.661624358" Dec 06 08:58:25 crc kubenswrapper[4954]: I1206 08:58:25.830157 4954 generic.go:334] "Generic (PLEG): container finished" podID="d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5" containerID="4d426f7347144e8477495f01de0100abebba3f45d74ef577632ce7dd287936b0" exitCode=0 Dec 06 08:58:25 crc kubenswrapper[4954]: I1206 08:58:25.830262 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bpcjv" event={"ID":"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5","Type":"ContainerDied","Data":"4d426f7347144e8477495f01de0100abebba3f45d74ef577632ce7dd287936b0"} Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.151432 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.246779 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxj7c\" (UniqueName: \"kubernetes.io/projected/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-kube-api-access-bxj7c\") pod \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.246860 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-etc-machine-id\") pod \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.246909 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-db-sync-config-data\") pod \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.247037 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5" (UID: "d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.247057 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-combined-ca-bundle\") pod \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.247185 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-scripts\") pod \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.247242 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-config-data\") pod \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\" (UID: \"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5\") " Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.248309 4954 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.252898 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-scripts" (OuterVolumeSpecName: "scripts") pod "d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5" (UID: "d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.253285 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5" (UID: "d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.253530 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-kube-api-access-bxj7c" (OuterVolumeSpecName: "kube-api-access-bxj7c") pod "d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5" (UID: "d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5"). InnerVolumeSpecName "kube-api-access-bxj7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.271611 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5" (UID: "d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.293266 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-config-data" (OuterVolumeSpecName: "config-data") pod "d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5" (UID: "d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.350037 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.350072 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.350083 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxj7c\" (UniqueName: \"kubernetes.io/projected/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-kube-api-access-bxj7c\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.350094 4954 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.350104 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.853126 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bpcjv" event={"ID":"d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5","Type":"ContainerDied","Data":"4277b835b614ac436834484461ea4e009ea3b944ae5a3df3a48b77264c8a4aa6"} Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.853178 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4277b835b614ac436834484461ea4e009ea3b944ae5a3df3a48b77264c8a4aa6" Dec 06 08:58:27 crc kubenswrapper[4954]: I1206 08:58:27.853249 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bpcjv" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.546263 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cf6448dcf-qplt7"] Dec 06 08:58:28 crc kubenswrapper[4954]: E1206 08:58:28.547237 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5" containerName="cinder-db-sync" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.547254 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5" containerName="cinder-db-sync" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.547955 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5" containerName="cinder-db-sync" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.551033 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.564104 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cf6448dcf-qplt7"] Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.671013 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx8bm\" (UniqueName: \"kubernetes.io/projected/1aca932f-9701-482e-8973-81dc83779cb1-kube-api-access-mx8bm\") pod \"dnsmasq-dns-7cf6448dcf-qplt7\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.671124 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-dns-svc\") pod \"dnsmasq-dns-7cf6448dcf-qplt7\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.671150 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-ovsdbserver-sb\") pod \"dnsmasq-dns-7cf6448dcf-qplt7\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.671213 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-config\") pod \"dnsmasq-dns-7cf6448dcf-qplt7\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.671230 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-ovsdbserver-nb\") pod \"dnsmasq-dns-7cf6448dcf-qplt7\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.773348 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx8bm\" (UniqueName: \"kubernetes.io/projected/1aca932f-9701-482e-8973-81dc83779cb1-kube-api-access-mx8bm\") pod \"dnsmasq-dns-7cf6448dcf-qplt7\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.773480 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-dns-svc\") pod \"dnsmasq-dns-7cf6448dcf-qplt7\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.773508 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-ovsdbserver-sb\") pod \"dnsmasq-dns-7cf6448dcf-qplt7\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.773603 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-config\") pod \"dnsmasq-dns-7cf6448dcf-qplt7\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.773627 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-ovsdbserver-nb\") pod \"dnsmasq-dns-7cf6448dcf-qplt7\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.774732 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-ovsdbserver-nb\") pod \"dnsmasq-dns-7cf6448dcf-qplt7\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.775658 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-dns-svc\") pod \"dnsmasq-dns-7cf6448dcf-qplt7\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.776507 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-ovsdbserver-sb\") pod \"dnsmasq-dns-7cf6448dcf-qplt7\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.777212 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-config\") pod \"dnsmasq-dns-7cf6448dcf-qplt7\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.791185 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.793180 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.803426 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.803748 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.804090 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qcdlj" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.804542 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx8bm\" (UniqueName: \"kubernetes.io/projected/1aca932f-9701-482e-8973-81dc83779cb1-kube-api-access-mx8bm\") pod \"dnsmasq-dns-7cf6448dcf-qplt7\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.817093 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.835453 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.878144 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.879951 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.879986 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16605dfd-45af-4f10-bd7a-09f9b7122140-etc-machine-id\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.880052 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-scripts\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.880079 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-config-data-custom\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.880107 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-config-data\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.880192 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16605dfd-45af-4f10-bd7a-09f9b7122140-logs\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.880224 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj57f\" (UniqueName: \"kubernetes.io/projected/16605dfd-45af-4f10-bd7a-09f9b7122140-kube-api-access-lj57f\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.981741 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16605dfd-45af-4f10-bd7a-09f9b7122140-logs\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.982072 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj57f\" (UniqueName: \"kubernetes.io/projected/16605dfd-45af-4f10-bd7a-09f9b7122140-kube-api-access-lj57f\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.982145 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16605dfd-45af-4f10-bd7a-09f9b7122140-etc-machine-id\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.982168 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.982224 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-scripts\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.982245 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-config-data-custom\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.982272 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-config-data\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.982428 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16605dfd-45af-4f10-bd7a-09f9b7122140-logs\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.983121 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16605dfd-45af-4f10-bd7a-09f9b7122140-etc-machine-id\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.987474 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.993194 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-scripts\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.993285 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-config-data-custom\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:28 crc kubenswrapper[4954]: I1206 08:58:28.994109 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-config-data\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:29 crc kubenswrapper[4954]: I1206 08:58:29.008054 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj57f\" (UniqueName: \"kubernetes.io/projected/16605dfd-45af-4f10-bd7a-09f9b7122140-kube-api-access-lj57f\") pod \"cinder-api-0\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " pod="openstack/cinder-api-0" Dec 06 08:58:29 crc kubenswrapper[4954]: I1206 08:58:29.170779 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 08:58:29 crc kubenswrapper[4954]: I1206 08:58:29.412689 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cf6448dcf-qplt7"] Dec 06 08:58:29 crc kubenswrapper[4954]: I1206 08:58:29.623260 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 08:58:29 crc kubenswrapper[4954]: W1206 08:58:29.628214 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16605dfd_45af_4f10_bd7a_09f9b7122140.slice/crio-7371550fc87998a80dde8e0997e1d99c6b9764961c19d6ac76acf88fa9ee95ca WatchSource:0}: Error finding container 7371550fc87998a80dde8e0997e1d99c6b9764961c19d6ac76acf88fa9ee95ca: Status 404 returned error can't find the container with id 7371550fc87998a80dde8e0997e1d99c6b9764961c19d6ac76acf88fa9ee95ca Dec 06 08:58:29 crc kubenswrapper[4954]: I1206 08:58:29.876606 4954 generic.go:334] "Generic (PLEG): container finished" podID="1aca932f-9701-482e-8973-81dc83779cb1" containerID="6352fa3d04d90cd44624806bb282f6f06ca4637543e09ca43ce788a1b30b167c" exitCode=0 Dec 06 08:58:29 crc kubenswrapper[4954]: I1206 08:58:29.876695 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" event={"ID":"1aca932f-9701-482e-8973-81dc83779cb1","Type":"ContainerDied","Data":"6352fa3d04d90cd44624806bb282f6f06ca4637543e09ca43ce788a1b30b167c"} Dec 06 08:58:29 crc kubenswrapper[4954]: I1206 08:58:29.876967 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" event={"ID":"1aca932f-9701-482e-8973-81dc83779cb1","Type":"ContainerStarted","Data":"6dab85d51c1aa1fb222fb8d119da5a5dbc374a883d1a1b00705456df34e9c235"} Dec 06 08:58:29 crc kubenswrapper[4954]: I1206 08:58:29.878191 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16605dfd-45af-4f10-bd7a-09f9b7122140","Type":"ContainerStarted","Data":"7371550fc87998a80dde8e0997e1d99c6b9764961c19d6ac76acf88fa9ee95ca"} Dec 06 08:58:30 crc kubenswrapper[4954]: I1206 08:58:30.444430 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:58:30 crc kubenswrapper[4954]: E1206 08:58:30.444639 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 08:58:30 crc kubenswrapper[4954]: I1206 08:58:30.894730 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" event={"ID":"1aca932f-9701-482e-8973-81dc83779cb1","Type":"ContainerStarted","Data":"821307b669af19f49db134ae5e408c5a20edf17d92a2eee3b7eeb3634ce89dd6"} Dec 06 08:58:30 crc kubenswrapper[4954]: I1206 08:58:30.895456 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:30 crc kubenswrapper[4954]: I1206 08:58:30.899378 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16605dfd-45af-4f10-bd7a-09f9b7122140","Type":"ContainerStarted","Data":"0f7f48095a0948440d2552bcd6739436dca9c6fa9faf2abc4bdec6405260698e"} Dec 06 08:58:30 crc kubenswrapper[4954]: I1206 08:58:30.916198 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" podStartSLOduration=2.916180293 podStartE2EDuration="2.916180293s" podCreationTimestamp="2025-12-06 08:58:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:58:30.915426503 +0000 UTC m=+7285.728785892" watchObservedRunningTime="2025-12-06 08:58:30.916180293 +0000 UTC m=+7285.729539682" Dec 06 08:58:31 crc kubenswrapper[4954]: I1206 08:58:31.504269 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 06 08:58:31 crc kubenswrapper[4954]: I1206 08:58:31.909507 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16605dfd-45af-4f10-bd7a-09f9b7122140","Type":"ContainerStarted","Data":"296a87fb747e98552a13699046b7a5dc74fcbe210411a357b742beb38455a476"} Dec 06 08:58:31 crc kubenswrapper[4954]: I1206 08:58:31.937666 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.93764471 podStartE2EDuration="3.93764471s" podCreationTimestamp="2025-12-06 08:58:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:58:31.926727739 +0000 UTC m=+7286.740087148" watchObservedRunningTime="2025-12-06 08:58:31.93764471 +0000 UTC m=+7286.751004099" Dec 06 08:58:32 crc kubenswrapper[4954]: I1206 08:58:32.917024 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="16605dfd-45af-4f10-bd7a-09f9b7122140" containerName="cinder-api-log" containerID="cri-o://0f7f48095a0948440d2552bcd6739436dca9c6fa9faf2abc4bdec6405260698e" gracePeriod=30 Dec 06 08:58:32 crc kubenswrapper[4954]: I1206 08:58:32.917071 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 06 08:58:32 crc kubenswrapper[4954]: I1206 08:58:32.917072 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="16605dfd-45af-4f10-bd7a-09f9b7122140" containerName="cinder-api" containerID="cri-o://296a87fb747e98552a13699046b7a5dc74fcbe210411a357b742beb38455a476" gracePeriod=30 Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.481244 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.575178 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-config-data-custom\") pod \"16605dfd-45af-4f10-bd7a-09f9b7122140\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.575492 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-config-data\") pod \"16605dfd-45af-4f10-bd7a-09f9b7122140\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.575664 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-combined-ca-bundle\") pod \"16605dfd-45af-4f10-bd7a-09f9b7122140\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.575761 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16605dfd-45af-4f10-bd7a-09f9b7122140-logs\") pod \"16605dfd-45af-4f10-bd7a-09f9b7122140\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.575881 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj57f\" (UniqueName: \"kubernetes.io/projected/16605dfd-45af-4f10-bd7a-09f9b7122140-kube-api-access-lj57f\") pod \"16605dfd-45af-4f10-bd7a-09f9b7122140\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.575977 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-scripts\") pod \"16605dfd-45af-4f10-bd7a-09f9b7122140\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.576053 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16605dfd-45af-4f10-bd7a-09f9b7122140-etc-machine-id\") pod \"16605dfd-45af-4f10-bd7a-09f9b7122140\" (UID: \"16605dfd-45af-4f10-bd7a-09f9b7122140\") " Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.576191 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16605dfd-45af-4f10-bd7a-09f9b7122140-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "16605dfd-45af-4f10-bd7a-09f9b7122140" (UID: "16605dfd-45af-4f10-bd7a-09f9b7122140"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.576468 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16605dfd-45af-4f10-bd7a-09f9b7122140-logs" (OuterVolumeSpecName: "logs") pod "16605dfd-45af-4f10-bd7a-09f9b7122140" (UID: "16605dfd-45af-4f10-bd7a-09f9b7122140"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.576779 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16605dfd-45af-4f10-bd7a-09f9b7122140-logs\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.576879 4954 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16605dfd-45af-4f10-bd7a-09f9b7122140-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.582715 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16605dfd-45af-4f10-bd7a-09f9b7122140-kube-api-access-lj57f" (OuterVolumeSpecName: "kube-api-access-lj57f") pod "16605dfd-45af-4f10-bd7a-09f9b7122140" (UID: "16605dfd-45af-4f10-bd7a-09f9b7122140"). InnerVolumeSpecName "kube-api-access-lj57f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.583394 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "16605dfd-45af-4f10-bd7a-09f9b7122140" (UID: "16605dfd-45af-4f10-bd7a-09f9b7122140"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.583536 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-scripts" (OuterVolumeSpecName: "scripts") pod "16605dfd-45af-4f10-bd7a-09f9b7122140" (UID: "16605dfd-45af-4f10-bd7a-09f9b7122140"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.602660 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16605dfd-45af-4f10-bd7a-09f9b7122140" (UID: "16605dfd-45af-4f10-bd7a-09f9b7122140"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.621041 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-config-data" (OuterVolumeSpecName: "config-data") pod "16605dfd-45af-4f10-bd7a-09f9b7122140" (UID: "16605dfd-45af-4f10-bd7a-09f9b7122140"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.679622 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.679918 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.679976 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.680028 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj57f\" (UniqueName: \"kubernetes.io/projected/16605dfd-45af-4f10-bd7a-09f9b7122140-kube-api-access-lj57f\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.680079 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16605dfd-45af-4f10-bd7a-09f9b7122140-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.808722 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x7k87"] Dec 06 08:58:33 crc kubenswrapper[4954]: E1206 08:58:33.809184 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16605dfd-45af-4f10-bd7a-09f9b7122140" containerName="cinder-api-log" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.809205 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="16605dfd-45af-4f10-bd7a-09f9b7122140" containerName="cinder-api-log" Dec 06 08:58:33 crc kubenswrapper[4954]: E1206 08:58:33.809224 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16605dfd-45af-4f10-bd7a-09f9b7122140" containerName="cinder-api" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.809231 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="16605dfd-45af-4f10-bd7a-09f9b7122140" containerName="cinder-api" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.809448 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="16605dfd-45af-4f10-bd7a-09f9b7122140" containerName="cinder-api-log" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.809471 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="16605dfd-45af-4f10-bd7a-09f9b7122140" containerName="cinder-api" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.810960 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x7k87" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.816166 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x7k87"] Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.884885 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71627c71-f976-48d7-89ed-6b8ae58271d9-utilities\") pod \"certified-operators-x7k87\" (UID: \"71627c71-f976-48d7-89ed-6b8ae58271d9\") " pod="openshift-marketplace/certified-operators-x7k87" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.885380 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71627c71-f976-48d7-89ed-6b8ae58271d9-catalog-content\") pod \"certified-operators-x7k87\" (UID: \"71627c71-f976-48d7-89ed-6b8ae58271d9\") " pod="openshift-marketplace/certified-operators-x7k87" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.885529 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng6mf\" (UniqueName: \"kubernetes.io/projected/71627c71-f976-48d7-89ed-6b8ae58271d9-kube-api-access-ng6mf\") pod \"certified-operators-x7k87\" (UID: \"71627c71-f976-48d7-89ed-6b8ae58271d9\") " pod="openshift-marketplace/certified-operators-x7k87" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.927060 4954 generic.go:334] "Generic (PLEG): container finished" podID="16605dfd-45af-4f10-bd7a-09f9b7122140" containerID="296a87fb747e98552a13699046b7a5dc74fcbe210411a357b742beb38455a476" exitCode=0 Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.927094 4954 generic.go:334] "Generic (PLEG): container finished" podID="16605dfd-45af-4f10-bd7a-09f9b7122140" containerID="0f7f48095a0948440d2552bcd6739436dca9c6fa9faf2abc4bdec6405260698e" exitCode=143 Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.927114 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.927118 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16605dfd-45af-4f10-bd7a-09f9b7122140","Type":"ContainerDied","Data":"296a87fb747e98552a13699046b7a5dc74fcbe210411a357b742beb38455a476"} Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.927849 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16605dfd-45af-4f10-bd7a-09f9b7122140","Type":"ContainerDied","Data":"0f7f48095a0948440d2552bcd6739436dca9c6fa9faf2abc4bdec6405260698e"} Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.927871 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16605dfd-45af-4f10-bd7a-09f9b7122140","Type":"ContainerDied","Data":"7371550fc87998a80dde8e0997e1d99c6b9764961c19d6ac76acf88fa9ee95ca"} Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.927887 4954 scope.go:117] "RemoveContainer" containerID="296a87fb747e98552a13699046b7a5dc74fcbe210411a357b742beb38455a476" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.963539 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.966251 4954 scope.go:117] "RemoveContainer" containerID="0f7f48095a0948440d2552bcd6739436dca9c6fa9faf2abc4bdec6405260698e" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.973357 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.985057 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.986614 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.987635 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71627c71-f976-48d7-89ed-6b8ae58271d9-catalog-content\") pod \"certified-operators-x7k87\" (UID: \"71627c71-f976-48d7-89ed-6b8ae58271d9\") " pod="openshift-marketplace/certified-operators-x7k87" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.987721 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng6mf\" (UniqueName: \"kubernetes.io/projected/71627c71-f976-48d7-89ed-6b8ae58271d9-kube-api-access-ng6mf\") pod \"certified-operators-x7k87\" (UID: \"71627c71-f976-48d7-89ed-6b8ae58271d9\") " pod="openshift-marketplace/certified-operators-x7k87" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.987842 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71627c71-f976-48d7-89ed-6b8ae58271d9-utilities\") pod \"certified-operators-x7k87\" (UID: \"71627c71-f976-48d7-89ed-6b8ae58271d9\") " pod="openshift-marketplace/certified-operators-x7k87" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.988410 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71627c71-f976-48d7-89ed-6b8ae58271d9-catalog-content\") pod \"certified-operators-x7k87\" (UID: \"71627c71-f976-48d7-89ed-6b8ae58271d9\") " pod="openshift-marketplace/certified-operators-x7k87" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.988435 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71627c71-f976-48d7-89ed-6b8ae58271d9-utilities\") pod \"certified-operators-x7k87\" (UID: \"71627c71-f976-48d7-89ed-6b8ae58271d9\") " pod="openshift-marketplace/certified-operators-x7k87" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.991758 4954 scope.go:117] "RemoveContainer" containerID="296a87fb747e98552a13699046b7a5dc74fcbe210411a357b742beb38455a476" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.992194 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.992258 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.992399 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qcdlj" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.992977 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.993359 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 06 08:58:33 crc kubenswrapper[4954]: I1206 08:58:33.993561 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.001254 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.009601 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng6mf\" (UniqueName: \"kubernetes.io/projected/71627c71-f976-48d7-89ed-6b8ae58271d9-kube-api-access-ng6mf\") pod \"certified-operators-x7k87\" (UID: \"71627c71-f976-48d7-89ed-6b8ae58271d9\") " pod="openshift-marketplace/certified-operators-x7k87" Dec 06 08:58:34 crc kubenswrapper[4954]: E1206 08:58:34.038954 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"296a87fb747e98552a13699046b7a5dc74fcbe210411a357b742beb38455a476\": container with ID starting with 296a87fb747e98552a13699046b7a5dc74fcbe210411a357b742beb38455a476 not found: ID does not exist" containerID="296a87fb747e98552a13699046b7a5dc74fcbe210411a357b742beb38455a476" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.039000 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"296a87fb747e98552a13699046b7a5dc74fcbe210411a357b742beb38455a476"} err="failed to get container status \"296a87fb747e98552a13699046b7a5dc74fcbe210411a357b742beb38455a476\": rpc error: code = NotFound desc = could not find container \"296a87fb747e98552a13699046b7a5dc74fcbe210411a357b742beb38455a476\": container with ID starting with 296a87fb747e98552a13699046b7a5dc74fcbe210411a357b742beb38455a476 not found: ID does not exist" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.039031 4954 scope.go:117] "RemoveContainer" containerID="0f7f48095a0948440d2552bcd6739436dca9c6fa9faf2abc4bdec6405260698e" Dec 06 08:58:34 crc kubenswrapper[4954]: E1206 08:58:34.044093 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f7f48095a0948440d2552bcd6739436dca9c6fa9faf2abc4bdec6405260698e\": container with ID starting with 0f7f48095a0948440d2552bcd6739436dca9c6fa9faf2abc4bdec6405260698e not found: ID does not exist" containerID="0f7f48095a0948440d2552bcd6739436dca9c6fa9faf2abc4bdec6405260698e" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.044134 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f7f48095a0948440d2552bcd6739436dca9c6fa9faf2abc4bdec6405260698e"} err="failed to get container status \"0f7f48095a0948440d2552bcd6739436dca9c6fa9faf2abc4bdec6405260698e\": rpc error: code = NotFound desc = could not find container \"0f7f48095a0948440d2552bcd6739436dca9c6fa9faf2abc4bdec6405260698e\": container with ID starting with 0f7f48095a0948440d2552bcd6739436dca9c6fa9faf2abc4bdec6405260698e not found: ID does not exist" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.044164 4954 scope.go:117] "RemoveContainer" containerID="296a87fb747e98552a13699046b7a5dc74fcbe210411a357b742beb38455a476" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.044586 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"296a87fb747e98552a13699046b7a5dc74fcbe210411a357b742beb38455a476"} err="failed to get container status \"296a87fb747e98552a13699046b7a5dc74fcbe210411a357b742beb38455a476\": rpc error: code = NotFound desc = could not find container \"296a87fb747e98552a13699046b7a5dc74fcbe210411a357b742beb38455a476\": container with ID starting with 296a87fb747e98552a13699046b7a5dc74fcbe210411a357b742beb38455a476 not found: ID does not exist" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.044616 4954 scope.go:117] "RemoveContainer" containerID="0f7f48095a0948440d2552bcd6739436dca9c6fa9faf2abc4bdec6405260698e" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.044890 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f7f48095a0948440d2552bcd6739436dca9c6fa9faf2abc4bdec6405260698e"} err="failed to get container status \"0f7f48095a0948440d2552bcd6739436dca9c6fa9faf2abc4bdec6405260698e\": rpc error: code = NotFound desc = could not find container \"0f7f48095a0948440d2552bcd6739436dca9c6fa9faf2abc4bdec6405260698e\": container with ID starting with 0f7f48095a0948440d2552bcd6739436dca9c6fa9faf2abc4bdec6405260698e not found: ID does not exist" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.089729 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.089773 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x4tz\" (UniqueName: \"kubernetes.io/projected/7ae3854c-73c4-4c90-9e60-f212bb318c43-kube-api-access-7x4tz\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.089799 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ae3854c-73c4-4c90-9e60-f212bb318c43-logs\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.089836 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-config-data\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.089874 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.089995 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-config-data-custom\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.090035 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ae3854c-73c4-4c90-9e60-f212bb318c43-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.090067 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-scripts\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.090294 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.127590 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x7k87" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.192075 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x4tz\" (UniqueName: \"kubernetes.io/projected/7ae3854c-73c4-4c90-9e60-f212bb318c43-kube-api-access-7x4tz\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.192142 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ae3854c-73c4-4c90-9e60-f212bb318c43-logs\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.192197 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-config-data\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.192266 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.192659 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ae3854c-73c4-4c90-9e60-f212bb318c43-logs\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.192792 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-config-data-custom\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.192833 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ae3854c-73c4-4c90-9e60-f212bb318c43-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.192876 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-scripts\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.193008 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.193038 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.193597 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ae3854c-73c4-4c90-9e60-f212bb318c43-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.196626 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.197346 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.197807 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.198162 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-scripts\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.200952 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-config-data-custom\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.201444 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-config-data\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.209830 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x4tz\" (UniqueName: \"kubernetes.io/projected/7ae3854c-73c4-4c90-9e60-f212bb318c43-kube-api-access-7x4tz\") pod \"cinder-api-0\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.397284 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.629955 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x7k87"] Dec 06 08:58:34 crc kubenswrapper[4954]: W1206 08:58:34.912747 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ae3854c_73c4_4c90_9e60_f212bb318c43.slice/crio-f35657b426d33045cf9eb2b2ce018cb2ca7d0c36ca8e86d58bae6c254c72ca7c WatchSource:0}: Error finding container f35657b426d33045cf9eb2b2ce018cb2ca7d0c36ca8e86d58bae6c254c72ca7c: Status 404 returned error can't find the container with id f35657b426d33045cf9eb2b2ce018cb2ca7d0c36ca8e86d58bae6c254c72ca7c Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.915102 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.940273 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ae3854c-73c4-4c90-9e60-f212bb318c43","Type":"ContainerStarted","Data":"f35657b426d33045cf9eb2b2ce018cb2ca7d0c36ca8e86d58bae6c254c72ca7c"} Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.944863 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7k87" event={"ID":"71627c71-f976-48d7-89ed-6b8ae58271d9","Type":"ContainerStarted","Data":"8ac6ac9ce343ada09740ff735c46cc67fc8059c272733069f1d91de9f4c28bb0"} Dec 06 08:58:34 crc kubenswrapper[4954]: I1206 08:58:34.944910 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7k87" event={"ID":"71627c71-f976-48d7-89ed-6b8ae58271d9","Type":"ContainerStarted","Data":"0ceb4f55c0e8ffd802752bdb732fb535d723f4aa0c70239f81c47d451f859276"} Dec 06 08:58:35 crc kubenswrapper[4954]: I1206 08:58:35.460586 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16605dfd-45af-4f10-bd7a-09f9b7122140" path="/var/lib/kubelet/pods/16605dfd-45af-4f10-bd7a-09f9b7122140/volumes" Dec 06 08:58:35 crc kubenswrapper[4954]: I1206 08:58:35.956965 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ae3854c-73c4-4c90-9e60-f212bb318c43","Type":"ContainerStarted","Data":"5b9fac5e6a06daaeb80e2822fa4ab96ae850613b679c2d4552a3c1d94699ce4f"} Dec 06 08:58:35 crc kubenswrapper[4954]: I1206 08:58:35.960667 4954 generic.go:334] "Generic (PLEG): container finished" podID="71627c71-f976-48d7-89ed-6b8ae58271d9" containerID="8ac6ac9ce343ada09740ff735c46cc67fc8059c272733069f1d91de9f4c28bb0" exitCode=0 Dec 06 08:58:35 crc kubenswrapper[4954]: I1206 08:58:35.960707 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7k87" event={"ID":"71627c71-f976-48d7-89ed-6b8ae58271d9","Type":"ContainerDied","Data":"8ac6ac9ce343ada09740ff735c46cc67fc8059c272733069f1d91de9f4c28bb0"} Dec 06 08:58:36 crc kubenswrapper[4954]: I1206 08:58:36.972273 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7k87" event={"ID":"71627c71-f976-48d7-89ed-6b8ae58271d9","Type":"ContainerStarted","Data":"a135f2b653f03bef4308348e7af2eda54dec87f60777ee6e8dcaa0c92cbcf052"} Dec 06 08:58:36 crc kubenswrapper[4954]: I1206 08:58:36.976192 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ae3854c-73c4-4c90-9e60-f212bb318c43","Type":"ContainerStarted","Data":"c51f35143353c8e7cd1fe4cee375a9612c205e3442f2d0f1cb05726a19fadfd3"} Dec 06 08:58:36 crc kubenswrapper[4954]: I1206 08:58:36.976392 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 06 08:58:37 crc kubenswrapper[4954]: I1206 08:58:37.013261 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.013233728 podStartE2EDuration="4.013233728s" podCreationTimestamp="2025-12-06 08:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:58:37.007506805 +0000 UTC m=+7291.820866204" watchObservedRunningTime="2025-12-06 08:58:37.013233728 +0000 UTC m=+7291.826593117" Dec 06 08:58:37 crc kubenswrapper[4954]: I1206 08:58:37.989092 4954 generic.go:334] "Generic (PLEG): container finished" podID="71627c71-f976-48d7-89ed-6b8ae58271d9" containerID="a135f2b653f03bef4308348e7af2eda54dec87f60777ee6e8dcaa0c92cbcf052" exitCode=0 Dec 06 08:58:37 crc kubenswrapper[4954]: I1206 08:58:37.989283 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7k87" event={"ID":"71627c71-f976-48d7-89ed-6b8ae58271d9","Type":"ContainerDied","Data":"a135f2b653f03bef4308348e7af2eda54dec87f60777ee6e8dcaa0c92cbcf052"} Dec 06 08:58:38 crc kubenswrapper[4954]: I1206 08:58:38.882785 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 08:58:38 crc kubenswrapper[4954]: I1206 08:58:38.945992 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6866cd7fc7-ln7tj"] Dec 06 08:58:38 crc kubenswrapper[4954]: I1206 08:58:38.946238 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" podUID="d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0" containerName="dnsmasq-dns" containerID="cri-o://7f8a9309aa161aa4a082e1887b35028048980185b73a6c42c42df5bcc0694ac1" gracePeriod=10 Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.004092 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" podUID="d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.57:5353: connect: connection refused" Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.026318 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7k87" event={"ID":"71627c71-f976-48d7-89ed-6b8ae58271d9","Type":"ContainerStarted","Data":"1ca792d2da055ef040dc281a127695690a82d3400898a791ae19d1c910a79549"} Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.079575 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x7k87" podStartSLOduration=3.520395331 podStartE2EDuration="6.079538013s" podCreationTimestamp="2025-12-06 08:58:33 +0000 UTC" firstStartedPulling="2025-12-06 08:58:35.962999616 +0000 UTC m=+7290.776359005" lastFinishedPulling="2025-12-06 08:58:38.522142298 +0000 UTC m=+7293.335501687" observedRunningTime="2025-12-06 08:58:39.067635067 +0000 UTC m=+7293.880994456" watchObservedRunningTime="2025-12-06 08:58:39.079538013 +0000 UTC m=+7293.892897402" Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.538643 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.708093 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crst7\" (UniqueName: \"kubernetes.io/projected/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-kube-api-access-crst7\") pod \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.708510 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-dns-svc\") pod \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.708617 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-ovsdbserver-nb\") pod \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.708736 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-ovsdbserver-sb\") pod \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.708827 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-config\") pod \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\" (UID: \"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0\") " Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.724963 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-kube-api-access-crst7" (OuterVolumeSpecName: "kube-api-access-crst7") pod "d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0" (UID: "d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0"). InnerVolumeSpecName "kube-api-access-crst7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.760942 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0" (UID: "d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.762974 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0" (UID: "d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.764912 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0" (UID: "d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.770991 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-config" (OuterVolumeSpecName: "config") pod "d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0" (UID: "d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.810690 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.810723 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-config\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.810737 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crst7\" (UniqueName: \"kubernetes.io/projected/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-kube-api-access-crst7\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.810749 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:39 crc kubenswrapper[4954]: I1206 08:58:39.810759 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:40 crc kubenswrapper[4954]: I1206 08:58:40.036690 4954 generic.go:334] "Generic (PLEG): container finished" podID="d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0" containerID="7f8a9309aa161aa4a082e1887b35028048980185b73a6c42c42df5bcc0694ac1" exitCode=0 Dec 06 08:58:40 crc kubenswrapper[4954]: I1206 08:58:40.036756 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" Dec 06 08:58:40 crc kubenswrapper[4954]: I1206 08:58:40.036808 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" event={"ID":"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0","Type":"ContainerDied","Data":"7f8a9309aa161aa4a082e1887b35028048980185b73a6c42c42df5bcc0694ac1"} Dec 06 08:58:40 crc kubenswrapper[4954]: I1206 08:58:40.036847 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6866cd7fc7-ln7tj" event={"ID":"d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0","Type":"ContainerDied","Data":"9b420856877844ada936d7f4af24fa80c9403ef2a3eab59682d3b4ad4b04156e"} Dec 06 08:58:40 crc kubenswrapper[4954]: I1206 08:58:40.036868 4954 scope.go:117] "RemoveContainer" containerID="7f8a9309aa161aa4a082e1887b35028048980185b73a6c42c42df5bcc0694ac1" Dec 06 08:58:40 crc kubenswrapper[4954]: I1206 08:58:40.056824 4954 scope.go:117] "RemoveContainer" containerID="7fb3067e63fc9857a5cb641c5245942c61e8b9b0cde14cdc0e3e6cc4983f1e5e" Dec 06 08:58:40 crc kubenswrapper[4954]: I1206 08:58:40.074871 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6866cd7fc7-ln7tj"] Dec 06 08:58:40 crc kubenswrapper[4954]: I1206 08:58:40.083884 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6866cd7fc7-ln7tj"] Dec 06 08:58:40 crc kubenswrapper[4954]: I1206 08:58:40.089223 4954 scope.go:117] "RemoveContainer" containerID="7f8a9309aa161aa4a082e1887b35028048980185b73a6c42c42df5bcc0694ac1" Dec 06 08:58:40 crc kubenswrapper[4954]: E1206 08:58:40.089749 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f8a9309aa161aa4a082e1887b35028048980185b73a6c42c42df5bcc0694ac1\": container with ID starting with 7f8a9309aa161aa4a082e1887b35028048980185b73a6c42c42df5bcc0694ac1 not found: ID does not exist" containerID="7f8a9309aa161aa4a082e1887b35028048980185b73a6c42c42df5bcc0694ac1" Dec 06 08:58:40 crc kubenswrapper[4954]: I1206 08:58:40.089781 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f8a9309aa161aa4a082e1887b35028048980185b73a6c42c42df5bcc0694ac1"} err="failed to get container status \"7f8a9309aa161aa4a082e1887b35028048980185b73a6c42c42df5bcc0694ac1\": rpc error: code = NotFound desc = could not find container \"7f8a9309aa161aa4a082e1887b35028048980185b73a6c42c42df5bcc0694ac1\": container with ID starting with 7f8a9309aa161aa4a082e1887b35028048980185b73a6c42c42df5bcc0694ac1 not found: ID does not exist" Dec 06 08:58:40 crc kubenswrapper[4954]: I1206 08:58:40.089805 4954 scope.go:117] "RemoveContainer" containerID="7fb3067e63fc9857a5cb641c5245942c61e8b9b0cde14cdc0e3e6cc4983f1e5e" Dec 06 08:58:40 crc kubenswrapper[4954]: E1206 08:58:40.090242 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fb3067e63fc9857a5cb641c5245942c61e8b9b0cde14cdc0e3e6cc4983f1e5e\": container with ID starting with 7fb3067e63fc9857a5cb641c5245942c61e8b9b0cde14cdc0e3e6cc4983f1e5e not found: ID does not exist" containerID="7fb3067e63fc9857a5cb641c5245942c61e8b9b0cde14cdc0e3e6cc4983f1e5e" Dec 06 08:58:40 crc kubenswrapper[4954]: I1206 08:58:40.090388 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fb3067e63fc9857a5cb641c5245942c61e8b9b0cde14cdc0e3e6cc4983f1e5e"} err="failed to get container status \"7fb3067e63fc9857a5cb641c5245942c61e8b9b0cde14cdc0e3e6cc4983f1e5e\": rpc error: code = NotFound desc = could not find container \"7fb3067e63fc9857a5cb641c5245942c61e8b9b0cde14cdc0e3e6cc4983f1e5e\": container with ID starting with 7fb3067e63fc9857a5cb641c5245942c61e8b9b0cde14cdc0e3e6cc4983f1e5e not found: ID does not exist" Dec 06 08:58:41 crc kubenswrapper[4954]: I1206 08:58:41.453880 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0" path="/var/lib/kubelet/pods/d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0/volumes" Dec 06 08:58:42 crc kubenswrapper[4954]: I1206 08:58:42.444377 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 08:58:43 crc kubenswrapper[4954]: I1206 08:58:43.067745 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"9a5af8207c0d9a138aaaadc63eac45bf948ce076df26cf72d671f69df33df520"} Dec 06 08:58:44 crc kubenswrapper[4954]: I1206 08:58:44.128518 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x7k87" Dec 06 08:58:44 crc kubenswrapper[4954]: I1206 08:58:44.129123 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x7k87" Dec 06 08:58:44 crc kubenswrapper[4954]: I1206 08:58:44.185048 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x7k87" Dec 06 08:58:45 crc kubenswrapper[4954]: I1206 08:58:45.130556 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x7k87" Dec 06 08:58:45 crc kubenswrapper[4954]: I1206 08:58:45.190388 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x7k87"] Dec 06 08:58:46 crc kubenswrapper[4954]: I1206 08:58:46.314376 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 06 08:58:47 crc kubenswrapper[4954]: I1206 08:58:47.102893 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x7k87" podUID="71627c71-f976-48d7-89ed-6b8ae58271d9" containerName="registry-server" containerID="cri-o://1ca792d2da055ef040dc281a127695690a82d3400898a791ae19d1c910a79549" gracePeriod=2 Dec 06 08:58:47 crc kubenswrapper[4954]: I1206 08:58:47.542932 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x7k87" Dec 06 08:58:47 crc kubenswrapper[4954]: I1206 08:58:47.676224 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71627c71-f976-48d7-89ed-6b8ae58271d9-utilities\") pod \"71627c71-f976-48d7-89ed-6b8ae58271d9\" (UID: \"71627c71-f976-48d7-89ed-6b8ae58271d9\") " Dec 06 08:58:47 crc kubenswrapper[4954]: I1206 08:58:47.676292 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71627c71-f976-48d7-89ed-6b8ae58271d9-catalog-content\") pod \"71627c71-f976-48d7-89ed-6b8ae58271d9\" (UID: \"71627c71-f976-48d7-89ed-6b8ae58271d9\") " Dec 06 08:58:47 crc kubenswrapper[4954]: I1206 08:58:47.676366 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng6mf\" (UniqueName: \"kubernetes.io/projected/71627c71-f976-48d7-89ed-6b8ae58271d9-kube-api-access-ng6mf\") pod \"71627c71-f976-48d7-89ed-6b8ae58271d9\" (UID: \"71627c71-f976-48d7-89ed-6b8ae58271d9\") " Dec 06 08:58:47 crc kubenswrapper[4954]: I1206 08:58:47.677034 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71627c71-f976-48d7-89ed-6b8ae58271d9-utilities" (OuterVolumeSpecName: "utilities") pod "71627c71-f976-48d7-89ed-6b8ae58271d9" (UID: "71627c71-f976-48d7-89ed-6b8ae58271d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:58:47 crc kubenswrapper[4954]: I1206 08:58:47.685716 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71627c71-f976-48d7-89ed-6b8ae58271d9-kube-api-access-ng6mf" (OuterVolumeSpecName: "kube-api-access-ng6mf") pod "71627c71-f976-48d7-89ed-6b8ae58271d9" (UID: "71627c71-f976-48d7-89ed-6b8ae58271d9"). InnerVolumeSpecName "kube-api-access-ng6mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:58:47 crc kubenswrapper[4954]: I1206 08:58:47.736266 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71627c71-f976-48d7-89ed-6b8ae58271d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71627c71-f976-48d7-89ed-6b8ae58271d9" (UID: "71627c71-f976-48d7-89ed-6b8ae58271d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:58:47 crc kubenswrapper[4954]: I1206 08:58:47.779068 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71627c71-f976-48d7-89ed-6b8ae58271d9-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:47 crc kubenswrapper[4954]: I1206 08:58:47.779106 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71627c71-f976-48d7-89ed-6b8ae58271d9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:47 crc kubenswrapper[4954]: I1206 08:58:47.779122 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng6mf\" (UniqueName: \"kubernetes.io/projected/71627c71-f976-48d7-89ed-6b8ae58271d9-kube-api-access-ng6mf\") on node \"crc\" DevicePath \"\"" Dec 06 08:58:48 crc kubenswrapper[4954]: I1206 08:58:48.114292 4954 generic.go:334] "Generic (PLEG): container finished" podID="71627c71-f976-48d7-89ed-6b8ae58271d9" containerID="1ca792d2da055ef040dc281a127695690a82d3400898a791ae19d1c910a79549" exitCode=0 Dec 06 08:58:48 crc kubenswrapper[4954]: I1206 08:58:48.114356 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7k87" event={"ID":"71627c71-f976-48d7-89ed-6b8ae58271d9","Type":"ContainerDied","Data":"1ca792d2da055ef040dc281a127695690a82d3400898a791ae19d1c910a79549"} Dec 06 08:58:48 crc kubenswrapper[4954]: I1206 08:58:48.114383 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7k87" event={"ID":"71627c71-f976-48d7-89ed-6b8ae58271d9","Type":"ContainerDied","Data":"0ceb4f55c0e8ffd802752bdb732fb535d723f4aa0c70239f81c47d451f859276"} Dec 06 08:58:48 crc kubenswrapper[4954]: I1206 08:58:48.114402 4954 scope.go:117] "RemoveContainer" containerID="1ca792d2da055ef040dc281a127695690a82d3400898a791ae19d1c910a79549" Dec 06 08:58:48 crc kubenswrapper[4954]: I1206 08:58:48.114536 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x7k87" Dec 06 08:58:48 crc kubenswrapper[4954]: I1206 08:58:48.138534 4954 scope.go:117] "RemoveContainer" containerID="a135f2b653f03bef4308348e7af2eda54dec87f60777ee6e8dcaa0c92cbcf052" Dec 06 08:58:48 crc kubenswrapper[4954]: I1206 08:58:48.163541 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x7k87"] Dec 06 08:58:48 crc kubenswrapper[4954]: I1206 08:58:48.174269 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x7k87"] Dec 06 08:58:48 crc kubenswrapper[4954]: I1206 08:58:48.177579 4954 scope.go:117] "RemoveContainer" containerID="8ac6ac9ce343ada09740ff735c46cc67fc8059c272733069f1d91de9f4c28bb0" Dec 06 08:58:48 crc kubenswrapper[4954]: I1206 08:58:48.202551 4954 scope.go:117] "RemoveContainer" containerID="1ca792d2da055ef040dc281a127695690a82d3400898a791ae19d1c910a79549" Dec 06 08:58:48 crc kubenswrapper[4954]: E1206 08:58:48.203238 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ca792d2da055ef040dc281a127695690a82d3400898a791ae19d1c910a79549\": container with ID starting with 1ca792d2da055ef040dc281a127695690a82d3400898a791ae19d1c910a79549 not found: ID does not exist" containerID="1ca792d2da055ef040dc281a127695690a82d3400898a791ae19d1c910a79549" Dec 06 08:58:48 crc kubenswrapper[4954]: I1206 08:58:48.203283 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ca792d2da055ef040dc281a127695690a82d3400898a791ae19d1c910a79549"} err="failed to get container status \"1ca792d2da055ef040dc281a127695690a82d3400898a791ae19d1c910a79549\": rpc error: code = NotFound desc = could not find container \"1ca792d2da055ef040dc281a127695690a82d3400898a791ae19d1c910a79549\": container with ID starting with 1ca792d2da055ef040dc281a127695690a82d3400898a791ae19d1c910a79549 not found: ID does not exist" Dec 06 08:58:48 crc kubenswrapper[4954]: I1206 08:58:48.203316 4954 scope.go:117] "RemoveContainer" containerID="a135f2b653f03bef4308348e7af2eda54dec87f60777ee6e8dcaa0c92cbcf052" Dec 06 08:58:48 crc kubenswrapper[4954]: E1206 08:58:48.203852 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a135f2b653f03bef4308348e7af2eda54dec87f60777ee6e8dcaa0c92cbcf052\": container with ID starting with a135f2b653f03bef4308348e7af2eda54dec87f60777ee6e8dcaa0c92cbcf052 not found: ID does not exist" containerID="a135f2b653f03bef4308348e7af2eda54dec87f60777ee6e8dcaa0c92cbcf052" Dec 06 08:58:48 crc kubenswrapper[4954]: I1206 08:58:48.203881 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a135f2b653f03bef4308348e7af2eda54dec87f60777ee6e8dcaa0c92cbcf052"} err="failed to get container status \"a135f2b653f03bef4308348e7af2eda54dec87f60777ee6e8dcaa0c92cbcf052\": rpc error: code = NotFound desc = could not find container \"a135f2b653f03bef4308348e7af2eda54dec87f60777ee6e8dcaa0c92cbcf052\": container with ID starting with a135f2b653f03bef4308348e7af2eda54dec87f60777ee6e8dcaa0c92cbcf052 not found: ID does not exist" Dec 06 08:58:48 crc kubenswrapper[4954]: I1206 08:58:48.203897 4954 scope.go:117] "RemoveContainer" containerID="8ac6ac9ce343ada09740ff735c46cc67fc8059c272733069f1d91de9f4c28bb0" Dec 06 08:58:48 crc kubenswrapper[4954]: E1206 08:58:48.204454 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ac6ac9ce343ada09740ff735c46cc67fc8059c272733069f1d91de9f4c28bb0\": container with ID starting with 8ac6ac9ce343ada09740ff735c46cc67fc8059c272733069f1d91de9f4c28bb0 not found: ID does not exist" containerID="8ac6ac9ce343ada09740ff735c46cc67fc8059c272733069f1d91de9f4c28bb0" Dec 06 08:58:48 crc kubenswrapper[4954]: I1206 08:58:48.204525 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac6ac9ce343ada09740ff735c46cc67fc8059c272733069f1d91de9f4c28bb0"} err="failed to get container status \"8ac6ac9ce343ada09740ff735c46cc67fc8059c272733069f1d91de9f4c28bb0\": rpc error: code = NotFound desc = could not find container \"8ac6ac9ce343ada09740ff735c46cc67fc8059c272733069f1d91de9f4c28bb0\": container with ID starting with 8ac6ac9ce343ada09740ff735c46cc67fc8059c272733069f1d91de9f4c28bb0 not found: ID does not exist" Dec 06 08:58:49 crc kubenswrapper[4954]: I1206 08:58:49.453253 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71627c71-f976-48d7-89ed-6b8ae58271d9" path="/var/lib/kubelet/pods/71627c71-f976-48d7-89ed-6b8ae58271d9/volumes" Dec 06 08:59:04 crc kubenswrapper[4954]: E1206 08:59:04.840693 4954 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.114:40474->38.129.56.114:35379: write tcp 38.129.56.114:40474->38.129.56.114:35379: write: broken pipe Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.263989 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 08:59:08 crc kubenswrapper[4954]: E1206 08:59:08.264957 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0" containerName="dnsmasq-dns" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.264975 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0" containerName="dnsmasq-dns" Dec 06 08:59:08 crc kubenswrapper[4954]: E1206 08:59:08.264993 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71627c71-f976-48d7-89ed-6b8ae58271d9" containerName="extract-utilities" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.265003 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="71627c71-f976-48d7-89ed-6b8ae58271d9" containerName="extract-utilities" Dec 06 08:59:08 crc kubenswrapper[4954]: E1206 08:59:08.265031 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71627c71-f976-48d7-89ed-6b8ae58271d9" containerName="extract-content" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.265039 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="71627c71-f976-48d7-89ed-6b8ae58271d9" containerName="extract-content" Dec 06 08:59:08 crc kubenswrapper[4954]: E1206 08:59:08.265054 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0" containerName="init" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.265060 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0" containerName="init" Dec 06 08:59:08 crc kubenswrapper[4954]: E1206 08:59:08.265076 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71627c71-f976-48d7-89ed-6b8ae58271d9" containerName="registry-server" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.265082 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="71627c71-f976-48d7-89ed-6b8ae58271d9" containerName="registry-server" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.265291 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="71627c71-f976-48d7-89ed-6b8ae58271d9" containerName="registry-server" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.265319 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7f421aa-a1c3-4a3a-b305-b0571b0e3eb0" containerName="dnsmasq-dns" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.266443 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.310225 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.314513 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.411205 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-config-data\") pod \"cinder-scheduler-0\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.411262 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.411432 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fefbd3c1-471e-4e7c-a736-a341e2057e8b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.411470 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgdk4\" (UniqueName: \"kubernetes.io/projected/fefbd3c1-471e-4e7c-a736-a341e2057e8b-kube-api-access-kgdk4\") pod \"cinder-scheduler-0\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.411671 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.411713 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-scripts\") pod \"cinder-scheduler-0\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.513907 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-config-data\") pod \"cinder-scheduler-0\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.513969 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.514124 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fefbd3c1-471e-4e7c-a736-a341e2057e8b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.514161 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgdk4\" (UniqueName: \"kubernetes.io/projected/fefbd3c1-471e-4e7c-a736-a341e2057e8b-kube-api-access-kgdk4\") pod \"cinder-scheduler-0\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.514299 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.514352 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-scripts\") pod \"cinder-scheduler-0\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.514370 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fefbd3c1-471e-4e7c-a736-a341e2057e8b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.520323 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-config-data\") pod \"cinder-scheduler-0\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.521060 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-scripts\") pod \"cinder-scheduler-0\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.523829 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.533306 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.556300 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgdk4\" (UniqueName: \"kubernetes.io/projected/fefbd3c1-471e-4e7c-a736-a341e2057e8b-kube-api-access-kgdk4\") pod \"cinder-scheduler-0\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:08 crc kubenswrapper[4954]: I1206 08:59:08.629711 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 08:59:09 crc kubenswrapper[4954]: I1206 08:59:09.122961 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 08:59:09 crc kubenswrapper[4954]: I1206 08:59:09.281095 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fefbd3c1-471e-4e7c-a736-a341e2057e8b","Type":"ContainerStarted","Data":"27e9c2d564da6ece1aeb0b1ab3d796bea598b9ae6f85991d4a2d91c6e5b03b93"} Dec 06 08:59:10 crc kubenswrapper[4954]: I1206 08:59:10.025432 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 06 08:59:10 crc kubenswrapper[4954]: I1206 08:59:10.026399 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7ae3854c-73c4-4c90-9e60-f212bb318c43" containerName="cinder-api-log" containerID="cri-o://5b9fac5e6a06daaeb80e2822fa4ab96ae850613b679c2d4552a3c1d94699ce4f" gracePeriod=30 Dec 06 08:59:10 crc kubenswrapper[4954]: I1206 08:59:10.026511 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7ae3854c-73c4-4c90-9e60-f212bb318c43" containerName="cinder-api" containerID="cri-o://c51f35143353c8e7cd1fe4cee375a9612c205e3442f2d0f1cb05726a19fadfd3" gracePeriod=30 Dec 06 08:59:10 crc kubenswrapper[4954]: I1206 08:59:10.296903 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fefbd3c1-471e-4e7c-a736-a341e2057e8b","Type":"ContainerStarted","Data":"46391dd4358449bffd5ce900ab1b3901cff56b899696122218a52ca8096dae06"} Dec 06 08:59:10 crc kubenswrapper[4954]: I1206 08:59:10.304298 4954 generic.go:334] "Generic (PLEG): container finished" podID="7ae3854c-73c4-4c90-9e60-f212bb318c43" containerID="5b9fac5e6a06daaeb80e2822fa4ab96ae850613b679c2d4552a3c1d94699ce4f" exitCode=143 Dec 06 08:59:10 crc kubenswrapper[4954]: I1206 08:59:10.304386 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ae3854c-73c4-4c90-9e60-f212bb318c43","Type":"ContainerDied","Data":"5b9fac5e6a06daaeb80e2822fa4ab96ae850613b679c2d4552a3c1d94699ce4f"} Dec 06 08:59:11 crc kubenswrapper[4954]: I1206 08:59:11.316944 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fefbd3c1-471e-4e7c-a736-a341e2057e8b","Type":"ContainerStarted","Data":"9338ec5d4dd7ab5f6ec6a6b978da9252bde6610cc6e826ffc65e7cec279a1fea"} Dec 06 08:59:11 crc kubenswrapper[4954]: I1206 08:59:11.344967 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.085850187 podStartE2EDuration="3.344939312s" podCreationTimestamp="2025-12-06 08:59:08 +0000 UTC" firstStartedPulling="2025-12-06 08:59:09.134967844 +0000 UTC m=+7323.948327233" lastFinishedPulling="2025-12-06 08:59:09.394056969 +0000 UTC m=+7324.207416358" observedRunningTime="2025-12-06 08:59:11.336739114 +0000 UTC m=+7326.150098523" watchObservedRunningTime="2025-12-06 08:59:11.344939312 +0000 UTC m=+7326.158298701" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.340422 4954 generic.go:334] "Generic (PLEG): container finished" podID="7ae3854c-73c4-4c90-9e60-f212bb318c43" containerID="c51f35143353c8e7cd1fe4cee375a9612c205e3442f2d0f1cb05726a19fadfd3" exitCode=0 Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.340745 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ae3854c-73c4-4c90-9e60-f212bb318c43","Type":"ContainerDied","Data":"c51f35143353c8e7cd1fe4cee375a9612c205e3442f2d0f1cb05726a19fadfd3"} Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.551510 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.630953 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.728741 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ae3854c-73c4-4c90-9e60-f212bb318c43-etc-machine-id\") pod \"7ae3854c-73c4-4c90-9e60-f212bb318c43\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.728806 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ae3854c-73c4-4c90-9e60-f212bb318c43-logs\") pod \"7ae3854c-73c4-4c90-9e60-f212bb318c43\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.728864 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ae3854c-73c4-4c90-9e60-f212bb318c43-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7ae3854c-73c4-4c90-9e60-f212bb318c43" (UID: "7ae3854c-73c4-4c90-9e60-f212bb318c43"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.728877 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-internal-tls-certs\") pod \"7ae3854c-73c4-4c90-9e60-f212bb318c43\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.728918 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x4tz\" (UniqueName: \"kubernetes.io/projected/7ae3854c-73c4-4c90-9e60-f212bb318c43-kube-api-access-7x4tz\") pod \"7ae3854c-73c4-4c90-9e60-f212bb318c43\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.728956 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-config-data\") pod \"7ae3854c-73c4-4c90-9e60-f212bb318c43\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.729013 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-public-tls-certs\") pod \"7ae3854c-73c4-4c90-9e60-f212bb318c43\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.729168 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-scripts\") pod \"7ae3854c-73c4-4c90-9e60-f212bb318c43\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.729802 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-combined-ca-bundle\") pod \"7ae3854c-73c4-4c90-9e60-f212bb318c43\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.729897 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-config-data-custom\") pod \"7ae3854c-73c4-4c90-9e60-f212bb318c43\" (UID: \"7ae3854c-73c4-4c90-9e60-f212bb318c43\") " Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.729999 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ae3854c-73c4-4c90-9e60-f212bb318c43-logs" (OuterVolumeSpecName: "logs") pod "7ae3854c-73c4-4c90-9e60-f212bb318c43" (UID: "7ae3854c-73c4-4c90-9e60-f212bb318c43"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.751863 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-scripts" (OuterVolumeSpecName: "scripts") pod "7ae3854c-73c4-4c90-9e60-f212bb318c43" (UID: "7ae3854c-73c4-4c90-9e60-f212bb318c43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.760731 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae3854c-73c4-4c90-9e60-f212bb318c43-kube-api-access-7x4tz" (OuterVolumeSpecName: "kube-api-access-7x4tz") pod "7ae3854c-73c4-4c90-9e60-f212bb318c43" (UID: "7ae3854c-73c4-4c90-9e60-f212bb318c43"). InnerVolumeSpecName "kube-api-access-7x4tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.760899 4954 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ae3854c-73c4-4c90-9e60-f212bb318c43-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.760949 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ae3854c-73c4-4c90-9e60-f212bb318c43-logs\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.762224 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7ae3854c-73c4-4c90-9e60-f212bb318c43" (UID: "7ae3854c-73c4-4c90-9e60-f212bb318c43"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.771153 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ae3854c-73c4-4c90-9e60-f212bb318c43" (UID: "7ae3854c-73c4-4c90-9e60-f212bb318c43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.802763 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7ae3854c-73c4-4c90-9e60-f212bb318c43" (UID: "7ae3854c-73c4-4c90-9e60-f212bb318c43"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.818300 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7ae3854c-73c4-4c90-9e60-f212bb318c43" (UID: "7ae3854c-73c4-4c90-9e60-f212bb318c43"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.828553 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-config-data" (OuterVolumeSpecName: "config-data") pod "7ae3854c-73c4-4c90-9e60-f212bb318c43" (UID: "7ae3854c-73c4-4c90-9e60-f212bb318c43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.868804 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.868845 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.868861 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.868872 4954 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.868884 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x4tz\" (UniqueName: \"kubernetes.io/projected/7ae3854c-73c4-4c90-9e60-f212bb318c43-kube-api-access-7x4tz\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.868896 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:13 crc kubenswrapper[4954]: I1206 08:59:13.868908 4954 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae3854c-73c4-4c90-9e60-f212bb318c43-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.351111 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ae3854c-73c4-4c90-9e60-f212bb318c43","Type":"ContainerDied","Data":"f35657b426d33045cf9eb2b2ce018cb2ca7d0c36ca8e86d58bae6c254c72ca7c"} Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.351175 4954 scope.go:117] "RemoveContainer" containerID="c51f35143353c8e7cd1fe4cee375a9612c205e3442f2d0f1cb05726a19fadfd3" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.352060 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.379085 4954 scope.go:117] "RemoveContainer" containerID="5b9fac5e6a06daaeb80e2822fa4ab96ae850613b679c2d4552a3c1d94699ce4f" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.388678 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.399029 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.413676 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 06 08:59:14 crc kubenswrapper[4954]: E1206 08:59:14.414124 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae3854c-73c4-4c90-9e60-f212bb318c43" containerName="cinder-api" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.414153 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae3854c-73c4-4c90-9e60-f212bb318c43" containerName="cinder-api" Dec 06 08:59:14 crc kubenswrapper[4954]: E1206 08:59:14.414194 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae3854c-73c4-4c90-9e60-f212bb318c43" containerName="cinder-api-log" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.414204 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae3854c-73c4-4c90-9e60-f212bb318c43" containerName="cinder-api-log" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.414424 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae3854c-73c4-4c90-9e60-f212bb318c43" containerName="cinder-api" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.414455 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae3854c-73c4-4c90-9e60-f212bb318c43" containerName="cinder-api-log" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.415525 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.419391 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.419514 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.419581 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.423237 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.480664 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ed7eaf-c22c-4933-943f-0bd990945ef1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.480723 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bd4t\" (UniqueName: \"kubernetes.io/projected/93ed7eaf-c22c-4933-943f-0bd990945ef1-kube-api-access-4bd4t\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.480757 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ed7eaf-c22c-4933-943f-0bd990945ef1-public-tls-certs\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.480776 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93ed7eaf-c22c-4933-943f-0bd990945ef1-logs\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.480916 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93ed7eaf-c22c-4933-943f-0bd990945ef1-config-data\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.481026 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93ed7eaf-c22c-4933-943f-0bd990945ef1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.481104 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93ed7eaf-c22c-4933-943f-0bd990945ef1-config-data-custom\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.481146 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ed7eaf-c22c-4933-943f-0bd990945ef1-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.481185 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93ed7eaf-c22c-4933-943f-0bd990945ef1-scripts\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.582191 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ed7eaf-c22c-4933-943f-0bd990945ef1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.582492 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bd4t\" (UniqueName: \"kubernetes.io/projected/93ed7eaf-c22c-4933-943f-0bd990945ef1-kube-api-access-4bd4t\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.582526 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ed7eaf-c22c-4933-943f-0bd990945ef1-public-tls-certs\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.582858 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93ed7eaf-c22c-4933-943f-0bd990945ef1-logs\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.582885 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93ed7eaf-c22c-4933-943f-0bd990945ef1-config-data\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.583165 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93ed7eaf-c22c-4933-943f-0bd990945ef1-logs\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.583235 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93ed7eaf-c22c-4933-943f-0bd990945ef1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.583309 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93ed7eaf-c22c-4933-943f-0bd990945ef1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.583378 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93ed7eaf-c22c-4933-943f-0bd990945ef1-config-data-custom\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.585877 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ed7eaf-c22c-4933-943f-0bd990945ef1-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.586226 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93ed7eaf-c22c-4933-943f-0bd990945ef1-scripts\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.588554 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ed7eaf-c22c-4933-943f-0bd990945ef1-public-tls-certs\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.588684 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ed7eaf-c22c-4933-943f-0bd990945ef1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.588778 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93ed7eaf-c22c-4933-943f-0bd990945ef1-config-data-custom\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.589086 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ed7eaf-c22c-4933-943f-0bd990945ef1-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.589297 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93ed7eaf-c22c-4933-943f-0bd990945ef1-config-data\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.590125 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93ed7eaf-c22c-4933-943f-0bd990945ef1-scripts\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.598722 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bd4t\" (UniqueName: \"kubernetes.io/projected/93ed7eaf-c22c-4933-943f-0bd990945ef1-kube-api-access-4bd4t\") pod \"cinder-api-0\" (UID: \"93ed7eaf-c22c-4933-943f-0bd990945ef1\") " pod="openstack/cinder-api-0" Dec 06 08:59:14 crc kubenswrapper[4954]: I1206 08:59:14.733353 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 08:59:15 crc kubenswrapper[4954]: I1206 08:59:15.197907 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 08:59:15 crc kubenswrapper[4954]: I1206 08:59:15.367484 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"93ed7eaf-c22c-4933-943f-0bd990945ef1","Type":"ContainerStarted","Data":"3299d8c7a7e729b73815caf86c36420bb4d0960118b49d6c3b791b502cc24644"} Dec 06 08:59:15 crc kubenswrapper[4954]: I1206 08:59:15.466721 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ae3854c-73c4-4c90-9e60-f212bb318c43" path="/var/lib/kubelet/pods/7ae3854c-73c4-4c90-9e60-f212bb318c43/volumes" Dec 06 08:59:16 crc kubenswrapper[4954]: I1206 08:59:16.377233 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"93ed7eaf-c22c-4933-943f-0bd990945ef1","Type":"ContainerStarted","Data":"b250e3cfb45392f19d89b676c08dadec4309fc9b0fb9e1235fb1f0fc4263a58c"} Dec 06 08:59:16 crc kubenswrapper[4954]: I1206 08:59:16.377550 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"93ed7eaf-c22c-4933-943f-0bd990945ef1","Type":"ContainerStarted","Data":"0c7ff1f1a954e83ed49dfabc69425a277d8dc496c4275d43cbd08898fb21f673"} Dec 06 08:59:16 crc kubenswrapper[4954]: I1206 08:59:16.377703 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 06 08:59:16 crc kubenswrapper[4954]: I1206 08:59:16.409081 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.409058294 podStartE2EDuration="2.409058294s" podCreationTimestamp="2025-12-06 08:59:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:59:16.401487543 +0000 UTC m=+7331.214846932" watchObservedRunningTime="2025-12-06 08:59:16.409058294 +0000 UTC m=+7331.222417683" Dec 06 08:59:18 crc kubenswrapper[4954]: I1206 08:59:18.844529 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 06 08:59:18 crc kubenswrapper[4954]: I1206 08:59:18.923016 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 08:59:19 crc kubenswrapper[4954]: I1206 08:59:19.398088 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="fefbd3c1-471e-4e7c-a736-a341e2057e8b" containerName="cinder-scheduler" containerID="cri-o://46391dd4358449bffd5ce900ab1b3901cff56b899696122218a52ca8096dae06" gracePeriod=30 Dec 06 08:59:19 crc kubenswrapper[4954]: I1206 08:59:19.398145 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="fefbd3c1-471e-4e7c-a736-a341e2057e8b" containerName="probe" containerID="cri-o://9338ec5d4dd7ab5f6ec6a6b978da9252bde6610cc6e826ffc65e7cec279a1fea" gracePeriod=30 Dec 06 08:59:20 crc kubenswrapper[4954]: I1206 08:59:20.409400 4954 generic.go:334] "Generic (PLEG): container finished" podID="fefbd3c1-471e-4e7c-a736-a341e2057e8b" containerID="9338ec5d4dd7ab5f6ec6a6b978da9252bde6610cc6e826ffc65e7cec279a1fea" exitCode=0 Dec 06 08:59:20 crc kubenswrapper[4954]: I1206 08:59:20.409517 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fefbd3c1-471e-4e7c-a736-a341e2057e8b","Type":"ContainerDied","Data":"9338ec5d4dd7ab5f6ec6a6b978da9252bde6610cc6e826ffc65e7cec279a1fea"} Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.431644 4954 generic.go:334] "Generic (PLEG): container finished" podID="fefbd3c1-471e-4e7c-a736-a341e2057e8b" containerID="46391dd4358449bffd5ce900ab1b3901cff56b899696122218a52ca8096dae06" exitCode=0 Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.435616 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fefbd3c1-471e-4e7c-a736-a341e2057e8b","Type":"ContainerDied","Data":"46391dd4358449bffd5ce900ab1b3901cff56b899696122218a52ca8096dae06"} Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.435676 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fefbd3c1-471e-4e7c-a736-a341e2057e8b","Type":"ContainerDied","Data":"27e9c2d564da6ece1aeb0b1ab3d796bea598b9ae6f85991d4a2d91c6e5b03b93"} Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.435690 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27e9c2d564da6ece1aeb0b1ab3d796bea598b9ae6f85991d4a2d91c6e5b03b93" Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.465668 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.604212 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fefbd3c1-471e-4e7c-a736-a341e2057e8b-etc-machine-id\") pod \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.604330 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-config-data\") pod \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.604369 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgdk4\" (UniqueName: \"kubernetes.io/projected/fefbd3c1-471e-4e7c-a736-a341e2057e8b-kube-api-access-kgdk4\") pod \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.604439 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-config-data-custom\") pod \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.604470 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-scripts\") pod \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.604505 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-combined-ca-bundle\") pod \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\" (UID: \"fefbd3c1-471e-4e7c-a736-a341e2057e8b\") " Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.605365 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fefbd3c1-471e-4e7c-a736-a341e2057e8b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fefbd3c1-471e-4e7c-a736-a341e2057e8b" (UID: "fefbd3c1-471e-4e7c-a736-a341e2057e8b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.610908 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fefbd3c1-471e-4e7c-a736-a341e2057e8b" (UID: "fefbd3c1-471e-4e7c-a736-a341e2057e8b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.625822 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-scripts" (OuterVolumeSpecName: "scripts") pod "fefbd3c1-471e-4e7c-a736-a341e2057e8b" (UID: "fefbd3c1-471e-4e7c-a736-a341e2057e8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.631329 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fefbd3c1-471e-4e7c-a736-a341e2057e8b-kube-api-access-kgdk4" (OuterVolumeSpecName: "kube-api-access-kgdk4") pod "fefbd3c1-471e-4e7c-a736-a341e2057e8b" (UID: "fefbd3c1-471e-4e7c-a736-a341e2057e8b"). InnerVolumeSpecName "kube-api-access-kgdk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.667315 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fefbd3c1-471e-4e7c-a736-a341e2057e8b" (UID: "fefbd3c1-471e-4e7c-a736-a341e2057e8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.697021 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-config-data" (OuterVolumeSpecName: "config-data") pod "fefbd3c1-471e-4e7c-a736-a341e2057e8b" (UID: "fefbd3c1-471e-4e7c-a736-a341e2057e8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.706285 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.706318 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.706329 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.706340 4954 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fefbd3c1-471e-4e7c-a736-a341e2057e8b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.706349 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fefbd3c1-471e-4e7c-a736-a341e2057e8b-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:21 crc kubenswrapper[4954]: I1206 08:59:21.706358 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgdk4\" (UniqueName: \"kubernetes.io/projected/fefbd3c1-471e-4e7c-a736-a341e2057e8b-kube-api-access-kgdk4\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.440326 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.481425 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.489723 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.499593 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 08:59:22 crc kubenswrapper[4954]: E1206 08:59:22.500029 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fefbd3c1-471e-4e7c-a736-a341e2057e8b" containerName="cinder-scheduler" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.500050 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="fefbd3c1-471e-4e7c-a736-a341e2057e8b" containerName="cinder-scheduler" Dec 06 08:59:22 crc kubenswrapper[4954]: E1206 08:59:22.500068 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fefbd3c1-471e-4e7c-a736-a341e2057e8b" containerName="probe" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.500075 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="fefbd3c1-471e-4e7c-a736-a341e2057e8b" containerName="probe" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.500236 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="fefbd3c1-471e-4e7c-a736-a341e2057e8b" containerName="probe" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.500247 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="fefbd3c1-471e-4e7c-a736-a341e2057e8b" containerName="cinder-scheduler" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.501321 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.503972 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.508575 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.620257 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/905b98df-c8e3-4553-9e44-6ce21971e83a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"905b98df-c8e3-4553-9e44-6ce21971e83a\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.620351 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7rvt\" (UniqueName: \"kubernetes.io/projected/905b98df-c8e3-4553-9e44-6ce21971e83a-kube-api-access-b7rvt\") pod \"cinder-scheduler-0\" (UID: \"905b98df-c8e3-4553-9e44-6ce21971e83a\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.620704 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/905b98df-c8e3-4553-9e44-6ce21971e83a-config-data\") pod \"cinder-scheduler-0\" (UID: \"905b98df-c8e3-4553-9e44-6ce21971e83a\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.620766 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/905b98df-c8e3-4553-9e44-6ce21971e83a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"905b98df-c8e3-4553-9e44-6ce21971e83a\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.620949 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/905b98df-c8e3-4553-9e44-6ce21971e83a-scripts\") pod \"cinder-scheduler-0\" (UID: \"905b98df-c8e3-4553-9e44-6ce21971e83a\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.620999 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/905b98df-c8e3-4553-9e44-6ce21971e83a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"905b98df-c8e3-4553-9e44-6ce21971e83a\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.722958 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/905b98df-c8e3-4553-9e44-6ce21971e83a-scripts\") pod \"cinder-scheduler-0\" (UID: \"905b98df-c8e3-4553-9e44-6ce21971e83a\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.723007 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/905b98df-c8e3-4553-9e44-6ce21971e83a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"905b98df-c8e3-4553-9e44-6ce21971e83a\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.723078 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/905b98df-c8e3-4553-9e44-6ce21971e83a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"905b98df-c8e3-4553-9e44-6ce21971e83a\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.723142 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7rvt\" (UniqueName: \"kubernetes.io/projected/905b98df-c8e3-4553-9e44-6ce21971e83a-kube-api-access-b7rvt\") pod \"cinder-scheduler-0\" (UID: \"905b98df-c8e3-4553-9e44-6ce21971e83a\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.723207 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/905b98df-c8e3-4553-9e44-6ce21971e83a-config-data\") pod \"cinder-scheduler-0\" (UID: \"905b98df-c8e3-4553-9e44-6ce21971e83a\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.723224 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/905b98df-c8e3-4553-9e44-6ce21971e83a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"905b98df-c8e3-4553-9e44-6ce21971e83a\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.723242 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/905b98df-c8e3-4553-9e44-6ce21971e83a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"905b98df-c8e3-4553-9e44-6ce21971e83a\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.727300 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/905b98df-c8e3-4553-9e44-6ce21971e83a-scripts\") pod \"cinder-scheduler-0\" (UID: \"905b98df-c8e3-4553-9e44-6ce21971e83a\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.729245 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/905b98df-c8e3-4553-9e44-6ce21971e83a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"905b98df-c8e3-4553-9e44-6ce21971e83a\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.729618 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/905b98df-c8e3-4553-9e44-6ce21971e83a-config-data\") pod \"cinder-scheduler-0\" (UID: \"905b98df-c8e3-4553-9e44-6ce21971e83a\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.739950 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/905b98df-c8e3-4553-9e44-6ce21971e83a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"905b98df-c8e3-4553-9e44-6ce21971e83a\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.740662 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7rvt\" (UniqueName: \"kubernetes.io/projected/905b98df-c8e3-4553-9e44-6ce21971e83a-kube-api-access-b7rvt\") pod \"cinder-scheduler-0\" (UID: \"905b98df-c8e3-4553-9e44-6ce21971e83a\") " pod="openstack/cinder-scheduler-0" Dec 06 08:59:22 crc kubenswrapper[4954]: I1206 08:59:22.820501 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 08:59:23 crc kubenswrapper[4954]: I1206 08:59:23.314073 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 08:59:23 crc kubenswrapper[4954]: I1206 08:59:23.339294 4954 scope.go:117] "RemoveContainer" containerID="b00dfc96ff5255ac93b2e6a1f9041cb8e56f587ef1bbc1dac79fb4e40710a890" Dec 06 08:59:23 crc kubenswrapper[4954]: I1206 08:59:23.455132 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fefbd3c1-471e-4e7c-a736-a341e2057e8b" path="/var/lib/kubelet/pods/fefbd3c1-471e-4e7c-a736-a341e2057e8b/volumes" Dec 06 08:59:23 crc kubenswrapper[4954]: I1206 08:59:23.457144 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"905b98df-c8e3-4553-9e44-6ce21971e83a","Type":"ContainerStarted","Data":"966df68f2f09b1f86b5c77410c65be3377d50f318789313b38f70647bf9c28ca"} Dec 06 08:59:24 crc kubenswrapper[4954]: I1206 08:59:24.465894 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"905b98df-c8e3-4553-9e44-6ce21971e83a","Type":"ContainerStarted","Data":"006a13611129c0364212cd91d6922c399ecc9e359b1dbc6a706c4cf42d936f79"} Dec 06 08:59:24 crc kubenswrapper[4954]: I1206 08:59:24.466250 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"905b98df-c8e3-4553-9e44-6ce21971e83a","Type":"ContainerStarted","Data":"d1c3b4218450dc1afcf6dcd18ca943e06c00f101844df738d41e08c0ead28c3b"} Dec 06 08:59:24 crc kubenswrapper[4954]: I1206 08:59:24.490533 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.490506544 podStartE2EDuration="2.490506544s" podCreationTimestamp="2025-12-06 08:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:59:24.483088786 +0000 UTC m=+7339.296448185" watchObservedRunningTime="2025-12-06 08:59:24.490506544 +0000 UTC m=+7339.303865933" Dec 06 08:59:26 crc kubenswrapper[4954]: I1206 08:59:26.833038 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 06 08:59:27 crc kubenswrapper[4954]: I1206 08:59:27.821313 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 06 08:59:33 crc kubenswrapper[4954]: I1206 08:59:33.024063 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.402712 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-77p2n"] Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.404445 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-77p2n" Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.414049 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c137-account-create-update-xv896"] Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.415776 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c137-account-create-update-xv896" Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.420106 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.435884 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-77p2n"] Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.465163 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c137-account-create-update-xv896"] Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.467538 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr48h\" (UniqueName: \"kubernetes.io/projected/25d763e6-1e5f-412b-9e16-f4cdf3cf98ac-kube-api-access-fr48h\") pod \"glance-db-create-77p2n\" (UID: \"25d763e6-1e5f-412b-9e16-f4cdf3cf98ac\") " pod="openstack/glance-db-create-77p2n" Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.467875 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25d763e6-1e5f-412b-9e16-f4cdf3cf98ac-operator-scripts\") pod \"glance-db-create-77p2n\" (UID: \"25d763e6-1e5f-412b-9e16-f4cdf3cf98ac\") " pod="openstack/glance-db-create-77p2n" Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.569920 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1478a06b-de76-4e6e-a09f-bf3599f5fad0-operator-scripts\") pod \"glance-c137-account-create-update-xv896\" (UID: \"1478a06b-de76-4e6e-a09f-bf3599f5fad0\") " pod="openstack/glance-c137-account-create-update-xv896" Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.570018 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25d763e6-1e5f-412b-9e16-f4cdf3cf98ac-operator-scripts\") pod \"glance-db-create-77p2n\" (UID: \"25d763e6-1e5f-412b-9e16-f4cdf3cf98ac\") " pod="openstack/glance-db-create-77p2n" Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.570219 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cghf4\" (UniqueName: \"kubernetes.io/projected/1478a06b-de76-4e6e-a09f-bf3599f5fad0-kube-api-access-cghf4\") pod \"glance-c137-account-create-update-xv896\" (UID: \"1478a06b-de76-4e6e-a09f-bf3599f5fad0\") " pod="openstack/glance-c137-account-create-update-xv896" Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.570258 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr48h\" (UniqueName: \"kubernetes.io/projected/25d763e6-1e5f-412b-9e16-f4cdf3cf98ac-kube-api-access-fr48h\") pod \"glance-db-create-77p2n\" (UID: \"25d763e6-1e5f-412b-9e16-f4cdf3cf98ac\") " pod="openstack/glance-db-create-77p2n" Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.570779 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25d763e6-1e5f-412b-9e16-f4cdf3cf98ac-operator-scripts\") pod \"glance-db-create-77p2n\" (UID: \"25d763e6-1e5f-412b-9e16-f4cdf3cf98ac\") " pod="openstack/glance-db-create-77p2n" Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.589830 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr48h\" (UniqueName: \"kubernetes.io/projected/25d763e6-1e5f-412b-9e16-f4cdf3cf98ac-kube-api-access-fr48h\") pod \"glance-db-create-77p2n\" (UID: \"25d763e6-1e5f-412b-9e16-f4cdf3cf98ac\") " pod="openstack/glance-db-create-77p2n" Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.671707 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1478a06b-de76-4e6e-a09f-bf3599f5fad0-operator-scripts\") pod \"glance-c137-account-create-update-xv896\" (UID: \"1478a06b-de76-4e6e-a09f-bf3599f5fad0\") " pod="openstack/glance-c137-account-create-update-xv896" Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.671875 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cghf4\" (UniqueName: \"kubernetes.io/projected/1478a06b-de76-4e6e-a09f-bf3599f5fad0-kube-api-access-cghf4\") pod \"glance-c137-account-create-update-xv896\" (UID: \"1478a06b-de76-4e6e-a09f-bf3599f5fad0\") " pod="openstack/glance-c137-account-create-update-xv896" Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.672584 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1478a06b-de76-4e6e-a09f-bf3599f5fad0-operator-scripts\") pod \"glance-c137-account-create-update-xv896\" (UID: \"1478a06b-de76-4e6e-a09f-bf3599f5fad0\") " pod="openstack/glance-c137-account-create-update-xv896" Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.688465 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cghf4\" (UniqueName: \"kubernetes.io/projected/1478a06b-de76-4e6e-a09f-bf3599f5fad0-kube-api-access-cghf4\") pod \"glance-c137-account-create-update-xv896\" (UID: \"1478a06b-de76-4e6e-a09f-bf3599f5fad0\") " pod="openstack/glance-c137-account-create-update-xv896" Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.727211 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-77p2n" Dec 06 08:59:35 crc kubenswrapper[4954]: I1206 08:59:35.745867 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c137-account-create-update-xv896" Dec 06 08:59:36 crc kubenswrapper[4954]: I1206 08:59:36.215908 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-77p2n"] Dec 06 08:59:36 crc kubenswrapper[4954]: W1206 08:59:36.217394 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25d763e6_1e5f_412b_9e16_f4cdf3cf98ac.slice/crio-7c37db0dc2ea4126a0e2a956231ed23e50e90efd14819b0b745fed5c2d6604d3 WatchSource:0}: Error finding container 7c37db0dc2ea4126a0e2a956231ed23e50e90efd14819b0b745fed5c2d6604d3: Status 404 returned error can't find the container with id 7c37db0dc2ea4126a0e2a956231ed23e50e90efd14819b0b745fed5c2d6604d3 Dec 06 08:59:36 crc kubenswrapper[4954]: I1206 08:59:36.290346 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c137-account-create-update-xv896"] Dec 06 08:59:36 crc kubenswrapper[4954]: I1206 08:59:36.595772 4954 generic.go:334] "Generic (PLEG): container finished" podID="25d763e6-1e5f-412b-9e16-f4cdf3cf98ac" containerID="428d38a6aea45d706de5c44686df6bc13d8dcdef2a6d045755f54720bb726c17" exitCode=0 Dec 06 08:59:36 crc kubenswrapper[4954]: I1206 08:59:36.596163 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-77p2n" event={"ID":"25d763e6-1e5f-412b-9e16-f4cdf3cf98ac","Type":"ContainerDied","Data":"428d38a6aea45d706de5c44686df6bc13d8dcdef2a6d045755f54720bb726c17"} Dec 06 08:59:36 crc kubenswrapper[4954]: I1206 08:59:36.596199 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-77p2n" event={"ID":"25d763e6-1e5f-412b-9e16-f4cdf3cf98ac","Type":"ContainerStarted","Data":"7c37db0dc2ea4126a0e2a956231ed23e50e90efd14819b0b745fed5c2d6604d3"} Dec 06 08:59:36 crc kubenswrapper[4954]: I1206 08:59:36.598530 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c137-account-create-update-xv896" event={"ID":"1478a06b-de76-4e6e-a09f-bf3599f5fad0","Type":"ContainerStarted","Data":"2653f73701e96675d3a7d4833b20eb482836944b84f5717ca05bdfe1f5de6196"} Dec 06 08:59:36 crc kubenswrapper[4954]: I1206 08:59:36.598555 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c137-account-create-update-xv896" event={"ID":"1478a06b-de76-4e6e-a09f-bf3599f5fad0","Type":"ContainerStarted","Data":"adf2fda65b8b7534c3083a216e227d037f79c629456f7bfc410e40df4a989b50"} Dec 06 08:59:36 crc kubenswrapper[4954]: I1206 08:59:36.634114 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-c137-account-create-update-xv896" podStartSLOduration=1.6340910979999999 podStartE2EDuration="1.634091098s" podCreationTimestamp="2025-12-06 08:59:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 08:59:36.628682874 +0000 UTC m=+7351.442042263" watchObservedRunningTime="2025-12-06 08:59:36.634091098 +0000 UTC m=+7351.447450487" Dec 06 08:59:37 crc kubenswrapper[4954]: I1206 08:59:37.609210 4954 generic.go:334] "Generic (PLEG): container finished" podID="1478a06b-de76-4e6e-a09f-bf3599f5fad0" containerID="2653f73701e96675d3a7d4833b20eb482836944b84f5717ca05bdfe1f5de6196" exitCode=0 Dec 06 08:59:37 crc kubenswrapper[4954]: I1206 08:59:37.609297 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c137-account-create-update-xv896" event={"ID":"1478a06b-de76-4e6e-a09f-bf3599f5fad0","Type":"ContainerDied","Data":"2653f73701e96675d3a7d4833b20eb482836944b84f5717ca05bdfe1f5de6196"} Dec 06 08:59:37 crc kubenswrapper[4954]: I1206 08:59:37.985777 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-77p2n" Dec 06 08:59:38 crc kubenswrapper[4954]: I1206 08:59:38.111251 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr48h\" (UniqueName: \"kubernetes.io/projected/25d763e6-1e5f-412b-9e16-f4cdf3cf98ac-kube-api-access-fr48h\") pod \"25d763e6-1e5f-412b-9e16-f4cdf3cf98ac\" (UID: \"25d763e6-1e5f-412b-9e16-f4cdf3cf98ac\") " Dec 06 08:59:38 crc kubenswrapper[4954]: I1206 08:59:38.112441 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25d763e6-1e5f-412b-9e16-f4cdf3cf98ac-operator-scripts\") pod \"25d763e6-1e5f-412b-9e16-f4cdf3cf98ac\" (UID: \"25d763e6-1e5f-412b-9e16-f4cdf3cf98ac\") " Dec 06 08:59:38 crc kubenswrapper[4954]: I1206 08:59:38.112869 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25d763e6-1e5f-412b-9e16-f4cdf3cf98ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "25d763e6-1e5f-412b-9e16-f4cdf3cf98ac" (UID: "25d763e6-1e5f-412b-9e16-f4cdf3cf98ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:59:38 crc kubenswrapper[4954]: I1206 08:59:38.113317 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25d763e6-1e5f-412b-9e16-f4cdf3cf98ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:38 crc kubenswrapper[4954]: I1206 08:59:38.130840 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d763e6-1e5f-412b-9e16-f4cdf3cf98ac-kube-api-access-fr48h" (OuterVolumeSpecName: "kube-api-access-fr48h") pod "25d763e6-1e5f-412b-9e16-f4cdf3cf98ac" (UID: "25d763e6-1e5f-412b-9e16-f4cdf3cf98ac"). InnerVolumeSpecName "kube-api-access-fr48h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:59:38 crc kubenswrapper[4954]: I1206 08:59:38.214880 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr48h\" (UniqueName: \"kubernetes.io/projected/25d763e6-1e5f-412b-9e16-f4cdf3cf98ac-kube-api-access-fr48h\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:38 crc kubenswrapper[4954]: I1206 08:59:38.623691 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-77p2n" event={"ID":"25d763e6-1e5f-412b-9e16-f4cdf3cf98ac","Type":"ContainerDied","Data":"7c37db0dc2ea4126a0e2a956231ed23e50e90efd14819b0b745fed5c2d6604d3"} Dec 06 08:59:38 crc kubenswrapper[4954]: I1206 08:59:38.623791 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c37db0dc2ea4126a0e2a956231ed23e50e90efd14819b0b745fed5c2d6604d3" Dec 06 08:59:38 crc kubenswrapper[4954]: I1206 08:59:38.623704 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-77p2n" Dec 06 08:59:38 crc kubenswrapper[4954]: I1206 08:59:38.953198 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c137-account-create-update-xv896" Dec 06 08:59:39 crc kubenswrapper[4954]: I1206 08:59:39.029334 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cghf4\" (UniqueName: \"kubernetes.io/projected/1478a06b-de76-4e6e-a09f-bf3599f5fad0-kube-api-access-cghf4\") pod \"1478a06b-de76-4e6e-a09f-bf3599f5fad0\" (UID: \"1478a06b-de76-4e6e-a09f-bf3599f5fad0\") " Dec 06 08:59:39 crc kubenswrapper[4954]: I1206 08:59:39.029619 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1478a06b-de76-4e6e-a09f-bf3599f5fad0-operator-scripts\") pod \"1478a06b-de76-4e6e-a09f-bf3599f5fad0\" (UID: \"1478a06b-de76-4e6e-a09f-bf3599f5fad0\") " Dec 06 08:59:39 crc kubenswrapper[4954]: I1206 08:59:39.030539 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1478a06b-de76-4e6e-a09f-bf3599f5fad0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1478a06b-de76-4e6e-a09f-bf3599f5fad0" (UID: "1478a06b-de76-4e6e-a09f-bf3599f5fad0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 08:59:39 crc kubenswrapper[4954]: I1206 08:59:39.036608 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1478a06b-de76-4e6e-a09f-bf3599f5fad0-kube-api-access-cghf4" (OuterVolumeSpecName: "kube-api-access-cghf4") pod "1478a06b-de76-4e6e-a09f-bf3599f5fad0" (UID: "1478a06b-de76-4e6e-a09f-bf3599f5fad0"). InnerVolumeSpecName "kube-api-access-cghf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 08:59:39 crc kubenswrapper[4954]: I1206 08:59:39.131462 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cghf4\" (UniqueName: \"kubernetes.io/projected/1478a06b-de76-4e6e-a09f-bf3599f5fad0-kube-api-access-cghf4\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:39 crc kubenswrapper[4954]: I1206 08:59:39.131498 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1478a06b-de76-4e6e-a09f-bf3599f5fad0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 08:59:39 crc kubenswrapper[4954]: I1206 08:59:39.645774 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c137-account-create-update-xv896" event={"ID":"1478a06b-de76-4e6e-a09f-bf3599f5fad0","Type":"ContainerDied","Data":"adf2fda65b8b7534c3083a216e227d037f79c629456f7bfc410e40df4a989b50"} Dec 06 08:59:39 crc kubenswrapper[4954]: I1206 08:59:39.645827 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adf2fda65b8b7534c3083a216e227d037f79c629456f7bfc410e40df4a989b50" Dec 06 08:59:39 crc kubenswrapper[4954]: I1206 08:59:39.645905 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c137-account-create-update-xv896" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.589017 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-tv6n7"] Dec 06 08:59:40 crc kubenswrapper[4954]: E1206 08:59:40.589871 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1478a06b-de76-4e6e-a09f-bf3599f5fad0" containerName="mariadb-account-create-update" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.590299 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1478a06b-de76-4e6e-a09f-bf3599f5fad0" containerName="mariadb-account-create-update" Dec 06 08:59:40 crc kubenswrapper[4954]: E1206 08:59:40.590335 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d763e6-1e5f-412b-9e16-f4cdf3cf98ac" containerName="mariadb-database-create" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.590345 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d763e6-1e5f-412b-9e16-f4cdf3cf98ac" containerName="mariadb-database-create" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.590545 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1478a06b-de76-4e6e-a09f-bf3599f5fad0" containerName="mariadb-account-create-update" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.590589 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d763e6-1e5f-412b-9e16-f4cdf3cf98ac" containerName="mariadb-database-create" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.591331 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tv6n7" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.595929 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xcnwj" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.599362 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tv6n7"] Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.602925 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.661498 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc9b245-21ff-4118-8725-2c0ad2eedf7c-config-data\") pod \"glance-db-sync-tv6n7\" (UID: \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\") " pod="openstack/glance-db-sync-tv6n7" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.661607 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr8d4\" (UniqueName: \"kubernetes.io/projected/edc9b245-21ff-4118-8725-2c0ad2eedf7c-kube-api-access-qr8d4\") pod \"glance-db-sync-tv6n7\" (UID: \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\") " pod="openstack/glance-db-sync-tv6n7" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.661692 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc9b245-21ff-4118-8725-2c0ad2eedf7c-combined-ca-bundle\") pod \"glance-db-sync-tv6n7\" (UID: \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\") " pod="openstack/glance-db-sync-tv6n7" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.661726 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/edc9b245-21ff-4118-8725-2c0ad2eedf7c-db-sync-config-data\") pod \"glance-db-sync-tv6n7\" (UID: \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\") " pod="openstack/glance-db-sync-tv6n7" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.791510 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr8d4\" (UniqueName: \"kubernetes.io/projected/edc9b245-21ff-4118-8725-2c0ad2eedf7c-kube-api-access-qr8d4\") pod \"glance-db-sync-tv6n7\" (UID: \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\") " pod="openstack/glance-db-sync-tv6n7" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.791791 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc9b245-21ff-4118-8725-2c0ad2eedf7c-combined-ca-bundle\") pod \"glance-db-sync-tv6n7\" (UID: \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\") " pod="openstack/glance-db-sync-tv6n7" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.791881 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/edc9b245-21ff-4118-8725-2c0ad2eedf7c-db-sync-config-data\") pod \"glance-db-sync-tv6n7\" (UID: \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\") " pod="openstack/glance-db-sync-tv6n7" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.792117 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc9b245-21ff-4118-8725-2c0ad2eedf7c-config-data\") pod \"glance-db-sync-tv6n7\" (UID: \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\") " pod="openstack/glance-db-sync-tv6n7" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.803083 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/edc9b245-21ff-4118-8725-2c0ad2eedf7c-db-sync-config-data\") pod \"glance-db-sync-tv6n7\" (UID: \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\") " pod="openstack/glance-db-sync-tv6n7" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.810291 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc9b245-21ff-4118-8725-2c0ad2eedf7c-config-data\") pod \"glance-db-sync-tv6n7\" (UID: \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\") " pod="openstack/glance-db-sync-tv6n7" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.811143 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc9b245-21ff-4118-8725-2c0ad2eedf7c-combined-ca-bundle\") pod \"glance-db-sync-tv6n7\" (UID: \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\") " pod="openstack/glance-db-sync-tv6n7" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.822684 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr8d4\" (UniqueName: \"kubernetes.io/projected/edc9b245-21ff-4118-8725-2c0ad2eedf7c-kube-api-access-qr8d4\") pod \"glance-db-sync-tv6n7\" (UID: \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\") " pod="openstack/glance-db-sync-tv6n7" Dec 06 08:59:40 crc kubenswrapper[4954]: I1206 08:59:40.913887 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tv6n7" Dec 06 08:59:41 crc kubenswrapper[4954]: I1206 08:59:41.304216 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tv6n7"] Dec 06 08:59:41 crc kubenswrapper[4954]: I1206 08:59:41.666372 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tv6n7" event={"ID":"edc9b245-21ff-4118-8725-2c0ad2eedf7c","Type":"ContainerStarted","Data":"b227a71c9986ac561f184fbc78cd6deceb4817f3093bee2356d38a258c7a8212"} Dec 06 08:59:55 crc kubenswrapper[4954]: I1206 08:59:55.964997 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d82k4"] Dec 06 08:59:55 crc kubenswrapper[4954]: I1206 08:59:55.967546 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d82k4" Dec 06 08:59:55 crc kubenswrapper[4954]: I1206 08:59:55.990899 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d82k4"] Dec 06 08:59:56 crc kubenswrapper[4954]: I1206 08:59:56.077677 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98f5ca97-be0e-45b9-97df-f72b98bdce09-catalog-content\") pod \"community-operators-d82k4\" (UID: \"98f5ca97-be0e-45b9-97df-f72b98bdce09\") " pod="openshift-marketplace/community-operators-d82k4" Dec 06 08:59:56 crc kubenswrapper[4954]: I1206 08:59:56.077749 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98f5ca97-be0e-45b9-97df-f72b98bdce09-utilities\") pod \"community-operators-d82k4\" (UID: \"98f5ca97-be0e-45b9-97df-f72b98bdce09\") " pod="openshift-marketplace/community-operators-d82k4" Dec 06 08:59:56 crc kubenswrapper[4954]: I1206 08:59:56.077832 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb79m\" (UniqueName: \"kubernetes.io/projected/98f5ca97-be0e-45b9-97df-f72b98bdce09-kube-api-access-cb79m\") pod \"community-operators-d82k4\" (UID: \"98f5ca97-be0e-45b9-97df-f72b98bdce09\") " pod="openshift-marketplace/community-operators-d82k4" Dec 06 08:59:56 crc kubenswrapper[4954]: I1206 08:59:56.179846 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb79m\" (UniqueName: \"kubernetes.io/projected/98f5ca97-be0e-45b9-97df-f72b98bdce09-kube-api-access-cb79m\") pod \"community-operators-d82k4\" (UID: \"98f5ca97-be0e-45b9-97df-f72b98bdce09\") " pod="openshift-marketplace/community-operators-d82k4" Dec 06 08:59:56 crc kubenswrapper[4954]: I1206 08:59:56.179935 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98f5ca97-be0e-45b9-97df-f72b98bdce09-catalog-content\") pod \"community-operators-d82k4\" (UID: \"98f5ca97-be0e-45b9-97df-f72b98bdce09\") " pod="openshift-marketplace/community-operators-d82k4" Dec 06 08:59:56 crc kubenswrapper[4954]: I1206 08:59:56.179984 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98f5ca97-be0e-45b9-97df-f72b98bdce09-utilities\") pod \"community-operators-d82k4\" (UID: \"98f5ca97-be0e-45b9-97df-f72b98bdce09\") " pod="openshift-marketplace/community-operators-d82k4" Dec 06 08:59:56 crc kubenswrapper[4954]: I1206 08:59:56.180432 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98f5ca97-be0e-45b9-97df-f72b98bdce09-utilities\") pod \"community-operators-d82k4\" (UID: \"98f5ca97-be0e-45b9-97df-f72b98bdce09\") " pod="openshift-marketplace/community-operators-d82k4" Dec 06 08:59:56 crc kubenswrapper[4954]: I1206 08:59:56.180703 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98f5ca97-be0e-45b9-97df-f72b98bdce09-catalog-content\") pod \"community-operators-d82k4\" (UID: \"98f5ca97-be0e-45b9-97df-f72b98bdce09\") " pod="openshift-marketplace/community-operators-d82k4" Dec 06 08:59:56 crc kubenswrapper[4954]: I1206 08:59:56.198416 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb79m\" (UniqueName: \"kubernetes.io/projected/98f5ca97-be0e-45b9-97df-f72b98bdce09-kube-api-access-cb79m\") pod \"community-operators-d82k4\" (UID: \"98f5ca97-be0e-45b9-97df-f72b98bdce09\") " pod="openshift-marketplace/community-operators-d82k4" Dec 06 08:59:56 crc kubenswrapper[4954]: I1206 08:59:56.294896 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d82k4" Dec 06 08:59:56 crc kubenswrapper[4954]: I1206 08:59:56.803875 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d82k4"] Dec 06 08:59:56 crc kubenswrapper[4954]: W1206 08:59:56.809264 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98f5ca97_be0e_45b9_97df_f72b98bdce09.slice/crio-b298b8a3c9723f3fa332cf8f2105f25e4df320bd6741ff64ec1c8c0dff3cd03f WatchSource:0}: Error finding container b298b8a3c9723f3fa332cf8f2105f25e4df320bd6741ff64ec1c8c0dff3cd03f: Status 404 returned error can't find the container with id b298b8a3c9723f3fa332cf8f2105f25e4df320bd6741ff64ec1c8c0dff3cd03f Dec 06 08:59:57 crc kubenswrapper[4954]: I1206 08:59:57.802034 4954 generic.go:334] "Generic (PLEG): container finished" podID="98f5ca97-be0e-45b9-97df-f72b98bdce09" containerID="2c5d13112bfd47d6e296f52208c803d39c67ba3734ba30a7ecdb80dc5bd604ff" exitCode=0 Dec 06 08:59:57 crc kubenswrapper[4954]: I1206 08:59:57.802117 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d82k4" event={"ID":"98f5ca97-be0e-45b9-97df-f72b98bdce09","Type":"ContainerDied","Data":"2c5d13112bfd47d6e296f52208c803d39c67ba3734ba30a7ecdb80dc5bd604ff"} Dec 06 08:59:57 crc kubenswrapper[4954]: I1206 08:59:57.802625 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d82k4" event={"ID":"98f5ca97-be0e-45b9-97df-f72b98bdce09","Type":"ContainerStarted","Data":"b298b8a3c9723f3fa332cf8f2105f25e4df320bd6741ff64ec1c8c0dff3cd03f"} Dec 06 08:59:58 crc kubenswrapper[4954]: I1206 08:59:58.813068 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d82k4" event={"ID":"98f5ca97-be0e-45b9-97df-f72b98bdce09","Type":"ContainerStarted","Data":"880d28c1d9494c7097260505ef6e861d83d2dda7c7a72f26aa870e73c47d8ac4"} Dec 06 08:59:59 crc kubenswrapper[4954]: I1206 08:59:59.823163 4954 generic.go:334] "Generic (PLEG): container finished" podID="98f5ca97-be0e-45b9-97df-f72b98bdce09" containerID="880d28c1d9494c7097260505ef6e861d83d2dda7c7a72f26aa870e73c47d8ac4" exitCode=0 Dec 06 08:59:59 crc kubenswrapper[4954]: I1206 08:59:59.823260 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d82k4" event={"ID":"98f5ca97-be0e-45b9-97df-f72b98bdce09","Type":"ContainerDied","Data":"880d28c1d9494c7097260505ef6e861d83d2dda7c7a72f26aa870e73c47d8ac4"} Dec 06 09:00:00 crc kubenswrapper[4954]: I1206 09:00:00.135652 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb"] Dec 06 09:00:00 crc kubenswrapper[4954]: I1206 09:00:00.137309 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb" Dec 06 09:00:00 crc kubenswrapper[4954]: I1206 09:00:00.141228 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 09:00:00 crc kubenswrapper[4954]: I1206 09:00:00.141249 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 09:00:00 crc kubenswrapper[4954]: I1206 09:00:00.151107 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb"] Dec 06 09:00:00 crc kubenswrapper[4954]: I1206 09:00:00.258994 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fff71e07-e11a-46fe-873b-4503a384170e-secret-volume\") pod \"collect-profiles-29416860-mc6mb\" (UID: \"fff71e07-e11a-46fe-873b-4503a384170e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb" Dec 06 09:00:00 crc kubenswrapper[4954]: I1206 09:00:00.259040 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fff71e07-e11a-46fe-873b-4503a384170e-config-volume\") pod \"collect-profiles-29416860-mc6mb\" (UID: \"fff71e07-e11a-46fe-873b-4503a384170e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb" Dec 06 09:00:00 crc kubenswrapper[4954]: I1206 09:00:00.259063 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgl8m\" (UniqueName: \"kubernetes.io/projected/fff71e07-e11a-46fe-873b-4503a384170e-kube-api-access-rgl8m\") pod \"collect-profiles-29416860-mc6mb\" (UID: \"fff71e07-e11a-46fe-873b-4503a384170e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb" Dec 06 09:00:00 crc kubenswrapper[4954]: I1206 09:00:00.360787 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fff71e07-e11a-46fe-873b-4503a384170e-secret-volume\") pod \"collect-profiles-29416860-mc6mb\" (UID: \"fff71e07-e11a-46fe-873b-4503a384170e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb" Dec 06 09:00:00 crc kubenswrapper[4954]: I1206 09:00:00.360832 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fff71e07-e11a-46fe-873b-4503a384170e-config-volume\") pod \"collect-profiles-29416860-mc6mb\" (UID: \"fff71e07-e11a-46fe-873b-4503a384170e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb" Dec 06 09:00:00 crc kubenswrapper[4954]: I1206 09:00:00.360860 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgl8m\" (UniqueName: \"kubernetes.io/projected/fff71e07-e11a-46fe-873b-4503a384170e-kube-api-access-rgl8m\") pod \"collect-profiles-29416860-mc6mb\" (UID: \"fff71e07-e11a-46fe-873b-4503a384170e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb" Dec 06 09:00:00 crc kubenswrapper[4954]: I1206 09:00:00.362984 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fff71e07-e11a-46fe-873b-4503a384170e-config-volume\") pod \"collect-profiles-29416860-mc6mb\" (UID: \"fff71e07-e11a-46fe-873b-4503a384170e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb" Dec 06 09:00:00 crc kubenswrapper[4954]: I1206 09:00:00.367536 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fff71e07-e11a-46fe-873b-4503a384170e-secret-volume\") pod \"collect-profiles-29416860-mc6mb\" (UID: \"fff71e07-e11a-46fe-873b-4503a384170e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb" Dec 06 09:00:00 crc kubenswrapper[4954]: I1206 09:00:00.377881 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgl8m\" (UniqueName: \"kubernetes.io/projected/fff71e07-e11a-46fe-873b-4503a384170e-kube-api-access-rgl8m\") pod \"collect-profiles-29416860-mc6mb\" (UID: \"fff71e07-e11a-46fe-873b-4503a384170e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb" Dec 06 09:00:00 crc kubenswrapper[4954]: I1206 09:00:00.458269 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb" Dec 06 09:00:01 crc kubenswrapper[4954]: I1206 09:00:01.035444 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb"] Dec 06 09:00:01 crc kubenswrapper[4954]: W1206 09:00:01.039262 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfff71e07_e11a_46fe_873b_4503a384170e.slice/crio-8bb8b88221078ed3fb9d3e339b215af7b0f5fb96980f7856b3a962a0448eb0fd WatchSource:0}: Error finding container 8bb8b88221078ed3fb9d3e339b215af7b0f5fb96980f7856b3a962a0448eb0fd: Status 404 returned error can't find the container with id 8bb8b88221078ed3fb9d3e339b215af7b0f5fb96980f7856b3a962a0448eb0fd Dec 06 09:00:01 crc kubenswrapper[4954]: I1206 09:00:01.846487 4954 generic.go:334] "Generic (PLEG): container finished" podID="fff71e07-e11a-46fe-873b-4503a384170e" containerID="333f424f7ebedbbabc5761b83016cf316e7a55a8f0bd716ca04772782f6d365f" exitCode=0 Dec 06 09:00:01 crc kubenswrapper[4954]: I1206 09:00:01.846593 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb" event={"ID":"fff71e07-e11a-46fe-873b-4503a384170e","Type":"ContainerDied","Data":"333f424f7ebedbbabc5761b83016cf316e7a55a8f0bd716ca04772782f6d365f"} Dec 06 09:00:01 crc kubenswrapper[4954]: I1206 09:00:01.846956 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb" event={"ID":"fff71e07-e11a-46fe-873b-4503a384170e","Type":"ContainerStarted","Data":"8bb8b88221078ed3fb9d3e339b215af7b0f5fb96980f7856b3a962a0448eb0fd"} Dec 06 09:00:01 crc kubenswrapper[4954]: I1206 09:00:01.854546 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d82k4" event={"ID":"98f5ca97-be0e-45b9-97df-f72b98bdce09","Type":"ContainerStarted","Data":"57da20c78f44687d6e19a19e0f7e423131cc5a44909bffe240062ab1d39edca5"} Dec 06 09:00:01 crc kubenswrapper[4954]: I1206 09:00:01.909640 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d82k4" podStartSLOduration=3.853562116 podStartE2EDuration="6.909604953s" podCreationTimestamp="2025-12-06 08:59:55 +0000 UTC" firstStartedPulling="2025-12-06 08:59:57.803589251 +0000 UTC m=+7372.616948640" lastFinishedPulling="2025-12-06 09:00:00.859632098 +0000 UTC m=+7375.672991477" observedRunningTime="2025-12-06 09:00:01.888875111 +0000 UTC m=+7376.702234510" watchObservedRunningTime="2025-12-06 09:00:01.909604953 +0000 UTC m=+7376.722964342" Dec 06 09:00:03 crc kubenswrapper[4954]: I1206 09:00:03.202641 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb" Dec 06 09:00:03 crc kubenswrapper[4954]: I1206 09:00:03.324969 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fff71e07-e11a-46fe-873b-4503a384170e-secret-volume\") pod \"fff71e07-e11a-46fe-873b-4503a384170e\" (UID: \"fff71e07-e11a-46fe-873b-4503a384170e\") " Dec 06 09:00:03 crc kubenswrapper[4954]: I1206 09:00:03.325295 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgl8m\" (UniqueName: \"kubernetes.io/projected/fff71e07-e11a-46fe-873b-4503a384170e-kube-api-access-rgl8m\") pod \"fff71e07-e11a-46fe-873b-4503a384170e\" (UID: \"fff71e07-e11a-46fe-873b-4503a384170e\") " Dec 06 09:00:03 crc kubenswrapper[4954]: I1206 09:00:03.325373 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fff71e07-e11a-46fe-873b-4503a384170e-config-volume\") pod \"fff71e07-e11a-46fe-873b-4503a384170e\" (UID: \"fff71e07-e11a-46fe-873b-4503a384170e\") " Dec 06 09:00:03 crc kubenswrapper[4954]: I1206 09:00:03.327994 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fff71e07-e11a-46fe-873b-4503a384170e-config-volume" (OuterVolumeSpecName: "config-volume") pod "fff71e07-e11a-46fe-873b-4503a384170e" (UID: "fff71e07-e11a-46fe-873b-4503a384170e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:00:03 crc kubenswrapper[4954]: I1206 09:00:03.332096 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fff71e07-e11a-46fe-873b-4503a384170e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fff71e07-e11a-46fe-873b-4503a384170e" (UID: "fff71e07-e11a-46fe-873b-4503a384170e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:00:03 crc kubenswrapper[4954]: I1206 09:00:03.332232 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff71e07-e11a-46fe-873b-4503a384170e-kube-api-access-rgl8m" (OuterVolumeSpecName: "kube-api-access-rgl8m") pod "fff71e07-e11a-46fe-873b-4503a384170e" (UID: "fff71e07-e11a-46fe-873b-4503a384170e"). InnerVolumeSpecName "kube-api-access-rgl8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:00:03 crc kubenswrapper[4954]: I1206 09:00:03.429234 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fff71e07-e11a-46fe-873b-4503a384170e-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:03 crc kubenswrapper[4954]: I1206 09:00:03.429983 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgl8m\" (UniqueName: \"kubernetes.io/projected/fff71e07-e11a-46fe-873b-4503a384170e-kube-api-access-rgl8m\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:03 crc kubenswrapper[4954]: I1206 09:00:03.430055 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fff71e07-e11a-46fe-873b-4503a384170e-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:03 crc kubenswrapper[4954]: I1206 09:00:03.872312 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb" event={"ID":"fff71e07-e11a-46fe-873b-4503a384170e","Type":"ContainerDied","Data":"8bb8b88221078ed3fb9d3e339b215af7b0f5fb96980f7856b3a962a0448eb0fd"} Dec 06 09:00:03 crc kubenswrapper[4954]: I1206 09:00:03.872383 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bb8b88221078ed3fb9d3e339b215af7b0f5fb96980f7856b3a962a0448eb0fd" Dec 06 09:00:03 crc kubenswrapper[4954]: I1206 09:00:03.872391 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb" Dec 06 09:00:04 crc kubenswrapper[4954]: I1206 09:00:04.296397 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c"] Dec 06 09:00:04 crc kubenswrapper[4954]: I1206 09:00:04.304056 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416815-lqg6c"] Dec 06 09:00:05 crc kubenswrapper[4954]: I1206 09:00:05.452965 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f" path="/var/lib/kubelet/pods/3f6cbcc3-d307-42f7-9b4e-3c6f8ab5848f/volumes" Dec 06 09:00:06 crc kubenswrapper[4954]: I1206 09:00:06.295286 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d82k4" Dec 06 09:00:06 crc kubenswrapper[4954]: I1206 09:00:06.295416 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d82k4" Dec 06 09:00:06 crc kubenswrapper[4954]: I1206 09:00:06.341692 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d82k4" Dec 06 09:00:06 crc kubenswrapper[4954]: I1206 09:00:06.946202 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d82k4" Dec 06 09:00:07 crc kubenswrapper[4954]: I1206 09:00:07.012851 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d82k4"] Dec 06 09:00:08 crc kubenswrapper[4954]: I1206 09:00:08.914827 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d82k4" podUID="98f5ca97-be0e-45b9-97df-f72b98bdce09" containerName="registry-server" containerID="cri-o://57da20c78f44687d6e19a19e0f7e423131cc5a44909bffe240062ab1d39edca5" gracePeriod=2 Dec 06 09:00:09 crc kubenswrapper[4954]: I1206 09:00:09.927020 4954 generic.go:334] "Generic (PLEG): container finished" podID="98f5ca97-be0e-45b9-97df-f72b98bdce09" containerID="57da20c78f44687d6e19a19e0f7e423131cc5a44909bffe240062ab1d39edca5" exitCode=0 Dec 06 09:00:09 crc kubenswrapper[4954]: I1206 09:00:09.927078 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d82k4" event={"ID":"98f5ca97-be0e-45b9-97df-f72b98bdce09","Type":"ContainerDied","Data":"57da20c78f44687d6e19a19e0f7e423131cc5a44909bffe240062ab1d39edca5"} Dec 06 09:00:09 crc kubenswrapper[4954]: I1206 09:00:09.927401 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d82k4" event={"ID":"98f5ca97-be0e-45b9-97df-f72b98bdce09","Type":"ContainerDied","Data":"b298b8a3c9723f3fa332cf8f2105f25e4df320bd6741ff64ec1c8c0dff3cd03f"} Dec 06 09:00:09 crc kubenswrapper[4954]: I1206 09:00:09.927418 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b298b8a3c9723f3fa332cf8f2105f25e4df320bd6741ff64ec1c8c0dff3cd03f" Dec 06 09:00:09 crc kubenswrapper[4954]: I1206 09:00:09.949778 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d82k4" Dec 06 09:00:10 crc kubenswrapper[4954]: I1206 09:00:10.071404 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98f5ca97-be0e-45b9-97df-f72b98bdce09-utilities\") pod \"98f5ca97-be0e-45b9-97df-f72b98bdce09\" (UID: \"98f5ca97-be0e-45b9-97df-f72b98bdce09\") " Dec 06 09:00:10 crc kubenswrapper[4954]: I1206 09:00:10.071465 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb79m\" (UniqueName: \"kubernetes.io/projected/98f5ca97-be0e-45b9-97df-f72b98bdce09-kube-api-access-cb79m\") pod \"98f5ca97-be0e-45b9-97df-f72b98bdce09\" (UID: \"98f5ca97-be0e-45b9-97df-f72b98bdce09\") " Dec 06 09:00:10 crc kubenswrapper[4954]: I1206 09:00:10.071724 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98f5ca97-be0e-45b9-97df-f72b98bdce09-catalog-content\") pod \"98f5ca97-be0e-45b9-97df-f72b98bdce09\" (UID: \"98f5ca97-be0e-45b9-97df-f72b98bdce09\") " Dec 06 09:00:10 crc kubenswrapper[4954]: I1206 09:00:10.076293 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98f5ca97-be0e-45b9-97df-f72b98bdce09-utilities" (OuterVolumeSpecName: "utilities") pod "98f5ca97-be0e-45b9-97df-f72b98bdce09" (UID: "98f5ca97-be0e-45b9-97df-f72b98bdce09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:00:10 crc kubenswrapper[4954]: I1206 09:00:10.081913 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98f5ca97-be0e-45b9-97df-f72b98bdce09-kube-api-access-cb79m" (OuterVolumeSpecName: "kube-api-access-cb79m") pod "98f5ca97-be0e-45b9-97df-f72b98bdce09" (UID: "98f5ca97-be0e-45b9-97df-f72b98bdce09"). InnerVolumeSpecName "kube-api-access-cb79m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:00:10 crc kubenswrapper[4954]: I1206 09:00:10.128117 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98f5ca97-be0e-45b9-97df-f72b98bdce09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98f5ca97-be0e-45b9-97df-f72b98bdce09" (UID: "98f5ca97-be0e-45b9-97df-f72b98bdce09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:00:10 crc kubenswrapper[4954]: I1206 09:00:10.173860 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98f5ca97-be0e-45b9-97df-f72b98bdce09-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:10 crc kubenswrapper[4954]: I1206 09:00:10.173898 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98f5ca97-be0e-45b9-97df-f72b98bdce09-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:10 crc kubenswrapper[4954]: I1206 09:00:10.173911 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb79m\" (UniqueName: \"kubernetes.io/projected/98f5ca97-be0e-45b9-97df-f72b98bdce09-kube-api-access-cb79m\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:10 crc kubenswrapper[4954]: I1206 09:00:10.934371 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d82k4" Dec 06 09:00:10 crc kubenswrapper[4954]: I1206 09:00:10.974113 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d82k4"] Dec 06 09:00:10 crc kubenswrapper[4954]: I1206 09:00:10.980349 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d82k4"] Dec 06 09:00:11 crc kubenswrapper[4954]: I1206 09:00:11.471912 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98f5ca97-be0e-45b9-97df-f72b98bdce09" path="/var/lib/kubelet/pods/98f5ca97-be0e-45b9-97df-f72b98bdce09/volumes" Dec 06 09:00:21 crc kubenswrapper[4954]: I1206 09:00:21.027371 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tv6n7" event={"ID":"edc9b245-21ff-4118-8725-2c0ad2eedf7c","Type":"ContainerStarted","Data":"71d657c037805e8342ad47387d7eed5887aeef88e929425e622e33b9dd67c89b"} Dec 06 09:00:21 crc kubenswrapper[4954]: I1206 09:00:21.084263 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-tv6n7" podStartSLOduration=2.322604509 podStartE2EDuration="41.084242921s" podCreationTimestamp="2025-12-06 08:59:40 +0000 UTC" firstStartedPulling="2025-12-06 08:59:41.309361361 +0000 UTC m=+7356.122720750" lastFinishedPulling="2025-12-06 09:00:20.070999773 +0000 UTC m=+7394.884359162" observedRunningTime="2025-12-06 09:00:21.040842425 +0000 UTC m=+7395.854201814" watchObservedRunningTime="2025-12-06 09:00:21.084242921 +0000 UTC m=+7395.897602310" Dec 06 09:00:23 crc kubenswrapper[4954]: I1206 09:00:23.510474 4954 scope.go:117] "RemoveContainer" containerID="801ac756c0693be85b21854668a57a9627135699f1d8a4611326e3352422245e" Dec 06 09:00:24 crc kubenswrapper[4954]: I1206 09:00:24.054060 4954 generic.go:334] "Generic (PLEG): container finished" podID="edc9b245-21ff-4118-8725-2c0ad2eedf7c" containerID="71d657c037805e8342ad47387d7eed5887aeef88e929425e622e33b9dd67c89b" exitCode=0 Dec 06 09:00:24 crc kubenswrapper[4954]: I1206 09:00:24.054143 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tv6n7" event={"ID":"edc9b245-21ff-4118-8725-2c0ad2eedf7c","Type":"ContainerDied","Data":"71d657c037805e8342ad47387d7eed5887aeef88e929425e622e33b9dd67c89b"} Dec 06 09:00:25 crc kubenswrapper[4954]: I1206 09:00:25.460385 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tv6n7" Dec 06 09:00:25 crc kubenswrapper[4954]: I1206 09:00:25.561467 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc9b245-21ff-4118-8725-2c0ad2eedf7c-combined-ca-bundle\") pod \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\" (UID: \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\") " Dec 06 09:00:25 crc kubenswrapper[4954]: I1206 09:00:25.561554 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc9b245-21ff-4118-8725-2c0ad2eedf7c-config-data\") pod \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\" (UID: \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\") " Dec 06 09:00:25 crc kubenswrapper[4954]: I1206 09:00:25.561614 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/edc9b245-21ff-4118-8725-2c0ad2eedf7c-db-sync-config-data\") pod \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\" (UID: \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\") " Dec 06 09:00:25 crc kubenswrapper[4954]: I1206 09:00:25.561678 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr8d4\" (UniqueName: \"kubernetes.io/projected/edc9b245-21ff-4118-8725-2c0ad2eedf7c-kube-api-access-qr8d4\") pod \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\" (UID: \"edc9b245-21ff-4118-8725-2c0ad2eedf7c\") " Dec 06 09:00:25 crc kubenswrapper[4954]: I1206 09:00:25.568438 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc9b245-21ff-4118-8725-2c0ad2eedf7c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "edc9b245-21ff-4118-8725-2c0ad2eedf7c" (UID: "edc9b245-21ff-4118-8725-2c0ad2eedf7c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:00:25 crc kubenswrapper[4954]: I1206 09:00:25.572444 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edc9b245-21ff-4118-8725-2c0ad2eedf7c-kube-api-access-qr8d4" (OuterVolumeSpecName: "kube-api-access-qr8d4") pod "edc9b245-21ff-4118-8725-2c0ad2eedf7c" (UID: "edc9b245-21ff-4118-8725-2c0ad2eedf7c"). InnerVolumeSpecName "kube-api-access-qr8d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:00:25 crc kubenswrapper[4954]: I1206 09:00:25.591547 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc9b245-21ff-4118-8725-2c0ad2eedf7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edc9b245-21ff-4118-8725-2c0ad2eedf7c" (UID: "edc9b245-21ff-4118-8725-2c0ad2eedf7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:00:25 crc kubenswrapper[4954]: I1206 09:00:25.618532 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc9b245-21ff-4118-8725-2c0ad2eedf7c-config-data" (OuterVolumeSpecName: "config-data") pod "edc9b245-21ff-4118-8725-2c0ad2eedf7c" (UID: "edc9b245-21ff-4118-8725-2c0ad2eedf7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:00:25 crc kubenswrapper[4954]: I1206 09:00:25.663732 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr8d4\" (UniqueName: \"kubernetes.io/projected/edc9b245-21ff-4118-8725-2c0ad2eedf7c-kube-api-access-qr8d4\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:25 crc kubenswrapper[4954]: I1206 09:00:25.663766 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc9b245-21ff-4118-8725-2c0ad2eedf7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:25 crc kubenswrapper[4954]: I1206 09:00:25.663776 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc9b245-21ff-4118-8725-2c0ad2eedf7c-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:25 crc kubenswrapper[4954]: I1206 09:00:25.663785 4954 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/edc9b245-21ff-4118-8725-2c0ad2eedf7c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.077622 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tv6n7" event={"ID":"edc9b245-21ff-4118-8725-2c0ad2eedf7c","Type":"ContainerDied","Data":"b227a71c9986ac561f184fbc78cd6deceb4817f3093bee2356d38a258c7a8212"} Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.077671 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b227a71c9986ac561f184fbc78cd6deceb4817f3093bee2356d38a258c7a8212" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.077699 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tv6n7" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.474323 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 09:00:26 crc kubenswrapper[4954]: E1206 09:00:26.475039 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f5ca97-be0e-45b9-97df-f72b98bdce09" containerName="extract-utilities" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.475060 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f5ca97-be0e-45b9-97df-f72b98bdce09" containerName="extract-utilities" Dec 06 09:00:26 crc kubenswrapper[4954]: E1206 09:00:26.475077 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc9b245-21ff-4118-8725-2c0ad2eedf7c" containerName="glance-db-sync" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.475086 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc9b245-21ff-4118-8725-2c0ad2eedf7c" containerName="glance-db-sync" Dec 06 09:00:26 crc kubenswrapper[4954]: E1206 09:00:26.475121 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f5ca97-be0e-45b9-97df-f72b98bdce09" containerName="extract-content" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.475129 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f5ca97-be0e-45b9-97df-f72b98bdce09" containerName="extract-content" Dec 06 09:00:26 crc kubenswrapper[4954]: E1206 09:00:26.475150 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff71e07-e11a-46fe-873b-4503a384170e" containerName="collect-profiles" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.475158 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff71e07-e11a-46fe-873b-4503a384170e" containerName="collect-profiles" Dec 06 09:00:26 crc kubenswrapper[4954]: E1206 09:00:26.475166 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f5ca97-be0e-45b9-97df-f72b98bdce09" containerName="registry-server" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.475173 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f5ca97-be0e-45b9-97df-f72b98bdce09" containerName="registry-server" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.475452 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="fff71e07-e11a-46fe-873b-4503a384170e" containerName="collect-profiles" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.475481 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="edc9b245-21ff-4118-8725-2c0ad2eedf7c" containerName="glance-db-sync" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.475506 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f5ca97-be0e-45b9-97df-f72b98bdce09" containerName="registry-server" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.476760 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.481946 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xcnwj" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.482340 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.482476 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.493210 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6895dc775f-db2wf"] Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.495712 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.523477 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6895dc775f-db2wf"] Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.534941 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.578420 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-dns-svc\") pod \"dnsmasq-dns-6895dc775f-db2wf\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.578495 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59924d65-bfbc-4d6e-a896-0318d18cad56-logs\") pod \"glance-default-external-api-0\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.578519 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-config\") pod \"dnsmasq-dns-6895dc775f-db2wf\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.578541 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-ovsdbserver-nb\") pod \"dnsmasq-dns-6895dc775f-db2wf\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.578611 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5z6l\" (UniqueName: \"kubernetes.io/projected/59924d65-bfbc-4d6e-a896-0318d18cad56-kube-api-access-h5z6l\") pod \"glance-default-external-api-0\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.579146 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59924d65-bfbc-4d6e-a896-0318d18cad56-config-data\") pod \"glance-default-external-api-0\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.579276 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59924d65-bfbc-4d6e-a896-0318d18cad56-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.579324 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59924d65-bfbc-4d6e-a896-0318d18cad56-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.579357 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8ld2\" (UniqueName: \"kubernetes.io/projected/1122f84e-bd24-4177-9ce1-00a28c07cef1-kube-api-access-j8ld2\") pod \"dnsmasq-dns-6895dc775f-db2wf\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.579392 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-ovsdbserver-sb\") pod \"dnsmasq-dns-6895dc775f-db2wf\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.579424 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59924d65-bfbc-4d6e-a896-0318d18cad56-scripts\") pod \"glance-default-external-api-0\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.580918 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.583310 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.585458 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.594633 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.681442 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74461432-266d-419b-a923-bd4c4d5e0e22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.681507 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-dns-svc\") pod \"dnsmasq-dns-6895dc775f-db2wf\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.681533 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59924d65-bfbc-4d6e-a896-0318d18cad56-logs\") pod \"glance-default-external-api-0\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.681556 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-ovsdbserver-nb\") pod \"dnsmasq-dns-6895dc775f-db2wf\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.681604 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-config\") pod \"dnsmasq-dns-6895dc775f-db2wf\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.681636 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74461432-266d-419b-a923-bd4c4d5e0e22-logs\") pod \"glance-default-internal-api-0\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.681660 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5z6l\" (UniqueName: \"kubernetes.io/projected/59924d65-bfbc-4d6e-a896-0318d18cad56-kube-api-access-h5z6l\") pod \"glance-default-external-api-0\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.681739 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74461432-266d-419b-a923-bd4c4d5e0e22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.681951 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59924d65-bfbc-4d6e-a896-0318d18cad56-config-data\") pod \"glance-default-external-api-0\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.681990 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqczz\" (UniqueName: \"kubernetes.io/projected/74461432-266d-419b-a923-bd4c4d5e0e22-kube-api-access-sqczz\") pod \"glance-default-internal-api-0\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.682040 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74461432-266d-419b-a923-bd4c4d5e0e22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.682065 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59924d65-bfbc-4d6e-a896-0318d18cad56-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.682077 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59924d65-bfbc-4d6e-a896-0318d18cad56-logs\") pod \"glance-default-external-api-0\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.682113 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59924d65-bfbc-4d6e-a896-0318d18cad56-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.682140 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8ld2\" (UniqueName: \"kubernetes.io/projected/1122f84e-bd24-4177-9ce1-00a28c07cef1-kube-api-access-j8ld2\") pod \"dnsmasq-dns-6895dc775f-db2wf\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.682359 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-ovsdbserver-sb\") pod \"dnsmasq-dns-6895dc775f-db2wf\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.682383 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74461432-266d-419b-a923-bd4c4d5e0e22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.682413 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59924d65-bfbc-4d6e-a896-0318d18cad56-scripts\") pod \"glance-default-external-api-0\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.682735 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59924d65-bfbc-4d6e-a896-0318d18cad56-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.682739 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-dns-svc\") pod \"dnsmasq-dns-6895dc775f-db2wf\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.682972 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-config\") pod \"dnsmasq-dns-6895dc775f-db2wf\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.683365 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-ovsdbserver-nb\") pod \"dnsmasq-dns-6895dc775f-db2wf\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.683396 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-ovsdbserver-sb\") pod \"dnsmasq-dns-6895dc775f-db2wf\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.689381 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59924d65-bfbc-4d6e-a896-0318d18cad56-config-data\") pod \"glance-default-external-api-0\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.694152 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59924d65-bfbc-4d6e-a896-0318d18cad56-scripts\") pod \"glance-default-external-api-0\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.694381 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59924d65-bfbc-4d6e-a896-0318d18cad56-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.700630 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8ld2\" (UniqueName: \"kubernetes.io/projected/1122f84e-bd24-4177-9ce1-00a28c07cef1-kube-api-access-j8ld2\") pod \"dnsmasq-dns-6895dc775f-db2wf\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.705484 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5z6l\" (UniqueName: \"kubernetes.io/projected/59924d65-bfbc-4d6e-a896-0318d18cad56-kube-api-access-h5z6l\") pod \"glance-default-external-api-0\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.784514 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74461432-266d-419b-a923-bd4c4d5e0e22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.784897 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74461432-266d-419b-a923-bd4c4d5e0e22-logs\") pod \"glance-default-internal-api-0\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.784971 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74461432-266d-419b-a923-bd4c4d5e0e22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.784997 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqczz\" (UniqueName: \"kubernetes.io/projected/74461432-266d-419b-a923-bd4c4d5e0e22-kube-api-access-sqczz\") pod \"glance-default-internal-api-0\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.785032 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74461432-266d-419b-a923-bd4c4d5e0e22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.785077 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74461432-266d-419b-a923-bd4c4d5e0e22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.785528 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74461432-266d-419b-a923-bd4c4d5e0e22-logs\") pod \"glance-default-internal-api-0\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.785627 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74461432-266d-419b-a923-bd4c4d5e0e22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.788872 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74461432-266d-419b-a923-bd4c4d5e0e22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.789165 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74461432-266d-419b-a923-bd4c4d5e0e22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.789981 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74461432-266d-419b-a923-bd4c4d5e0e22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.797002 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.805410 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqczz\" (UniqueName: \"kubernetes.io/projected/74461432-266d-419b-a923-bd4c4d5e0e22-kube-api-access-sqczz\") pod \"glance-default-internal-api-0\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.820995 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:26 crc kubenswrapper[4954]: I1206 09:00:26.909059 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 09:00:27 crc kubenswrapper[4954]: I1206 09:00:27.400995 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6895dc775f-db2wf"] Dec 06 09:00:27 crc kubenswrapper[4954]: I1206 09:00:27.500378 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 09:00:27 crc kubenswrapper[4954]: W1206 09:00:27.532052 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59924d65_bfbc_4d6e_a896_0318d18cad56.slice/crio-8d59b2d8851607120e9af4ffc98e15ca5b365caff003e318e48f5a412ac5a2cd WatchSource:0}: Error finding container 8d59b2d8851607120e9af4ffc98e15ca5b365caff003e318e48f5a412ac5a2cd: Status 404 returned error can't find the container with id 8d59b2d8851607120e9af4ffc98e15ca5b365caff003e318e48f5a412ac5a2cd Dec 06 09:00:27 crc kubenswrapper[4954]: I1206 09:00:27.840795 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 09:00:27 crc kubenswrapper[4954]: W1206 09:00:27.870665 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74461432_266d_419b_a923_bd4c4d5e0e22.slice/crio-82ba69bcde26b90d13cbcf9e7d3ddbc48851cae2b7509a1956ba02a846d9033a WatchSource:0}: Error finding container 82ba69bcde26b90d13cbcf9e7d3ddbc48851cae2b7509a1956ba02a846d9033a: Status 404 returned error can't find the container with id 82ba69bcde26b90d13cbcf9e7d3ddbc48851cae2b7509a1956ba02a846d9033a Dec 06 09:00:28 crc kubenswrapper[4954]: I1206 09:00:28.020983 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 09:00:28 crc kubenswrapper[4954]: I1206 09:00:28.141221 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74461432-266d-419b-a923-bd4c4d5e0e22","Type":"ContainerStarted","Data":"82ba69bcde26b90d13cbcf9e7d3ddbc48851cae2b7509a1956ba02a846d9033a"} Dec 06 09:00:28 crc kubenswrapper[4954]: I1206 09:00:28.143826 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"59924d65-bfbc-4d6e-a896-0318d18cad56","Type":"ContainerStarted","Data":"8d59b2d8851607120e9af4ffc98e15ca5b365caff003e318e48f5a412ac5a2cd"} Dec 06 09:00:28 crc kubenswrapper[4954]: I1206 09:00:28.146436 4954 generic.go:334] "Generic (PLEG): container finished" podID="1122f84e-bd24-4177-9ce1-00a28c07cef1" containerID="af709883c925e62966e00a8b68e98a577a0f1662b1367ecbad33cfd26477b46e" exitCode=0 Dec 06 09:00:28 crc kubenswrapper[4954]: I1206 09:00:28.146477 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6895dc775f-db2wf" event={"ID":"1122f84e-bd24-4177-9ce1-00a28c07cef1","Type":"ContainerDied","Data":"af709883c925e62966e00a8b68e98a577a0f1662b1367ecbad33cfd26477b46e"} Dec 06 09:00:28 crc kubenswrapper[4954]: I1206 09:00:28.146499 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6895dc775f-db2wf" event={"ID":"1122f84e-bd24-4177-9ce1-00a28c07cef1","Type":"ContainerStarted","Data":"9cdbacd5bb094692882df9a1605208d23159148341eadabc872616b10d421b14"} Dec 06 09:00:28 crc kubenswrapper[4954]: I1206 09:00:28.822264 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 09:00:29 crc kubenswrapper[4954]: I1206 09:00:29.168116 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74461432-266d-419b-a923-bd4c4d5e0e22","Type":"ContainerStarted","Data":"9c7423daabff7f6c5d33be3a9e09a81f8f081deaa774a2758b39f99a124de1a4"} Dec 06 09:00:29 crc kubenswrapper[4954]: I1206 09:00:29.168170 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74461432-266d-419b-a923-bd4c4d5e0e22","Type":"ContainerStarted","Data":"bbfd2413ff54f75a8c3e9f195eeac0613f5fcad4a8500c5e5654633e31e5f423"} Dec 06 09:00:29 crc kubenswrapper[4954]: I1206 09:00:29.170513 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6895dc775f-db2wf" event={"ID":"1122f84e-bd24-4177-9ce1-00a28c07cef1","Type":"ContainerStarted","Data":"d320fd3f1a507f3877ea11d94b91e6c0a21a27e0c97b2454b07dd02fbaacb04c"} Dec 06 09:00:29 crc kubenswrapper[4954]: I1206 09:00:29.171845 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:29 crc kubenswrapper[4954]: I1206 09:00:29.174682 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"59924d65-bfbc-4d6e-a896-0318d18cad56","Type":"ContainerStarted","Data":"74fcc0dfd6959534f8be02b1abc1a6368c48639c1fdce6d5ae69355eb86c700f"} Dec 06 09:00:29 crc kubenswrapper[4954]: I1206 09:00:29.174726 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"59924d65-bfbc-4d6e-a896-0318d18cad56","Type":"ContainerStarted","Data":"5505921560b0c4ebca6646728c10349e40b19296bd262d3bddb380cad0d01e88"} Dec 06 09:00:29 crc kubenswrapper[4954]: I1206 09:00:29.174861 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="59924d65-bfbc-4d6e-a896-0318d18cad56" containerName="glance-log" containerID="cri-o://5505921560b0c4ebca6646728c10349e40b19296bd262d3bddb380cad0d01e88" gracePeriod=30 Dec 06 09:00:29 crc kubenswrapper[4954]: I1206 09:00:29.175209 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="59924d65-bfbc-4d6e-a896-0318d18cad56" containerName="glance-httpd" containerID="cri-o://74fcc0dfd6959534f8be02b1abc1a6368c48639c1fdce6d5ae69355eb86c700f" gracePeriod=30 Dec 06 09:00:29 crc kubenswrapper[4954]: I1206 09:00:29.202830 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6895dc775f-db2wf" podStartSLOduration=3.202806148 podStartE2EDuration="3.202806148s" podCreationTimestamp="2025-12-06 09:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:00:29.197886468 +0000 UTC m=+7404.011245877" watchObservedRunningTime="2025-12-06 09:00:29.202806148 +0000 UTC m=+7404.016165537" Dec 06 09:00:29 crc kubenswrapper[4954]: I1206 09:00:29.226418 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.226388596 podStartE2EDuration="3.226388596s" podCreationTimestamp="2025-12-06 09:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:00:29.22278248 +0000 UTC m=+7404.036141889" watchObservedRunningTime="2025-12-06 09:00:29.226388596 +0000 UTC m=+7404.039747985" Dec 06 09:00:29 crc kubenswrapper[4954]: I1206 09:00:29.967212 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.073776 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59924d65-bfbc-4d6e-a896-0318d18cad56-config-data\") pod \"59924d65-bfbc-4d6e-a896-0318d18cad56\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.073887 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59924d65-bfbc-4d6e-a896-0318d18cad56-logs\") pod \"59924d65-bfbc-4d6e-a896-0318d18cad56\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.073913 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59924d65-bfbc-4d6e-a896-0318d18cad56-httpd-run\") pod \"59924d65-bfbc-4d6e-a896-0318d18cad56\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.073979 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59924d65-bfbc-4d6e-a896-0318d18cad56-combined-ca-bundle\") pod \"59924d65-bfbc-4d6e-a896-0318d18cad56\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.074031 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5z6l\" (UniqueName: \"kubernetes.io/projected/59924d65-bfbc-4d6e-a896-0318d18cad56-kube-api-access-h5z6l\") pod \"59924d65-bfbc-4d6e-a896-0318d18cad56\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.074067 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59924d65-bfbc-4d6e-a896-0318d18cad56-scripts\") pod \"59924d65-bfbc-4d6e-a896-0318d18cad56\" (UID: \"59924d65-bfbc-4d6e-a896-0318d18cad56\") " Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.074599 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59924d65-bfbc-4d6e-a896-0318d18cad56-logs" (OuterVolumeSpecName: "logs") pod "59924d65-bfbc-4d6e-a896-0318d18cad56" (UID: "59924d65-bfbc-4d6e-a896-0318d18cad56"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.074926 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59924d65-bfbc-4d6e-a896-0318d18cad56-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "59924d65-bfbc-4d6e-a896-0318d18cad56" (UID: "59924d65-bfbc-4d6e-a896-0318d18cad56"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.095110 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59924d65-bfbc-4d6e-a896-0318d18cad56-scripts" (OuterVolumeSpecName: "scripts") pod "59924d65-bfbc-4d6e-a896-0318d18cad56" (UID: "59924d65-bfbc-4d6e-a896-0318d18cad56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.098830 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59924d65-bfbc-4d6e-a896-0318d18cad56-kube-api-access-h5z6l" (OuterVolumeSpecName: "kube-api-access-h5z6l") pod "59924d65-bfbc-4d6e-a896-0318d18cad56" (UID: "59924d65-bfbc-4d6e-a896-0318d18cad56"). InnerVolumeSpecName "kube-api-access-h5z6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.171899 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59924d65-bfbc-4d6e-a896-0318d18cad56-config-data" (OuterVolumeSpecName: "config-data") pod "59924d65-bfbc-4d6e-a896-0318d18cad56" (UID: "59924d65-bfbc-4d6e-a896-0318d18cad56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.179293 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59924d65-bfbc-4d6e-a896-0318d18cad56-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.179329 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59924d65-bfbc-4d6e-a896-0318d18cad56-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.179341 4954 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59924d65-bfbc-4d6e-a896-0318d18cad56-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.179354 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5z6l\" (UniqueName: \"kubernetes.io/projected/59924d65-bfbc-4d6e-a896-0318d18cad56-kube-api-access-h5z6l\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.179367 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59924d65-bfbc-4d6e-a896-0318d18cad56-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.181701 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59924d65-bfbc-4d6e-a896-0318d18cad56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59924d65-bfbc-4d6e-a896-0318d18cad56" (UID: "59924d65-bfbc-4d6e-a896-0318d18cad56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.208853 4954 generic.go:334] "Generic (PLEG): container finished" podID="59924d65-bfbc-4d6e-a896-0318d18cad56" containerID="74fcc0dfd6959534f8be02b1abc1a6368c48639c1fdce6d5ae69355eb86c700f" exitCode=0 Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.208901 4954 generic.go:334] "Generic (PLEG): container finished" podID="59924d65-bfbc-4d6e-a896-0318d18cad56" containerID="5505921560b0c4ebca6646728c10349e40b19296bd262d3bddb380cad0d01e88" exitCode=143 Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.209788 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.210591 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="74461432-266d-419b-a923-bd4c4d5e0e22" containerName="glance-log" containerID="cri-o://bbfd2413ff54f75a8c3e9f195eeac0613f5fcad4a8500c5e5654633e31e5f423" gracePeriod=30 Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.210772 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"59924d65-bfbc-4d6e-a896-0318d18cad56","Type":"ContainerDied","Data":"74fcc0dfd6959534f8be02b1abc1a6368c48639c1fdce6d5ae69355eb86c700f"} Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.210814 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"59924d65-bfbc-4d6e-a896-0318d18cad56","Type":"ContainerDied","Data":"5505921560b0c4ebca6646728c10349e40b19296bd262d3bddb380cad0d01e88"} Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.210828 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"59924d65-bfbc-4d6e-a896-0318d18cad56","Type":"ContainerDied","Data":"8d59b2d8851607120e9af4ffc98e15ca5b365caff003e318e48f5a412ac5a2cd"} Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.210848 4954 scope.go:117] "RemoveContainer" containerID="74fcc0dfd6959534f8be02b1abc1a6368c48639c1fdce6d5ae69355eb86c700f" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.210986 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="74461432-266d-419b-a923-bd4c4d5e0e22" containerName="glance-httpd" containerID="cri-o://9c7423daabff7f6c5d33be3a9e09a81f8f081deaa774a2758b39f99a124de1a4" gracePeriod=30 Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.261792 4954 scope.go:117] "RemoveContainer" containerID="5505921560b0c4ebca6646728c10349e40b19296bd262d3bddb380cad0d01e88" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.264138 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.264116076 podStartE2EDuration="4.264116076s" podCreationTimestamp="2025-12-06 09:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:00:30.250264287 +0000 UTC m=+7405.063623686" watchObservedRunningTime="2025-12-06 09:00:30.264116076 +0000 UTC m=+7405.077475465" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.281945 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59924d65-bfbc-4d6e-a896-0318d18cad56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.285072 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.293511 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.312445 4954 scope.go:117] "RemoveContainer" containerID="74fcc0dfd6959534f8be02b1abc1a6368c48639c1fdce6d5ae69355eb86c700f" Dec 06 09:00:30 crc kubenswrapper[4954]: E1206 09:00:30.312987 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74fcc0dfd6959534f8be02b1abc1a6368c48639c1fdce6d5ae69355eb86c700f\": container with ID starting with 74fcc0dfd6959534f8be02b1abc1a6368c48639c1fdce6d5ae69355eb86c700f not found: ID does not exist" containerID="74fcc0dfd6959534f8be02b1abc1a6368c48639c1fdce6d5ae69355eb86c700f" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.313031 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74fcc0dfd6959534f8be02b1abc1a6368c48639c1fdce6d5ae69355eb86c700f"} err="failed to get container status \"74fcc0dfd6959534f8be02b1abc1a6368c48639c1fdce6d5ae69355eb86c700f\": rpc error: code = NotFound desc = could not find container \"74fcc0dfd6959534f8be02b1abc1a6368c48639c1fdce6d5ae69355eb86c700f\": container with ID starting with 74fcc0dfd6959534f8be02b1abc1a6368c48639c1fdce6d5ae69355eb86c700f not found: ID does not exist" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.313060 4954 scope.go:117] "RemoveContainer" containerID="5505921560b0c4ebca6646728c10349e40b19296bd262d3bddb380cad0d01e88" Dec 06 09:00:30 crc kubenswrapper[4954]: E1206 09:00:30.313409 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5505921560b0c4ebca6646728c10349e40b19296bd262d3bddb380cad0d01e88\": container with ID starting with 5505921560b0c4ebca6646728c10349e40b19296bd262d3bddb380cad0d01e88 not found: ID does not exist" containerID="5505921560b0c4ebca6646728c10349e40b19296bd262d3bddb380cad0d01e88" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.313449 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5505921560b0c4ebca6646728c10349e40b19296bd262d3bddb380cad0d01e88"} err="failed to get container status \"5505921560b0c4ebca6646728c10349e40b19296bd262d3bddb380cad0d01e88\": rpc error: code = NotFound desc = could not find container \"5505921560b0c4ebca6646728c10349e40b19296bd262d3bddb380cad0d01e88\": container with ID starting with 5505921560b0c4ebca6646728c10349e40b19296bd262d3bddb380cad0d01e88 not found: ID does not exist" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.313478 4954 scope.go:117] "RemoveContainer" containerID="74fcc0dfd6959534f8be02b1abc1a6368c48639c1fdce6d5ae69355eb86c700f" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.313721 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74fcc0dfd6959534f8be02b1abc1a6368c48639c1fdce6d5ae69355eb86c700f"} err="failed to get container status \"74fcc0dfd6959534f8be02b1abc1a6368c48639c1fdce6d5ae69355eb86c700f\": rpc error: code = NotFound desc = could not find container \"74fcc0dfd6959534f8be02b1abc1a6368c48639c1fdce6d5ae69355eb86c700f\": container with ID starting with 74fcc0dfd6959534f8be02b1abc1a6368c48639c1fdce6d5ae69355eb86c700f not found: ID does not exist" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.313745 4954 scope.go:117] "RemoveContainer" containerID="5505921560b0c4ebca6646728c10349e40b19296bd262d3bddb380cad0d01e88" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.313972 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5505921560b0c4ebca6646728c10349e40b19296bd262d3bddb380cad0d01e88"} err="failed to get container status \"5505921560b0c4ebca6646728c10349e40b19296bd262d3bddb380cad0d01e88\": rpc error: code = NotFound desc = could not find container \"5505921560b0c4ebca6646728c10349e40b19296bd262d3bddb380cad0d01e88\": container with ID starting with 5505921560b0c4ebca6646728c10349e40b19296bd262d3bddb380cad0d01e88 not found: ID does not exist" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.323607 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 09:00:30 crc kubenswrapper[4954]: E1206 09:00:30.323979 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59924d65-bfbc-4d6e-a896-0318d18cad56" containerName="glance-httpd" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.323994 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="59924d65-bfbc-4d6e-a896-0318d18cad56" containerName="glance-httpd" Dec 06 09:00:30 crc kubenswrapper[4954]: E1206 09:00:30.324022 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59924d65-bfbc-4d6e-a896-0318d18cad56" containerName="glance-log" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.324029 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="59924d65-bfbc-4d6e-a896-0318d18cad56" containerName="glance-log" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.324191 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="59924d65-bfbc-4d6e-a896-0318d18cad56" containerName="glance-log" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.324212 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="59924d65-bfbc-4d6e-a896-0318d18cad56" containerName="glance-httpd" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.325136 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.333087 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.333515 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.342013 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.383961 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-config-data\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.384017 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-scripts\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.384043 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.384065 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bccf6ec3-7db7-497d-97ba-e7002fb77b80-logs\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.384105 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.384138 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjhfh\" (UniqueName: \"kubernetes.io/projected/bccf6ec3-7db7-497d-97ba-e7002fb77b80-kube-api-access-vjhfh\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.384207 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bccf6ec3-7db7-497d-97ba-e7002fb77b80-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.485987 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-config-data\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.486039 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-scripts\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.486073 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.486097 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bccf6ec3-7db7-497d-97ba-e7002fb77b80-logs\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.486146 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.486179 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjhfh\" (UniqueName: \"kubernetes.io/projected/bccf6ec3-7db7-497d-97ba-e7002fb77b80-kube-api-access-vjhfh\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.486232 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bccf6ec3-7db7-497d-97ba-e7002fb77b80-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.486946 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bccf6ec3-7db7-497d-97ba-e7002fb77b80-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.486970 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bccf6ec3-7db7-497d-97ba-e7002fb77b80-logs\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.490410 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.491045 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-scripts\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.495259 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.495537 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-config-data\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.509628 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjhfh\" (UniqueName: \"kubernetes.io/projected/bccf6ec3-7db7-497d-97ba-e7002fb77b80-kube-api-access-vjhfh\") pod \"glance-default-external-api-0\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.656029 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 09:00:30 crc kubenswrapper[4954]: I1206 09:00:30.972433 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.099347 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74461432-266d-419b-a923-bd4c4d5e0e22-logs\") pod \"74461432-266d-419b-a923-bd4c4d5e0e22\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.099822 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqczz\" (UniqueName: \"kubernetes.io/projected/74461432-266d-419b-a923-bd4c4d5e0e22-kube-api-access-sqczz\") pod \"74461432-266d-419b-a923-bd4c4d5e0e22\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.099898 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74461432-266d-419b-a923-bd4c4d5e0e22-combined-ca-bundle\") pod \"74461432-266d-419b-a923-bd4c4d5e0e22\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.099963 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74461432-266d-419b-a923-bd4c4d5e0e22-config-data\") pod \"74461432-266d-419b-a923-bd4c4d5e0e22\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.099999 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74461432-266d-419b-a923-bd4c4d5e0e22-httpd-run\") pod \"74461432-266d-419b-a923-bd4c4d5e0e22\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.100059 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74461432-266d-419b-a923-bd4c4d5e0e22-scripts\") pod \"74461432-266d-419b-a923-bd4c4d5e0e22\" (UID: \"74461432-266d-419b-a923-bd4c4d5e0e22\") " Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.100555 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74461432-266d-419b-a923-bd4c4d5e0e22-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "74461432-266d-419b-a923-bd4c4d5e0e22" (UID: "74461432-266d-419b-a923-bd4c4d5e0e22"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.099905 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74461432-266d-419b-a923-bd4c4d5e0e22-logs" (OuterVolumeSpecName: "logs") pod "74461432-266d-419b-a923-bd4c4d5e0e22" (UID: "74461432-266d-419b-a923-bd4c4d5e0e22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.125678 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74461432-266d-419b-a923-bd4c4d5e0e22-scripts" (OuterVolumeSpecName: "scripts") pod "74461432-266d-419b-a923-bd4c4d5e0e22" (UID: "74461432-266d-419b-a923-bd4c4d5e0e22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.126433 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74461432-266d-419b-a923-bd4c4d5e0e22-kube-api-access-sqczz" (OuterVolumeSpecName: "kube-api-access-sqczz") pod "74461432-266d-419b-a923-bd4c4d5e0e22" (UID: "74461432-266d-419b-a923-bd4c4d5e0e22"). InnerVolumeSpecName "kube-api-access-sqczz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.148078 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74461432-266d-419b-a923-bd4c4d5e0e22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74461432-266d-419b-a923-bd4c4d5e0e22" (UID: "74461432-266d-419b-a923-bd4c4d5e0e22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.161399 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74461432-266d-419b-a923-bd4c4d5e0e22-config-data" (OuterVolumeSpecName: "config-data") pod "74461432-266d-419b-a923-bd4c4d5e0e22" (UID: "74461432-266d-419b-a923-bd4c4d5e0e22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.201223 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74461432-266d-419b-a923-bd4c4d5e0e22-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.201252 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqczz\" (UniqueName: \"kubernetes.io/projected/74461432-266d-419b-a923-bd4c4d5e0e22-kube-api-access-sqczz\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.201262 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74461432-266d-419b-a923-bd4c4d5e0e22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.201273 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74461432-266d-419b-a923-bd4c4d5e0e22-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.201281 4954 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74461432-266d-419b-a923-bd4c4d5e0e22-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.201288 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74461432-266d-419b-a923-bd4c4d5e0e22-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.219872 4954 generic.go:334] "Generic (PLEG): container finished" podID="74461432-266d-419b-a923-bd4c4d5e0e22" containerID="9c7423daabff7f6c5d33be3a9e09a81f8f081deaa774a2758b39f99a124de1a4" exitCode=0 Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.219904 4954 generic.go:334] "Generic (PLEG): container finished" podID="74461432-266d-419b-a923-bd4c4d5e0e22" containerID="bbfd2413ff54f75a8c3e9f195eeac0613f5fcad4a8500c5e5654633e31e5f423" exitCode=143 Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.219959 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.220035 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74461432-266d-419b-a923-bd4c4d5e0e22","Type":"ContainerDied","Data":"9c7423daabff7f6c5d33be3a9e09a81f8f081deaa774a2758b39f99a124de1a4"} Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.220115 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74461432-266d-419b-a923-bd4c4d5e0e22","Type":"ContainerDied","Data":"bbfd2413ff54f75a8c3e9f195eeac0613f5fcad4a8500c5e5654633e31e5f423"} Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.220133 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74461432-266d-419b-a923-bd4c4d5e0e22","Type":"ContainerDied","Data":"82ba69bcde26b90d13cbcf9e7d3ddbc48851cae2b7509a1956ba02a846d9033a"} Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.220157 4954 scope.go:117] "RemoveContainer" containerID="9c7423daabff7f6c5d33be3a9e09a81f8f081deaa774a2758b39f99a124de1a4" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.258828 4954 scope.go:117] "RemoveContainer" containerID="bbfd2413ff54f75a8c3e9f195eeac0613f5fcad4a8500c5e5654633e31e5f423" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.266545 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.277827 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.287754 4954 scope.go:117] "RemoveContainer" containerID="9c7423daabff7f6c5d33be3a9e09a81f8f081deaa774a2758b39f99a124de1a4" Dec 06 09:00:31 crc kubenswrapper[4954]: E1206 09:00:31.288137 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c7423daabff7f6c5d33be3a9e09a81f8f081deaa774a2758b39f99a124de1a4\": container with ID starting with 9c7423daabff7f6c5d33be3a9e09a81f8f081deaa774a2758b39f99a124de1a4 not found: ID does not exist" containerID="9c7423daabff7f6c5d33be3a9e09a81f8f081deaa774a2758b39f99a124de1a4" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.288169 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c7423daabff7f6c5d33be3a9e09a81f8f081deaa774a2758b39f99a124de1a4"} err="failed to get container status \"9c7423daabff7f6c5d33be3a9e09a81f8f081deaa774a2758b39f99a124de1a4\": rpc error: code = NotFound desc = could not find container \"9c7423daabff7f6c5d33be3a9e09a81f8f081deaa774a2758b39f99a124de1a4\": container with ID starting with 9c7423daabff7f6c5d33be3a9e09a81f8f081deaa774a2758b39f99a124de1a4 not found: ID does not exist" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.288190 4954 scope.go:117] "RemoveContainer" containerID="bbfd2413ff54f75a8c3e9f195eeac0613f5fcad4a8500c5e5654633e31e5f423" Dec 06 09:00:31 crc kubenswrapper[4954]: E1206 09:00:31.288602 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbfd2413ff54f75a8c3e9f195eeac0613f5fcad4a8500c5e5654633e31e5f423\": container with ID starting with bbfd2413ff54f75a8c3e9f195eeac0613f5fcad4a8500c5e5654633e31e5f423 not found: ID does not exist" containerID="bbfd2413ff54f75a8c3e9f195eeac0613f5fcad4a8500c5e5654633e31e5f423" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.288620 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbfd2413ff54f75a8c3e9f195eeac0613f5fcad4a8500c5e5654633e31e5f423"} err="failed to get container status \"bbfd2413ff54f75a8c3e9f195eeac0613f5fcad4a8500c5e5654633e31e5f423\": rpc error: code = NotFound desc = could not find container \"bbfd2413ff54f75a8c3e9f195eeac0613f5fcad4a8500c5e5654633e31e5f423\": container with ID starting with bbfd2413ff54f75a8c3e9f195eeac0613f5fcad4a8500c5e5654633e31e5f423 not found: ID does not exist" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.288633 4954 scope.go:117] "RemoveContainer" containerID="9c7423daabff7f6c5d33be3a9e09a81f8f081deaa774a2758b39f99a124de1a4" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.290125 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c7423daabff7f6c5d33be3a9e09a81f8f081deaa774a2758b39f99a124de1a4"} err="failed to get container status \"9c7423daabff7f6c5d33be3a9e09a81f8f081deaa774a2758b39f99a124de1a4\": rpc error: code = NotFound desc = could not find container \"9c7423daabff7f6c5d33be3a9e09a81f8f081deaa774a2758b39f99a124de1a4\": container with ID starting with 9c7423daabff7f6c5d33be3a9e09a81f8f081deaa774a2758b39f99a124de1a4 not found: ID does not exist" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.290147 4954 scope.go:117] "RemoveContainer" containerID="bbfd2413ff54f75a8c3e9f195eeac0613f5fcad4a8500c5e5654633e31e5f423" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.290605 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbfd2413ff54f75a8c3e9f195eeac0613f5fcad4a8500c5e5654633e31e5f423"} err="failed to get container status \"bbfd2413ff54f75a8c3e9f195eeac0613f5fcad4a8500c5e5654633e31e5f423\": rpc error: code = NotFound desc = could not find container \"bbfd2413ff54f75a8c3e9f195eeac0613f5fcad4a8500c5e5654633e31e5f423\": container with ID starting with bbfd2413ff54f75a8c3e9f195eeac0613f5fcad4a8500c5e5654633e31e5f423 not found: ID does not exist" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.294140 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 09:00:31 crc kubenswrapper[4954]: E1206 09:00:31.295124 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74461432-266d-419b-a923-bd4c4d5e0e22" containerName="glance-log" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.295174 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="74461432-266d-419b-a923-bd4c4d5e0e22" containerName="glance-log" Dec 06 09:00:31 crc kubenswrapper[4954]: E1206 09:00:31.295263 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74461432-266d-419b-a923-bd4c4d5e0e22" containerName="glance-httpd" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.295274 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="74461432-266d-419b-a923-bd4c4d5e0e22" containerName="glance-httpd" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.295529 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="74461432-266d-419b-a923-bd4c4d5e0e22" containerName="glance-log" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.295678 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="74461432-266d-419b-a923-bd4c4d5e0e22" containerName="glance-httpd" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.296898 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.299894 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.305782 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.327379 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 06 09:00:31 crc kubenswrapper[4954]: W1206 09:00:31.340438 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbccf6ec3_7db7_497d_97ba_e7002fb77b80.slice/crio-118ad42982ce60c01e606e5170ba5994ec6bbe62f9be498fd4a922b07ecd5417 WatchSource:0}: Error finding container 118ad42982ce60c01e606e5170ba5994ec6bbe62f9be498fd4a922b07ecd5417: Status 404 returned error can't find the container with id 118ad42982ce60c01e606e5170ba5994ec6bbe62f9be498fd4a922b07ecd5417 Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.353431 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.404994 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f5005a42-2172-4a4d-b063-a9c24649e747-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.405075 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.405097 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5005a42-2172-4a4d-b063-a9c24649e747-logs\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.405534 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwdcg\" (UniqueName: \"kubernetes.io/projected/f5005a42-2172-4a4d-b063-a9c24649e747-kube-api-access-xwdcg\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.405591 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.405616 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.405816 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.454480 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59924d65-bfbc-4d6e-a896-0318d18cad56" path="/var/lib/kubelet/pods/59924d65-bfbc-4d6e-a896-0318d18cad56/volumes" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.455342 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74461432-266d-419b-a923-bd4c4d5e0e22" path="/var/lib/kubelet/pods/74461432-266d-419b-a923-bd4c4d5e0e22/volumes" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.507915 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.508015 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.508105 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f5005a42-2172-4a4d-b063-a9c24649e747-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.508168 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.508189 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5005a42-2172-4a4d-b063-a9c24649e747-logs\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.508328 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwdcg\" (UniqueName: \"kubernetes.io/projected/f5005a42-2172-4a4d-b063-a9c24649e747-kube-api-access-xwdcg\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.508416 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.508942 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f5005a42-2172-4a4d-b063-a9c24649e747-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.509048 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5005a42-2172-4a4d-b063-a9c24649e747-logs\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.512313 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.512315 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.513195 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.513711 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.524550 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwdcg\" (UniqueName: \"kubernetes.io/projected/f5005a42-2172-4a4d-b063-a9c24649e747-kube-api-access-xwdcg\") pod \"glance-default-internal-api-0\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:00:31 crc kubenswrapper[4954]: I1206 09:00:31.633979 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 09:00:32 crc kubenswrapper[4954]: I1206 09:00:32.142469 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 09:00:32 crc kubenswrapper[4954]: W1206 09:00:32.152158 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5005a42_2172_4a4d_b063_a9c24649e747.slice/crio-6d759c460d85b42748485b75443cc67bdcbc95a2fe4363973ba119ba02a7d17c WatchSource:0}: Error finding container 6d759c460d85b42748485b75443cc67bdcbc95a2fe4363973ba119ba02a7d17c: Status 404 returned error can't find the container with id 6d759c460d85b42748485b75443cc67bdcbc95a2fe4363973ba119ba02a7d17c Dec 06 09:00:32 crc kubenswrapper[4954]: I1206 09:00:32.229086 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f5005a42-2172-4a4d-b063-a9c24649e747","Type":"ContainerStarted","Data":"6d759c460d85b42748485b75443cc67bdcbc95a2fe4363973ba119ba02a7d17c"} Dec 06 09:00:32 crc kubenswrapper[4954]: I1206 09:00:32.230429 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bccf6ec3-7db7-497d-97ba-e7002fb77b80","Type":"ContainerStarted","Data":"ab537d70b86dbdeee1673a5e85e881972b23f51ef78d8ab71bc18429ee8fe81b"} Dec 06 09:00:32 crc kubenswrapper[4954]: I1206 09:00:32.230455 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bccf6ec3-7db7-497d-97ba-e7002fb77b80","Type":"ContainerStarted","Data":"118ad42982ce60c01e606e5170ba5994ec6bbe62f9be498fd4a922b07ecd5417"} Dec 06 09:00:33 crc kubenswrapper[4954]: I1206 09:00:33.242907 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f5005a42-2172-4a4d-b063-a9c24649e747","Type":"ContainerStarted","Data":"500f2fc8df3f1ece73cec8ecc49047b0de1241d9f0ee0300d4f8e7cafe8c1b6c"} Dec 06 09:00:33 crc kubenswrapper[4954]: I1206 09:00:33.244443 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bccf6ec3-7db7-497d-97ba-e7002fb77b80","Type":"ContainerStarted","Data":"fd79a5aa15258c69a1e9270ca7c72365c816bc413d9201e4eecfb57701183d44"} Dec 06 09:00:33 crc kubenswrapper[4954]: I1206 09:00:33.275612 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.275580636 podStartE2EDuration="3.275580636s" podCreationTimestamp="2025-12-06 09:00:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:00:33.267972104 +0000 UTC m=+7408.081331523" watchObservedRunningTime="2025-12-06 09:00:33.275580636 +0000 UTC m=+7408.088940045" Dec 06 09:00:34 crc kubenswrapper[4954]: I1206 09:00:34.252972 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f5005a42-2172-4a4d-b063-a9c24649e747","Type":"ContainerStarted","Data":"fdc387b7a76029405379b37127fe7d9ec4b0befd24f8eb115bb83973bf3671fa"} Dec 06 09:00:34 crc kubenswrapper[4954]: I1206 09:00:34.272112 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.272094359 podStartE2EDuration="3.272094359s" podCreationTimestamp="2025-12-06 09:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:00:34.269721426 +0000 UTC m=+7409.083080815" watchObservedRunningTime="2025-12-06 09:00:34.272094359 +0000 UTC m=+7409.085453748" Dec 06 09:00:36 crc kubenswrapper[4954]: I1206 09:00:36.822734 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:00:36 crc kubenswrapper[4954]: I1206 09:00:36.879249 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cf6448dcf-qplt7"] Dec 06 09:00:36 crc kubenswrapper[4954]: I1206 09:00:36.879774 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" podUID="1aca932f-9701-482e-8973-81dc83779cb1" containerName="dnsmasq-dns" containerID="cri-o://821307b669af19f49db134ae5e408c5a20edf17d92a2eee3b7eeb3634ce89dd6" gracePeriod=10 Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.295287 4954 generic.go:334] "Generic (PLEG): container finished" podID="1aca932f-9701-482e-8973-81dc83779cb1" containerID="821307b669af19f49db134ae5e408c5a20edf17d92a2eee3b7eeb3634ce89dd6" exitCode=0 Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.295363 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" event={"ID":"1aca932f-9701-482e-8973-81dc83779cb1","Type":"ContainerDied","Data":"821307b669af19f49db134ae5e408c5a20edf17d92a2eee3b7eeb3634ce89dd6"} Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.295656 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" event={"ID":"1aca932f-9701-482e-8973-81dc83779cb1","Type":"ContainerDied","Data":"6dab85d51c1aa1fb222fb8d119da5a5dbc374a883d1a1b00705456df34e9c235"} Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.295688 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dab85d51c1aa1fb222fb8d119da5a5dbc374a883d1a1b00705456df34e9c235" Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.359526 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.538993 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx8bm\" (UniqueName: \"kubernetes.io/projected/1aca932f-9701-482e-8973-81dc83779cb1-kube-api-access-mx8bm\") pod \"1aca932f-9701-482e-8973-81dc83779cb1\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.539049 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-ovsdbserver-sb\") pod \"1aca932f-9701-482e-8973-81dc83779cb1\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.539090 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-dns-svc\") pod \"1aca932f-9701-482e-8973-81dc83779cb1\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.539200 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-config\") pod \"1aca932f-9701-482e-8973-81dc83779cb1\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.539266 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-ovsdbserver-nb\") pod \"1aca932f-9701-482e-8973-81dc83779cb1\" (UID: \"1aca932f-9701-482e-8973-81dc83779cb1\") " Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.545203 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aca932f-9701-482e-8973-81dc83779cb1-kube-api-access-mx8bm" (OuterVolumeSpecName: "kube-api-access-mx8bm") pod "1aca932f-9701-482e-8973-81dc83779cb1" (UID: "1aca932f-9701-482e-8973-81dc83779cb1"). InnerVolumeSpecName "kube-api-access-mx8bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.585222 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1aca932f-9701-482e-8973-81dc83779cb1" (UID: "1aca932f-9701-482e-8973-81dc83779cb1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.585717 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1aca932f-9701-482e-8973-81dc83779cb1" (UID: "1aca932f-9701-482e-8973-81dc83779cb1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.587137 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-config" (OuterVolumeSpecName: "config") pod "1aca932f-9701-482e-8973-81dc83779cb1" (UID: "1aca932f-9701-482e-8973-81dc83779cb1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.589360 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1aca932f-9701-482e-8973-81dc83779cb1" (UID: "1aca932f-9701-482e-8973-81dc83779cb1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.641381 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.641419 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx8bm\" (UniqueName: \"kubernetes.io/projected/1aca932f-9701-482e-8973-81dc83779cb1-kube-api-access-mx8bm\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.641433 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.641444 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:37 crc kubenswrapper[4954]: I1206 09:00:37.641458 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aca932f-9701-482e-8973-81dc83779cb1-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:38 crc kubenswrapper[4954]: I1206 09:00:38.303250 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cf6448dcf-qplt7" Dec 06 09:00:38 crc kubenswrapper[4954]: I1206 09:00:38.334373 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cf6448dcf-qplt7"] Dec 06 09:00:38 crc kubenswrapper[4954]: I1206 09:00:38.343623 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cf6448dcf-qplt7"] Dec 06 09:00:39 crc kubenswrapper[4954]: I1206 09:00:39.456201 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aca932f-9701-482e-8973-81dc83779cb1" path="/var/lib/kubelet/pods/1aca932f-9701-482e-8973-81dc83779cb1/volumes" Dec 06 09:00:40 crc kubenswrapper[4954]: I1206 09:00:40.656946 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 09:00:40 crc kubenswrapper[4954]: I1206 09:00:40.657002 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 09:00:40 crc kubenswrapper[4954]: I1206 09:00:40.689787 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 09:00:40 crc kubenswrapper[4954]: I1206 09:00:40.714285 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 09:00:41 crc kubenswrapper[4954]: I1206 09:00:41.329515 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 09:00:41 crc kubenswrapper[4954]: I1206 09:00:41.329570 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 09:00:41 crc kubenswrapper[4954]: I1206 09:00:41.634804 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 09:00:41 crc kubenswrapper[4954]: I1206 09:00:41.634906 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 09:00:41 crc kubenswrapper[4954]: I1206 09:00:41.671925 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 09:00:41 crc kubenswrapper[4954]: I1206 09:00:41.700421 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 09:00:42 crc kubenswrapper[4954]: I1206 09:00:42.338271 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 09:00:42 crc kubenswrapper[4954]: I1206 09:00:42.338641 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 09:00:43 crc kubenswrapper[4954]: I1206 09:00:43.753688 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 09:00:43 crc kubenswrapper[4954]: I1206 09:00:43.754966 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 09:00:45 crc kubenswrapper[4954]: I1206 09:00:45.147500 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 09:00:45 crc kubenswrapper[4954]: I1206 09:00:45.148707 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 09:00:45 crc kubenswrapper[4954]: I1206 09:00:45.200442 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.370005 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-h75rt"] Dec 06 09:00:53 crc kubenswrapper[4954]: E1206 09:00:53.370963 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aca932f-9701-482e-8973-81dc83779cb1" containerName="init" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.370982 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aca932f-9701-482e-8973-81dc83779cb1" containerName="init" Dec 06 09:00:53 crc kubenswrapper[4954]: E1206 09:00:53.371006 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aca932f-9701-482e-8973-81dc83779cb1" containerName="dnsmasq-dns" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.371015 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aca932f-9701-482e-8973-81dc83779cb1" containerName="dnsmasq-dns" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.371274 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aca932f-9701-482e-8973-81dc83779cb1" containerName="dnsmasq-dns" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.372085 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-h75rt" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.384701 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-h75rt"] Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.463538 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzqxv\" (UniqueName: \"kubernetes.io/projected/a6670e88-e4e3-47e7-988c-55a8d6fc868f-kube-api-access-vzqxv\") pod \"placement-db-create-h75rt\" (UID: \"a6670e88-e4e3-47e7-988c-55a8d6fc868f\") " pod="openstack/placement-db-create-h75rt" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.463621 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6670e88-e4e3-47e7-988c-55a8d6fc868f-operator-scripts\") pod \"placement-db-create-h75rt\" (UID: \"a6670e88-e4e3-47e7-988c-55a8d6fc868f\") " pod="openstack/placement-db-create-h75rt" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.474815 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-856b-account-create-update-tftj9"] Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.478440 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-856b-account-create-update-tftj9" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.483949 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.493182 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-856b-account-create-update-tftj9"] Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.565545 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9gsl\" (UniqueName: \"kubernetes.io/projected/a375ec95-2316-4971-9b5e-757088bfaa3a-kube-api-access-d9gsl\") pod \"placement-856b-account-create-update-tftj9\" (UID: \"a375ec95-2316-4971-9b5e-757088bfaa3a\") " pod="openstack/placement-856b-account-create-update-tftj9" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.565692 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzqxv\" (UniqueName: \"kubernetes.io/projected/a6670e88-e4e3-47e7-988c-55a8d6fc868f-kube-api-access-vzqxv\") pod \"placement-db-create-h75rt\" (UID: \"a6670e88-e4e3-47e7-988c-55a8d6fc868f\") " pod="openstack/placement-db-create-h75rt" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.565780 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6670e88-e4e3-47e7-988c-55a8d6fc868f-operator-scripts\") pod \"placement-db-create-h75rt\" (UID: \"a6670e88-e4e3-47e7-988c-55a8d6fc868f\") " pod="openstack/placement-db-create-h75rt" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.565820 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a375ec95-2316-4971-9b5e-757088bfaa3a-operator-scripts\") pod \"placement-856b-account-create-update-tftj9\" (UID: \"a375ec95-2316-4971-9b5e-757088bfaa3a\") " pod="openstack/placement-856b-account-create-update-tftj9" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.566723 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6670e88-e4e3-47e7-988c-55a8d6fc868f-operator-scripts\") pod \"placement-db-create-h75rt\" (UID: \"a6670e88-e4e3-47e7-988c-55a8d6fc868f\") " pod="openstack/placement-db-create-h75rt" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.600297 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzqxv\" (UniqueName: \"kubernetes.io/projected/a6670e88-e4e3-47e7-988c-55a8d6fc868f-kube-api-access-vzqxv\") pod \"placement-db-create-h75rt\" (UID: \"a6670e88-e4e3-47e7-988c-55a8d6fc868f\") " pod="openstack/placement-db-create-h75rt" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.668126 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a375ec95-2316-4971-9b5e-757088bfaa3a-operator-scripts\") pod \"placement-856b-account-create-update-tftj9\" (UID: \"a375ec95-2316-4971-9b5e-757088bfaa3a\") " pod="openstack/placement-856b-account-create-update-tftj9" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.668330 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9gsl\" (UniqueName: \"kubernetes.io/projected/a375ec95-2316-4971-9b5e-757088bfaa3a-kube-api-access-d9gsl\") pod \"placement-856b-account-create-update-tftj9\" (UID: \"a375ec95-2316-4971-9b5e-757088bfaa3a\") " pod="openstack/placement-856b-account-create-update-tftj9" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.669181 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a375ec95-2316-4971-9b5e-757088bfaa3a-operator-scripts\") pod \"placement-856b-account-create-update-tftj9\" (UID: \"a375ec95-2316-4971-9b5e-757088bfaa3a\") " pod="openstack/placement-856b-account-create-update-tftj9" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.688779 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9gsl\" (UniqueName: \"kubernetes.io/projected/a375ec95-2316-4971-9b5e-757088bfaa3a-kube-api-access-d9gsl\") pod \"placement-856b-account-create-update-tftj9\" (UID: \"a375ec95-2316-4971-9b5e-757088bfaa3a\") " pod="openstack/placement-856b-account-create-update-tftj9" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.701173 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-h75rt" Dec 06 09:00:53 crc kubenswrapper[4954]: I1206 09:00:53.798019 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-856b-account-create-update-tftj9" Dec 06 09:00:54 crc kubenswrapper[4954]: I1206 09:00:54.185125 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-h75rt"] Dec 06 09:00:54 crc kubenswrapper[4954]: W1206 09:00:54.190807 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6670e88_e4e3_47e7_988c_55a8d6fc868f.slice/crio-fcc83675731519e687c84741b766f7cdd15f4c1c53a7418fbc253728275d618f WatchSource:0}: Error finding container fcc83675731519e687c84741b766f7cdd15f4c1c53a7418fbc253728275d618f: Status 404 returned error can't find the container with id fcc83675731519e687c84741b766f7cdd15f4c1c53a7418fbc253728275d618f Dec 06 09:00:54 crc kubenswrapper[4954]: I1206 09:00:54.303542 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-856b-account-create-update-tftj9"] Dec 06 09:00:54 crc kubenswrapper[4954]: W1206 09:00:54.307771 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda375ec95_2316_4971_9b5e_757088bfaa3a.slice/crio-78de509536f7431d1b0ca31ece2715e6f5dfc5c709411b2c6d2e4c5da6a2c2ad WatchSource:0}: Error finding container 78de509536f7431d1b0ca31ece2715e6f5dfc5c709411b2c6d2e4c5da6a2c2ad: Status 404 returned error can't find the container with id 78de509536f7431d1b0ca31ece2715e6f5dfc5c709411b2c6d2e4c5da6a2c2ad Dec 06 09:00:54 crc kubenswrapper[4954]: I1206 09:00:54.455296 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-h75rt" event={"ID":"a6670e88-e4e3-47e7-988c-55a8d6fc868f","Type":"ContainerStarted","Data":"51ba5f6bebaee456a005a086ab9cbccc59287c82b90049fce52c6ac82e1708c5"} Dec 06 09:00:54 crc kubenswrapper[4954]: I1206 09:00:54.455340 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-h75rt" event={"ID":"a6670e88-e4e3-47e7-988c-55a8d6fc868f","Type":"ContainerStarted","Data":"fcc83675731519e687c84741b766f7cdd15f4c1c53a7418fbc253728275d618f"} Dec 06 09:00:54 crc kubenswrapper[4954]: I1206 09:00:54.458119 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-856b-account-create-update-tftj9" event={"ID":"a375ec95-2316-4971-9b5e-757088bfaa3a","Type":"ContainerStarted","Data":"baebdeff029d20859471674ca23a4036db0ec8cd584b13a62b8e1923fef26993"} Dec 06 09:00:54 crc kubenswrapper[4954]: I1206 09:00:54.458154 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-856b-account-create-update-tftj9" event={"ID":"a375ec95-2316-4971-9b5e-757088bfaa3a","Type":"ContainerStarted","Data":"78de509536f7431d1b0ca31ece2715e6f5dfc5c709411b2c6d2e4c5da6a2c2ad"} Dec 06 09:00:54 crc kubenswrapper[4954]: I1206 09:00:54.473184 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-h75rt" podStartSLOduration=1.4731631840000001 podStartE2EDuration="1.473163184s" podCreationTimestamp="2025-12-06 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:00:54.473089502 +0000 UTC m=+7429.286448881" watchObservedRunningTime="2025-12-06 09:00:54.473163184 +0000 UTC m=+7429.286522573" Dec 06 09:00:54 crc kubenswrapper[4954]: I1206 09:00:54.487694 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-856b-account-create-update-tftj9" podStartSLOduration=1.48767513 podStartE2EDuration="1.48767513s" podCreationTimestamp="2025-12-06 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:00:54.485078311 +0000 UTC m=+7429.298437700" watchObservedRunningTime="2025-12-06 09:00:54.48767513 +0000 UTC m=+7429.301034519" Dec 06 09:00:55 crc kubenswrapper[4954]: I1206 09:00:55.472898 4954 generic.go:334] "Generic (PLEG): container finished" podID="a375ec95-2316-4971-9b5e-757088bfaa3a" containerID="baebdeff029d20859471674ca23a4036db0ec8cd584b13a62b8e1923fef26993" exitCode=0 Dec 06 09:00:55 crc kubenswrapper[4954]: I1206 09:00:55.472984 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-856b-account-create-update-tftj9" event={"ID":"a375ec95-2316-4971-9b5e-757088bfaa3a","Type":"ContainerDied","Data":"baebdeff029d20859471674ca23a4036db0ec8cd584b13a62b8e1923fef26993"} Dec 06 09:00:55 crc kubenswrapper[4954]: I1206 09:00:55.475250 4954 generic.go:334] "Generic (PLEG): container finished" podID="a6670e88-e4e3-47e7-988c-55a8d6fc868f" containerID="51ba5f6bebaee456a005a086ab9cbccc59287c82b90049fce52c6ac82e1708c5" exitCode=0 Dec 06 09:00:55 crc kubenswrapper[4954]: I1206 09:00:55.475285 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-h75rt" event={"ID":"a6670e88-e4e3-47e7-988c-55a8d6fc868f","Type":"ContainerDied","Data":"51ba5f6bebaee456a005a086ab9cbccc59287c82b90049fce52c6ac82e1708c5"} Dec 06 09:00:56 crc kubenswrapper[4954]: I1206 09:00:56.914591 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-h75rt" Dec 06 09:00:56 crc kubenswrapper[4954]: I1206 09:00:56.920448 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-856b-account-create-update-tftj9" Dec 06 09:00:57 crc kubenswrapper[4954]: I1206 09:00:57.032763 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a375ec95-2316-4971-9b5e-757088bfaa3a-operator-scripts\") pod \"a375ec95-2316-4971-9b5e-757088bfaa3a\" (UID: \"a375ec95-2316-4971-9b5e-757088bfaa3a\") " Dec 06 09:00:57 crc kubenswrapper[4954]: I1206 09:00:57.032811 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9gsl\" (UniqueName: \"kubernetes.io/projected/a375ec95-2316-4971-9b5e-757088bfaa3a-kube-api-access-d9gsl\") pod \"a375ec95-2316-4971-9b5e-757088bfaa3a\" (UID: \"a375ec95-2316-4971-9b5e-757088bfaa3a\") " Dec 06 09:00:57 crc kubenswrapper[4954]: I1206 09:00:57.033356 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a375ec95-2316-4971-9b5e-757088bfaa3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a375ec95-2316-4971-9b5e-757088bfaa3a" (UID: "a375ec95-2316-4971-9b5e-757088bfaa3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:00:57 crc kubenswrapper[4954]: I1206 09:00:57.033869 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6670e88-e4e3-47e7-988c-55a8d6fc868f-operator-scripts\") pod \"a6670e88-e4e3-47e7-988c-55a8d6fc868f\" (UID: \"a6670e88-e4e3-47e7-988c-55a8d6fc868f\") " Dec 06 09:00:57 crc kubenswrapper[4954]: I1206 09:00:57.033968 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzqxv\" (UniqueName: \"kubernetes.io/projected/a6670e88-e4e3-47e7-988c-55a8d6fc868f-kube-api-access-vzqxv\") pod \"a6670e88-e4e3-47e7-988c-55a8d6fc868f\" (UID: \"a6670e88-e4e3-47e7-988c-55a8d6fc868f\") " Dec 06 09:00:57 crc kubenswrapper[4954]: I1206 09:00:57.034298 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6670e88-e4e3-47e7-988c-55a8d6fc868f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6670e88-e4e3-47e7-988c-55a8d6fc868f" (UID: "a6670e88-e4e3-47e7-988c-55a8d6fc868f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:00:57 crc kubenswrapper[4954]: I1206 09:00:57.034819 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6670e88-e4e3-47e7-988c-55a8d6fc868f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:57 crc kubenswrapper[4954]: I1206 09:00:57.034838 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a375ec95-2316-4971-9b5e-757088bfaa3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:57 crc kubenswrapper[4954]: I1206 09:00:57.038217 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6670e88-e4e3-47e7-988c-55a8d6fc868f-kube-api-access-vzqxv" (OuterVolumeSpecName: "kube-api-access-vzqxv") pod "a6670e88-e4e3-47e7-988c-55a8d6fc868f" (UID: "a6670e88-e4e3-47e7-988c-55a8d6fc868f"). InnerVolumeSpecName "kube-api-access-vzqxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:00:57 crc kubenswrapper[4954]: I1206 09:00:57.038296 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a375ec95-2316-4971-9b5e-757088bfaa3a-kube-api-access-d9gsl" (OuterVolumeSpecName: "kube-api-access-d9gsl") pod "a375ec95-2316-4971-9b5e-757088bfaa3a" (UID: "a375ec95-2316-4971-9b5e-757088bfaa3a"). InnerVolumeSpecName "kube-api-access-d9gsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:00:57 crc kubenswrapper[4954]: I1206 09:00:57.136761 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzqxv\" (UniqueName: \"kubernetes.io/projected/a6670e88-e4e3-47e7-988c-55a8d6fc868f-kube-api-access-vzqxv\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:57 crc kubenswrapper[4954]: I1206 09:00:57.136800 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9gsl\" (UniqueName: \"kubernetes.io/projected/a375ec95-2316-4971-9b5e-757088bfaa3a-kube-api-access-d9gsl\") on node \"crc\" DevicePath \"\"" Dec 06 09:00:57 crc kubenswrapper[4954]: I1206 09:00:57.504874 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-h75rt" Dec 06 09:00:57 crc kubenswrapper[4954]: I1206 09:00:57.504910 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-h75rt" event={"ID":"a6670e88-e4e3-47e7-988c-55a8d6fc868f","Type":"ContainerDied","Data":"fcc83675731519e687c84741b766f7cdd15f4c1c53a7418fbc253728275d618f"} Dec 06 09:00:57 crc kubenswrapper[4954]: I1206 09:00:57.505310 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcc83675731519e687c84741b766f7cdd15f4c1c53a7418fbc253728275d618f" Dec 06 09:00:57 crc kubenswrapper[4954]: I1206 09:00:57.507142 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-856b-account-create-update-tftj9" event={"ID":"a375ec95-2316-4971-9b5e-757088bfaa3a","Type":"ContainerDied","Data":"78de509536f7431d1b0ca31ece2715e6f5dfc5c709411b2c6d2e4c5da6a2c2ad"} Dec 06 09:00:57 crc kubenswrapper[4954]: I1206 09:00:57.507194 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78de509536f7431d1b0ca31ece2715e6f5dfc5c709411b2c6d2e4c5da6a2c2ad" Dec 06 09:00:57 crc kubenswrapper[4954]: I1206 09:00:57.507217 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-856b-account-create-update-tftj9" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.797738 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-xfwfr"] Dec 06 09:00:58 crc kubenswrapper[4954]: E1206 09:00:58.798129 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a375ec95-2316-4971-9b5e-757088bfaa3a" containerName="mariadb-account-create-update" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.798142 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a375ec95-2316-4971-9b5e-757088bfaa3a" containerName="mariadb-account-create-update" Dec 06 09:00:58 crc kubenswrapper[4954]: E1206 09:00:58.798160 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6670e88-e4e3-47e7-988c-55a8d6fc868f" containerName="mariadb-database-create" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.798166 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6670e88-e4e3-47e7-988c-55a8d6fc868f" containerName="mariadb-database-create" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.798368 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6670e88-e4e3-47e7-988c-55a8d6fc868f" containerName="mariadb-database-create" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.798379 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a375ec95-2316-4971-9b5e-757088bfaa3a" containerName="mariadb-account-create-update" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.799251 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xfwfr" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.801701 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.804235 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-26sbh" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.804255 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.809212 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b9f8d8cdf-5rtxk"] Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.810867 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.834128 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xfwfr"] Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.857680 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b9f8d8cdf-5rtxk"] Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.896903 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-ovsdbserver-sb\") pod \"dnsmasq-dns-b9f8d8cdf-5rtxk\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.896957 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-ovsdbserver-nb\") pod \"dnsmasq-dns-b9f8d8cdf-5rtxk\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.896988 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g2k4\" (UniqueName: \"kubernetes.io/projected/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-kube-api-access-6g2k4\") pod \"placement-db-sync-xfwfr\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " pod="openstack/placement-db-sync-xfwfr" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.897016 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfqg2\" (UniqueName: \"kubernetes.io/projected/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-kube-api-access-dfqg2\") pod \"dnsmasq-dns-b9f8d8cdf-5rtxk\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.897032 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-config\") pod \"dnsmasq-dns-b9f8d8cdf-5rtxk\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.897083 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-dns-svc\") pod \"dnsmasq-dns-b9f8d8cdf-5rtxk\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.897127 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-config-data\") pod \"placement-db-sync-xfwfr\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " pod="openstack/placement-db-sync-xfwfr" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.897150 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-scripts\") pod \"placement-db-sync-xfwfr\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " pod="openstack/placement-db-sync-xfwfr" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.897187 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-combined-ca-bundle\") pod \"placement-db-sync-xfwfr\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " pod="openstack/placement-db-sync-xfwfr" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.897231 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-logs\") pod \"placement-db-sync-xfwfr\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " pod="openstack/placement-db-sync-xfwfr" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.999205 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-combined-ca-bundle\") pod \"placement-db-sync-xfwfr\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " pod="openstack/placement-db-sync-xfwfr" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.999588 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-logs\") pod \"placement-db-sync-xfwfr\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " pod="openstack/placement-db-sync-xfwfr" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.999633 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-ovsdbserver-sb\") pod \"dnsmasq-dns-b9f8d8cdf-5rtxk\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.999675 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-ovsdbserver-nb\") pod \"dnsmasq-dns-b9f8d8cdf-5rtxk\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.999742 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g2k4\" (UniqueName: \"kubernetes.io/projected/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-kube-api-access-6g2k4\") pod \"placement-db-sync-xfwfr\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " pod="openstack/placement-db-sync-xfwfr" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.999773 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfqg2\" (UniqueName: \"kubernetes.io/projected/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-kube-api-access-dfqg2\") pod \"dnsmasq-dns-b9f8d8cdf-5rtxk\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.999789 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-config\") pod \"dnsmasq-dns-b9f8d8cdf-5rtxk\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.999849 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-dns-svc\") pod \"dnsmasq-dns-b9f8d8cdf-5rtxk\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.999919 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-config-data\") pod \"placement-db-sync-xfwfr\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " pod="openstack/placement-db-sync-xfwfr" Dec 06 09:00:58 crc kubenswrapper[4954]: I1206 09:00:58.999951 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-scripts\") pod \"placement-db-sync-xfwfr\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " pod="openstack/placement-db-sync-xfwfr" Dec 06 09:00:59 crc kubenswrapper[4954]: I1206 09:00:59.001394 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-config\") pod \"dnsmasq-dns-b9f8d8cdf-5rtxk\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:00:59 crc kubenswrapper[4954]: I1206 09:00:59.001437 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-ovsdbserver-sb\") pod \"dnsmasq-dns-b9f8d8cdf-5rtxk\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:00:59 crc kubenswrapper[4954]: I1206 09:00:59.001495 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-ovsdbserver-nb\") pod \"dnsmasq-dns-b9f8d8cdf-5rtxk\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:00:59 crc kubenswrapper[4954]: I1206 09:00:59.001637 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-dns-svc\") pod \"dnsmasq-dns-b9f8d8cdf-5rtxk\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:00:59 crc kubenswrapper[4954]: I1206 09:00:59.001764 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-logs\") pod \"placement-db-sync-xfwfr\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " pod="openstack/placement-db-sync-xfwfr" Dec 06 09:00:59 crc kubenswrapper[4954]: I1206 09:00:59.006092 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-scripts\") pod \"placement-db-sync-xfwfr\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " pod="openstack/placement-db-sync-xfwfr" Dec 06 09:00:59 crc kubenswrapper[4954]: I1206 09:00:59.006158 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-combined-ca-bundle\") pod \"placement-db-sync-xfwfr\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " pod="openstack/placement-db-sync-xfwfr" Dec 06 09:00:59 crc kubenswrapper[4954]: I1206 09:00:59.017268 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-config-data\") pod \"placement-db-sync-xfwfr\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " pod="openstack/placement-db-sync-xfwfr" Dec 06 09:00:59 crc kubenswrapper[4954]: I1206 09:00:59.020103 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g2k4\" (UniqueName: \"kubernetes.io/projected/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-kube-api-access-6g2k4\") pod \"placement-db-sync-xfwfr\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " pod="openstack/placement-db-sync-xfwfr" Dec 06 09:00:59 crc kubenswrapper[4954]: I1206 09:00:59.020783 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfqg2\" (UniqueName: \"kubernetes.io/projected/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-kube-api-access-dfqg2\") pod \"dnsmasq-dns-b9f8d8cdf-5rtxk\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:00:59 crc kubenswrapper[4954]: I1206 09:00:59.122192 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xfwfr" Dec 06 09:00:59 crc kubenswrapper[4954]: I1206 09:00:59.145333 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:00:59 crc kubenswrapper[4954]: I1206 09:00:59.666521 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xfwfr"] Dec 06 09:00:59 crc kubenswrapper[4954]: I1206 09:00:59.794414 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b9f8d8cdf-5rtxk"] Dec 06 09:00:59 crc kubenswrapper[4954]: W1206 09:00:59.797840 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11ea7382_8d96_4609_a38c_b7fd6ef3a0a2.slice/crio-1f7d3518d008fffc7272ccb2ccac25d3db49cc8efac9da6b58966cf4191ea356 WatchSource:0}: Error finding container 1f7d3518d008fffc7272ccb2ccac25d3db49cc8efac9da6b58966cf4191ea356: Status 404 returned error can't find the container with id 1f7d3518d008fffc7272ccb2ccac25d3db49cc8efac9da6b58966cf4191ea356 Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.139110 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29416861-2xng2"] Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.140658 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416861-2xng2" Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.173006 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416861-2xng2"] Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.254361 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36e918d7-8a47-464a-be52-aa9a7f8a875a-config-data\") pod \"keystone-cron-29416861-2xng2\" (UID: \"36e918d7-8a47-464a-be52-aa9a7f8a875a\") " pod="openstack/keystone-cron-29416861-2xng2" Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.254416 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36e918d7-8a47-464a-be52-aa9a7f8a875a-combined-ca-bundle\") pod \"keystone-cron-29416861-2xng2\" (UID: \"36e918d7-8a47-464a-be52-aa9a7f8a875a\") " pod="openstack/keystone-cron-29416861-2xng2" Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.254549 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/36e918d7-8a47-464a-be52-aa9a7f8a875a-fernet-keys\") pod \"keystone-cron-29416861-2xng2\" (UID: \"36e918d7-8a47-464a-be52-aa9a7f8a875a\") " pod="openstack/keystone-cron-29416861-2xng2" Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.254652 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s72hc\" (UniqueName: \"kubernetes.io/projected/36e918d7-8a47-464a-be52-aa9a7f8a875a-kube-api-access-s72hc\") pod \"keystone-cron-29416861-2xng2\" (UID: \"36e918d7-8a47-464a-be52-aa9a7f8a875a\") " pod="openstack/keystone-cron-29416861-2xng2" Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.356245 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/36e918d7-8a47-464a-be52-aa9a7f8a875a-fernet-keys\") pod \"keystone-cron-29416861-2xng2\" (UID: \"36e918d7-8a47-464a-be52-aa9a7f8a875a\") " pod="openstack/keystone-cron-29416861-2xng2" Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.356334 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s72hc\" (UniqueName: \"kubernetes.io/projected/36e918d7-8a47-464a-be52-aa9a7f8a875a-kube-api-access-s72hc\") pod \"keystone-cron-29416861-2xng2\" (UID: \"36e918d7-8a47-464a-be52-aa9a7f8a875a\") " pod="openstack/keystone-cron-29416861-2xng2" Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.356444 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36e918d7-8a47-464a-be52-aa9a7f8a875a-config-data\") pod \"keystone-cron-29416861-2xng2\" (UID: \"36e918d7-8a47-464a-be52-aa9a7f8a875a\") " pod="openstack/keystone-cron-29416861-2xng2" Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.356470 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36e918d7-8a47-464a-be52-aa9a7f8a875a-combined-ca-bundle\") pod \"keystone-cron-29416861-2xng2\" (UID: \"36e918d7-8a47-464a-be52-aa9a7f8a875a\") " pod="openstack/keystone-cron-29416861-2xng2" Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.360274 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36e918d7-8a47-464a-be52-aa9a7f8a875a-combined-ca-bundle\") pod \"keystone-cron-29416861-2xng2\" (UID: \"36e918d7-8a47-464a-be52-aa9a7f8a875a\") " pod="openstack/keystone-cron-29416861-2xng2" Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.361008 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36e918d7-8a47-464a-be52-aa9a7f8a875a-config-data\") pod \"keystone-cron-29416861-2xng2\" (UID: \"36e918d7-8a47-464a-be52-aa9a7f8a875a\") " pod="openstack/keystone-cron-29416861-2xng2" Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.361081 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/36e918d7-8a47-464a-be52-aa9a7f8a875a-fernet-keys\") pod \"keystone-cron-29416861-2xng2\" (UID: \"36e918d7-8a47-464a-be52-aa9a7f8a875a\") " pod="openstack/keystone-cron-29416861-2xng2" Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.379449 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s72hc\" (UniqueName: \"kubernetes.io/projected/36e918d7-8a47-464a-be52-aa9a7f8a875a-kube-api-access-s72hc\") pod \"keystone-cron-29416861-2xng2\" (UID: \"36e918d7-8a47-464a-be52-aa9a7f8a875a\") " pod="openstack/keystone-cron-29416861-2xng2" Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.467971 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416861-2xng2" Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.551058 4954 generic.go:334] "Generic (PLEG): container finished" podID="11ea7382-8d96-4609-a38c-b7fd6ef3a0a2" containerID="c35638e43c2562e6dfb9cf83a637ba84197fc11b549b9b4800be864aa7e6ce79" exitCode=0 Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.551484 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" event={"ID":"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2","Type":"ContainerDied","Data":"c35638e43c2562e6dfb9cf83a637ba84197fc11b549b9b4800be864aa7e6ce79"} Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.551531 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" event={"ID":"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2","Type":"ContainerStarted","Data":"1f7d3518d008fffc7272ccb2ccac25d3db49cc8efac9da6b58966cf4191ea356"} Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.559405 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xfwfr" event={"ID":"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77","Type":"ContainerStarted","Data":"019acc1361764b7e444e60299a05253fda24fef89f06244851c54dacbb63dd74"} Dec 06 09:01:00 crc kubenswrapper[4954]: I1206 09:01:00.945282 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416861-2xng2"] Dec 06 09:01:01 crc kubenswrapper[4954]: I1206 09:01:01.582834 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416861-2xng2" event={"ID":"36e918d7-8a47-464a-be52-aa9a7f8a875a","Type":"ContainerStarted","Data":"ecf9c4568a44ce2b630934a98e52a22e76539023c79cd5c152d30cb913236dfa"} Dec 06 09:01:01 crc kubenswrapper[4954]: I1206 09:01:01.583134 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416861-2xng2" event={"ID":"36e918d7-8a47-464a-be52-aa9a7f8a875a","Type":"ContainerStarted","Data":"1b64815f4661ee9c721a2d5c6e8de90ddec645bb5d29e78b0848269b484006d9"} Dec 06 09:01:01 crc kubenswrapper[4954]: I1206 09:01:01.585047 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" event={"ID":"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2","Type":"ContainerStarted","Data":"c850f31740b9bc831a3e309e59b3fb0fb870f632cac3b269ff39b146154c87f7"} Dec 06 09:01:01 crc kubenswrapper[4954]: I1206 09:01:01.585506 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:01:01 crc kubenswrapper[4954]: I1206 09:01:01.607398 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29416861-2xng2" podStartSLOduration=1.607377011 podStartE2EDuration="1.607377011s" podCreationTimestamp="2025-12-06 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:01:01.600814466 +0000 UTC m=+7436.414173866" watchObservedRunningTime="2025-12-06 09:01:01.607377011 +0000 UTC m=+7436.420736430" Dec 06 09:01:01 crc kubenswrapper[4954]: I1206 09:01:01.625534 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" podStartSLOduration=3.625511594 podStartE2EDuration="3.625511594s" podCreationTimestamp="2025-12-06 09:00:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:01:01.619408741 +0000 UTC m=+7436.432768130" watchObservedRunningTime="2025-12-06 09:01:01.625511594 +0000 UTC m=+7436.438870983" Dec 06 09:01:03 crc kubenswrapper[4954]: I1206 09:01:03.606202 4954 generic.go:334] "Generic (PLEG): container finished" podID="36e918d7-8a47-464a-be52-aa9a7f8a875a" containerID="ecf9c4568a44ce2b630934a98e52a22e76539023c79cd5c152d30cb913236dfa" exitCode=0 Dec 06 09:01:03 crc kubenswrapper[4954]: I1206 09:01:03.606295 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416861-2xng2" event={"ID":"36e918d7-8a47-464a-be52-aa9a7f8a875a","Type":"ContainerDied","Data":"ecf9c4568a44ce2b630934a98e52a22e76539023c79cd5c152d30cb913236dfa"} Dec 06 09:01:04 crc kubenswrapper[4954]: I1206 09:01:04.616649 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xfwfr" event={"ID":"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77","Type":"ContainerStarted","Data":"a1be0a810e0b1182c5cdee9415d159dfb3145088f0e197f52835e596f783ae2f"} Dec 06 09:01:04 crc kubenswrapper[4954]: I1206 09:01:04.635240 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-xfwfr" podStartSLOduration=2.97838417 podStartE2EDuration="6.635220978s" podCreationTimestamp="2025-12-06 09:00:58 +0000 UTC" firstStartedPulling="2025-12-06 09:00:59.676403718 +0000 UTC m=+7434.489763117" lastFinishedPulling="2025-12-06 09:01:03.333240536 +0000 UTC m=+7438.146599925" observedRunningTime="2025-12-06 09:01:04.633930463 +0000 UTC m=+7439.447289852" watchObservedRunningTime="2025-12-06 09:01:04.635220978 +0000 UTC m=+7439.448580367" Dec 06 09:01:04 crc kubenswrapper[4954]: I1206 09:01:04.951449 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416861-2xng2" Dec 06 09:01:05 crc kubenswrapper[4954]: I1206 09:01:05.051819 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s72hc\" (UniqueName: \"kubernetes.io/projected/36e918d7-8a47-464a-be52-aa9a7f8a875a-kube-api-access-s72hc\") pod \"36e918d7-8a47-464a-be52-aa9a7f8a875a\" (UID: \"36e918d7-8a47-464a-be52-aa9a7f8a875a\") " Dec 06 09:01:05 crc kubenswrapper[4954]: I1206 09:01:05.051878 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36e918d7-8a47-464a-be52-aa9a7f8a875a-config-data\") pod \"36e918d7-8a47-464a-be52-aa9a7f8a875a\" (UID: \"36e918d7-8a47-464a-be52-aa9a7f8a875a\") " Dec 06 09:01:05 crc kubenswrapper[4954]: I1206 09:01:05.051934 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36e918d7-8a47-464a-be52-aa9a7f8a875a-combined-ca-bundle\") pod \"36e918d7-8a47-464a-be52-aa9a7f8a875a\" (UID: \"36e918d7-8a47-464a-be52-aa9a7f8a875a\") " Dec 06 09:01:05 crc kubenswrapper[4954]: I1206 09:01:05.052085 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/36e918d7-8a47-464a-be52-aa9a7f8a875a-fernet-keys\") pod \"36e918d7-8a47-464a-be52-aa9a7f8a875a\" (UID: \"36e918d7-8a47-464a-be52-aa9a7f8a875a\") " Dec 06 09:01:05 crc kubenswrapper[4954]: I1206 09:01:05.058216 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36e918d7-8a47-464a-be52-aa9a7f8a875a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "36e918d7-8a47-464a-be52-aa9a7f8a875a" (UID: "36e918d7-8a47-464a-be52-aa9a7f8a875a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:01:05 crc kubenswrapper[4954]: I1206 09:01:05.058712 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36e918d7-8a47-464a-be52-aa9a7f8a875a-kube-api-access-s72hc" (OuterVolumeSpecName: "kube-api-access-s72hc") pod "36e918d7-8a47-464a-be52-aa9a7f8a875a" (UID: "36e918d7-8a47-464a-be52-aa9a7f8a875a"). InnerVolumeSpecName "kube-api-access-s72hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:01:05 crc kubenswrapper[4954]: I1206 09:01:05.087057 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36e918d7-8a47-464a-be52-aa9a7f8a875a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36e918d7-8a47-464a-be52-aa9a7f8a875a" (UID: "36e918d7-8a47-464a-be52-aa9a7f8a875a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:01:05 crc kubenswrapper[4954]: I1206 09:01:05.103160 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36e918d7-8a47-464a-be52-aa9a7f8a875a-config-data" (OuterVolumeSpecName: "config-data") pod "36e918d7-8a47-464a-be52-aa9a7f8a875a" (UID: "36e918d7-8a47-464a-be52-aa9a7f8a875a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:01:05 crc kubenswrapper[4954]: I1206 09:01:05.153896 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36e918d7-8a47-464a-be52-aa9a7f8a875a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:01:05 crc kubenswrapper[4954]: I1206 09:01:05.153929 4954 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/36e918d7-8a47-464a-be52-aa9a7f8a875a-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 09:01:05 crc kubenswrapper[4954]: I1206 09:01:05.153940 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s72hc\" (UniqueName: \"kubernetes.io/projected/36e918d7-8a47-464a-be52-aa9a7f8a875a-kube-api-access-s72hc\") on node \"crc\" DevicePath \"\"" Dec 06 09:01:05 crc kubenswrapper[4954]: I1206 09:01:05.153949 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36e918d7-8a47-464a-be52-aa9a7f8a875a-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:01:05 crc kubenswrapper[4954]: I1206 09:01:05.629293 4954 generic.go:334] "Generic (PLEG): container finished" podID="9d9c4efc-2261-4ee6-8de1-ff2ee393fa77" containerID="a1be0a810e0b1182c5cdee9415d159dfb3145088f0e197f52835e596f783ae2f" exitCode=0 Dec 06 09:01:05 crc kubenswrapper[4954]: I1206 09:01:05.629383 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xfwfr" event={"ID":"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77","Type":"ContainerDied","Data":"a1be0a810e0b1182c5cdee9415d159dfb3145088f0e197f52835e596f783ae2f"} Dec 06 09:01:05 crc kubenswrapper[4954]: I1206 09:01:05.635864 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416861-2xng2" event={"ID":"36e918d7-8a47-464a-be52-aa9a7f8a875a","Type":"ContainerDied","Data":"1b64815f4661ee9c721a2d5c6e8de90ddec645bb5d29e78b0848269b484006d9"} Dec 06 09:01:05 crc kubenswrapper[4954]: I1206 09:01:05.635928 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b64815f4661ee9c721a2d5c6e8de90ddec645bb5d29e78b0848269b484006d9" Dec 06 09:01:05 crc kubenswrapper[4954]: I1206 09:01:05.636006 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416861-2xng2" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.055895 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xfwfr" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.121421 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-combined-ca-bundle\") pod \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.121480 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-scripts\") pod \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.121614 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-config-data\") pod \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.121656 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g2k4\" (UniqueName: \"kubernetes.io/projected/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-kube-api-access-6g2k4\") pod \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.121727 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-logs\") pod \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\" (UID: \"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77\") " Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.122273 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-logs" (OuterVolumeSpecName: "logs") pod "9d9c4efc-2261-4ee6-8de1-ff2ee393fa77" (UID: "9d9c4efc-2261-4ee6-8de1-ff2ee393fa77"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.126832 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-scripts" (OuterVolumeSpecName: "scripts") pod "9d9c4efc-2261-4ee6-8de1-ff2ee393fa77" (UID: "9d9c4efc-2261-4ee6-8de1-ff2ee393fa77"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.127975 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-kube-api-access-6g2k4" (OuterVolumeSpecName: "kube-api-access-6g2k4") pod "9d9c4efc-2261-4ee6-8de1-ff2ee393fa77" (UID: "9d9c4efc-2261-4ee6-8de1-ff2ee393fa77"). InnerVolumeSpecName "kube-api-access-6g2k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.154988 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d9c4efc-2261-4ee6-8de1-ff2ee393fa77" (UID: "9d9c4efc-2261-4ee6-8de1-ff2ee393fa77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.156515 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-config-data" (OuterVolumeSpecName: "config-data") pod "9d9c4efc-2261-4ee6-8de1-ff2ee393fa77" (UID: "9d9c4efc-2261-4ee6-8de1-ff2ee393fa77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.224445 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.224484 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g2k4\" (UniqueName: \"kubernetes.io/projected/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-kube-api-access-6g2k4\") on node \"crc\" DevicePath \"\"" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.224495 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.224504 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.224512 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.660671 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xfwfr" event={"ID":"9d9c4efc-2261-4ee6-8de1-ff2ee393fa77","Type":"ContainerDied","Data":"019acc1361764b7e444e60299a05253fda24fef89f06244851c54dacbb63dd74"} Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.661012 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="019acc1361764b7e444e60299a05253fda24fef89f06244851c54dacbb63dd74" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.660770 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xfwfr" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.740677 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-75f95f96c6-h66hl"] Dec 06 09:01:07 crc kubenswrapper[4954]: E1206 09:01:07.741038 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36e918d7-8a47-464a-be52-aa9a7f8a875a" containerName="keystone-cron" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.741057 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="36e918d7-8a47-464a-be52-aa9a7f8a875a" containerName="keystone-cron" Dec 06 09:01:07 crc kubenswrapper[4954]: E1206 09:01:07.741089 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d9c4efc-2261-4ee6-8de1-ff2ee393fa77" containerName="placement-db-sync" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.741095 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d9c4efc-2261-4ee6-8de1-ff2ee393fa77" containerName="placement-db-sync" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.741274 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="36e918d7-8a47-464a-be52-aa9a7f8a875a" containerName="keystone-cron" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.741307 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d9c4efc-2261-4ee6-8de1-ff2ee393fa77" containerName="placement-db-sync" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.742264 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.745717 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.745937 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.746020 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-26sbh" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.747614 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.748165 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.770064 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-75f95f96c6-h66hl"] Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.833786 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e149340-c1cd-4bc4-acc8-396bc4c314f7-combined-ca-bundle\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.833890 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e149340-c1cd-4bc4-acc8-396bc4c314f7-public-tls-certs\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.833926 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e149340-c1cd-4bc4-acc8-396bc4c314f7-internal-tls-certs\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.834001 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e149340-c1cd-4bc4-acc8-396bc4c314f7-scripts\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.834110 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e149340-c1cd-4bc4-acc8-396bc4c314f7-config-data\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.834205 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e149340-c1cd-4bc4-acc8-396bc4c314f7-logs\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.834261 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fxw4\" (UniqueName: \"kubernetes.io/projected/4e149340-c1cd-4bc4-acc8-396bc4c314f7-kube-api-access-2fxw4\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.936103 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e149340-c1cd-4bc4-acc8-396bc4c314f7-scripts\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.936177 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e149340-c1cd-4bc4-acc8-396bc4c314f7-config-data\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.936204 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e149340-c1cd-4bc4-acc8-396bc4c314f7-logs\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.936226 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fxw4\" (UniqueName: \"kubernetes.io/projected/4e149340-c1cd-4bc4-acc8-396bc4c314f7-kube-api-access-2fxw4\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.936258 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e149340-c1cd-4bc4-acc8-396bc4c314f7-combined-ca-bundle\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.936334 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e149340-c1cd-4bc4-acc8-396bc4c314f7-public-tls-certs\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.936371 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e149340-c1cd-4bc4-acc8-396bc4c314f7-internal-tls-certs\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.937135 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e149340-c1cd-4bc4-acc8-396bc4c314f7-logs\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.941667 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e149340-c1cd-4bc4-acc8-396bc4c314f7-config-data\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.941816 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e149340-c1cd-4bc4-acc8-396bc4c314f7-internal-tls-certs\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.944722 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e149340-c1cd-4bc4-acc8-396bc4c314f7-combined-ca-bundle\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.945064 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e149340-c1cd-4bc4-acc8-396bc4c314f7-scripts\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.948169 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e149340-c1cd-4bc4-acc8-396bc4c314f7-public-tls-certs\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:07 crc kubenswrapper[4954]: I1206 09:01:07.956426 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fxw4\" (UniqueName: \"kubernetes.io/projected/4e149340-c1cd-4bc4-acc8-396bc4c314f7-kube-api-access-2fxw4\") pod \"placement-75f95f96c6-h66hl\" (UID: \"4e149340-c1cd-4bc4-acc8-396bc4c314f7\") " pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:08 crc kubenswrapper[4954]: I1206 09:01:08.066010 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:08 crc kubenswrapper[4954]: I1206 09:01:08.320716 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-75f95f96c6-h66hl"] Dec 06 09:01:08 crc kubenswrapper[4954]: W1206 09:01:08.325928 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e149340_c1cd_4bc4_acc8_396bc4c314f7.slice/crio-2867819a5b666c7c816e336a4a1f1dac5f52d922928e1a647293fb74973cf910 WatchSource:0}: Error finding container 2867819a5b666c7c816e336a4a1f1dac5f52d922928e1a647293fb74973cf910: Status 404 returned error can't find the container with id 2867819a5b666c7c816e336a4a1f1dac5f52d922928e1a647293fb74973cf910 Dec 06 09:01:08 crc kubenswrapper[4954]: I1206 09:01:08.670919 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75f95f96c6-h66hl" event={"ID":"4e149340-c1cd-4bc4-acc8-396bc4c314f7","Type":"ContainerStarted","Data":"bc4da49a5fa150b555b62b85b041278536550f5f9cd05e1fd0ee3289be2762cf"} Dec 06 09:01:08 crc kubenswrapper[4954]: I1206 09:01:08.671221 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75f95f96c6-h66hl" event={"ID":"4e149340-c1cd-4bc4-acc8-396bc4c314f7","Type":"ContainerStarted","Data":"2867819a5b666c7c816e336a4a1f1dac5f52d922928e1a647293fb74973cf910"} Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.146768 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.199334 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6895dc775f-db2wf"] Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.202260 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6895dc775f-db2wf" podUID="1122f84e-bd24-4177-9ce1-00a28c07cef1" containerName="dnsmasq-dns" containerID="cri-o://d320fd3f1a507f3877ea11d94b91e6c0a21a27e0c97b2454b07dd02fbaacb04c" gracePeriod=10 Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.681036 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75f95f96c6-h66hl" event={"ID":"4e149340-c1cd-4bc4-acc8-396bc4c314f7","Type":"ContainerStarted","Data":"ad0511efeeeb7ff9c2cfebf22ef6f7ca1f6700b98219519e5e1194c0bd658ad6"} Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.681742 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.681767 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.683633 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.684086 4954 generic.go:334] "Generic (PLEG): container finished" podID="1122f84e-bd24-4177-9ce1-00a28c07cef1" containerID="d320fd3f1a507f3877ea11d94b91e6c0a21a27e0c97b2454b07dd02fbaacb04c" exitCode=0 Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.684131 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6895dc775f-db2wf" event={"ID":"1122f84e-bd24-4177-9ce1-00a28c07cef1","Type":"ContainerDied","Data":"d320fd3f1a507f3877ea11d94b91e6c0a21a27e0c97b2454b07dd02fbaacb04c"} Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.684160 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6895dc775f-db2wf" event={"ID":"1122f84e-bd24-4177-9ce1-00a28c07cef1","Type":"ContainerDied","Data":"9cdbacd5bb094692882df9a1605208d23159148341eadabc872616b10d421b14"} Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.684180 4954 scope.go:117] "RemoveContainer" containerID="d320fd3f1a507f3877ea11d94b91e6c0a21a27e0c97b2454b07dd02fbaacb04c" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.703537 4954 scope.go:117] "RemoveContainer" containerID="af709883c925e62966e00a8b68e98a577a0f1662b1367ecbad33cfd26477b46e" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.708024 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-75f95f96c6-h66hl" podStartSLOduration=2.707999091 podStartE2EDuration="2.707999091s" podCreationTimestamp="2025-12-06 09:01:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:01:09.699838953 +0000 UTC m=+7444.513198342" watchObservedRunningTime="2025-12-06 09:01:09.707999091 +0000 UTC m=+7444.521358480" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.732762 4954 scope.go:117] "RemoveContainer" containerID="d320fd3f1a507f3877ea11d94b91e6c0a21a27e0c97b2454b07dd02fbaacb04c" Dec 06 09:01:09 crc kubenswrapper[4954]: E1206 09:01:09.733876 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d320fd3f1a507f3877ea11d94b91e6c0a21a27e0c97b2454b07dd02fbaacb04c\": container with ID starting with d320fd3f1a507f3877ea11d94b91e6c0a21a27e0c97b2454b07dd02fbaacb04c not found: ID does not exist" containerID="d320fd3f1a507f3877ea11d94b91e6c0a21a27e0c97b2454b07dd02fbaacb04c" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.733932 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d320fd3f1a507f3877ea11d94b91e6c0a21a27e0c97b2454b07dd02fbaacb04c"} err="failed to get container status \"d320fd3f1a507f3877ea11d94b91e6c0a21a27e0c97b2454b07dd02fbaacb04c\": rpc error: code = NotFound desc = could not find container \"d320fd3f1a507f3877ea11d94b91e6c0a21a27e0c97b2454b07dd02fbaacb04c\": container with ID starting with d320fd3f1a507f3877ea11d94b91e6c0a21a27e0c97b2454b07dd02fbaacb04c not found: ID does not exist" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.733963 4954 scope.go:117] "RemoveContainer" containerID="af709883c925e62966e00a8b68e98a577a0f1662b1367ecbad33cfd26477b46e" Dec 06 09:01:09 crc kubenswrapper[4954]: E1206 09:01:09.736826 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af709883c925e62966e00a8b68e98a577a0f1662b1367ecbad33cfd26477b46e\": container with ID starting with af709883c925e62966e00a8b68e98a577a0f1662b1367ecbad33cfd26477b46e not found: ID does not exist" containerID="af709883c925e62966e00a8b68e98a577a0f1662b1367ecbad33cfd26477b46e" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.736874 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af709883c925e62966e00a8b68e98a577a0f1662b1367ecbad33cfd26477b46e"} err="failed to get container status \"af709883c925e62966e00a8b68e98a577a0f1662b1367ecbad33cfd26477b46e\": rpc error: code = NotFound desc = could not find container \"af709883c925e62966e00a8b68e98a577a0f1662b1367ecbad33cfd26477b46e\": container with ID starting with af709883c925e62966e00a8b68e98a577a0f1662b1367ecbad33cfd26477b46e not found: ID does not exist" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.778104 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8ld2\" (UniqueName: \"kubernetes.io/projected/1122f84e-bd24-4177-9ce1-00a28c07cef1-kube-api-access-j8ld2\") pod \"1122f84e-bd24-4177-9ce1-00a28c07cef1\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.778152 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-config\") pod \"1122f84e-bd24-4177-9ce1-00a28c07cef1\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.778214 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-dns-svc\") pod \"1122f84e-bd24-4177-9ce1-00a28c07cef1\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.778245 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-ovsdbserver-sb\") pod \"1122f84e-bd24-4177-9ce1-00a28c07cef1\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.778401 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-ovsdbserver-nb\") pod \"1122f84e-bd24-4177-9ce1-00a28c07cef1\" (UID: \"1122f84e-bd24-4177-9ce1-00a28c07cef1\") " Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.784107 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1122f84e-bd24-4177-9ce1-00a28c07cef1-kube-api-access-j8ld2" (OuterVolumeSpecName: "kube-api-access-j8ld2") pod "1122f84e-bd24-4177-9ce1-00a28c07cef1" (UID: "1122f84e-bd24-4177-9ce1-00a28c07cef1"). InnerVolumeSpecName "kube-api-access-j8ld2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.829445 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1122f84e-bd24-4177-9ce1-00a28c07cef1" (UID: "1122f84e-bd24-4177-9ce1-00a28c07cef1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.829654 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-config" (OuterVolumeSpecName: "config") pod "1122f84e-bd24-4177-9ce1-00a28c07cef1" (UID: "1122f84e-bd24-4177-9ce1-00a28c07cef1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.830022 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1122f84e-bd24-4177-9ce1-00a28c07cef1" (UID: "1122f84e-bd24-4177-9ce1-00a28c07cef1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.843182 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1122f84e-bd24-4177-9ce1-00a28c07cef1" (UID: "1122f84e-bd24-4177-9ce1-00a28c07cef1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.881216 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.881267 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.881281 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.881298 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8ld2\" (UniqueName: \"kubernetes.io/projected/1122f84e-bd24-4177-9ce1-00a28c07cef1-kube-api-access-j8ld2\") on node \"crc\" DevicePath \"\"" Dec 06 09:01:09 crc kubenswrapper[4954]: I1206 09:01:09.881310 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1122f84e-bd24-4177-9ce1-00a28c07cef1-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:01:10 crc kubenswrapper[4954]: I1206 09:01:10.101605 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:01:10 crc kubenswrapper[4954]: I1206 09:01:10.101678 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:01:10 crc kubenswrapper[4954]: I1206 09:01:10.692418 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6895dc775f-db2wf" Dec 06 09:01:10 crc kubenswrapper[4954]: I1206 09:01:10.725322 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6895dc775f-db2wf"] Dec 06 09:01:10 crc kubenswrapper[4954]: I1206 09:01:10.740808 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6895dc775f-db2wf"] Dec 06 09:01:11 crc kubenswrapper[4954]: I1206 09:01:11.455249 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1122f84e-bd24-4177-9ce1-00a28c07cef1" path="/var/lib/kubelet/pods/1122f84e-bd24-4177-9ce1-00a28c07cef1/volumes" Dec 06 09:01:23 crc kubenswrapper[4954]: I1206 09:01:23.600348 4954 scope.go:117] "RemoveContainer" containerID="80cee4ea69672d7aec6aa3bcd0b8881418a5d92e0ad74501f1ded895bf634a8c" Dec 06 09:01:23 crc kubenswrapper[4954]: I1206 09:01:23.633362 4954 scope.go:117] "RemoveContainer" containerID="903979689bf5a6b261e67f3a7d1d7c626c2b9bdd7a6c09d161614dd9163913b9" Dec 06 09:01:39 crc kubenswrapper[4954]: I1206 09:01:39.093728 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:39 crc kubenswrapper[4954]: I1206 09:01:39.096949 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-75f95f96c6-h66hl" Dec 06 09:01:40 crc kubenswrapper[4954]: I1206 09:01:40.101458 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:01:40 crc kubenswrapper[4954]: I1206 09:01:40.101543 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:02:02 crc kubenswrapper[4954]: I1206 09:02:02.982959 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-2srl5"] Dec 06 09:02:02 crc kubenswrapper[4954]: E1206 09:02:02.984222 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1122f84e-bd24-4177-9ce1-00a28c07cef1" containerName="dnsmasq-dns" Dec 06 09:02:02 crc kubenswrapper[4954]: I1206 09:02:02.984239 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1122f84e-bd24-4177-9ce1-00a28c07cef1" containerName="dnsmasq-dns" Dec 06 09:02:02 crc kubenswrapper[4954]: E1206 09:02:02.984272 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1122f84e-bd24-4177-9ce1-00a28c07cef1" containerName="init" Dec 06 09:02:02 crc kubenswrapper[4954]: I1206 09:02:02.984281 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1122f84e-bd24-4177-9ce1-00a28c07cef1" containerName="init" Dec 06 09:02:02 crc kubenswrapper[4954]: I1206 09:02:02.984533 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1122f84e-bd24-4177-9ce1-00a28c07cef1" containerName="dnsmasq-dns" Dec 06 09:02:02 crc kubenswrapper[4954]: I1206 09:02:02.985420 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2srl5" Dec 06 09:02:02 crc kubenswrapper[4954]: I1206 09:02:02.993113 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2srl5"] Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.046450 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-x24tp"] Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.047961 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x24tp" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.054511 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-f156-account-create-update-vxs99"] Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.055663 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f156-account-create-update-vxs99" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.059001 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.089114 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-x24tp"] Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.106660 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f156-account-create-update-vxs99"] Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.183018 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkhxs\" (UniqueName: \"kubernetes.io/projected/91b65e94-c4ec-45b8-b20f-9369a89280fd-kube-api-access-xkhxs\") pod \"nova-api-db-create-2srl5\" (UID: \"91b65e94-c4ec-45b8-b20f-9369a89280fd\") " pod="openstack/nova-api-db-create-2srl5" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.183324 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb30f1cd-3245-4e8e-a852-ef58699dff5f-operator-scripts\") pod \"nova-api-f156-account-create-update-vxs99\" (UID: \"bb30f1cd-3245-4e8e-a852-ef58699dff5f\") " pod="openstack/nova-api-f156-account-create-update-vxs99" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.183480 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj2r7\" (UniqueName: \"kubernetes.io/projected/bb30f1cd-3245-4e8e-a852-ef58699dff5f-kube-api-access-hj2r7\") pod \"nova-api-f156-account-create-update-vxs99\" (UID: \"bb30f1cd-3245-4e8e-a852-ef58699dff5f\") " pod="openstack/nova-api-f156-account-create-update-vxs99" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.183626 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91b65e94-c4ec-45b8-b20f-9369a89280fd-operator-scripts\") pod \"nova-api-db-create-2srl5\" (UID: \"91b65e94-c4ec-45b8-b20f-9369a89280fd\") " pod="openstack/nova-api-db-create-2srl5" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.183740 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0283d717-473b-41df-a35e-dc69e47c185f-operator-scripts\") pod \"nova-cell0-db-create-x24tp\" (UID: \"0283d717-473b-41df-a35e-dc69e47c185f\") " pod="openstack/nova-cell0-db-create-x24tp" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.183780 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p4wh\" (UniqueName: \"kubernetes.io/projected/0283d717-473b-41df-a35e-dc69e47c185f-kube-api-access-4p4wh\") pod \"nova-cell0-db-create-x24tp\" (UID: \"0283d717-473b-41df-a35e-dc69e47c185f\") " pod="openstack/nova-cell0-db-create-x24tp" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.256377 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-c5cvb"] Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.257540 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-c5cvb" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.278537 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-05ea-account-create-update-lrhlm"] Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.279873 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-05ea-account-create-update-lrhlm" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.282302 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.284619 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb30f1cd-3245-4e8e-a852-ef58699dff5f-operator-scripts\") pod \"nova-api-f156-account-create-update-vxs99\" (UID: \"bb30f1cd-3245-4e8e-a852-ef58699dff5f\") " pod="openstack/nova-api-f156-account-create-update-vxs99" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.284677 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj2r7\" (UniqueName: \"kubernetes.io/projected/bb30f1cd-3245-4e8e-a852-ef58699dff5f-kube-api-access-hj2r7\") pod \"nova-api-f156-account-create-update-vxs99\" (UID: \"bb30f1cd-3245-4e8e-a852-ef58699dff5f\") " pod="openstack/nova-api-f156-account-create-update-vxs99" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.284711 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91b65e94-c4ec-45b8-b20f-9369a89280fd-operator-scripts\") pod \"nova-api-db-create-2srl5\" (UID: \"91b65e94-c4ec-45b8-b20f-9369a89280fd\") " pod="openstack/nova-api-db-create-2srl5" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.284740 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0283d717-473b-41df-a35e-dc69e47c185f-operator-scripts\") pod \"nova-cell0-db-create-x24tp\" (UID: \"0283d717-473b-41df-a35e-dc69e47c185f\") " pod="openstack/nova-cell0-db-create-x24tp" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.284762 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p4wh\" (UniqueName: \"kubernetes.io/projected/0283d717-473b-41df-a35e-dc69e47c185f-kube-api-access-4p4wh\") pod \"nova-cell0-db-create-x24tp\" (UID: \"0283d717-473b-41df-a35e-dc69e47c185f\") " pod="openstack/nova-cell0-db-create-x24tp" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.284785 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkhxs\" (UniqueName: \"kubernetes.io/projected/91b65e94-c4ec-45b8-b20f-9369a89280fd-kube-api-access-xkhxs\") pod \"nova-api-db-create-2srl5\" (UID: \"91b65e94-c4ec-45b8-b20f-9369a89280fd\") " pod="openstack/nova-api-db-create-2srl5" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.285662 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb30f1cd-3245-4e8e-a852-ef58699dff5f-operator-scripts\") pod \"nova-api-f156-account-create-update-vxs99\" (UID: \"bb30f1cd-3245-4e8e-a852-ef58699dff5f\") " pod="openstack/nova-api-f156-account-create-update-vxs99" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.285667 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91b65e94-c4ec-45b8-b20f-9369a89280fd-operator-scripts\") pod \"nova-api-db-create-2srl5\" (UID: \"91b65e94-c4ec-45b8-b20f-9369a89280fd\") " pod="openstack/nova-api-db-create-2srl5" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.286056 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0283d717-473b-41df-a35e-dc69e47c185f-operator-scripts\") pod \"nova-cell0-db-create-x24tp\" (UID: \"0283d717-473b-41df-a35e-dc69e47c185f\") " pod="openstack/nova-cell0-db-create-x24tp" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.287746 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-c5cvb"] Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.302774 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-05ea-account-create-update-lrhlm"] Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.302876 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj2r7\" (UniqueName: \"kubernetes.io/projected/bb30f1cd-3245-4e8e-a852-ef58699dff5f-kube-api-access-hj2r7\") pod \"nova-api-f156-account-create-update-vxs99\" (UID: \"bb30f1cd-3245-4e8e-a852-ef58699dff5f\") " pod="openstack/nova-api-f156-account-create-update-vxs99" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.303206 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p4wh\" (UniqueName: \"kubernetes.io/projected/0283d717-473b-41df-a35e-dc69e47c185f-kube-api-access-4p4wh\") pod \"nova-cell0-db-create-x24tp\" (UID: \"0283d717-473b-41df-a35e-dc69e47c185f\") " pod="openstack/nova-cell0-db-create-x24tp" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.303489 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkhxs\" (UniqueName: \"kubernetes.io/projected/91b65e94-c4ec-45b8-b20f-9369a89280fd-kube-api-access-xkhxs\") pod \"nova-api-db-create-2srl5\" (UID: \"91b65e94-c4ec-45b8-b20f-9369a89280fd\") " pod="openstack/nova-api-db-create-2srl5" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.335969 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2srl5" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.369459 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x24tp" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.383726 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f156-account-create-update-vxs99" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.386529 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17477a55-543f-4561-bbae-101a1fba05d9-operator-scripts\") pod \"nova-cell0-05ea-account-create-update-lrhlm\" (UID: \"17477a55-543f-4561-bbae-101a1fba05d9\") " pod="openstack/nova-cell0-05ea-account-create-update-lrhlm" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.386599 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klhcx\" (UniqueName: \"kubernetes.io/projected/100c0102-825d-4f1c-9a32-511f7ef4081c-kube-api-access-klhcx\") pod \"nova-cell1-db-create-c5cvb\" (UID: \"100c0102-825d-4f1c-9a32-511f7ef4081c\") " pod="openstack/nova-cell1-db-create-c5cvb" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.386676 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/100c0102-825d-4f1c-9a32-511f7ef4081c-operator-scripts\") pod \"nova-cell1-db-create-c5cvb\" (UID: \"100c0102-825d-4f1c-9a32-511f7ef4081c\") " pod="openstack/nova-cell1-db-create-c5cvb" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.386749 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z5kl\" (UniqueName: \"kubernetes.io/projected/17477a55-543f-4561-bbae-101a1fba05d9-kube-api-access-7z5kl\") pod \"nova-cell0-05ea-account-create-update-lrhlm\" (UID: \"17477a55-543f-4561-bbae-101a1fba05d9\") " pod="openstack/nova-cell0-05ea-account-create-update-lrhlm" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.468461 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4263-account-create-update-6l4p9"] Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.469470 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4263-account-create-update-6l4p9" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.473121 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.479105 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4263-account-create-update-6l4p9"] Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.491795 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9836498d-4b30-4325-9cdc-0cfe5c3073fc-operator-scripts\") pod \"nova-cell1-4263-account-create-update-6l4p9\" (UID: \"9836498d-4b30-4325-9cdc-0cfe5c3073fc\") " pod="openstack/nova-cell1-4263-account-create-update-6l4p9" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.491913 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17477a55-543f-4561-bbae-101a1fba05d9-operator-scripts\") pod \"nova-cell0-05ea-account-create-update-lrhlm\" (UID: \"17477a55-543f-4561-bbae-101a1fba05d9\") " pod="openstack/nova-cell0-05ea-account-create-update-lrhlm" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.491943 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klhcx\" (UniqueName: \"kubernetes.io/projected/100c0102-825d-4f1c-9a32-511f7ef4081c-kube-api-access-klhcx\") pod \"nova-cell1-db-create-c5cvb\" (UID: \"100c0102-825d-4f1c-9a32-511f7ef4081c\") " pod="openstack/nova-cell1-db-create-c5cvb" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.491995 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k4x5\" (UniqueName: \"kubernetes.io/projected/9836498d-4b30-4325-9cdc-0cfe5c3073fc-kube-api-access-9k4x5\") pod \"nova-cell1-4263-account-create-update-6l4p9\" (UID: \"9836498d-4b30-4325-9cdc-0cfe5c3073fc\") " pod="openstack/nova-cell1-4263-account-create-update-6l4p9" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.492052 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/100c0102-825d-4f1c-9a32-511f7ef4081c-operator-scripts\") pod \"nova-cell1-db-create-c5cvb\" (UID: \"100c0102-825d-4f1c-9a32-511f7ef4081c\") " pod="openstack/nova-cell1-db-create-c5cvb" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.492145 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z5kl\" (UniqueName: \"kubernetes.io/projected/17477a55-543f-4561-bbae-101a1fba05d9-kube-api-access-7z5kl\") pod \"nova-cell0-05ea-account-create-update-lrhlm\" (UID: \"17477a55-543f-4561-bbae-101a1fba05d9\") " pod="openstack/nova-cell0-05ea-account-create-update-lrhlm" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.495401 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/100c0102-825d-4f1c-9a32-511f7ef4081c-operator-scripts\") pod \"nova-cell1-db-create-c5cvb\" (UID: \"100c0102-825d-4f1c-9a32-511f7ef4081c\") " pod="openstack/nova-cell1-db-create-c5cvb" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.497540 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17477a55-543f-4561-bbae-101a1fba05d9-operator-scripts\") pod \"nova-cell0-05ea-account-create-update-lrhlm\" (UID: \"17477a55-543f-4561-bbae-101a1fba05d9\") " pod="openstack/nova-cell0-05ea-account-create-update-lrhlm" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.514780 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klhcx\" (UniqueName: \"kubernetes.io/projected/100c0102-825d-4f1c-9a32-511f7ef4081c-kube-api-access-klhcx\") pod \"nova-cell1-db-create-c5cvb\" (UID: \"100c0102-825d-4f1c-9a32-511f7ef4081c\") " pod="openstack/nova-cell1-db-create-c5cvb" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.518055 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z5kl\" (UniqueName: \"kubernetes.io/projected/17477a55-543f-4561-bbae-101a1fba05d9-kube-api-access-7z5kl\") pod \"nova-cell0-05ea-account-create-update-lrhlm\" (UID: \"17477a55-543f-4561-bbae-101a1fba05d9\") " pod="openstack/nova-cell0-05ea-account-create-update-lrhlm" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.574273 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-c5cvb" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.594191 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k4x5\" (UniqueName: \"kubernetes.io/projected/9836498d-4b30-4325-9cdc-0cfe5c3073fc-kube-api-access-9k4x5\") pod \"nova-cell1-4263-account-create-update-6l4p9\" (UID: \"9836498d-4b30-4325-9cdc-0cfe5c3073fc\") " pod="openstack/nova-cell1-4263-account-create-update-6l4p9" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.594331 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9836498d-4b30-4325-9cdc-0cfe5c3073fc-operator-scripts\") pod \"nova-cell1-4263-account-create-update-6l4p9\" (UID: \"9836498d-4b30-4325-9cdc-0cfe5c3073fc\") " pod="openstack/nova-cell1-4263-account-create-update-6l4p9" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.595139 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9836498d-4b30-4325-9cdc-0cfe5c3073fc-operator-scripts\") pod \"nova-cell1-4263-account-create-update-6l4p9\" (UID: \"9836498d-4b30-4325-9cdc-0cfe5c3073fc\") " pod="openstack/nova-cell1-4263-account-create-update-6l4p9" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.610217 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k4x5\" (UniqueName: \"kubernetes.io/projected/9836498d-4b30-4325-9cdc-0cfe5c3073fc-kube-api-access-9k4x5\") pod \"nova-cell1-4263-account-create-update-6l4p9\" (UID: \"9836498d-4b30-4325-9cdc-0cfe5c3073fc\") " pod="openstack/nova-cell1-4263-account-create-update-6l4p9" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.614119 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-05ea-account-create-update-lrhlm" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.847220 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4263-account-create-update-6l4p9" Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.877030 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2srl5"] Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.977926 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-x24tp"] Dec 06 09:02:03 crc kubenswrapper[4954]: I1206 09:02:03.986117 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f156-account-create-update-vxs99"] Dec 06 09:02:03 crc kubenswrapper[4954]: W1206 09:02:03.989356 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0283d717_473b_41df_a35e_dc69e47c185f.slice/crio-6e04b05e87b4da159e3cdaeabb0c1f17b2138a6cea29e32f13f670245d7ef0a9 WatchSource:0}: Error finding container 6e04b05e87b4da159e3cdaeabb0c1f17b2138a6cea29e32f13f670245d7ef0a9: Status 404 returned error can't find the container with id 6e04b05e87b4da159e3cdaeabb0c1f17b2138a6cea29e32f13f670245d7ef0a9 Dec 06 09:02:04 crc kubenswrapper[4954]: I1206 09:02:04.136686 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-c5cvb"] Dec 06 09:02:04 crc kubenswrapper[4954]: W1206 09:02:04.144140 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod100c0102_825d_4f1c_9a32_511f7ef4081c.slice/crio-b236b416a47f94925f3aa185237fa3c0a6024d37af4dfc541379daaf253c1db7 WatchSource:0}: Error finding container b236b416a47f94925f3aa185237fa3c0a6024d37af4dfc541379daaf253c1db7: Status 404 returned error can't find the container with id b236b416a47f94925f3aa185237fa3c0a6024d37af4dfc541379daaf253c1db7 Dec 06 09:02:04 crc kubenswrapper[4954]: I1206 09:02:04.231073 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-05ea-account-create-update-lrhlm"] Dec 06 09:02:04 crc kubenswrapper[4954]: W1206 09:02:04.237148 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17477a55_543f_4561_bbae_101a1fba05d9.slice/crio-4f943a516102ccbef34cd21c62722e24d63402a195a17d1d98a421671fcc8641 WatchSource:0}: Error finding container 4f943a516102ccbef34cd21c62722e24d63402a195a17d1d98a421671fcc8641: Status 404 returned error can't find the container with id 4f943a516102ccbef34cd21c62722e24d63402a195a17d1d98a421671fcc8641 Dec 06 09:02:04 crc kubenswrapper[4954]: I1206 09:02:04.350810 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4263-account-create-update-6l4p9"] Dec 06 09:02:04 crc kubenswrapper[4954]: W1206 09:02:04.362000 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9836498d_4b30_4325_9cdc_0cfe5c3073fc.slice/crio-e71ecb93dbeaac0816e00f1e775307afea3c22e31541ed688621e132ff078637 WatchSource:0}: Error finding container e71ecb93dbeaac0816e00f1e775307afea3c22e31541ed688621e132ff078637: Status 404 returned error can't find the container with id e71ecb93dbeaac0816e00f1e775307afea3c22e31541ed688621e132ff078637 Dec 06 09:02:04 crc kubenswrapper[4954]: I1206 09:02:04.564369 4954 generic.go:334] "Generic (PLEG): container finished" podID="0283d717-473b-41df-a35e-dc69e47c185f" containerID="827d60d6e470d3fdbf0b8c088055c0c89edc93d7017da2deb4dd6ad0da54697b" exitCode=0 Dec 06 09:02:04 crc kubenswrapper[4954]: I1206 09:02:04.564413 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-x24tp" event={"ID":"0283d717-473b-41df-a35e-dc69e47c185f","Type":"ContainerDied","Data":"827d60d6e470d3fdbf0b8c088055c0c89edc93d7017da2deb4dd6ad0da54697b"} Dec 06 09:02:04 crc kubenswrapper[4954]: I1206 09:02:04.564470 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-x24tp" event={"ID":"0283d717-473b-41df-a35e-dc69e47c185f","Type":"ContainerStarted","Data":"6e04b05e87b4da159e3cdaeabb0c1f17b2138a6cea29e32f13f670245d7ef0a9"} Dec 06 09:02:04 crc kubenswrapper[4954]: I1206 09:02:04.571071 4954 generic.go:334] "Generic (PLEG): container finished" podID="91b65e94-c4ec-45b8-b20f-9369a89280fd" containerID="6c5f824e827abaa732aed26fc27f9076dacdab6c2af1a424276b3032c33a485d" exitCode=0 Dec 06 09:02:04 crc kubenswrapper[4954]: I1206 09:02:04.571183 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2srl5" event={"ID":"91b65e94-c4ec-45b8-b20f-9369a89280fd","Type":"ContainerDied","Data":"6c5f824e827abaa732aed26fc27f9076dacdab6c2af1a424276b3032c33a485d"} Dec 06 09:02:04 crc kubenswrapper[4954]: I1206 09:02:04.571210 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2srl5" event={"ID":"91b65e94-c4ec-45b8-b20f-9369a89280fd","Type":"ContainerStarted","Data":"4e181ec24be1d2cbb46ff1689a9c5b535928de016dd4d96e1648829fa5281285"} Dec 06 09:02:04 crc kubenswrapper[4954]: I1206 09:02:04.572474 4954 generic.go:334] "Generic (PLEG): container finished" podID="100c0102-825d-4f1c-9a32-511f7ef4081c" containerID="db03219069090717bb2cb3d4922c829147b65a42f4dab9649baea8a1b8503d1f" exitCode=0 Dec 06 09:02:04 crc kubenswrapper[4954]: I1206 09:02:04.572539 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-c5cvb" event={"ID":"100c0102-825d-4f1c-9a32-511f7ef4081c","Type":"ContainerDied","Data":"db03219069090717bb2cb3d4922c829147b65a42f4dab9649baea8a1b8503d1f"} Dec 06 09:02:04 crc kubenswrapper[4954]: I1206 09:02:04.572617 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-c5cvb" event={"ID":"100c0102-825d-4f1c-9a32-511f7ef4081c","Type":"ContainerStarted","Data":"b236b416a47f94925f3aa185237fa3c0a6024d37af4dfc541379daaf253c1db7"} Dec 06 09:02:04 crc kubenswrapper[4954]: I1206 09:02:04.573625 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4263-account-create-update-6l4p9" event={"ID":"9836498d-4b30-4325-9cdc-0cfe5c3073fc","Type":"ContainerStarted","Data":"e71ecb93dbeaac0816e00f1e775307afea3c22e31541ed688621e132ff078637"} Dec 06 09:02:04 crc kubenswrapper[4954]: I1206 09:02:04.574675 4954 generic.go:334] "Generic (PLEG): container finished" podID="bb30f1cd-3245-4e8e-a852-ef58699dff5f" containerID="12c5ed624d5e2f89166eade9f0fed510b6a70994b42c161aea7651baa6f59420" exitCode=0 Dec 06 09:02:04 crc kubenswrapper[4954]: I1206 09:02:04.574721 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f156-account-create-update-vxs99" event={"ID":"bb30f1cd-3245-4e8e-a852-ef58699dff5f","Type":"ContainerDied","Data":"12c5ed624d5e2f89166eade9f0fed510b6a70994b42c161aea7651baa6f59420"} Dec 06 09:02:04 crc kubenswrapper[4954]: I1206 09:02:04.574738 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f156-account-create-update-vxs99" event={"ID":"bb30f1cd-3245-4e8e-a852-ef58699dff5f","Type":"ContainerStarted","Data":"6265bb7ec523ff009c67c015a4117a388c4739aed2d1f479ede3db590d93367b"} Dec 06 09:02:04 crc kubenswrapper[4954]: I1206 09:02:04.575498 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-05ea-account-create-update-lrhlm" event={"ID":"17477a55-543f-4561-bbae-101a1fba05d9","Type":"ContainerStarted","Data":"4f943a516102ccbef34cd21c62722e24d63402a195a17d1d98a421671fcc8641"} Dec 06 09:02:05 crc kubenswrapper[4954]: I1206 09:02:05.583348 4954 generic.go:334] "Generic (PLEG): container finished" podID="17477a55-543f-4561-bbae-101a1fba05d9" containerID="82fc39882aa341485e8cd47169ef8d68727bd6f788de4fb8609efe635d7aa517" exitCode=0 Dec 06 09:02:05 crc kubenswrapper[4954]: I1206 09:02:05.583420 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-05ea-account-create-update-lrhlm" event={"ID":"17477a55-543f-4561-bbae-101a1fba05d9","Type":"ContainerDied","Data":"82fc39882aa341485e8cd47169ef8d68727bd6f788de4fb8609efe635d7aa517"} Dec 06 09:02:05 crc kubenswrapper[4954]: I1206 09:02:05.584978 4954 generic.go:334] "Generic (PLEG): container finished" podID="9836498d-4b30-4325-9cdc-0cfe5c3073fc" containerID="add6d48afa13434d0f9e3087e5162265fd3fcb84c411bd80c1b6000c2ade33ad" exitCode=0 Dec 06 09:02:05 crc kubenswrapper[4954]: I1206 09:02:05.585022 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4263-account-create-update-6l4p9" event={"ID":"9836498d-4b30-4325-9cdc-0cfe5c3073fc","Type":"ContainerDied","Data":"add6d48afa13434d0f9e3087e5162265fd3fcb84c411bd80c1b6000c2ade33ad"} Dec 06 09:02:05 crc kubenswrapper[4954]: I1206 09:02:05.973452 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x24tp" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.135106 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-c5cvb" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.140891 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f156-account-create-update-vxs99" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.150800 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0283d717-473b-41df-a35e-dc69e47c185f-operator-scripts\") pod \"0283d717-473b-41df-a35e-dc69e47c185f\" (UID: \"0283d717-473b-41df-a35e-dc69e47c185f\") " Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.150915 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2srl5" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.151029 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p4wh\" (UniqueName: \"kubernetes.io/projected/0283d717-473b-41df-a35e-dc69e47c185f-kube-api-access-4p4wh\") pod \"0283d717-473b-41df-a35e-dc69e47c185f\" (UID: \"0283d717-473b-41df-a35e-dc69e47c185f\") " Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.153138 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0283d717-473b-41df-a35e-dc69e47c185f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0283d717-473b-41df-a35e-dc69e47c185f" (UID: "0283d717-473b-41df-a35e-dc69e47c185f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.169214 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0283d717-473b-41df-a35e-dc69e47c185f-kube-api-access-4p4wh" (OuterVolumeSpecName: "kube-api-access-4p4wh") pod "0283d717-473b-41df-a35e-dc69e47c185f" (UID: "0283d717-473b-41df-a35e-dc69e47c185f"). InnerVolumeSpecName "kube-api-access-4p4wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.252833 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91b65e94-c4ec-45b8-b20f-9369a89280fd-operator-scripts\") pod \"91b65e94-c4ec-45b8-b20f-9369a89280fd\" (UID: \"91b65e94-c4ec-45b8-b20f-9369a89280fd\") " Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.252917 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/100c0102-825d-4f1c-9a32-511f7ef4081c-operator-scripts\") pod \"100c0102-825d-4f1c-9a32-511f7ef4081c\" (UID: \"100c0102-825d-4f1c-9a32-511f7ef4081c\") " Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.252944 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb30f1cd-3245-4e8e-a852-ef58699dff5f-operator-scripts\") pod \"bb30f1cd-3245-4e8e-a852-ef58699dff5f\" (UID: \"bb30f1cd-3245-4e8e-a852-ef58699dff5f\") " Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.253078 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkhxs\" (UniqueName: \"kubernetes.io/projected/91b65e94-c4ec-45b8-b20f-9369a89280fd-kube-api-access-xkhxs\") pod \"91b65e94-c4ec-45b8-b20f-9369a89280fd\" (UID: \"91b65e94-c4ec-45b8-b20f-9369a89280fd\") " Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.253145 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klhcx\" (UniqueName: \"kubernetes.io/projected/100c0102-825d-4f1c-9a32-511f7ef4081c-kube-api-access-klhcx\") pod \"100c0102-825d-4f1c-9a32-511f7ef4081c\" (UID: \"100c0102-825d-4f1c-9a32-511f7ef4081c\") " Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.253182 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj2r7\" (UniqueName: \"kubernetes.io/projected/bb30f1cd-3245-4e8e-a852-ef58699dff5f-kube-api-access-hj2r7\") pod \"bb30f1cd-3245-4e8e-a852-ef58699dff5f\" (UID: \"bb30f1cd-3245-4e8e-a852-ef58699dff5f\") " Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.253428 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/100c0102-825d-4f1c-9a32-511f7ef4081c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "100c0102-825d-4f1c-9a32-511f7ef4081c" (UID: "100c0102-825d-4f1c-9a32-511f7ef4081c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.253507 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91b65e94-c4ec-45b8-b20f-9369a89280fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "91b65e94-c4ec-45b8-b20f-9369a89280fd" (UID: "91b65e94-c4ec-45b8-b20f-9369a89280fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.253587 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb30f1cd-3245-4e8e-a852-ef58699dff5f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb30f1cd-3245-4e8e-a852-ef58699dff5f" (UID: "bb30f1cd-3245-4e8e-a852-ef58699dff5f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.254129 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91b65e94-c4ec-45b8-b20f-9369a89280fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.254373 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/100c0102-825d-4f1c-9a32-511f7ef4081c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.254387 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb30f1cd-3245-4e8e-a852-ef58699dff5f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.254400 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p4wh\" (UniqueName: \"kubernetes.io/projected/0283d717-473b-41df-a35e-dc69e47c185f-kube-api-access-4p4wh\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.254413 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0283d717-473b-41df-a35e-dc69e47c185f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.256699 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91b65e94-c4ec-45b8-b20f-9369a89280fd-kube-api-access-xkhxs" (OuterVolumeSpecName: "kube-api-access-xkhxs") pod "91b65e94-c4ec-45b8-b20f-9369a89280fd" (UID: "91b65e94-c4ec-45b8-b20f-9369a89280fd"). InnerVolumeSpecName "kube-api-access-xkhxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.256944 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb30f1cd-3245-4e8e-a852-ef58699dff5f-kube-api-access-hj2r7" (OuterVolumeSpecName: "kube-api-access-hj2r7") pod "bb30f1cd-3245-4e8e-a852-ef58699dff5f" (UID: "bb30f1cd-3245-4e8e-a852-ef58699dff5f"). InnerVolumeSpecName "kube-api-access-hj2r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.258682 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/100c0102-825d-4f1c-9a32-511f7ef4081c-kube-api-access-klhcx" (OuterVolumeSpecName: "kube-api-access-klhcx") pod "100c0102-825d-4f1c-9a32-511f7ef4081c" (UID: "100c0102-825d-4f1c-9a32-511f7ef4081c"). InnerVolumeSpecName "kube-api-access-klhcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.356189 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkhxs\" (UniqueName: \"kubernetes.io/projected/91b65e94-c4ec-45b8-b20f-9369a89280fd-kube-api-access-xkhxs\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.356231 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klhcx\" (UniqueName: \"kubernetes.io/projected/100c0102-825d-4f1c-9a32-511f7ef4081c-kube-api-access-klhcx\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.356244 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj2r7\" (UniqueName: \"kubernetes.io/projected/bb30f1cd-3245-4e8e-a852-ef58699dff5f-kube-api-access-hj2r7\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.598903 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-c5cvb" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.599903 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-c5cvb" event={"ID":"100c0102-825d-4f1c-9a32-511f7ef4081c","Type":"ContainerDied","Data":"b236b416a47f94925f3aa185237fa3c0a6024d37af4dfc541379daaf253c1db7"} Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.599947 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b236b416a47f94925f3aa185237fa3c0a6024d37af4dfc541379daaf253c1db7" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.608144 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f156-account-create-update-vxs99" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.608238 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f156-account-create-update-vxs99" event={"ID":"bb30f1cd-3245-4e8e-a852-ef58699dff5f","Type":"ContainerDied","Data":"6265bb7ec523ff009c67c015a4117a388c4739aed2d1f479ede3db590d93367b"} Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.608314 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6265bb7ec523ff009c67c015a4117a388c4739aed2d1f479ede3db590d93367b" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.613490 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x24tp" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.613697 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-x24tp" event={"ID":"0283d717-473b-41df-a35e-dc69e47c185f","Type":"ContainerDied","Data":"6e04b05e87b4da159e3cdaeabb0c1f17b2138a6cea29e32f13f670245d7ef0a9"} Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.613735 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e04b05e87b4da159e3cdaeabb0c1f17b2138a6cea29e32f13f670245d7ef0a9" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.615417 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2srl5" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.623249 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2srl5" event={"ID":"91b65e94-c4ec-45b8-b20f-9369a89280fd","Type":"ContainerDied","Data":"4e181ec24be1d2cbb46ff1689a9c5b535928de016dd4d96e1648829fa5281285"} Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.623298 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e181ec24be1d2cbb46ff1689a9c5b535928de016dd4d96e1648829fa5281285" Dec 06 09:02:06 crc kubenswrapper[4954]: I1206 09:02:06.994772 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4263-account-create-update-6l4p9" Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.009981 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-05ea-account-create-update-lrhlm" Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.172389 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9836498d-4b30-4325-9cdc-0cfe5c3073fc-operator-scripts\") pod \"9836498d-4b30-4325-9cdc-0cfe5c3073fc\" (UID: \"9836498d-4b30-4325-9cdc-0cfe5c3073fc\") " Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.172890 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17477a55-543f-4561-bbae-101a1fba05d9-operator-scripts\") pod \"17477a55-543f-4561-bbae-101a1fba05d9\" (UID: \"17477a55-543f-4561-bbae-101a1fba05d9\") " Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.172916 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9836498d-4b30-4325-9cdc-0cfe5c3073fc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9836498d-4b30-4325-9cdc-0cfe5c3073fc" (UID: "9836498d-4b30-4325-9cdc-0cfe5c3073fc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.172971 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k4x5\" (UniqueName: \"kubernetes.io/projected/9836498d-4b30-4325-9cdc-0cfe5c3073fc-kube-api-access-9k4x5\") pod \"9836498d-4b30-4325-9cdc-0cfe5c3073fc\" (UID: \"9836498d-4b30-4325-9cdc-0cfe5c3073fc\") " Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.173063 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z5kl\" (UniqueName: \"kubernetes.io/projected/17477a55-543f-4561-bbae-101a1fba05d9-kube-api-access-7z5kl\") pod \"17477a55-543f-4561-bbae-101a1fba05d9\" (UID: \"17477a55-543f-4561-bbae-101a1fba05d9\") " Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.173527 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9836498d-4b30-4325-9cdc-0cfe5c3073fc-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.173684 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17477a55-543f-4561-bbae-101a1fba05d9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "17477a55-543f-4561-bbae-101a1fba05d9" (UID: "17477a55-543f-4561-bbae-101a1fba05d9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.177751 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9836498d-4b30-4325-9cdc-0cfe5c3073fc-kube-api-access-9k4x5" (OuterVolumeSpecName: "kube-api-access-9k4x5") pod "9836498d-4b30-4325-9cdc-0cfe5c3073fc" (UID: "9836498d-4b30-4325-9cdc-0cfe5c3073fc"). InnerVolumeSpecName "kube-api-access-9k4x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.178376 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17477a55-543f-4561-bbae-101a1fba05d9-kube-api-access-7z5kl" (OuterVolumeSpecName: "kube-api-access-7z5kl") pod "17477a55-543f-4561-bbae-101a1fba05d9" (UID: "17477a55-543f-4561-bbae-101a1fba05d9"). InnerVolumeSpecName "kube-api-access-7z5kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.274891 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z5kl\" (UniqueName: \"kubernetes.io/projected/17477a55-543f-4561-bbae-101a1fba05d9-kube-api-access-7z5kl\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.274930 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17477a55-543f-4561-bbae-101a1fba05d9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.274940 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k4x5\" (UniqueName: \"kubernetes.io/projected/9836498d-4b30-4325-9cdc-0cfe5c3073fc-kube-api-access-9k4x5\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.626185 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4263-account-create-update-6l4p9" event={"ID":"9836498d-4b30-4325-9cdc-0cfe5c3073fc","Type":"ContainerDied","Data":"e71ecb93dbeaac0816e00f1e775307afea3c22e31541ed688621e132ff078637"} Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.626776 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e71ecb93dbeaac0816e00f1e775307afea3c22e31541ed688621e132ff078637" Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.626218 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4263-account-create-update-6l4p9" Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.627896 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-05ea-account-create-update-lrhlm" event={"ID":"17477a55-543f-4561-bbae-101a1fba05d9","Type":"ContainerDied","Data":"4f943a516102ccbef34cd21c62722e24d63402a195a17d1d98a421671fcc8641"} Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.627926 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f943a516102ccbef34cd21c62722e24d63402a195a17d1d98a421671fcc8641" Dec 06 09:02:07 crc kubenswrapper[4954]: I1206 09:02:07.627981 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-05ea-account-create-update-lrhlm" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.662692 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lqx47"] Dec 06 09:02:08 crc kubenswrapper[4954]: E1206 09:02:08.665700 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0283d717-473b-41df-a35e-dc69e47c185f" containerName="mariadb-database-create" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.665756 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0283d717-473b-41df-a35e-dc69e47c185f" containerName="mariadb-database-create" Dec 06 09:02:08 crc kubenswrapper[4954]: E1206 09:02:08.665772 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb30f1cd-3245-4e8e-a852-ef58699dff5f" containerName="mariadb-account-create-update" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.665780 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb30f1cd-3245-4e8e-a852-ef58699dff5f" containerName="mariadb-account-create-update" Dec 06 09:02:08 crc kubenswrapper[4954]: E1206 09:02:08.665812 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100c0102-825d-4f1c-9a32-511f7ef4081c" containerName="mariadb-database-create" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.665823 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="100c0102-825d-4f1c-9a32-511f7ef4081c" containerName="mariadb-database-create" Dec 06 09:02:08 crc kubenswrapper[4954]: E1206 09:02:08.665841 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9836498d-4b30-4325-9cdc-0cfe5c3073fc" containerName="mariadb-account-create-update" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.665849 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9836498d-4b30-4325-9cdc-0cfe5c3073fc" containerName="mariadb-account-create-update" Dec 06 09:02:08 crc kubenswrapper[4954]: E1206 09:02:08.665864 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91b65e94-c4ec-45b8-b20f-9369a89280fd" containerName="mariadb-database-create" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.665872 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b65e94-c4ec-45b8-b20f-9369a89280fd" containerName="mariadb-database-create" Dec 06 09:02:08 crc kubenswrapper[4954]: E1206 09:02:08.665906 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17477a55-543f-4561-bbae-101a1fba05d9" containerName="mariadb-account-create-update" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.665914 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="17477a55-543f-4561-bbae-101a1fba05d9" containerName="mariadb-account-create-update" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.666124 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb30f1cd-3245-4e8e-a852-ef58699dff5f" containerName="mariadb-account-create-update" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.666147 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="100c0102-825d-4f1c-9a32-511f7ef4081c" containerName="mariadb-database-create" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.666163 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="17477a55-543f-4561-bbae-101a1fba05d9" containerName="mariadb-account-create-update" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.666178 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9836498d-4b30-4325-9cdc-0cfe5c3073fc" containerName="mariadb-account-create-update" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.666195 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="0283d717-473b-41df-a35e-dc69e47c185f" containerName="mariadb-database-create" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.666208 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="91b65e94-c4ec-45b8-b20f-9369a89280fd" containerName="mariadb-database-create" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.666968 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lqx47" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.669376 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.670152 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gm4bt" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.677051 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.679413 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lqx47"] Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.827603 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2346b393-bbbb-4f4f-a056-e1697ea8428b-scripts\") pod \"nova-cell0-conductor-db-sync-lqx47\" (UID: \"2346b393-bbbb-4f4f-a056-e1697ea8428b\") " pod="openstack/nova-cell0-conductor-db-sync-lqx47" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.827655 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2346b393-bbbb-4f4f-a056-e1697ea8428b-config-data\") pod \"nova-cell0-conductor-db-sync-lqx47\" (UID: \"2346b393-bbbb-4f4f-a056-e1697ea8428b\") " pod="openstack/nova-cell0-conductor-db-sync-lqx47" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.827881 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2346b393-bbbb-4f4f-a056-e1697ea8428b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lqx47\" (UID: \"2346b393-bbbb-4f4f-a056-e1697ea8428b\") " pod="openstack/nova-cell0-conductor-db-sync-lqx47" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.827955 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c6mv\" (UniqueName: \"kubernetes.io/projected/2346b393-bbbb-4f4f-a056-e1697ea8428b-kube-api-access-4c6mv\") pod \"nova-cell0-conductor-db-sync-lqx47\" (UID: \"2346b393-bbbb-4f4f-a056-e1697ea8428b\") " pod="openstack/nova-cell0-conductor-db-sync-lqx47" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.930219 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2346b393-bbbb-4f4f-a056-e1697ea8428b-scripts\") pod \"nova-cell0-conductor-db-sync-lqx47\" (UID: \"2346b393-bbbb-4f4f-a056-e1697ea8428b\") " pod="openstack/nova-cell0-conductor-db-sync-lqx47" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.930280 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2346b393-bbbb-4f4f-a056-e1697ea8428b-config-data\") pod \"nova-cell0-conductor-db-sync-lqx47\" (UID: \"2346b393-bbbb-4f4f-a056-e1697ea8428b\") " pod="openstack/nova-cell0-conductor-db-sync-lqx47" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.930338 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2346b393-bbbb-4f4f-a056-e1697ea8428b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lqx47\" (UID: \"2346b393-bbbb-4f4f-a056-e1697ea8428b\") " pod="openstack/nova-cell0-conductor-db-sync-lqx47" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.930364 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c6mv\" (UniqueName: \"kubernetes.io/projected/2346b393-bbbb-4f4f-a056-e1697ea8428b-kube-api-access-4c6mv\") pod \"nova-cell0-conductor-db-sync-lqx47\" (UID: \"2346b393-bbbb-4f4f-a056-e1697ea8428b\") " pod="openstack/nova-cell0-conductor-db-sync-lqx47" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.938491 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2346b393-bbbb-4f4f-a056-e1697ea8428b-scripts\") pod \"nova-cell0-conductor-db-sync-lqx47\" (UID: \"2346b393-bbbb-4f4f-a056-e1697ea8428b\") " pod="openstack/nova-cell0-conductor-db-sync-lqx47" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.939712 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2346b393-bbbb-4f4f-a056-e1697ea8428b-config-data\") pod \"nova-cell0-conductor-db-sync-lqx47\" (UID: \"2346b393-bbbb-4f4f-a056-e1697ea8428b\") " pod="openstack/nova-cell0-conductor-db-sync-lqx47" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.943025 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2346b393-bbbb-4f4f-a056-e1697ea8428b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lqx47\" (UID: \"2346b393-bbbb-4f4f-a056-e1697ea8428b\") " pod="openstack/nova-cell0-conductor-db-sync-lqx47" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.965204 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c6mv\" (UniqueName: \"kubernetes.io/projected/2346b393-bbbb-4f4f-a056-e1697ea8428b-kube-api-access-4c6mv\") pod \"nova-cell0-conductor-db-sync-lqx47\" (UID: \"2346b393-bbbb-4f4f-a056-e1697ea8428b\") " pod="openstack/nova-cell0-conductor-db-sync-lqx47" Dec 06 09:02:08 crc kubenswrapper[4954]: I1206 09:02:08.999130 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lqx47" Dec 06 09:02:09 crc kubenswrapper[4954]: I1206 09:02:09.469420 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lqx47"] Dec 06 09:02:09 crc kubenswrapper[4954]: W1206 09:02:09.474904 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2346b393_bbbb_4f4f_a056_e1697ea8428b.slice/crio-cf996cf5e8fbadce19f0b285d2a96e5efca2dba8c0c36b88f298d88599af3f89 WatchSource:0}: Error finding container cf996cf5e8fbadce19f0b285d2a96e5efca2dba8c0c36b88f298d88599af3f89: Status 404 returned error can't find the container with id cf996cf5e8fbadce19f0b285d2a96e5efca2dba8c0c36b88f298d88599af3f89 Dec 06 09:02:09 crc kubenswrapper[4954]: I1206 09:02:09.649397 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lqx47" event={"ID":"2346b393-bbbb-4f4f-a056-e1697ea8428b","Type":"ContainerStarted","Data":"cf996cf5e8fbadce19f0b285d2a96e5efca2dba8c0c36b88f298d88599af3f89"} Dec 06 09:02:10 crc kubenswrapper[4954]: I1206 09:02:10.101608 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:02:10 crc kubenswrapper[4954]: I1206 09:02:10.101990 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:02:10 crc kubenswrapper[4954]: I1206 09:02:10.102048 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 09:02:10 crc kubenswrapper[4954]: I1206 09:02:10.102868 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9a5af8207c0d9a138aaaadc63eac45bf948ce076df26cf72d671f69df33df520"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:02:10 crc kubenswrapper[4954]: I1206 09:02:10.102969 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://9a5af8207c0d9a138aaaadc63eac45bf948ce076df26cf72d671f69df33df520" gracePeriod=600 Dec 06 09:02:10 crc kubenswrapper[4954]: I1206 09:02:10.662600 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="9a5af8207c0d9a138aaaadc63eac45bf948ce076df26cf72d671f69df33df520" exitCode=0 Dec 06 09:02:10 crc kubenswrapper[4954]: I1206 09:02:10.662738 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"9a5af8207c0d9a138aaaadc63eac45bf948ce076df26cf72d671f69df33df520"} Dec 06 09:02:10 crc kubenswrapper[4954]: I1206 09:02:10.662919 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78"} Dec 06 09:02:10 crc kubenswrapper[4954]: I1206 09:02:10.662943 4954 scope.go:117] "RemoveContainer" containerID="4436336669c85fb06f91fbb34de049c56f89c0db1e88d72fdeb75193727824f8" Dec 06 09:02:18 crc kubenswrapper[4954]: I1206 09:02:18.745581 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lqx47" event={"ID":"2346b393-bbbb-4f4f-a056-e1697ea8428b","Type":"ContainerStarted","Data":"a7b3ebef671a3fe78d828715d13e643b24ca3d4717dd6da4b2b4e1eb74c2da22"} Dec 06 09:02:24 crc kubenswrapper[4954]: I1206 09:02:24.796497 4954 generic.go:334] "Generic (PLEG): container finished" podID="2346b393-bbbb-4f4f-a056-e1697ea8428b" containerID="a7b3ebef671a3fe78d828715d13e643b24ca3d4717dd6da4b2b4e1eb74c2da22" exitCode=0 Dec 06 09:02:24 crc kubenswrapper[4954]: I1206 09:02:24.796603 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lqx47" event={"ID":"2346b393-bbbb-4f4f-a056-e1697ea8428b","Type":"ContainerDied","Data":"a7b3ebef671a3fe78d828715d13e643b24ca3d4717dd6da4b2b4e1eb74c2da22"} Dec 06 09:02:25 crc kubenswrapper[4954]: I1206 09:02:25.072324 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-lmzpf"] Dec 06 09:02:25 crc kubenswrapper[4954]: I1206 09:02:25.083338 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7a78-account-create-update-79rdj"] Dec 06 09:02:25 crc kubenswrapper[4954]: I1206 09:02:25.094939 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-lmzpf"] Dec 06 09:02:25 crc kubenswrapper[4954]: I1206 09:02:25.103547 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7a78-account-create-update-79rdj"] Dec 06 09:02:25 crc kubenswrapper[4954]: I1206 09:02:25.464922 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c30c8ba-bb27-4cbf-9586-8c4c915bd571" path="/var/lib/kubelet/pods/3c30c8ba-bb27-4cbf-9586-8c4c915bd571/volumes" Dec 06 09:02:25 crc kubenswrapper[4954]: I1206 09:02:25.465657 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9542d633-4858-414f-8132-e4f3e860d56b" path="/var/lib/kubelet/pods/9542d633-4858-414f-8132-e4f3e860d56b/volumes" Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.126938 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lqx47" Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.190005 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2346b393-bbbb-4f4f-a056-e1697ea8428b-combined-ca-bundle\") pod \"2346b393-bbbb-4f4f-a056-e1697ea8428b\" (UID: \"2346b393-bbbb-4f4f-a056-e1697ea8428b\") " Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.190193 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2346b393-bbbb-4f4f-a056-e1697ea8428b-config-data\") pod \"2346b393-bbbb-4f4f-a056-e1697ea8428b\" (UID: \"2346b393-bbbb-4f4f-a056-e1697ea8428b\") " Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.190406 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c6mv\" (UniqueName: \"kubernetes.io/projected/2346b393-bbbb-4f4f-a056-e1697ea8428b-kube-api-access-4c6mv\") pod \"2346b393-bbbb-4f4f-a056-e1697ea8428b\" (UID: \"2346b393-bbbb-4f4f-a056-e1697ea8428b\") " Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.190436 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2346b393-bbbb-4f4f-a056-e1697ea8428b-scripts\") pod \"2346b393-bbbb-4f4f-a056-e1697ea8428b\" (UID: \"2346b393-bbbb-4f4f-a056-e1697ea8428b\") " Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.196847 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2346b393-bbbb-4f4f-a056-e1697ea8428b-scripts" (OuterVolumeSpecName: "scripts") pod "2346b393-bbbb-4f4f-a056-e1697ea8428b" (UID: "2346b393-bbbb-4f4f-a056-e1697ea8428b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.196926 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2346b393-bbbb-4f4f-a056-e1697ea8428b-kube-api-access-4c6mv" (OuterVolumeSpecName: "kube-api-access-4c6mv") pod "2346b393-bbbb-4f4f-a056-e1697ea8428b" (UID: "2346b393-bbbb-4f4f-a056-e1697ea8428b"). InnerVolumeSpecName "kube-api-access-4c6mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.218834 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2346b393-bbbb-4f4f-a056-e1697ea8428b-config-data" (OuterVolumeSpecName: "config-data") pod "2346b393-bbbb-4f4f-a056-e1697ea8428b" (UID: "2346b393-bbbb-4f4f-a056-e1697ea8428b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.221259 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2346b393-bbbb-4f4f-a056-e1697ea8428b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2346b393-bbbb-4f4f-a056-e1697ea8428b" (UID: "2346b393-bbbb-4f4f-a056-e1697ea8428b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.292055 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2346b393-bbbb-4f4f-a056-e1697ea8428b-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.292093 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c6mv\" (UniqueName: \"kubernetes.io/projected/2346b393-bbbb-4f4f-a056-e1697ea8428b-kube-api-access-4c6mv\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.292104 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2346b393-bbbb-4f4f-a056-e1697ea8428b-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.292116 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2346b393-bbbb-4f4f-a056-e1697ea8428b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.818427 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lqx47" event={"ID":"2346b393-bbbb-4f4f-a056-e1697ea8428b","Type":"ContainerDied","Data":"cf996cf5e8fbadce19f0b285d2a96e5efca2dba8c0c36b88f298d88599af3f89"} Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.818948 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf996cf5e8fbadce19f0b285d2a96e5efca2dba8c0c36b88f298d88599af3f89" Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.818700 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lqx47" Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.903095 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 09:02:26 crc kubenswrapper[4954]: E1206 09:02:26.903616 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2346b393-bbbb-4f4f-a056-e1697ea8428b" containerName="nova-cell0-conductor-db-sync" Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.903635 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="2346b393-bbbb-4f4f-a056-e1697ea8428b" containerName="nova-cell0-conductor-db-sync" Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.903880 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="2346b393-bbbb-4f4f-a056-e1697ea8428b" containerName="nova-cell0-conductor-db-sync" Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.904658 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.907083 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.907296 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gm4bt" Dec 06 09:02:26 crc kubenswrapper[4954]: I1206 09:02:26.911226 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 09:02:27 crc kubenswrapper[4954]: I1206 09:02:27.105003 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrpgl\" (UniqueName: \"kubernetes.io/projected/309d0b52-0b14-4f55-b15c-acb1a6824a88-kube-api-access-vrpgl\") pod \"nova-cell0-conductor-0\" (UID: \"309d0b52-0b14-4f55-b15c-acb1a6824a88\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:02:27 crc kubenswrapper[4954]: I1206 09:02:27.105078 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309d0b52-0b14-4f55-b15c-acb1a6824a88-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"309d0b52-0b14-4f55-b15c-acb1a6824a88\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:02:27 crc kubenswrapper[4954]: I1206 09:02:27.105132 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309d0b52-0b14-4f55-b15c-acb1a6824a88-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"309d0b52-0b14-4f55-b15c-acb1a6824a88\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:02:27 crc kubenswrapper[4954]: I1206 09:02:27.206799 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrpgl\" (UniqueName: \"kubernetes.io/projected/309d0b52-0b14-4f55-b15c-acb1a6824a88-kube-api-access-vrpgl\") pod \"nova-cell0-conductor-0\" (UID: \"309d0b52-0b14-4f55-b15c-acb1a6824a88\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:02:27 crc kubenswrapper[4954]: I1206 09:02:27.206884 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309d0b52-0b14-4f55-b15c-acb1a6824a88-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"309d0b52-0b14-4f55-b15c-acb1a6824a88\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:02:27 crc kubenswrapper[4954]: I1206 09:02:27.206946 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309d0b52-0b14-4f55-b15c-acb1a6824a88-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"309d0b52-0b14-4f55-b15c-acb1a6824a88\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:02:27 crc kubenswrapper[4954]: I1206 09:02:27.217605 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309d0b52-0b14-4f55-b15c-acb1a6824a88-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"309d0b52-0b14-4f55-b15c-acb1a6824a88\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:02:27 crc kubenswrapper[4954]: I1206 09:02:27.217790 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309d0b52-0b14-4f55-b15c-acb1a6824a88-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"309d0b52-0b14-4f55-b15c-acb1a6824a88\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:02:27 crc kubenswrapper[4954]: I1206 09:02:27.224788 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrpgl\" (UniqueName: \"kubernetes.io/projected/309d0b52-0b14-4f55-b15c-acb1a6824a88-kube-api-access-vrpgl\") pod \"nova-cell0-conductor-0\" (UID: \"309d0b52-0b14-4f55-b15c-acb1a6824a88\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:02:27 crc kubenswrapper[4954]: I1206 09:02:27.227452 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 06 09:02:27 crc kubenswrapper[4954]: I1206 09:02:27.705477 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 09:02:27 crc kubenswrapper[4954]: W1206 09:02:27.711048 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod309d0b52_0b14_4f55_b15c_acb1a6824a88.slice/crio-c5026b8160d1f070ecaa5582f82e6a84679c1e1c64030fba5e2e7cedf256f2d0 WatchSource:0}: Error finding container c5026b8160d1f070ecaa5582f82e6a84679c1e1c64030fba5e2e7cedf256f2d0: Status 404 returned error can't find the container with id c5026b8160d1f070ecaa5582f82e6a84679c1e1c64030fba5e2e7cedf256f2d0 Dec 06 09:02:27 crc kubenswrapper[4954]: I1206 09:02:27.829470 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"309d0b52-0b14-4f55-b15c-acb1a6824a88","Type":"ContainerStarted","Data":"c5026b8160d1f070ecaa5582f82e6a84679c1e1c64030fba5e2e7cedf256f2d0"} Dec 06 09:02:28 crc kubenswrapper[4954]: I1206 09:02:28.846280 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"309d0b52-0b14-4f55-b15c-acb1a6824a88","Type":"ContainerStarted","Data":"7879d554328be882d29323f6423d040c51857bfb864f3e9ad740f8ef137010b6"} Dec 06 09:02:28 crc kubenswrapper[4954]: I1206 09:02:28.846887 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.032089 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=11.032048804 podStartE2EDuration="11.032048804s" podCreationTimestamp="2025-12-06 09:02:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:02:28.864946311 +0000 UTC m=+7523.678305710" watchObservedRunningTime="2025-12-06 09:02:37.032048804 +0000 UTC m=+7531.845408193" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.036033 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-nv5pz"] Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.043089 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-nv5pz"] Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.252015 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.453618 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1477797-cbd1-46d5-a238-70b8a626f523" path="/var/lib/kubelet/pods/a1477797-cbd1-46d5-a238-70b8a626f523/volumes" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.705095 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-nrs7z"] Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.708364 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nrs7z" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.712814 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.712880 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.740863 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-nrs7z"] Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.816824 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-nrs7z\" (UID: \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\") " pod="openstack/nova-cell0-cell-mapping-nrs7z" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.816884 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c26d\" (UniqueName: \"kubernetes.io/projected/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-kube-api-access-7c26d\") pod \"nova-cell0-cell-mapping-nrs7z\" (UID: \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\") " pod="openstack/nova-cell0-cell-mapping-nrs7z" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.816945 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-config-data\") pod \"nova-cell0-cell-mapping-nrs7z\" (UID: \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\") " pod="openstack/nova-cell0-cell-mapping-nrs7z" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.816965 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-scripts\") pod \"nova-cell0-cell-mapping-nrs7z\" (UID: \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\") " pod="openstack/nova-cell0-cell-mapping-nrs7z" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.924629 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-nrs7z\" (UID: \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\") " pod="openstack/nova-cell0-cell-mapping-nrs7z" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.924689 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c26d\" (UniqueName: \"kubernetes.io/projected/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-kube-api-access-7c26d\") pod \"nova-cell0-cell-mapping-nrs7z\" (UID: \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\") " pod="openstack/nova-cell0-cell-mapping-nrs7z" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.924750 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-config-data\") pod \"nova-cell0-cell-mapping-nrs7z\" (UID: \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\") " pod="openstack/nova-cell0-cell-mapping-nrs7z" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.924768 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-scripts\") pod \"nova-cell0-cell-mapping-nrs7z\" (UID: \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\") " pod="openstack/nova-cell0-cell-mapping-nrs7z" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.925846 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.926941 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.939753 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.952475 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-nrs7z\" (UID: \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\") " pod="openstack/nova-cell0-cell-mapping-nrs7z" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.952487 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-scripts\") pod \"nova-cell0-cell-mapping-nrs7z\" (UID: \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\") " pod="openstack/nova-cell0-cell-mapping-nrs7z" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.954690 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-config-data\") pod \"nova-cell0-cell-mapping-nrs7z\" (UID: \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\") " pod="openstack/nova-cell0-cell-mapping-nrs7z" Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.972308 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:02:37 crc kubenswrapper[4954]: I1206 09:02:37.986712 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c26d\" (UniqueName: \"kubernetes.io/projected/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-kube-api-access-7c26d\") pod \"nova-cell0-cell-mapping-nrs7z\" (UID: \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\") " pod="openstack/nova-cell0-cell-mapping-nrs7z" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.028631 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc7gq\" (UniqueName: \"kubernetes.io/projected/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0-kube-api-access-wc7gq\") pod \"nova-scheduler-0\" (UID: \"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0\") " pod="openstack/nova-scheduler-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.028695 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0\") " pod="openstack/nova-scheduler-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.028725 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0-config-data\") pod \"nova-scheduler-0\" (UID: \"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0\") " pod="openstack/nova-scheduler-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.044730 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nrs7z" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.078188 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.118041 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.122971 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.133124 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc7gq\" (UniqueName: \"kubernetes.io/projected/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0-kube-api-access-wc7gq\") pod \"nova-scheduler-0\" (UID: \"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0\") " pod="openstack/nova-scheduler-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.133202 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0\") " pod="openstack/nova-scheduler-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.133234 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0-config-data\") pod \"nova-scheduler-0\" (UID: \"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0\") " pod="openstack/nova-scheduler-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.141690 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.144278 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0-config-data\") pod \"nova-scheduler-0\" (UID: \"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0\") " pod="openstack/nova-scheduler-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.162482 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0\") " pod="openstack/nova-scheduler-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.185470 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc7gq\" (UniqueName: \"kubernetes.io/projected/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0-kube-api-access-wc7gq\") pod \"nova-scheduler-0\" (UID: \"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0\") " pod="openstack/nova-scheduler-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.234555 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfdqj\" (UniqueName: \"kubernetes.io/projected/3183c048-9d1e-42e8-a51f-33e19b7da173-kube-api-access-bfdqj\") pod \"nova-metadata-0\" (UID: \"3183c048-9d1e-42e8-a51f-33e19b7da173\") " pod="openstack/nova-metadata-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.234645 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3183c048-9d1e-42e8-a51f-33e19b7da173-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3183c048-9d1e-42e8-a51f-33e19b7da173\") " pod="openstack/nova-metadata-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.234671 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3183c048-9d1e-42e8-a51f-33e19b7da173-logs\") pod \"nova-metadata-0\" (UID: \"3183c048-9d1e-42e8-a51f-33e19b7da173\") " pod="openstack/nova-metadata-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.234712 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3183c048-9d1e-42e8-a51f-33e19b7da173-config-data\") pod \"nova-metadata-0\" (UID: \"3183c048-9d1e-42e8-a51f-33e19b7da173\") " pod="openstack/nova-metadata-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.270078 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.271616 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.277869 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.291592 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.335781 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.336794 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.336835 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfdqj\" (UniqueName: \"kubernetes.io/projected/3183c048-9d1e-42e8-a51f-33e19b7da173-kube-api-access-bfdqj\") pod \"nova-metadata-0\" (UID: \"3183c048-9d1e-42e8-a51f-33e19b7da173\") " pod="openstack/nova-metadata-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.336879 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3183c048-9d1e-42e8-a51f-33e19b7da173-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3183c048-9d1e-42e8-a51f-33e19b7da173\") " pod="openstack/nova-metadata-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.336900 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf8lb\" (UniqueName: \"kubernetes.io/projected/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9-kube-api-access-bf8lb\") pod \"nova-cell1-novncproxy-0\" (UID: \"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.336920 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.336938 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3183c048-9d1e-42e8-a51f-33e19b7da173-logs\") pod \"nova-metadata-0\" (UID: \"3183c048-9d1e-42e8-a51f-33e19b7da173\") " pod="openstack/nova-metadata-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.336978 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3183c048-9d1e-42e8-a51f-33e19b7da173-config-data\") pod \"nova-metadata-0\" (UID: \"3183c048-9d1e-42e8-a51f-33e19b7da173\") " pod="openstack/nova-metadata-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.344307 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3183c048-9d1e-42e8-a51f-33e19b7da173-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3183c048-9d1e-42e8-a51f-33e19b7da173\") " pod="openstack/nova-metadata-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.344853 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3183c048-9d1e-42e8-a51f-33e19b7da173-logs\") pod \"nova-metadata-0\" (UID: \"3183c048-9d1e-42e8-a51f-33e19b7da173\") " pod="openstack/nova-metadata-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.345386 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3183c048-9d1e-42e8-a51f-33e19b7da173-config-data\") pod \"nova-metadata-0\" (UID: \"3183c048-9d1e-42e8-a51f-33e19b7da173\") " pod="openstack/nova-metadata-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.365854 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f85bbdcf9-tk6ff"] Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.367551 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.389584 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfdqj\" (UniqueName: \"kubernetes.io/projected/3183c048-9d1e-42e8-a51f-33e19b7da173-kube-api-access-bfdqj\") pod \"nova-metadata-0\" (UID: \"3183c048-9d1e-42e8-a51f-33e19b7da173\") " pod="openstack/nova-metadata-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.389754 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.412623 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f85bbdcf9-tk6ff"] Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.412975 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.416975 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.449517 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-ovsdbserver-sb\") pod \"dnsmasq-dns-f85bbdcf9-tk6ff\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.449636 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.449687 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-config\") pod \"dnsmasq-dns-f85bbdcf9-tk6ff\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.449707 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-ovsdbserver-nb\") pod \"dnsmasq-dns-f85bbdcf9-tk6ff\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.449747 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-dns-svc\") pod \"dnsmasq-dns-f85bbdcf9-tk6ff\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.449782 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf8lb\" (UniqueName: \"kubernetes.io/projected/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9-kube-api-access-bf8lb\") pod \"nova-cell1-novncproxy-0\" (UID: \"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.449802 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.449858 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lps2f\" (UniqueName: \"kubernetes.io/projected/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-kube-api-access-lps2f\") pod \"dnsmasq-dns-f85bbdcf9-tk6ff\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.466435 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.470126 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.470681 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.472877 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf8lb\" (UniqueName: \"kubernetes.io/projected/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9-kube-api-access-bf8lb\") pod \"nova-cell1-novncproxy-0\" (UID: \"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.499602 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.551023 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-ovsdbserver-sb\") pod \"dnsmasq-dns-f85bbdcf9-tk6ff\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.551087 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/becf43b0-72f4-4d14-9d89-5593721d9ec0-logs\") pod \"nova-api-0\" (UID: \"becf43b0-72f4-4d14-9d89-5593721d9ec0\") " pod="openstack/nova-api-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.551159 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-config\") pod \"dnsmasq-dns-f85bbdcf9-tk6ff\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.551178 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-ovsdbserver-nb\") pod \"dnsmasq-dns-f85bbdcf9-tk6ff\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.551229 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/becf43b0-72f4-4d14-9d89-5593721d9ec0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"becf43b0-72f4-4d14-9d89-5593721d9ec0\") " pod="openstack/nova-api-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.551248 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-dns-svc\") pod \"dnsmasq-dns-f85bbdcf9-tk6ff\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.551294 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/becf43b0-72f4-4d14-9d89-5593721d9ec0-config-data\") pod \"nova-api-0\" (UID: \"becf43b0-72f4-4d14-9d89-5593721d9ec0\") " pod="openstack/nova-api-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.551315 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqc8m\" (UniqueName: \"kubernetes.io/projected/becf43b0-72f4-4d14-9d89-5593721d9ec0-kube-api-access-qqc8m\") pod \"nova-api-0\" (UID: \"becf43b0-72f4-4d14-9d89-5593721d9ec0\") " pod="openstack/nova-api-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.551354 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lps2f\" (UniqueName: \"kubernetes.io/projected/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-kube-api-access-lps2f\") pod \"dnsmasq-dns-f85bbdcf9-tk6ff\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.552956 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-config\") pod \"dnsmasq-dns-f85bbdcf9-tk6ff\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.553344 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-dns-svc\") pod \"dnsmasq-dns-f85bbdcf9-tk6ff\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.553794 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-ovsdbserver-nb\") pod \"dnsmasq-dns-f85bbdcf9-tk6ff\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.556547 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-ovsdbserver-sb\") pod \"dnsmasq-dns-f85bbdcf9-tk6ff\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.573070 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lps2f\" (UniqueName: \"kubernetes.io/projected/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-kube-api-access-lps2f\") pod \"dnsmasq-dns-f85bbdcf9-tk6ff\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.624217 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.653488 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/becf43b0-72f4-4d14-9d89-5593721d9ec0-logs\") pod \"nova-api-0\" (UID: \"becf43b0-72f4-4d14-9d89-5593721d9ec0\") " pod="openstack/nova-api-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.653954 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/becf43b0-72f4-4d14-9d89-5593721d9ec0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"becf43b0-72f4-4d14-9d89-5593721d9ec0\") " pod="openstack/nova-api-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.654023 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/becf43b0-72f4-4d14-9d89-5593721d9ec0-config-data\") pod \"nova-api-0\" (UID: \"becf43b0-72f4-4d14-9d89-5593721d9ec0\") " pod="openstack/nova-api-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.654045 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqc8m\" (UniqueName: \"kubernetes.io/projected/becf43b0-72f4-4d14-9d89-5593721d9ec0-kube-api-access-qqc8m\") pod \"nova-api-0\" (UID: \"becf43b0-72f4-4d14-9d89-5593721d9ec0\") " pod="openstack/nova-api-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.654617 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/becf43b0-72f4-4d14-9d89-5593721d9ec0-logs\") pod \"nova-api-0\" (UID: \"becf43b0-72f4-4d14-9d89-5593721d9ec0\") " pod="openstack/nova-api-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.658248 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/becf43b0-72f4-4d14-9d89-5593721d9ec0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"becf43b0-72f4-4d14-9d89-5593721d9ec0\") " pod="openstack/nova-api-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.658476 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/becf43b0-72f4-4d14-9d89-5593721d9ec0-config-data\") pod \"nova-api-0\" (UID: \"becf43b0-72f4-4d14-9d89-5593721d9ec0\") " pod="openstack/nova-api-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.674932 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqc8m\" (UniqueName: \"kubernetes.io/projected/becf43b0-72f4-4d14-9d89-5593721d9ec0-kube-api-access-qqc8m\") pod \"nova-api-0\" (UID: \"becf43b0-72f4-4d14-9d89-5593721d9ec0\") " pod="openstack/nova-api-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.823781 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.837808 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.867848 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-nrs7z"] Dec 06 09:02:38 crc kubenswrapper[4954]: I1206 09:02:38.893411 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.009444 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nrs7z" event={"ID":"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6","Type":"ContainerStarted","Data":"a7403033532f1fd1527492ac6600ae179b417d0139ac0ad1151837d0a34925ff"} Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.010944 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0","Type":"ContainerStarted","Data":"0ae6f46d99afd34231db2ebfabe47549f644f8c58ddff105bf091f9b60d4f04d"} Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.012428 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6xj9l"] Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.015704 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6xj9l" Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.021263 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.021502 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.040109 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6xj9l"] Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.107581 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.161583 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c55851b-2db1-4dce-98e8-a7e3dcb41190-config-data\") pod \"nova-cell1-conductor-db-sync-6xj9l\" (UID: \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\") " pod="openstack/nova-cell1-conductor-db-sync-6xj9l" Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.161637 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c55851b-2db1-4dce-98e8-a7e3dcb41190-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6xj9l\" (UID: \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\") " pod="openstack/nova-cell1-conductor-db-sync-6xj9l" Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.161691 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8krjl\" (UniqueName: \"kubernetes.io/projected/5c55851b-2db1-4dce-98e8-a7e3dcb41190-kube-api-access-8krjl\") pod \"nova-cell1-conductor-db-sync-6xj9l\" (UID: \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\") " pod="openstack/nova-cell1-conductor-db-sync-6xj9l" Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.161728 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c55851b-2db1-4dce-98e8-a7e3dcb41190-scripts\") pod \"nova-cell1-conductor-db-sync-6xj9l\" (UID: \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\") " pod="openstack/nova-cell1-conductor-db-sync-6xj9l" Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.233379 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.269655 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c55851b-2db1-4dce-98e8-a7e3dcb41190-config-data\") pod \"nova-cell1-conductor-db-sync-6xj9l\" (UID: \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\") " pod="openstack/nova-cell1-conductor-db-sync-6xj9l" Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.269738 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c55851b-2db1-4dce-98e8-a7e3dcb41190-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6xj9l\" (UID: \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\") " pod="openstack/nova-cell1-conductor-db-sync-6xj9l" Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.269812 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8krjl\" (UniqueName: \"kubernetes.io/projected/5c55851b-2db1-4dce-98e8-a7e3dcb41190-kube-api-access-8krjl\") pod \"nova-cell1-conductor-db-sync-6xj9l\" (UID: \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\") " pod="openstack/nova-cell1-conductor-db-sync-6xj9l" Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.269887 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c55851b-2db1-4dce-98e8-a7e3dcb41190-scripts\") pod \"nova-cell1-conductor-db-sync-6xj9l\" (UID: \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\") " pod="openstack/nova-cell1-conductor-db-sync-6xj9l" Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.287750 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c55851b-2db1-4dce-98e8-a7e3dcb41190-scripts\") pod \"nova-cell1-conductor-db-sync-6xj9l\" (UID: \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\") " pod="openstack/nova-cell1-conductor-db-sync-6xj9l" Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.287897 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c55851b-2db1-4dce-98e8-a7e3dcb41190-config-data\") pod \"nova-cell1-conductor-db-sync-6xj9l\" (UID: \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\") " pod="openstack/nova-cell1-conductor-db-sync-6xj9l" Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.289013 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c55851b-2db1-4dce-98e8-a7e3dcb41190-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6xj9l\" (UID: \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\") " pod="openstack/nova-cell1-conductor-db-sync-6xj9l" Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.294657 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8krjl\" (UniqueName: \"kubernetes.io/projected/5c55851b-2db1-4dce-98e8-a7e3dcb41190-kube-api-access-8krjl\") pod \"nova-cell1-conductor-db-sync-6xj9l\" (UID: \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\") " pod="openstack/nova-cell1-conductor-db-sync-6xj9l" Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.382456 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6xj9l" Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.474482 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:02:39 crc kubenswrapper[4954]: W1206 09:02:39.501078 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbecf43b0_72f4_4d14_9d89_5593721d9ec0.slice/crio-cd7f954c14cd3880e2064b0b24ee3d660776c352732d1e88a38e4197791a9586 WatchSource:0}: Error finding container cd7f954c14cd3880e2064b0b24ee3d660776c352732d1e88a38e4197791a9586: Status 404 returned error can't find the container with id cd7f954c14cd3880e2064b0b24ee3d660776c352732d1e88a38e4197791a9586 Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.568697 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f85bbdcf9-tk6ff"] Dec 06 09:02:39 crc kubenswrapper[4954]: I1206 09:02:39.878807 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6xj9l"] Dec 06 09:02:40 crc kubenswrapper[4954]: I1206 09:02:40.048119 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9","Type":"ContainerStarted","Data":"48bacfe652fcf79f409ccf172cdca87971cd8a43d05028a91804ab2ca252007d"} Dec 06 09:02:40 crc kubenswrapper[4954]: I1206 09:02:40.052250 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nrs7z" event={"ID":"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6","Type":"ContainerStarted","Data":"ec5e22c416a70cb7e7195a71443f7142d49975c23ed8c4a0762e1db1c661a7dc"} Dec 06 09:02:40 crc kubenswrapper[4954]: I1206 09:02:40.055165 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"becf43b0-72f4-4d14-9d89-5593721d9ec0","Type":"ContainerStarted","Data":"cd7f954c14cd3880e2064b0b24ee3d660776c352732d1e88a38e4197791a9586"} Dec 06 09:02:40 crc kubenswrapper[4954]: I1206 09:02:40.063189 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3183c048-9d1e-42e8-a51f-33e19b7da173","Type":"ContainerStarted","Data":"53c2901b707c049a902259dc848b5e9ba614078fdce9c08de3d020f62cb80b79"} Dec 06 09:02:40 crc kubenswrapper[4954]: I1206 09:02:40.073849 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-nrs7z" podStartSLOduration=3.073828381 podStartE2EDuration="3.073828381s" podCreationTimestamp="2025-12-06 09:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:02:40.070278146 +0000 UTC m=+7534.883637545" watchObservedRunningTime="2025-12-06 09:02:40.073828381 +0000 UTC m=+7534.887187770" Dec 06 09:02:40 crc kubenswrapper[4954]: I1206 09:02:40.074801 4954 generic.go:334] "Generic (PLEG): container finished" podID="2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a" containerID="a06e5d0a634317d55b03665a2dd85d91619c9c649569c42261b501bfd0a3599f" exitCode=0 Dec 06 09:02:40 crc kubenswrapper[4954]: I1206 09:02:40.074853 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" event={"ID":"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a","Type":"ContainerDied","Data":"a06e5d0a634317d55b03665a2dd85d91619c9c649569c42261b501bfd0a3599f"} Dec 06 09:02:40 crc kubenswrapper[4954]: I1206 09:02:40.074907 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" event={"ID":"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a","Type":"ContainerStarted","Data":"ebc8dbe722682110dcec9feef55e8f0fbd8d9cd9f86d68885c25c641a316907c"} Dec 06 09:02:42 crc kubenswrapper[4954]: I1206 09:02:42.125857 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" event={"ID":"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a","Type":"ContainerStarted","Data":"073a19617c17941e084c57a92ccacf88aede8bc77f16ff842329f11ea49766e3"} Dec 06 09:02:42 crc kubenswrapper[4954]: I1206 09:02:42.126752 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:42 crc kubenswrapper[4954]: I1206 09:02:42.138886 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0","Type":"ContainerStarted","Data":"485753a82f1e99673a1c494fb9f0c3605901b0edb6a6329c60c793102f85761f"} Dec 06 09:02:42 crc kubenswrapper[4954]: I1206 09:02:42.144784 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6xj9l" event={"ID":"5c55851b-2db1-4dce-98e8-a7e3dcb41190","Type":"ContainerStarted","Data":"8bf150aa0019c3f84a3929dc1735d31e66ed2b18bac87cf3ef22153533367525"} Dec 06 09:02:42 crc kubenswrapper[4954]: I1206 09:02:42.144823 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6xj9l" event={"ID":"5c55851b-2db1-4dce-98e8-a7e3dcb41190","Type":"ContainerStarted","Data":"3436cdc52917deb8c32a40abe38c5caa57b37d503e70e62b89e4563fd194d22a"} Dec 06 09:02:42 crc kubenswrapper[4954]: I1206 09:02:42.150181 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9","Type":"ContainerStarted","Data":"bcf24d950de2b4aae2973cc7aad9350fceeeee838a1fc6c9542c4272106899e2"} Dec 06 09:02:42 crc kubenswrapper[4954]: I1206 09:02:42.153905 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"becf43b0-72f4-4d14-9d89-5593721d9ec0","Type":"ContainerStarted","Data":"49835852d73e193582ba30ea8e61626cc5bd001243125e937b03423d07641311"} Dec 06 09:02:42 crc kubenswrapper[4954]: I1206 09:02:42.157533 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" podStartSLOduration=4.15750514 podStartE2EDuration="4.15750514s" podCreationTimestamp="2025-12-06 09:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:02:42.151727776 +0000 UTC m=+7536.965087165" watchObservedRunningTime="2025-12-06 09:02:42.15750514 +0000 UTC m=+7536.970864529" Dec 06 09:02:42 crc kubenswrapper[4954]: I1206 09:02:42.160574 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3183c048-9d1e-42e8-a51f-33e19b7da173","Type":"ContainerStarted","Data":"f5de95013c1e37dbb9b3889890eaf420067d4fe5cf920c384a3abfe64b6898e5"} Dec 06 09:02:42 crc kubenswrapper[4954]: I1206 09:02:42.177077 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.369474062 podStartE2EDuration="5.177056804s" podCreationTimestamp="2025-12-06 09:02:37 +0000 UTC" firstStartedPulling="2025-12-06 09:02:38.92669576 +0000 UTC m=+7533.740055149" lastFinishedPulling="2025-12-06 09:02:41.734278512 +0000 UTC m=+7536.547637891" observedRunningTime="2025-12-06 09:02:42.163641115 +0000 UTC m=+7536.977000494" watchObservedRunningTime="2025-12-06 09:02:42.177056804 +0000 UTC m=+7536.990416203" Dec 06 09:02:42 crc kubenswrapper[4954]: I1206 09:02:42.190868 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.552226708 podStartE2EDuration="4.190842844s" podCreationTimestamp="2025-12-06 09:02:38 +0000 UTC" firstStartedPulling="2025-12-06 09:02:39.125609529 +0000 UTC m=+7533.938968918" lastFinishedPulling="2025-12-06 09:02:41.764225665 +0000 UTC m=+7536.577585054" observedRunningTime="2025-12-06 09:02:42.178985596 +0000 UTC m=+7536.992344995" watchObservedRunningTime="2025-12-06 09:02:42.190842844 +0000 UTC m=+7537.004202233" Dec 06 09:02:42 crc kubenswrapper[4954]: I1206 09:02:42.211438 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-6xj9l" podStartSLOduration=4.211402774 podStartE2EDuration="4.211402774s" podCreationTimestamp="2025-12-06 09:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:02:42.199168447 +0000 UTC m=+7537.012527846" watchObservedRunningTime="2025-12-06 09:02:42.211402774 +0000 UTC m=+7537.024762163" Dec 06 09:02:42 crc kubenswrapper[4954]: I1206 09:02:42.630589 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 09:02:42 crc kubenswrapper[4954]: I1206 09:02:42.641324 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:02:43 crc kubenswrapper[4954]: I1206 09:02:43.174000 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3183c048-9d1e-42e8-a51f-33e19b7da173","Type":"ContainerStarted","Data":"1bd9905bb6bcb4b5e537265f60f7842e43e3a012ae4c44775a37b25e58f260cb"} Dec 06 09:02:43 crc kubenswrapper[4954]: I1206 09:02:43.177384 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"becf43b0-72f4-4d14-9d89-5593721d9ec0","Type":"ContainerStarted","Data":"a2bc6c989244b9b3382af0ea6fa4d4f5812eaf8d2451316302a9892518c7760b"} Dec 06 09:02:43 crc kubenswrapper[4954]: I1206 09:02:43.195159 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.668983295 podStartE2EDuration="5.195134648s" podCreationTimestamp="2025-12-06 09:02:38 +0000 UTC" firstStartedPulling="2025-12-06 09:02:39.253258398 +0000 UTC m=+7534.066617787" lastFinishedPulling="2025-12-06 09:02:41.779409751 +0000 UTC m=+7536.592769140" observedRunningTime="2025-12-06 09:02:43.192144268 +0000 UTC m=+7538.005503657" watchObservedRunningTime="2025-12-06 09:02:43.195134648 +0000 UTC m=+7538.008494057" Dec 06 09:02:43 crc kubenswrapper[4954]: I1206 09:02:43.223592 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.953084386 podStartE2EDuration="5.223552289s" podCreationTimestamp="2025-12-06 09:02:38 +0000 UTC" firstStartedPulling="2025-12-06 09:02:39.537307688 +0000 UTC m=+7534.350667077" lastFinishedPulling="2025-12-06 09:02:41.807775591 +0000 UTC m=+7536.621134980" observedRunningTime="2025-12-06 09:02:43.213854099 +0000 UTC m=+7538.027213518" watchObservedRunningTime="2025-12-06 09:02:43.223552289 +0000 UTC m=+7538.036911688" Dec 06 09:02:43 crc kubenswrapper[4954]: I1206 09:02:43.292090 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 06 09:02:43 crc kubenswrapper[4954]: I1206 09:02:43.500851 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:43 crc kubenswrapper[4954]: I1206 09:02:43.625650 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 09:02:43 crc kubenswrapper[4954]: I1206 09:02:43.625696 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.189200 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3183c048-9d1e-42e8-a51f-33e19b7da173" containerName="nova-metadata-log" containerID="cri-o://f5de95013c1e37dbb9b3889890eaf420067d4fe5cf920c384a3abfe64b6898e5" gracePeriod=30 Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.189437 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://bcf24d950de2b4aae2973cc7aad9350fceeeee838a1fc6c9542c4272106899e2" gracePeriod=30 Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.190898 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3183c048-9d1e-42e8-a51f-33e19b7da173" containerName="nova-metadata-metadata" containerID="cri-o://1bd9905bb6bcb4b5e537265f60f7842e43e3a012ae4c44775a37b25e58f260cb" gracePeriod=30 Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.837163 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.893105 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3183c048-9d1e-42e8-a51f-33e19b7da173-combined-ca-bundle\") pod \"3183c048-9d1e-42e8-a51f-33e19b7da173\" (UID: \"3183c048-9d1e-42e8-a51f-33e19b7da173\") " Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.893181 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3183c048-9d1e-42e8-a51f-33e19b7da173-logs\") pod \"3183c048-9d1e-42e8-a51f-33e19b7da173\" (UID: \"3183c048-9d1e-42e8-a51f-33e19b7da173\") " Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.893212 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfdqj\" (UniqueName: \"kubernetes.io/projected/3183c048-9d1e-42e8-a51f-33e19b7da173-kube-api-access-bfdqj\") pod \"3183c048-9d1e-42e8-a51f-33e19b7da173\" (UID: \"3183c048-9d1e-42e8-a51f-33e19b7da173\") " Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.893373 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3183c048-9d1e-42e8-a51f-33e19b7da173-config-data\") pod \"3183c048-9d1e-42e8-a51f-33e19b7da173\" (UID: \"3183c048-9d1e-42e8-a51f-33e19b7da173\") " Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.893844 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3183c048-9d1e-42e8-a51f-33e19b7da173-logs" (OuterVolumeSpecName: "logs") pod "3183c048-9d1e-42e8-a51f-33e19b7da173" (UID: "3183c048-9d1e-42e8-a51f-33e19b7da173"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.899611 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3183c048-9d1e-42e8-a51f-33e19b7da173-kube-api-access-bfdqj" (OuterVolumeSpecName: "kube-api-access-bfdqj") pod "3183c048-9d1e-42e8-a51f-33e19b7da173" (UID: "3183c048-9d1e-42e8-a51f-33e19b7da173"). InnerVolumeSpecName "kube-api-access-bfdqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.924981 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3183c048-9d1e-42e8-a51f-33e19b7da173-config-data" (OuterVolumeSpecName: "config-data") pod "3183c048-9d1e-42e8-a51f-33e19b7da173" (UID: "3183c048-9d1e-42e8-a51f-33e19b7da173"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.928810 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3183c048-9d1e-42e8-a51f-33e19b7da173-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3183c048-9d1e-42e8-a51f-33e19b7da173" (UID: "3183c048-9d1e-42e8-a51f-33e19b7da173"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.945925 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.994640 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9-config-data\") pod \"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9\" (UID: \"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9\") " Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.994831 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9-combined-ca-bundle\") pod \"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9\" (UID: \"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9\") " Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.995036 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf8lb\" (UniqueName: \"kubernetes.io/projected/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9-kube-api-access-bf8lb\") pod \"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9\" (UID: \"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9\") " Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.995532 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3183c048-9d1e-42e8-a51f-33e19b7da173-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.995575 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3183c048-9d1e-42e8-a51f-33e19b7da173-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.995589 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfdqj\" (UniqueName: \"kubernetes.io/projected/3183c048-9d1e-42e8-a51f-33e19b7da173-kube-api-access-bfdqj\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.995599 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3183c048-9d1e-42e8-a51f-33e19b7da173-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:44 crc kubenswrapper[4954]: I1206 09:02:44.998114 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9-kube-api-access-bf8lb" (OuterVolumeSpecName: "kube-api-access-bf8lb") pod "1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9" (UID: "1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9"). InnerVolumeSpecName "kube-api-access-bf8lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.019793 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9-config-data" (OuterVolumeSpecName: "config-data") pod "1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9" (UID: "1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.027759 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9" (UID: "1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.097385 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf8lb\" (UniqueName: \"kubernetes.io/projected/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9-kube-api-access-bf8lb\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.097440 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.097451 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.198424 4954 generic.go:334] "Generic (PLEG): container finished" podID="1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9" containerID="bcf24d950de2b4aae2973cc7aad9350fceeeee838a1fc6c9542c4272106899e2" exitCode=0 Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.198480 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9","Type":"ContainerDied","Data":"bcf24d950de2b4aae2973cc7aad9350fceeeee838a1fc6c9542c4272106899e2"} Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.198505 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9","Type":"ContainerDied","Data":"48bacfe652fcf79f409ccf172cdca87971cd8a43d05028a91804ab2ca252007d"} Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.198521 4954 scope.go:117] "RemoveContainer" containerID="bcf24d950de2b4aae2973cc7aad9350fceeeee838a1fc6c9542c4272106899e2" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.198644 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.201139 4954 generic.go:334] "Generic (PLEG): container finished" podID="ca4ca0d9-e86a-46f5-941a-b692d62ab2e6" containerID="ec5e22c416a70cb7e7195a71443f7142d49975c23ed8c4a0762e1db1c661a7dc" exitCode=0 Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.201197 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nrs7z" event={"ID":"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6","Type":"ContainerDied","Data":"ec5e22c416a70cb7e7195a71443f7142d49975c23ed8c4a0762e1db1c661a7dc"} Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.204043 4954 generic.go:334] "Generic (PLEG): container finished" podID="3183c048-9d1e-42e8-a51f-33e19b7da173" containerID="1bd9905bb6bcb4b5e537265f60f7842e43e3a012ae4c44775a37b25e58f260cb" exitCode=0 Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.204071 4954 generic.go:334] "Generic (PLEG): container finished" podID="3183c048-9d1e-42e8-a51f-33e19b7da173" containerID="f5de95013c1e37dbb9b3889890eaf420067d4fe5cf920c384a3abfe64b6898e5" exitCode=143 Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.204104 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3183c048-9d1e-42e8-a51f-33e19b7da173","Type":"ContainerDied","Data":"1bd9905bb6bcb4b5e537265f60f7842e43e3a012ae4c44775a37b25e58f260cb"} Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.204124 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3183c048-9d1e-42e8-a51f-33e19b7da173","Type":"ContainerDied","Data":"f5de95013c1e37dbb9b3889890eaf420067d4fe5cf920c384a3abfe64b6898e5"} Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.204136 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3183c048-9d1e-42e8-a51f-33e19b7da173","Type":"ContainerDied","Data":"53c2901b707c049a902259dc848b5e9ba614078fdce9c08de3d020f62cb80b79"} Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.204190 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.208444 4954 generic.go:334] "Generic (PLEG): container finished" podID="5c55851b-2db1-4dce-98e8-a7e3dcb41190" containerID="8bf150aa0019c3f84a3929dc1735d31e66ed2b18bac87cf3ef22153533367525" exitCode=0 Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.208539 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6xj9l" event={"ID":"5c55851b-2db1-4dce-98e8-a7e3dcb41190","Type":"ContainerDied","Data":"8bf150aa0019c3f84a3929dc1735d31e66ed2b18bac87cf3ef22153533367525"} Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.221277 4954 scope.go:117] "RemoveContainer" containerID="bcf24d950de2b4aae2973cc7aad9350fceeeee838a1fc6c9542c4272106899e2" Dec 06 09:02:45 crc kubenswrapper[4954]: E1206 09:02:45.221698 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcf24d950de2b4aae2973cc7aad9350fceeeee838a1fc6c9542c4272106899e2\": container with ID starting with bcf24d950de2b4aae2973cc7aad9350fceeeee838a1fc6c9542c4272106899e2 not found: ID does not exist" containerID="bcf24d950de2b4aae2973cc7aad9350fceeeee838a1fc6c9542c4272106899e2" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.221739 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcf24d950de2b4aae2973cc7aad9350fceeeee838a1fc6c9542c4272106899e2"} err="failed to get container status \"bcf24d950de2b4aae2973cc7aad9350fceeeee838a1fc6c9542c4272106899e2\": rpc error: code = NotFound desc = could not find container \"bcf24d950de2b4aae2973cc7aad9350fceeeee838a1fc6c9542c4272106899e2\": container with ID starting with bcf24d950de2b4aae2973cc7aad9350fceeeee838a1fc6c9542c4272106899e2 not found: ID does not exist" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.221763 4954 scope.go:117] "RemoveContainer" containerID="1bd9905bb6bcb4b5e537265f60f7842e43e3a012ae4c44775a37b25e58f260cb" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.248081 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.271681 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.276808 4954 scope.go:117] "RemoveContainer" containerID="f5de95013c1e37dbb9b3889890eaf420067d4fe5cf920c384a3abfe64b6898e5" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.327050 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 09:02:45 crc kubenswrapper[4954]: E1206 09:02:45.336905 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3183c048-9d1e-42e8-a51f-33e19b7da173" containerName="nova-metadata-metadata" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.336941 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3183c048-9d1e-42e8-a51f-33e19b7da173" containerName="nova-metadata-metadata" Dec 06 09:02:45 crc kubenswrapper[4954]: E1206 09:02:45.336991 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.336999 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 09:02:45 crc kubenswrapper[4954]: E1206 09:02:45.337041 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3183c048-9d1e-42e8-a51f-33e19b7da173" containerName="nova-metadata-log" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.337049 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3183c048-9d1e-42e8-a51f-33e19b7da173" containerName="nova-metadata-log" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.352601 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.352686 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3183c048-9d1e-42e8-a51f-33e19b7da173" containerName="nova-metadata-metadata" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.352703 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3183c048-9d1e-42e8-a51f-33e19b7da173" containerName="nova-metadata-log" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.355766 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.355798 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.355895 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.361176 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.364378 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.364769 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.365623 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.371741 4954 scope.go:117] "RemoveContainer" containerID="1bd9905bb6bcb4b5e537265f60f7842e43e3a012ae4c44775a37b25e58f260cb" Dec 06 09:02:45 crc kubenswrapper[4954]: E1206 09:02:45.376894 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd9905bb6bcb4b5e537265f60f7842e43e3a012ae4c44775a37b25e58f260cb\": container with ID starting with 1bd9905bb6bcb4b5e537265f60f7842e43e3a012ae4c44775a37b25e58f260cb not found: ID does not exist" containerID="1bd9905bb6bcb4b5e537265f60f7842e43e3a012ae4c44775a37b25e58f260cb" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.376971 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd9905bb6bcb4b5e537265f60f7842e43e3a012ae4c44775a37b25e58f260cb"} err="failed to get container status \"1bd9905bb6bcb4b5e537265f60f7842e43e3a012ae4c44775a37b25e58f260cb\": rpc error: code = NotFound desc = could not find container \"1bd9905bb6bcb4b5e537265f60f7842e43e3a012ae4c44775a37b25e58f260cb\": container with ID starting with 1bd9905bb6bcb4b5e537265f60f7842e43e3a012ae4c44775a37b25e58f260cb not found: ID does not exist" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.377017 4954 scope.go:117] "RemoveContainer" containerID="f5de95013c1e37dbb9b3889890eaf420067d4fe5cf920c384a3abfe64b6898e5" Dec 06 09:02:45 crc kubenswrapper[4954]: E1206 09:02:45.377585 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5de95013c1e37dbb9b3889890eaf420067d4fe5cf920c384a3abfe64b6898e5\": container with ID starting with f5de95013c1e37dbb9b3889890eaf420067d4fe5cf920c384a3abfe64b6898e5 not found: ID does not exist" containerID="f5de95013c1e37dbb9b3889890eaf420067d4fe5cf920c384a3abfe64b6898e5" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.377619 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5de95013c1e37dbb9b3889890eaf420067d4fe5cf920c384a3abfe64b6898e5"} err="failed to get container status \"f5de95013c1e37dbb9b3889890eaf420067d4fe5cf920c384a3abfe64b6898e5\": rpc error: code = NotFound desc = could not find container \"f5de95013c1e37dbb9b3889890eaf420067d4fe5cf920c384a3abfe64b6898e5\": container with ID starting with f5de95013c1e37dbb9b3889890eaf420067d4fe5cf920c384a3abfe64b6898e5 not found: ID does not exist" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.377645 4954 scope.go:117] "RemoveContainer" containerID="1bd9905bb6bcb4b5e537265f60f7842e43e3a012ae4c44775a37b25e58f260cb" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.377970 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd9905bb6bcb4b5e537265f60f7842e43e3a012ae4c44775a37b25e58f260cb"} err="failed to get container status \"1bd9905bb6bcb4b5e537265f60f7842e43e3a012ae4c44775a37b25e58f260cb\": rpc error: code = NotFound desc = could not find container \"1bd9905bb6bcb4b5e537265f60f7842e43e3a012ae4c44775a37b25e58f260cb\": container with ID starting with 1bd9905bb6bcb4b5e537265f60f7842e43e3a012ae4c44775a37b25e58f260cb not found: ID does not exist" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.377990 4954 scope.go:117] "RemoveContainer" containerID="f5de95013c1e37dbb9b3889890eaf420067d4fe5cf920c384a3abfe64b6898e5" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.378507 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5de95013c1e37dbb9b3889890eaf420067d4fe5cf920c384a3abfe64b6898e5"} err="failed to get container status \"f5de95013c1e37dbb9b3889890eaf420067d4fe5cf920c384a3abfe64b6898e5\": rpc error: code = NotFound desc = could not find container \"f5de95013c1e37dbb9b3889890eaf420067d4fe5cf920c384a3abfe64b6898e5\": container with ID starting with f5de95013c1e37dbb9b3889890eaf420067d4fe5cf920c384a3abfe64b6898e5 not found: ID does not exist" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.378633 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.381185 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.384780 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.384939 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.393219 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.423459 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwfl4\" (UniqueName: \"kubernetes.io/projected/7f9f9d71-3f70-4ef5-9d90-f0d792e5b646-kube-api-access-kwfl4\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f9f9d71-3f70-4ef5-9d90-f0d792e5b646\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.423965 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9f9d71-3f70-4ef5-9d90-f0d792e5b646-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f9f9d71-3f70-4ef5-9d90-f0d792e5b646\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.424010 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9f9d71-3f70-4ef5-9d90-f0d792e5b646-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f9f9d71-3f70-4ef5-9d90-f0d792e5b646\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.424074 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9f9d71-3f70-4ef5-9d90-f0d792e5b646-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f9f9d71-3f70-4ef5-9d90-f0d792e5b646\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.424104 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9f9d71-3f70-4ef5-9d90-f0d792e5b646-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f9f9d71-3f70-4ef5-9d90-f0d792e5b646\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.452464 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9" path="/var/lib/kubelet/pods/1bc2d2d6-ee7a-439b-9ce0-2b9041a254b9/volumes" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.453038 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3183c048-9d1e-42e8-a51f-33e19b7da173" path="/var/lib/kubelet/pods/3183c048-9d1e-42e8-a51f-33e19b7da173/volumes" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.525373 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/114844b1-fbfb-46b8-80d1-27d3fad69f5a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " pod="openstack/nova-metadata-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.525434 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9f9d71-3f70-4ef5-9d90-f0d792e5b646-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f9f9d71-3f70-4ef5-9d90-f0d792e5b646\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.525460 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9f9d71-3f70-4ef5-9d90-f0d792e5b646-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f9f9d71-3f70-4ef5-9d90-f0d792e5b646\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.526261 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/114844b1-fbfb-46b8-80d1-27d3fad69f5a-logs\") pod \"nova-metadata-0\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " pod="openstack/nova-metadata-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.526323 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89nhv\" (UniqueName: \"kubernetes.io/projected/114844b1-fbfb-46b8-80d1-27d3fad69f5a-kube-api-access-89nhv\") pod \"nova-metadata-0\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " pod="openstack/nova-metadata-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.526364 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9f9d71-3f70-4ef5-9d90-f0d792e5b646-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f9f9d71-3f70-4ef5-9d90-f0d792e5b646\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.526467 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9f9d71-3f70-4ef5-9d90-f0d792e5b646-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f9f9d71-3f70-4ef5-9d90-f0d792e5b646\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.526591 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/114844b1-fbfb-46b8-80d1-27d3fad69f5a-config-data\") pod \"nova-metadata-0\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " pod="openstack/nova-metadata-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.526631 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/114844b1-fbfb-46b8-80d1-27d3fad69f5a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " pod="openstack/nova-metadata-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.526687 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwfl4\" (UniqueName: \"kubernetes.io/projected/7f9f9d71-3f70-4ef5-9d90-f0d792e5b646-kube-api-access-kwfl4\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f9f9d71-3f70-4ef5-9d90-f0d792e5b646\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.530803 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9f9d71-3f70-4ef5-9d90-f0d792e5b646-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f9f9d71-3f70-4ef5-9d90-f0d792e5b646\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.531776 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9f9d71-3f70-4ef5-9d90-f0d792e5b646-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f9f9d71-3f70-4ef5-9d90-f0d792e5b646\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.531877 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9f9d71-3f70-4ef5-9d90-f0d792e5b646-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f9f9d71-3f70-4ef5-9d90-f0d792e5b646\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.538136 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9f9d71-3f70-4ef5-9d90-f0d792e5b646-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f9f9d71-3f70-4ef5-9d90-f0d792e5b646\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.555468 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwfl4\" (UniqueName: \"kubernetes.io/projected/7f9f9d71-3f70-4ef5-9d90-f0d792e5b646-kube-api-access-kwfl4\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f9f9d71-3f70-4ef5-9d90-f0d792e5b646\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.628721 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/114844b1-fbfb-46b8-80d1-27d3fad69f5a-logs\") pod \"nova-metadata-0\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " pod="openstack/nova-metadata-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.628788 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89nhv\" (UniqueName: \"kubernetes.io/projected/114844b1-fbfb-46b8-80d1-27d3fad69f5a-kube-api-access-89nhv\") pod \"nova-metadata-0\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " pod="openstack/nova-metadata-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.628842 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/114844b1-fbfb-46b8-80d1-27d3fad69f5a-config-data\") pod \"nova-metadata-0\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " pod="openstack/nova-metadata-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.628866 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/114844b1-fbfb-46b8-80d1-27d3fad69f5a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " pod="openstack/nova-metadata-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.629117 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/114844b1-fbfb-46b8-80d1-27d3fad69f5a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " pod="openstack/nova-metadata-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.631264 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/114844b1-fbfb-46b8-80d1-27d3fad69f5a-logs\") pod \"nova-metadata-0\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " pod="openstack/nova-metadata-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.634595 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/114844b1-fbfb-46b8-80d1-27d3fad69f5a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " pod="openstack/nova-metadata-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.636051 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/114844b1-fbfb-46b8-80d1-27d3fad69f5a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " pod="openstack/nova-metadata-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.638269 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/114844b1-fbfb-46b8-80d1-27d3fad69f5a-config-data\") pod \"nova-metadata-0\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " pod="openstack/nova-metadata-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.644377 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89nhv\" (UniqueName: \"kubernetes.io/projected/114844b1-fbfb-46b8-80d1-27d3fad69f5a-kube-api-access-89nhv\") pod \"nova-metadata-0\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " pod="openstack/nova-metadata-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.691159 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:45 crc kubenswrapper[4954]: I1206 09:02:45.701283 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.151178 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.254808 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7f9f9d71-3f70-4ef5-9d90-f0d792e5b646","Type":"ContainerStarted","Data":"6c5f5cdb13499f35589b024f85bceac7abc5624c9df22c34599b36376366bfec"} Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.278540 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.750450 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6xj9l" Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.762778 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nrs7z" Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.852090 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-scripts\") pod \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\" (UID: \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\") " Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.852466 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8krjl\" (UniqueName: \"kubernetes.io/projected/5c55851b-2db1-4dce-98e8-a7e3dcb41190-kube-api-access-8krjl\") pod \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\" (UID: \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\") " Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.852552 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c55851b-2db1-4dce-98e8-a7e3dcb41190-scripts\") pod \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\" (UID: \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\") " Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.852626 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c55851b-2db1-4dce-98e8-a7e3dcb41190-combined-ca-bundle\") pod \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\" (UID: \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\") " Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.852738 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-config-data\") pod \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\" (UID: \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\") " Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.852789 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c26d\" (UniqueName: \"kubernetes.io/projected/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-kube-api-access-7c26d\") pod \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\" (UID: \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\") " Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.852853 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-combined-ca-bundle\") pod \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\" (UID: \"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6\") " Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.852881 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c55851b-2db1-4dce-98e8-a7e3dcb41190-config-data\") pod \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\" (UID: \"5c55851b-2db1-4dce-98e8-a7e3dcb41190\") " Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.859268 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-scripts" (OuterVolumeSpecName: "scripts") pod "ca4ca0d9-e86a-46f5-941a-b692d62ab2e6" (UID: "ca4ca0d9-e86a-46f5-941a-b692d62ab2e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.860381 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-kube-api-access-7c26d" (OuterVolumeSpecName: "kube-api-access-7c26d") pod "ca4ca0d9-e86a-46f5-941a-b692d62ab2e6" (UID: "ca4ca0d9-e86a-46f5-941a-b692d62ab2e6"). InnerVolumeSpecName "kube-api-access-7c26d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.862687 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c55851b-2db1-4dce-98e8-a7e3dcb41190-kube-api-access-8krjl" (OuterVolumeSpecName: "kube-api-access-8krjl") pod "5c55851b-2db1-4dce-98e8-a7e3dcb41190" (UID: "5c55851b-2db1-4dce-98e8-a7e3dcb41190"). InnerVolumeSpecName "kube-api-access-8krjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.863897 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c55851b-2db1-4dce-98e8-a7e3dcb41190-scripts" (OuterVolumeSpecName: "scripts") pod "5c55851b-2db1-4dce-98e8-a7e3dcb41190" (UID: "5c55851b-2db1-4dce-98e8-a7e3dcb41190"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.889202 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c55851b-2db1-4dce-98e8-a7e3dcb41190-config-data" (OuterVolumeSpecName: "config-data") pod "5c55851b-2db1-4dce-98e8-a7e3dcb41190" (UID: "5c55851b-2db1-4dce-98e8-a7e3dcb41190"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.894257 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-config-data" (OuterVolumeSpecName: "config-data") pod "ca4ca0d9-e86a-46f5-941a-b692d62ab2e6" (UID: "ca4ca0d9-e86a-46f5-941a-b692d62ab2e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.894269 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c55851b-2db1-4dce-98e8-a7e3dcb41190-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c55851b-2db1-4dce-98e8-a7e3dcb41190" (UID: "5c55851b-2db1-4dce-98e8-a7e3dcb41190"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.898727 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca4ca0d9-e86a-46f5-941a-b692d62ab2e6" (UID: "ca4ca0d9-e86a-46f5-941a-b692d62ab2e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.953994 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8krjl\" (UniqueName: \"kubernetes.io/projected/5c55851b-2db1-4dce-98e8-a7e3dcb41190-kube-api-access-8krjl\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.954037 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c55851b-2db1-4dce-98e8-a7e3dcb41190-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.954047 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c55851b-2db1-4dce-98e8-a7e3dcb41190-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.954058 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.954068 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c26d\" (UniqueName: \"kubernetes.io/projected/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-kube-api-access-7c26d\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.954076 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.954084 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c55851b-2db1-4dce-98e8-a7e3dcb41190-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:46 crc kubenswrapper[4954]: I1206 09:02:46.954092 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.272435 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"114844b1-fbfb-46b8-80d1-27d3fad69f5a","Type":"ContainerStarted","Data":"a7aea1c152e4b6757f714d274e08c89358664c17ef13d3234675f1b21a924fb2"} Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.273789 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"114844b1-fbfb-46b8-80d1-27d3fad69f5a","Type":"ContainerStarted","Data":"e2361eba2d155e2b896747bb46e23176728e7d90688c7f3435fb18053039fb48"} Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.273915 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"114844b1-fbfb-46b8-80d1-27d3fad69f5a","Type":"ContainerStarted","Data":"5e3a81c049c08172fd21642424873b330471790f815839b8fda1ee1493654132"} Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.279336 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6xj9l" event={"ID":"5c55851b-2db1-4dce-98e8-a7e3dcb41190","Type":"ContainerDied","Data":"3436cdc52917deb8c32a40abe38c5caa57b37d503e70e62b89e4563fd194d22a"} Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.279403 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3436cdc52917deb8c32a40abe38c5caa57b37d503e70e62b89e4563fd194d22a" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.279482 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6xj9l" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.283893 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7f9f9d71-3f70-4ef5-9d90-f0d792e5b646","Type":"ContainerStarted","Data":"935ab1c24342c75d4d6a2f89e5b67aa2906889d3ff0b4d9070d141da998e1363"} Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.289833 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nrs7z" event={"ID":"ca4ca0d9-e86a-46f5-941a-b692d62ab2e6","Type":"ContainerDied","Data":"a7403033532f1fd1527492ac6600ae179b417d0139ac0ad1151837d0a34925ff"} Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.289890 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7403033532f1fd1527492ac6600ae179b417d0139ac0ad1151837d0a34925ff" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.289963 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nrs7z" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.304722 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.30470296 podStartE2EDuration="2.30470296s" podCreationTimestamp="2025-12-06 09:02:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:02:47.299120531 +0000 UTC m=+7542.112479930" watchObservedRunningTime="2025-12-06 09:02:47.30470296 +0000 UTC m=+7542.118062349" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.330088 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.330055409 podStartE2EDuration="2.330055409s" podCreationTimestamp="2025-12-06 09:02:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:02:47.322911518 +0000 UTC m=+7542.136270907" watchObservedRunningTime="2025-12-06 09:02:47.330055409 +0000 UTC m=+7542.143414798" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.350149 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 09:02:47 crc kubenswrapper[4954]: E1206 09:02:47.350858 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c55851b-2db1-4dce-98e8-a7e3dcb41190" containerName="nova-cell1-conductor-db-sync" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.350952 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c55851b-2db1-4dce-98e8-a7e3dcb41190" containerName="nova-cell1-conductor-db-sync" Dec 06 09:02:47 crc kubenswrapper[4954]: E1206 09:02:47.351072 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4ca0d9-e86a-46f5-941a-b692d62ab2e6" containerName="nova-manage" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.351156 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4ca0d9-e86a-46f5-941a-b692d62ab2e6" containerName="nova-manage" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.351465 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca4ca0d9-e86a-46f5-941a-b692d62ab2e6" containerName="nova-manage" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.351544 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c55851b-2db1-4dce-98e8-a7e3dcb41190" containerName="nova-cell1-conductor-db-sync" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.352483 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.354911 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.358629 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.362731 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d57c38b-caac-4a50-b930-85079409a490-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7d57c38b-caac-4a50-b930-85079409a490\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.362831 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d57c38b-caac-4a50-b930-85079409a490-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7d57c38b-caac-4a50-b930-85079409a490\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.362855 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgbjj\" (UniqueName: \"kubernetes.io/projected/7d57c38b-caac-4a50-b930-85079409a490-kube-api-access-mgbjj\") pod \"nova-cell1-conductor-0\" (UID: \"7d57c38b-caac-4a50-b930-85079409a490\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.431300 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.431547 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="becf43b0-72f4-4d14-9d89-5593721d9ec0" containerName="nova-api-log" containerID="cri-o://49835852d73e193582ba30ea8e61626cc5bd001243125e937b03423d07641311" gracePeriod=30 Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.431696 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="becf43b0-72f4-4d14-9d89-5593721d9ec0" containerName="nova-api-api" containerID="cri-o://a2bc6c989244b9b3382af0ea6fa4d4f5812eaf8d2451316302a9892518c7760b" gracePeriod=30 Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.465696 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.465889 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0" containerName="nova-scheduler-scheduler" containerID="cri-o://485753a82f1e99673a1c494fb9f0c3605901b0edb6a6329c60c793102f85761f" gracePeriod=30 Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.466093 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d57c38b-caac-4a50-b930-85079409a490-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7d57c38b-caac-4a50-b930-85079409a490\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.466175 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d57c38b-caac-4a50-b930-85079409a490-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7d57c38b-caac-4a50-b930-85079409a490\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.466197 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgbjj\" (UniqueName: \"kubernetes.io/projected/7d57c38b-caac-4a50-b930-85079409a490-kube-api-access-mgbjj\") pod \"nova-cell1-conductor-0\" (UID: \"7d57c38b-caac-4a50-b930-85079409a490\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.480269 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d57c38b-caac-4a50-b930-85079409a490-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7d57c38b-caac-4a50-b930-85079409a490\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.495323 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.500737 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgbjj\" (UniqueName: \"kubernetes.io/projected/7d57c38b-caac-4a50-b930-85079409a490-kube-api-access-mgbjj\") pod \"nova-cell1-conductor-0\" (UID: \"7d57c38b-caac-4a50-b930-85079409a490\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.505595 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d57c38b-caac-4a50-b930-85079409a490-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7d57c38b-caac-4a50-b930-85079409a490\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:02:47 crc kubenswrapper[4954]: I1206 09:02:47.704187 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.085731 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.190340 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/becf43b0-72f4-4d14-9d89-5593721d9ec0-config-data\") pod \"becf43b0-72f4-4d14-9d89-5593721d9ec0\" (UID: \"becf43b0-72f4-4d14-9d89-5593721d9ec0\") " Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.190455 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/becf43b0-72f4-4d14-9d89-5593721d9ec0-combined-ca-bundle\") pod \"becf43b0-72f4-4d14-9d89-5593721d9ec0\" (UID: \"becf43b0-72f4-4d14-9d89-5593721d9ec0\") " Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.190671 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqc8m\" (UniqueName: \"kubernetes.io/projected/becf43b0-72f4-4d14-9d89-5593721d9ec0-kube-api-access-qqc8m\") pod \"becf43b0-72f4-4d14-9d89-5593721d9ec0\" (UID: \"becf43b0-72f4-4d14-9d89-5593721d9ec0\") " Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.191591 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/becf43b0-72f4-4d14-9d89-5593721d9ec0-logs\") pod \"becf43b0-72f4-4d14-9d89-5593721d9ec0\" (UID: \"becf43b0-72f4-4d14-9d89-5593721d9ec0\") " Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.191804 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/becf43b0-72f4-4d14-9d89-5593721d9ec0-logs" (OuterVolumeSpecName: "logs") pod "becf43b0-72f4-4d14-9d89-5593721d9ec0" (UID: "becf43b0-72f4-4d14-9d89-5593721d9ec0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.192355 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/becf43b0-72f4-4d14-9d89-5593721d9ec0-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.203008 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/becf43b0-72f4-4d14-9d89-5593721d9ec0-kube-api-access-qqc8m" (OuterVolumeSpecName: "kube-api-access-qqc8m") pod "becf43b0-72f4-4d14-9d89-5593721d9ec0" (UID: "becf43b0-72f4-4d14-9d89-5593721d9ec0"). InnerVolumeSpecName "kube-api-access-qqc8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.215107 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/becf43b0-72f4-4d14-9d89-5593721d9ec0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "becf43b0-72f4-4d14-9d89-5593721d9ec0" (UID: "becf43b0-72f4-4d14-9d89-5593721d9ec0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.225171 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/becf43b0-72f4-4d14-9d89-5593721d9ec0-config-data" (OuterVolumeSpecName: "config-data") pod "becf43b0-72f4-4d14-9d89-5593721d9ec0" (UID: "becf43b0-72f4-4d14-9d89-5593721d9ec0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.293766 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/becf43b0-72f4-4d14-9d89-5593721d9ec0-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.293805 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/becf43b0-72f4-4d14-9d89-5593721d9ec0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.293822 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqc8m\" (UniqueName: \"kubernetes.io/projected/becf43b0-72f4-4d14-9d89-5593721d9ec0-kube-api-access-qqc8m\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.301293 4954 generic.go:334] "Generic (PLEG): container finished" podID="becf43b0-72f4-4d14-9d89-5593721d9ec0" containerID="a2bc6c989244b9b3382af0ea6fa4d4f5812eaf8d2451316302a9892518c7760b" exitCode=0 Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.301349 4954 generic.go:334] "Generic (PLEG): container finished" podID="becf43b0-72f4-4d14-9d89-5593721d9ec0" containerID="49835852d73e193582ba30ea8e61626cc5bd001243125e937b03423d07641311" exitCode=143 Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.302750 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.306754 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"becf43b0-72f4-4d14-9d89-5593721d9ec0","Type":"ContainerDied","Data":"a2bc6c989244b9b3382af0ea6fa4d4f5812eaf8d2451316302a9892518c7760b"} Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.306861 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"becf43b0-72f4-4d14-9d89-5593721d9ec0","Type":"ContainerDied","Data":"49835852d73e193582ba30ea8e61626cc5bd001243125e937b03423d07641311"} Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.306883 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"becf43b0-72f4-4d14-9d89-5593721d9ec0","Type":"ContainerDied","Data":"cd7f954c14cd3880e2064b0b24ee3d660776c352732d1e88a38e4197791a9586"} Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.306912 4954 scope.go:117] "RemoveContainer" containerID="a2bc6c989244b9b3382af0ea6fa4d4f5812eaf8d2451316302a9892518c7760b" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.312747 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.455140 4954 scope.go:117] "RemoveContainer" containerID="49835852d73e193582ba30ea8e61626cc5bd001243125e937b03423d07641311" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.487125 4954 scope.go:117] "RemoveContainer" containerID="a2bc6c989244b9b3382af0ea6fa4d4f5812eaf8d2451316302a9892518c7760b" Dec 06 09:02:48 crc kubenswrapper[4954]: E1206 09:02:48.487643 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2bc6c989244b9b3382af0ea6fa4d4f5812eaf8d2451316302a9892518c7760b\": container with ID starting with a2bc6c989244b9b3382af0ea6fa4d4f5812eaf8d2451316302a9892518c7760b not found: ID does not exist" containerID="a2bc6c989244b9b3382af0ea6fa4d4f5812eaf8d2451316302a9892518c7760b" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.487680 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2bc6c989244b9b3382af0ea6fa4d4f5812eaf8d2451316302a9892518c7760b"} err="failed to get container status \"a2bc6c989244b9b3382af0ea6fa4d4f5812eaf8d2451316302a9892518c7760b\": rpc error: code = NotFound desc = could not find container \"a2bc6c989244b9b3382af0ea6fa4d4f5812eaf8d2451316302a9892518c7760b\": container with ID starting with a2bc6c989244b9b3382af0ea6fa4d4f5812eaf8d2451316302a9892518c7760b not found: ID does not exist" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.487706 4954 scope.go:117] "RemoveContainer" containerID="49835852d73e193582ba30ea8e61626cc5bd001243125e937b03423d07641311" Dec 06 09:02:48 crc kubenswrapper[4954]: E1206 09:02:48.488702 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49835852d73e193582ba30ea8e61626cc5bd001243125e937b03423d07641311\": container with ID starting with 49835852d73e193582ba30ea8e61626cc5bd001243125e937b03423d07641311 not found: ID does not exist" containerID="49835852d73e193582ba30ea8e61626cc5bd001243125e937b03423d07641311" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.488740 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49835852d73e193582ba30ea8e61626cc5bd001243125e937b03423d07641311"} err="failed to get container status \"49835852d73e193582ba30ea8e61626cc5bd001243125e937b03423d07641311\": rpc error: code = NotFound desc = could not find container \"49835852d73e193582ba30ea8e61626cc5bd001243125e937b03423d07641311\": container with ID starting with 49835852d73e193582ba30ea8e61626cc5bd001243125e937b03423d07641311 not found: ID does not exist" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.488760 4954 scope.go:117] "RemoveContainer" containerID="a2bc6c989244b9b3382af0ea6fa4d4f5812eaf8d2451316302a9892518c7760b" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.489104 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2bc6c989244b9b3382af0ea6fa4d4f5812eaf8d2451316302a9892518c7760b"} err="failed to get container status \"a2bc6c989244b9b3382af0ea6fa4d4f5812eaf8d2451316302a9892518c7760b\": rpc error: code = NotFound desc = could not find container \"a2bc6c989244b9b3382af0ea6fa4d4f5812eaf8d2451316302a9892518c7760b\": container with ID starting with a2bc6c989244b9b3382af0ea6fa4d4f5812eaf8d2451316302a9892518c7760b not found: ID does not exist" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.489133 4954 scope.go:117] "RemoveContainer" containerID="49835852d73e193582ba30ea8e61626cc5bd001243125e937b03423d07641311" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.494024 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.494032 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49835852d73e193582ba30ea8e61626cc5bd001243125e937b03423d07641311"} err="failed to get container status \"49835852d73e193582ba30ea8e61626cc5bd001243125e937b03423d07641311\": rpc error: code = NotFound desc = could not find container \"49835852d73e193582ba30ea8e61626cc5bd001243125e937b03423d07641311\": container with ID starting with 49835852d73e193582ba30ea8e61626cc5bd001243125e937b03423d07641311 not found: ID does not exist" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.520191 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.527885 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 09:02:48 crc kubenswrapper[4954]: E1206 09:02:48.528440 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="becf43b0-72f4-4d14-9d89-5593721d9ec0" containerName="nova-api-api" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.528469 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="becf43b0-72f4-4d14-9d89-5593721d9ec0" containerName="nova-api-api" Dec 06 09:02:48 crc kubenswrapper[4954]: E1206 09:02:48.528496 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="becf43b0-72f4-4d14-9d89-5593721d9ec0" containerName="nova-api-log" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.528505 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="becf43b0-72f4-4d14-9d89-5593721d9ec0" containerName="nova-api-log" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.528786 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="becf43b0-72f4-4d14-9d89-5593721d9ec0" containerName="nova-api-api" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.528812 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="becf43b0-72f4-4d14-9d89-5593721d9ec0" containerName="nova-api-log" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.530161 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.534342 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.535479 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.701250 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/036caab1-0808-4d27-aa67-1ae88cb33df4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"036caab1-0808-4d27-aa67-1ae88cb33df4\") " pod="openstack/nova-api-0" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.701326 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/036caab1-0808-4d27-aa67-1ae88cb33df4-logs\") pod \"nova-api-0\" (UID: \"036caab1-0808-4d27-aa67-1ae88cb33df4\") " pod="openstack/nova-api-0" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.701379 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rff8r\" (UniqueName: \"kubernetes.io/projected/036caab1-0808-4d27-aa67-1ae88cb33df4-kube-api-access-rff8r\") pod \"nova-api-0\" (UID: \"036caab1-0808-4d27-aa67-1ae88cb33df4\") " pod="openstack/nova-api-0" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.701413 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/036caab1-0808-4d27-aa67-1ae88cb33df4-config-data\") pod \"nova-api-0\" (UID: \"036caab1-0808-4d27-aa67-1ae88cb33df4\") " pod="openstack/nova-api-0" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.803471 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/036caab1-0808-4d27-aa67-1ae88cb33df4-config-data\") pod \"nova-api-0\" (UID: \"036caab1-0808-4d27-aa67-1ae88cb33df4\") " pod="openstack/nova-api-0" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.803644 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/036caab1-0808-4d27-aa67-1ae88cb33df4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"036caab1-0808-4d27-aa67-1ae88cb33df4\") " pod="openstack/nova-api-0" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.803705 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/036caab1-0808-4d27-aa67-1ae88cb33df4-logs\") pod \"nova-api-0\" (UID: \"036caab1-0808-4d27-aa67-1ae88cb33df4\") " pod="openstack/nova-api-0" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.803769 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rff8r\" (UniqueName: \"kubernetes.io/projected/036caab1-0808-4d27-aa67-1ae88cb33df4-kube-api-access-rff8r\") pod \"nova-api-0\" (UID: \"036caab1-0808-4d27-aa67-1ae88cb33df4\") " pod="openstack/nova-api-0" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.804238 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/036caab1-0808-4d27-aa67-1ae88cb33df4-logs\") pod \"nova-api-0\" (UID: \"036caab1-0808-4d27-aa67-1ae88cb33df4\") " pod="openstack/nova-api-0" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.808740 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/036caab1-0808-4d27-aa67-1ae88cb33df4-config-data\") pod \"nova-api-0\" (UID: \"036caab1-0808-4d27-aa67-1ae88cb33df4\") " pod="openstack/nova-api-0" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.808737 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/036caab1-0808-4d27-aa67-1ae88cb33df4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"036caab1-0808-4d27-aa67-1ae88cb33df4\") " pod="openstack/nova-api-0" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.825343 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rff8r\" (UniqueName: \"kubernetes.io/projected/036caab1-0808-4d27-aa67-1ae88cb33df4-kube-api-access-rff8r\") pod \"nova-api-0\" (UID: \"036caab1-0808-4d27-aa67-1ae88cb33df4\") " pod="openstack/nova-api-0" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.839739 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.862196 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.953226 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9f8d8cdf-5rtxk"] Dec 06 09:02:48 crc kubenswrapper[4954]: I1206 09:02:48.953522 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" podUID="11ea7382-8d96-4609-a38c-b7fd6ef3a0a2" containerName="dnsmasq-dns" containerID="cri-o://c850f31740b9bc831a3e309e59b3fb0fb870f632cac3b269ff39b146154c87f7" gracePeriod=10 Dec 06 09:02:49 crc kubenswrapper[4954]: I1206 09:02:49.146346 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" podUID="11ea7382-8d96-4609-a38c-b7fd6ef3a0a2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.83:5353: connect: connection refused" Dec 06 09:02:49 crc kubenswrapper[4954]: I1206 09:02:49.364712 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7d57c38b-caac-4a50-b930-85079409a490","Type":"ContainerStarted","Data":"57c3c5afb583cb02bd0bdff592d19200582363dc10579239b56c8e9b4aeaff9c"} Dec 06 09:02:49 crc kubenswrapper[4954]: I1206 09:02:49.364764 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7d57c38b-caac-4a50-b930-85079409a490","Type":"ContainerStarted","Data":"b30209fdd1ebadeb04a4a09ec4e4f2f8688722046245838a079298b8b88bec13"} Dec 06 09:02:49 crc kubenswrapper[4954]: I1206 09:02:49.364843 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="114844b1-fbfb-46b8-80d1-27d3fad69f5a" containerName="nova-metadata-log" containerID="cri-o://e2361eba2d155e2b896747bb46e23176728e7d90688c7f3435fb18053039fb48" gracePeriod=30 Dec 06 09:02:49 crc kubenswrapper[4954]: I1206 09:02:49.364896 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="114844b1-fbfb-46b8-80d1-27d3fad69f5a" containerName="nova-metadata-metadata" containerID="cri-o://a7aea1c152e4b6757f714d274e08c89358664c17ef13d3234675f1b21a924fb2" gracePeriod=30 Dec 06 09:02:49 crc kubenswrapper[4954]: I1206 09:02:49.385683 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.385661537 podStartE2EDuration="2.385661537s" podCreationTimestamp="2025-12-06 09:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:02:49.383475888 +0000 UTC m=+7544.196835297" watchObservedRunningTime="2025-12-06 09:02:49.385661537 +0000 UTC m=+7544.199020926" Dec 06 09:02:49 crc kubenswrapper[4954]: I1206 09:02:49.407394 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:02:49 crc kubenswrapper[4954]: I1206 09:02:49.457399 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="becf43b0-72f4-4d14-9d89-5593721d9ec0" path="/var/lib/kubelet/pods/becf43b0-72f4-4d14-9d89-5593721d9ec0/volumes" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.402016 4954 generic.go:334] "Generic (PLEG): container finished" podID="114844b1-fbfb-46b8-80d1-27d3fad69f5a" containerID="a7aea1c152e4b6757f714d274e08c89358664c17ef13d3234675f1b21a924fb2" exitCode=0 Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.402500 4954 generic.go:334] "Generic (PLEG): container finished" podID="114844b1-fbfb-46b8-80d1-27d3fad69f5a" containerID="e2361eba2d155e2b896747bb46e23176728e7d90688c7f3435fb18053039fb48" exitCode=143 Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.402436 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"114844b1-fbfb-46b8-80d1-27d3fad69f5a","Type":"ContainerDied","Data":"a7aea1c152e4b6757f714d274e08c89358664c17ef13d3234675f1b21a924fb2"} Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.402599 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"114844b1-fbfb-46b8-80d1-27d3fad69f5a","Type":"ContainerDied","Data":"e2361eba2d155e2b896747bb46e23176728e7d90688c7f3435fb18053039fb48"} Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.402613 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"114844b1-fbfb-46b8-80d1-27d3fad69f5a","Type":"ContainerDied","Data":"5e3a81c049c08172fd21642424873b330471790f815839b8fda1ee1493654132"} Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.402625 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e3a81c049c08172fd21642424873b330471790f815839b8fda1ee1493654132" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.417436 4954 generic.go:334] "Generic (PLEG): container finished" podID="11ea7382-8d96-4609-a38c-b7fd6ef3a0a2" containerID="c850f31740b9bc831a3e309e59b3fb0fb870f632cac3b269ff39b146154c87f7" exitCode=0 Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.417587 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" event={"ID":"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2","Type":"ContainerDied","Data":"c850f31740b9bc831a3e309e59b3fb0fb870f632cac3b269ff39b146154c87f7"} Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.429272 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"036caab1-0808-4d27-aa67-1ae88cb33df4","Type":"ContainerStarted","Data":"2c1ac7e426318d477c94eebd6aebc7ba871f3e2e48015050c2ee9f28476c9a76"} Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.429387 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"036caab1-0808-4d27-aa67-1ae88cb33df4","Type":"ContainerStarted","Data":"c9a9a33e1760ab152f03b911963e5b3aa28615a9b12ad95e57cc051a87b51bfc"} Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.429842 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"036caab1-0808-4d27-aa67-1ae88cb33df4","Type":"ContainerStarted","Data":"7ea4015718f214d9d1e809e454b83a8c56e6637f3ed16f1d68e3e180f5051f7d"} Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.431160 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.448812 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.485038 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.485014446 podStartE2EDuration="2.485014446s" podCreationTimestamp="2025-12-06 09:02:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:02:50.455342582 +0000 UTC m=+7545.268701991" watchObservedRunningTime="2025-12-06 09:02:50.485014446 +0000 UTC m=+7545.298373835" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.543006 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/114844b1-fbfb-46b8-80d1-27d3fad69f5a-combined-ca-bundle\") pod \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.543094 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/114844b1-fbfb-46b8-80d1-27d3fad69f5a-nova-metadata-tls-certs\") pod \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.543130 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/114844b1-fbfb-46b8-80d1-27d3fad69f5a-config-data\") pod \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.543264 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89nhv\" (UniqueName: \"kubernetes.io/projected/114844b1-fbfb-46b8-80d1-27d3fad69f5a-kube-api-access-89nhv\") pod \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.543331 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/114844b1-fbfb-46b8-80d1-27d3fad69f5a-logs\") pod \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\" (UID: \"114844b1-fbfb-46b8-80d1-27d3fad69f5a\") " Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.545244 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/114844b1-fbfb-46b8-80d1-27d3fad69f5a-logs" (OuterVolumeSpecName: "logs") pod "114844b1-fbfb-46b8-80d1-27d3fad69f5a" (UID: "114844b1-fbfb-46b8-80d1-27d3fad69f5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.551742 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/114844b1-fbfb-46b8-80d1-27d3fad69f5a-kube-api-access-89nhv" (OuterVolumeSpecName: "kube-api-access-89nhv") pod "114844b1-fbfb-46b8-80d1-27d3fad69f5a" (UID: "114844b1-fbfb-46b8-80d1-27d3fad69f5a"). InnerVolumeSpecName "kube-api-access-89nhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.573261 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/114844b1-fbfb-46b8-80d1-27d3fad69f5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "114844b1-fbfb-46b8-80d1-27d3fad69f5a" (UID: "114844b1-fbfb-46b8-80d1-27d3fad69f5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.578858 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/114844b1-fbfb-46b8-80d1-27d3fad69f5a-config-data" (OuterVolumeSpecName: "config-data") pod "114844b1-fbfb-46b8-80d1-27d3fad69f5a" (UID: "114844b1-fbfb-46b8-80d1-27d3fad69f5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.597494 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/114844b1-fbfb-46b8-80d1-27d3fad69f5a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "114844b1-fbfb-46b8-80d1-27d3fad69f5a" (UID: "114844b1-fbfb-46b8-80d1-27d3fad69f5a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.645509 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89nhv\" (UniqueName: \"kubernetes.io/projected/114844b1-fbfb-46b8-80d1-27d3fad69f5a-kube-api-access-89nhv\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.645546 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/114844b1-fbfb-46b8-80d1-27d3fad69f5a-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.645574 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/114844b1-fbfb-46b8-80d1-27d3fad69f5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.645583 4954 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/114844b1-fbfb-46b8-80d1-27d3fad69f5a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.645593 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/114844b1-fbfb-46b8-80d1-27d3fad69f5a-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.673710 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.696556 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.746549 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-config\") pod \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.746609 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-dns-svc\") pod \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.746682 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-ovsdbserver-sb\") pod \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.746915 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-ovsdbserver-nb\") pod \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.746940 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfqg2\" (UniqueName: \"kubernetes.io/projected/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-kube-api-access-dfqg2\") pod \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\" (UID: \"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2\") " Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.752819 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-kube-api-access-dfqg2" (OuterVolumeSpecName: "kube-api-access-dfqg2") pod "11ea7382-8d96-4609-a38c-b7fd6ef3a0a2" (UID: "11ea7382-8d96-4609-a38c-b7fd6ef3a0a2"). InnerVolumeSpecName "kube-api-access-dfqg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.787365 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "11ea7382-8d96-4609-a38c-b7fd6ef3a0a2" (UID: "11ea7382-8d96-4609-a38c-b7fd6ef3a0a2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.787615 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-config" (OuterVolumeSpecName: "config") pod "11ea7382-8d96-4609-a38c-b7fd6ef3a0a2" (UID: "11ea7382-8d96-4609-a38c-b7fd6ef3a0a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.792431 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "11ea7382-8d96-4609-a38c-b7fd6ef3a0a2" (UID: "11ea7382-8d96-4609-a38c-b7fd6ef3a0a2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.804296 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "11ea7382-8d96-4609-a38c-b7fd6ef3a0a2" (UID: "11ea7382-8d96-4609-a38c-b7fd6ef3a0a2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.848453 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.848485 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfqg2\" (UniqueName: \"kubernetes.io/projected/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-kube-api-access-dfqg2\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.848501 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.848510 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:50 crc kubenswrapper[4954]: I1206 09:02:50.848521 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.041304 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xjqjk"] Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.048843 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xjqjk"] Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.439807 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.439842 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.439893 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9f8d8cdf-5rtxk" event={"ID":"11ea7382-8d96-4609-a38c-b7fd6ef3a0a2","Type":"ContainerDied","Data":"1f7d3518d008fffc7272ccb2ccac25d3db49cc8efac9da6b58966cf4191ea356"} Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.439945 4954 scope.go:117] "RemoveContainer" containerID="c850f31740b9bc831a3e309e59b3fb0fb870f632cac3b269ff39b146154c87f7" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.458192 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="668295d0-3bb0-41cd-ae1d-6ab1558d79fc" path="/var/lib/kubelet/pods/668295d0-3bb0-41cd-ae1d-6ab1558d79fc/volumes" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.468168 4954 scope.go:117] "RemoveContainer" containerID="c35638e43c2562e6dfb9cf83a637ba84197fc11b549b9b4800be864aa7e6ce79" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.493546 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9f8d8cdf-5rtxk"] Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.516595 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b9f8d8cdf-5rtxk"] Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.529184 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.552864 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.565919 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:02:51 crc kubenswrapper[4954]: E1206 09:02:51.566306 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="114844b1-fbfb-46b8-80d1-27d3fad69f5a" containerName="nova-metadata-log" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.566327 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="114844b1-fbfb-46b8-80d1-27d3fad69f5a" containerName="nova-metadata-log" Dec 06 09:02:51 crc kubenswrapper[4954]: E1206 09:02:51.566352 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ea7382-8d96-4609-a38c-b7fd6ef3a0a2" containerName="dnsmasq-dns" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.566358 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ea7382-8d96-4609-a38c-b7fd6ef3a0a2" containerName="dnsmasq-dns" Dec 06 09:02:51 crc kubenswrapper[4954]: E1206 09:02:51.566368 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ea7382-8d96-4609-a38c-b7fd6ef3a0a2" containerName="init" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.566374 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ea7382-8d96-4609-a38c-b7fd6ef3a0a2" containerName="init" Dec 06 09:02:51 crc kubenswrapper[4954]: E1206 09:02:51.566396 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="114844b1-fbfb-46b8-80d1-27d3fad69f5a" containerName="nova-metadata-metadata" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.566403 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="114844b1-fbfb-46b8-80d1-27d3fad69f5a" containerName="nova-metadata-metadata" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.566615 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="114844b1-fbfb-46b8-80d1-27d3fad69f5a" containerName="nova-metadata-log" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.566644 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="114844b1-fbfb-46b8-80d1-27d3fad69f5a" containerName="nova-metadata-metadata" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.566658 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="11ea7382-8d96-4609-a38c-b7fd6ef3a0a2" containerName="dnsmasq-dns" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.567588 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.569426 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.569744 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.577020 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.661891 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-config-data\") pod \"nova-metadata-0\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " pod="openstack/nova-metadata-0" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.661972 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " pod="openstack/nova-metadata-0" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.662138 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " pod="openstack/nova-metadata-0" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.662254 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sgz7\" (UniqueName: \"kubernetes.io/projected/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-kube-api-access-2sgz7\") pod \"nova-metadata-0\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " pod="openstack/nova-metadata-0" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.662319 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-logs\") pod \"nova-metadata-0\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " pod="openstack/nova-metadata-0" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.763447 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-config-data\") pod \"nova-metadata-0\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " pod="openstack/nova-metadata-0" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.763606 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " pod="openstack/nova-metadata-0" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.763664 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " pod="openstack/nova-metadata-0" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.763725 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sgz7\" (UniqueName: \"kubernetes.io/projected/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-kube-api-access-2sgz7\") pod \"nova-metadata-0\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " pod="openstack/nova-metadata-0" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.763759 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-logs\") pod \"nova-metadata-0\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " pod="openstack/nova-metadata-0" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.764281 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-logs\") pod \"nova-metadata-0\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " pod="openstack/nova-metadata-0" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.768432 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " pod="openstack/nova-metadata-0" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.768600 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " pod="openstack/nova-metadata-0" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.774395 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-config-data\") pod \"nova-metadata-0\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " pod="openstack/nova-metadata-0" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.786137 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sgz7\" (UniqueName: \"kubernetes.io/projected/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-kube-api-access-2sgz7\") pod \"nova-metadata-0\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " pod="openstack/nova-metadata-0" Dec 06 09:02:51 crc kubenswrapper[4954]: I1206 09:02:51.887489 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:02:52 crc kubenswrapper[4954]: I1206 09:02:52.365335 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:02:52 crc kubenswrapper[4954]: I1206 09:02:52.457828 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc83a5f0-0a4a-4617-b02b-41bcf291fc22","Type":"ContainerStarted","Data":"e41b21b89caf434aec43ec6ecde8415e8d88662e3020ee5786e26788f963db0a"} Dec 06 09:02:53 crc kubenswrapper[4954]: I1206 09:02:53.453502 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="114844b1-fbfb-46b8-80d1-27d3fad69f5a" path="/var/lib/kubelet/pods/114844b1-fbfb-46b8-80d1-27d3fad69f5a/volumes" Dec 06 09:02:53 crc kubenswrapper[4954]: I1206 09:02:53.454421 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11ea7382-8d96-4609-a38c-b7fd6ef3a0a2" path="/var/lib/kubelet/pods/11ea7382-8d96-4609-a38c-b7fd6ef3a0a2/volumes" Dec 06 09:02:53 crc kubenswrapper[4954]: I1206 09:02:53.469799 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc83a5f0-0a4a-4617-b02b-41bcf291fc22","Type":"ContainerStarted","Data":"f99ee20c804edbdf031c2d1bd792eb259bf25452e42624851d7a3ac4f71a5e3c"} Dec 06 09:02:53 crc kubenswrapper[4954]: I1206 09:02:53.471743 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc83a5f0-0a4a-4617-b02b-41bcf291fc22","Type":"ContainerStarted","Data":"1b1766c142e01846a7945324a0c19bed6622d392361104da88fa4fc153ca16a8"} Dec 06 09:02:53 crc kubenswrapper[4954]: I1206 09:02:53.501695 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.5016748890000002 podStartE2EDuration="2.501674889s" podCreationTimestamp="2025-12-06 09:02:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:02:53.493391437 +0000 UTC m=+7548.306750836" watchObservedRunningTime="2025-12-06 09:02:53.501674889 +0000 UTC m=+7548.315034278" Dec 06 09:02:55 crc kubenswrapper[4954]: I1206 09:02:55.692716 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:55 crc kubenswrapper[4954]: I1206 09:02:55.720632 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:56 crc kubenswrapper[4954]: I1206 09:02:56.517996 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 06 09:02:56 crc kubenswrapper[4954]: I1206 09:02:56.888193 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 09:02:56 crc kubenswrapper[4954]: I1206 09:02:56.888592 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 09:02:57 crc kubenswrapper[4954]: I1206 09:02:57.734177 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.245536 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-nvnws"] Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.246740 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nvnws" Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.249709 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.249966 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.260627 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-nvnws"] Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.413931 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44770c67-121a-4d30-8915-bd5083f9084b-config-data\") pod \"nova-cell1-cell-mapping-nvnws\" (UID: \"44770c67-121a-4d30-8915-bd5083f9084b\") " pod="openstack/nova-cell1-cell-mapping-nvnws" Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.414289 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44770c67-121a-4d30-8915-bd5083f9084b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-nvnws\" (UID: \"44770c67-121a-4d30-8915-bd5083f9084b\") " pod="openstack/nova-cell1-cell-mapping-nvnws" Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.414519 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44770c67-121a-4d30-8915-bd5083f9084b-scripts\") pod \"nova-cell1-cell-mapping-nvnws\" (UID: \"44770c67-121a-4d30-8915-bd5083f9084b\") " pod="openstack/nova-cell1-cell-mapping-nvnws" Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.414708 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qb4d\" (UniqueName: \"kubernetes.io/projected/44770c67-121a-4d30-8915-bd5083f9084b-kube-api-access-7qb4d\") pod \"nova-cell1-cell-mapping-nvnws\" (UID: \"44770c67-121a-4d30-8915-bd5083f9084b\") " pod="openstack/nova-cell1-cell-mapping-nvnws" Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.517156 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44770c67-121a-4d30-8915-bd5083f9084b-scripts\") pod \"nova-cell1-cell-mapping-nvnws\" (UID: \"44770c67-121a-4d30-8915-bd5083f9084b\") " pod="openstack/nova-cell1-cell-mapping-nvnws" Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.517246 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qb4d\" (UniqueName: \"kubernetes.io/projected/44770c67-121a-4d30-8915-bd5083f9084b-kube-api-access-7qb4d\") pod \"nova-cell1-cell-mapping-nvnws\" (UID: \"44770c67-121a-4d30-8915-bd5083f9084b\") " pod="openstack/nova-cell1-cell-mapping-nvnws" Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.517311 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44770c67-121a-4d30-8915-bd5083f9084b-config-data\") pod \"nova-cell1-cell-mapping-nvnws\" (UID: \"44770c67-121a-4d30-8915-bd5083f9084b\") " pod="openstack/nova-cell1-cell-mapping-nvnws" Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.517335 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44770c67-121a-4d30-8915-bd5083f9084b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-nvnws\" (UID: \"44770c67-121a-4d30-8915-bd5083f9084b\") " pod="openstack/nova-cell1-cell-mapping-nvnws" Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.522777 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44770c67-121a-4d30-8915-bd5083f9084b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-nvnws\" (UID: \"44770c67-121a-4d30-8915-bd5083f9084b\") " pod="openstack/nova-cell1-cell-mapping-nvnws" Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.523485 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44770c67-121a-4d30-8915-bd5083f9084b-scripts\") pod \"nova-cell1-cell-mapping-nvnws\" (UID: \"44770c67-121a-4d30-8915-bd5083f9084b\") " pod="openstack/nova-cell1-cell-mapping-nvnws" Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.525678 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44770c67-121a-4d30-8915-bd5083f9084b-config-data\") pod \"nova-cell1-cell-mapping-nvnws\" (UID: \"44770c67-121a-4d30-8915-bd5083f9084b\") " pod="openstack/nova-cell1-cell-mapping-nvnws" Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.542016 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qb4d\" (UniqueName: \"kubernetes.io/projected/44770c67-121a-4d30-8915-bd5083f9084b-kube-api-access-7qb4d\") pod \"nova-cell1-cell-mapping-nvnws\" (UID: \"44770c67-121a-4d30-8915-bd5083f9084b\") " pod="openstack/nova-cell1-cell-mapping-nvnws" Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.633721 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nvnws" Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.863938 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 09:02:58 crc kubenswrapper[4954]: I1206 09:02:58.864194 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 09:02:59 crc kubenswrapper[4954]: I1206 09:02:59.106070 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-nvnws"] Dec 06 09:02:59 crc kubenswrapper[4954]: W1206 09:02:59.114242 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44770c67_121a_4d30_8915_bd5083f9084b.slice/crio-28f218e0fba0b010b050d8f9c2494ca95f5c763df7aded6e7e6dea39ca73a51d WatchSource:0}: Error finding container 28f218e0fba0b010b050d8f9c2494ca95f5c763df7aded6e7e6dea39ca73a51d: Status 404 returned error can't find the container with id 28f218e0fba0b010b050d8f9c2494ca95f5c763df7aded6e7e6dea39ca73a51d Dec 06 09:02:59 crc kubenswrapper[4954]: I1206 09:02:59.553187 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nvnws" event={"ID":"44770c67-121a-4d30-8915-bd5083f9084b","Type":"ContainerStarted","Data":"9635d8deddc78aea91f3844e877dd66ab05b48a040214ccd70d712cddc555dfe"} Dec 06 09:02:59 crc kubenswrapper[4954]: I1206 09:02:59.556712 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nvnws" event={"ID":"44770c67-121a-4d30-8915-bd5083f9084b","Type":"ContainerStarted","Data":"28f218e0fba0b010b050d8f9c2494ca95f5c763df7aded6e7e6dea39ca73a51d"} Dec 06 09:02:59 crc kubenswrapper[4954]: I1206 09:02:59.589163 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-nvnws" podStartSLOduration=1.589122186 podStartE2EDuration="1.589122186s" podCreationTimestamp="2025-12-06 09:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:02:59.578744268 +0000 UTC m=+7554.392103657" watchObservedRunningTime="2025-12-06 09:02:59.589122186 +0000 UTC m=+7554.402481575" Dec 06 09:02:59 crc kubenswrapper[4954]: I1206 09:02:59.946900 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="036caab1-0808-4d27-aa67-1ae88cb33df4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.104:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 09:02:59 crc kubenswrapper[4954]: I1206 09:02:59.946898 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="036caab1-0808-4d27-aa67-1ae88cb33df4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.104:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 09:03:01 crc kubenswrapper[4954]: I1206 09:03:01.888808 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 09:03:01 crc kubenswrapper[4954]: I1206 09:03:01.889101 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 09:03:02 crc kubenswrapper[4954]: I1206 09:03:02.901026 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dc83a5f0-0a4a-4617-b02b-41bcf291fc22" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.105:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 09:03:02 crc kubenswrapper[4954]: I1206 09:03:02.901016 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dc83a5f0-0a4a-4617-b02b-41bcf291fc22" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.105:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 09:03:04 crc kubenswrapper[4954]: I1206 09:03:04.596723 4954 generic.go:334] "Generic (PLEG): container finished" podID="44770c67-121a-4d30-8915-bd5083f9084b" containerID="9635d8deddc78aea91f3844e877dd66ab05b48a040214ccd70d712cddc555dfe" exitCode=0 Dec 06 09:03:04 crc kubenswrapper[4954]: I1206 09:03:04.597189 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nvnws" event={"ID":"44770c67-121a-4d30-8915-bd5083f9084b","Type":"ContainerDied","Data":"9635d8deddc78aea91f3844e877dd66ab05b48a040214ccd70d712cddc555dfe"} Dec 06 09:03:05 crc kubenswrapper[4954]: I1206 09:03:05.982674 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nvnws" Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.166016 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44770c67-121a-4d30-8915-bd5083f9084b-scripts\") pod \"44770c67-121a-4d30-8915-bd5083f9084b\" (UID: \"44770c67-121a-4d30-8915-bd5083f9084b\") " Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.166251 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44770c67-121a-4d30-8915-bd5083f9084b-config-data\") pod \"44770c67-121a-4d30-8915-bd5083f9084b\" (UID: \"44770c67-121a-4d30-8915-bd5083f9084b\") " Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.166345 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qb4d\" (UniqueName: \"kubernetes.io/projected/44770c67-121a-4d30-8915-bd5083f9084b-kube-api-access-7qb4d\") pod \"44770c67-121a-4d30-8915-bd5083f9084b\" (UID: \"44770c67-121a-4d30-8915-bd5083f9084b\") " Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.166510 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44770c67-121a-4d30-8915-bd5083f9084b-combined-ca-bundle\") pod \"44770c67-121a-4d30-8915-bd5083f9084b\" (UID: \"44770c67-121a-4d30-8915-bd5083f9084b\") " Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.172661 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44770c67-121a-4d30-8915-bd5083f9084b-kube-api-access-7qb4d" (OuterVolumeSpecName: "kube-api-access-7qb4d") pod "44770c67-121a-4d30-8915-bd5083f9084b" (UID: "44770c67-121a-4d30-8915-bd5083f9084b"). InnerVolumeSpecName "kube-api-access-7qb4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.173686 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44770c67-121a-4d30-8915-bd5083f9084b-scripts" (OuterVolumeSpecName: "scripts") pod "44770c67-121a-4d30-8915-bd5083f9084b" (UID: "44770c67-121a-4d30-8915-bd5083f9084b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.193416 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44770c67-121a-4d30-8915-bd5083f9084b-config-data" (OuterVolumeSpecName: "config-data") pod "44770c67-121a-4d30-8915-bd5083f9084b" (UID: "44770c67-121a-4d30-8915-bd5083f9084b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.200461 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44770c67-121a-4d30-8915-bd5083f9084b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44770c67-121a-4d30-8915-bd5083f9084b" (UID: "44770c67-121a-4d30-8915-bd5083f9084b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.269143 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qb4d\" (UniqueName: \"kubernetes.io/projected/44770c67-121a-4d30-8915-bd5083f9084b-kube-api-access-7qb4d\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.269182 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44770c67-121a-4d30-8915-bd5083f9084b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.269194 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44770c67-121a-4d30-8915-bd5083f9084b-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.269224 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44770c67-121a-4d30-8915-bd5083f9084b-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.615814 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nvnws" event={"ID":"44770c67-121a-4d30-8915-bd5083f9084b","Type":"ContainerDied","Data":"28f218e0fba0b010b050d8f9c2494ca95f5c763df7aded6e7e6dea39ca73a51d"} Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.615862 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28f218e0fba0b010b050d8f9c2494ca95f5c763df7aded6e7e6dea39ca73a51d" Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.615870 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nvnws" Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.794653 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.795190 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="036caab1-0808-4d27-aa67-1ae88cb33df4" containerName="nova-api-log" containerID="cri-o://c9a9a33e1760ab152f03b911963e5b3aa28615a9b12ad95e57cc051a87b51bfc" gracePeriod=30 Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.795293 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="036caab1-0808-4d27-aa67-1ae88cb33df4" containerName="nova-api-api" containerID="cri-o://2c1ac7e426318d477c94eebd6aebc7ba871f3e2e48015050c2ee9f28476c9a76" gracePeriod=30 Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.819511 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.819839 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dc83a5f0-0a4a-4617-b02b-41bcf291fc22" containerName="nova-metadata-log" containerID="cri-o://1b1766c142e01846a7945324a0c19bed6622d392361104da88fa4fc153ca16a8" gracePeriod=30 Dec 06 09:03:06 crc kubenswrapper[4954]: I1206 09:03:06.820328 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dc83a5f0-0a4a-4617-b02b-41bcf291fc22" containerName="nova-metadata-metadata" containerID="cri-o://f99ee20c804edbdf031c2d1bd792eb259bf25452e42624851d7a3ac4f71a5e3c" gracePeriod=30 Dec 06 09:03:07 crc kubenswrapper[4954]: I1206 09:03:07.639108 4954 generic.go:334] "Generic (PLEG): container finished" podID="036caab1-0808-4d27-aa67-1ae88cb33df4" containerID="c9a9a33e1760ab152f03b911963e5b3aa28615a9b12ad95e57cc051a87b51bfc" exitCode=143 Dec 06 09:03:07 crc kubenswrapper[4954]: I1206 09:03:07.639206 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"036caab1-0808-4d27-aa67-1ae88cb33df4","Type":"ContainerDied","Data":"c9a9a33e1760ab152f03b911963e5b3aa28615a9b12ad95e57cc051a87b51bfc"} Dec 06 09:03:07 crc kubenswrapper[4954]: I1206 09:03:07.643223 4954 generic.go:334] "Generic (PLEG): container finished" podID="dc83a5f0-0a4a-4617-b02b-41bcf291fc22" containerID="1b1766c142e01846a7945324a0c19bed6622d392361104da88fa4fc153ca16a8" exitCode=143 Dec 06 09:03:07 crc kubenswrapper[4954]: I1206 09:03:07.643310 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc83a5f0-0a4a-4617-b02b-41bcf291fc22","Type":"ContainerDied","Data":"1b1766c142e01846a7945324a0c19bed6622d392361104da88fa4fc153ca16a8"} Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.484460 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.652964 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-logs\") pod \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.653092 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-nova-metadata-tls-certs\") pod \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.653124 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-config-data\") pod \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.653254 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-combined-ca-bundle\") pod \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.653288 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sgz7\" (UniqueName: \"kubernetes.io/projected/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-kube-api-access-2sgz7\") pod \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\" (UID: \"dc83a5f0-0a4a-4617-b02b-41bcf291fc22\") " Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.653549 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-logs" (OuterVolumeSpecName: "logs") pod "dc83a5f0-0a4a-4617-b02b-41bcf291fc22" (UID: "dc83a5f0-0a4a-4617-b02b-41bcf291fc22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.654254 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.664828 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-kube-api-access-2sgz7" (OuterVolumeSpecName: "kube-api-access-2sgz7") pod "dc83a5f0-0a4a-4617-b02b-41bcf291fc22" (UID: "dc83a5f0-0a4a-4617-b02b-41bcf291fc22"). InnerVolumeSpecName "kube-api-access-2sgz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.686642 4954 generic.go:334] "Generic (PLEG): container finished" podID="dc83a5f0-0a4a-4617-b02b-41bcf291fc22" containerID="f99ee20c804edbdf031c2d1bd792eb259bf25452e42624851d7a3ac4f71a5e3c" exitCode=0 Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.686685 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc83a5f0-0a4a-4617-b02b-41bcf291fc22","Type":"ContainerDied","Data":"f99ee20c804edbdf031c2d1bd792eb259bf25452e42624851d7a3ac4f71a5e3c"} Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.686712 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc83a5f0-0a4a-4617-b02b-41bcf291fc22","Type":"ContainerDied","Data":"e41b21b89caf434aec43ec6ecde8415e8d88662e3020ee5786e26788f963db0a"} Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.686728 4954 scope.go:117] "RemoveContainer" containerID="f99ee20c804edbdf031c2d1bd792eb259bf25452e42624851d7a3ac4f71a5e3c" Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.686845 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.727866 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-config-data" (OuterVolumeSpecName: "config-data") pod "dc83a5f0-0a4a-4617-b02b-41bcf291fc22" (UID: "dc83a5f0-0a4a-4617-b02b-41bcf291fc22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.733321 4954 scope.go:117] "RemoveContainer" containerID="1b1766c142e01846a7945324a0c19bed6622d392361104da88fa4fc153ca16a8" Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.743857 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc83a5f0-0a4a-4617-b02b-41bcf291fc22" (UID: "dc83a5f0-0a4a-4617-b02b-41bcf291fc22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.756792 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.756832 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sgz7\" (UniqueName: \"kubernetes.io/projected/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-kube-api-access-2sgz7\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.756846 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.774767 4954 scope.go:117] "RemoveContainer" containerID="f99ee20c804edbdf031c2d1bd792eb259bf25452e42624851d7a3ac4f71a5e3c" Dec 06 09:03:10 crc kubenswrapper[4954]: E1206 09:03:10.775513 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f99ee20c804edbdf031c2d1bd792eb259bf25452e42624851d7a3ac4f71a5e3c\": container with ID starting with f99ee20c804edbdf031c2d1bd792eb259bf25452e42624851d7a3ac4f71a5e3c not found: ID does not exist" containerID="f99ee20c804edbdf031c2d1bd792eb259bf25452e42624851d7a3ac4f71a5e3c" Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.775555 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f99ee20c804edbdf031c2d1bd792eb259bf25452e42624851d7a3ac4f71a5e3c"} err="failed to get container status \"f99ee20c804edbdf031c2d1bd792eb259bf25452e42624851d7a3ac4f71a5e3c\": rpc error: code = NotFound desc = could not find container \"f99ee20c804edbdf031c2d1bd792eb259bf25452e42624851d7a3ac4f71a5e3c\": container with ID starting with f99ee20c804edbdf031c2d1bd792eb259bf25452e42624851d7a3ac4f71a5e3c not found: ID does not exist" Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.775597 4954 scope.go:117] "RemoveContainer" containerID="1b1766c142e01846a7945324a0c19bed6622d392361104da88fa4fc153ca16a8" Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.781026 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "dc83a5f0-0a4a-4617-b02b-41bcf291fc22" (UID: "dc83a5f0-0a4a-4617-b02b-41bcf291fc22"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:03:10 crc kubenswrapper[4954]: E1206 09:03:10.781102 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b1766c142e01846a7945324a0c19bed6622d392361104da88fa4fc153ca16a8\": container with ID starting with 1b1766c142e01846a7945324a0c19bed6622d392361104da88fa4fc153ca16a8 not found: ID does not exist" containerID="1b1766c142e01846a7945324a0c19bed6622d392361104da88fa4fc153ca16a8" Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.781147 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b1766c142e01846a7945324a0c19bed6622d392361104da88fa4fc153ca16a8"} err="failed to get container status \"1b1766c142e01846a7945324a0c19bed6622d392361104da88fa4fc153ca16a8\": rpc error: code = NotFound desc = could not find container \"1b1766c142e01846a7945324a0c19bed6622d392361104da88fa4fc153ca16a8\": container with ID starting with 1b1766c142e01846a7945324a0c19bed6622d392361104da88fa4fc153ca16a8 not found: ID does not exist" Dec 06 09:03:10 crc kubenswrapper[4954]: I1206 09:03:10.858212 4954 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc83a5f0-0a4a-4617-b02b-41bcf291fc22-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.092753 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.103728 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.120062 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:03:11 crc kubenswrapper[4954]: E1206 09:03:11.120895 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc83a5f0-0a4a-4617-b02b-41bcf291fc22" containerName="nova-metadata-metadata" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.121015 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc83a5f0-0a4a-4617-b02b-41bcf291fc22" containerName="nova-metadata-metadata" Dec 06 09:03:11 crc kubenswrapper[4954]: E1206 09:03:11.121103 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44770c67-121a-4d30-8915-bd5083f9084b" containerName="nova-manage" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.121173 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="44770c67-121a-4d30-8915-bd5083f9084b" containerName="nova-manage" Dec 06 09:03:11 crc kubenswrapper[4954]: E1206 09:03:11.121254 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc83a5f0-0a4a-4617-b02b-41bcf291fc22" containerName="nova-metadata-log" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.121310 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc83a5f0-0a4a-4617-b02b-41bcf291fc22" containerName="nova-metadata-log" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.121541 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc83a5f0-0a4a-4617-b02b-41bcf291fc22" containerName="nova-metadata-log" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.121639 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc83a5f0-0a4a-4617-b02b-41bcf291fc22" containerName="nova-metadata-metadata" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.121706 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="44770c67-121a-4d30-8915-bd5083f9084b" containerName="nova-manage" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.122711 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.131415 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.159775 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.160282 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.264346 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ac9b546-77cc-444e-9263-ad4f8edfba97-logs\") pod \"nova-metadata-0\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " pod="openstack/nova-metadata-0" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.264417 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ac9b546-77cc-444e-9263-ad4f8edfba97-config-data\") pod \"nova-metadata-0\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " pod="openstack/nova-metadata-0" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.264667 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94whj\" (UniqueName: \"kubernetes.io/projected/2ac9b546-77cc-444e-9263-ad4f8edfba97-kube-api-access-94whj\") pod \"nova-metadata-0\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " pod="openstack/nova-metadata-0" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.265040 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ac9b546-77cc-444e-9263-ad4f8edfba97-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " pod="openstack/nova-metadata-0" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.265103 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ac9b546-77cc-444e-9263-ad4f8edfba97-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " pod="openstack/nova-metadata-0" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.366811 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ac9b546-77cc-444e-9263-ad4f8edfba97-config-data\") pod \"nova-metadata-0\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " pod="openstack/nova-metadata-0" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.366938 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94whj\" (UniqueName: \"kubernetes.io/projected/2ac9b546-77cc-444e-9263-ad4f8edfba97-kube-api-access-94whj\") pod \"nova-metadata-0\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " pod="openstack/nova-metadata-0" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.367046 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ac9b546-77cc-444e-9263-ad4f8edfba97-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " pod="openstack/nova-metadata-0" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.367077 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ac9b546-77cc-444e-9263-ad4f8edfba97-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " pod="openstack/nova-metadata-0" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.367145 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ac9b546-77cc-444e-9263-ad4f8edfba97-logs\") pod \"nova-metadata-0\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " pod="openstack/nova-metadata-0" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.367894 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ac9b546-77cc-444e-9263-ad4f8edfba97-logs\") pod \"nova-metadata-0\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " pod="openstack/nova-metadata-0" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.371199 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ac9b546-77cc-444e-9263-ad4f8edfba97-config-data\") pod \"nova-metadata-0\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " pod="openstack/nova-metadata-0" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.373542 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ac9b546-77cc-444e-9263-ad4f8edfba97-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " pod="openstack/nova-metadata-0" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.375696 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ac9b546-77cc-444e-9263-ad4f8edfba97-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " pod="openstack/nova-metadata-0" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.391233 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94whj\" (UniqueName: \"kubernetes.io/projected/2ac9b546-77cc-444e-9263-ad4f8edfba97-kube-api-access-94whj\") pod \"nova-metadata-0\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " pod="openstack/nova-metadata-0" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.439369 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:03:11 crc kubenswrapper[4954]: I1206 09:03:11.453031 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc83a5f0-0a4a-4617-b02b-41bcf291fc22" path="/var/lib/kubelet/pods/dc83a5f0-0a4a-4617-b02b-41bcf291fc22/volumes" Dec 06 09:03:12 crc kubenswrapper[4954]: I1206 09:03:12.101100 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:03:12 crc kubenswrapper[4954]: I1206 09:03:12.741167 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2ac9b546-77cc-444e-9263-ad4f8edfba97","Type":"ContainerStarted","Data":"5ba2b28d1db1e5489ac7175d5b6f06f37a289288cd713269dfbe260180b3d9f2"} Dec 06 09:03:12 crc kubenswrapper[4954]: I1206 09:03:12.741767 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2ac9b546-77cc-444e-9263-ad4f8edfba97","Type":"ContainerStarted","Data":"a63ecc39c19cdb980123413a222cae58a66829ca8107faaa473e80b57ac52273"} Dec 06 09:03:12 crc kubenswrapper[4954]: I1206 09:03:12.741788 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2ac9b546-77cc-444e-9263-ad4f8edfba97","Type":"ContainerStarted","Data":"1ee4bb710f71f144649f70d9d7b190e364ac0288a0815916f136cc9c24b7736b"} Dec 06 09:03:12 crc kubenswrapper[4954]: I1206 09:03:12.767845 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.767816921 podStartE2EDuration="1.767816921s" podCreationTimestamp="2025-12-06 09:03:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:03:12.762632202 +0000 UTC m=+7567.575991601" watchObservedRunningTime="2025-12-06 09:03:12.767816921 +0000 UTC m=+7567.581176320" Dec 06 09:03:16 crc kubenswrapper[4954]: I1206 09:03:16.440031 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 09:03:16 crc kubenswrapper[4954]: I1206 09:03:16.440458 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 09:03:17 crc kubenswrapper[4954]: I1206 09:03:17.797795 4954 generic.go:334] "Generic (PLEG): container finished" podID="cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0" containerID="485753a82f1e99673a1c494fb9f0c3605901b0edb6a6329c60c793102f85761f" exitCode=137 Dec 06 09:03:17 crc kubenswrapper[4954]: I1206 09:03:17.797886 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0","Type":"ContainerDied","Data":"485753a82f1e99673a1c494fb9f0c3605901b0edb6a6329c60c793102f85761f"} Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.018893 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.102213 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc7gq\" (UniqueName: \"kubernetes.io/projected/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0-kube-api-access-wc7gq\") pod \"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0\" (UID: \"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0\") " Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.102361 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0-combined-ca-bundle\") pod \"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0\" (UID: \"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0\") " Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.102399 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0-config-data\") pod \"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0\" (UID: \"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0\") " Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.108841 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0-kube-api-access-wc7gq" (OuterVolumeSpecName: "kube-api-access-wc7gq") pod "cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0" (UID: "cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0"). InnerVolumeSpecName "kube-api-access-wc7gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.129439 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0-config-data" (OuterVolumeSpecName: "config-data") pod "cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0" (UID: "cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.134722 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0" (UID: "cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.205518 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc7gq\" (UniqueName: \"kubernetes.io/projected/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0-kube-api-access-wc7gq\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.205582 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.205597 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.809344 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0","Type":"ContainerDied","Data":"0ae6f46d99afd34231db2ebfabe47549f644f8c58ddff105bf091f9b60d4f04d"} Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.809400 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.809834 4954 scope.go:117] "RemoveContainer" containerID="485753a82f1e99673a1c494fb9f0c3605901b0edb6a6329c60c793102f85761f" Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.843118 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.854225 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.863631 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.863680 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.865572 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:03:18 crc kubenswrapper[4954]: E1206 09:03:18.865941 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0" containerName="nova-scheduler-scheduler" Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.865960 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0" containerName="nova-scheduler-scheduler" Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.866152 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0" containerName="nova-scheduler-scheduler" Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.866770 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.868830 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 06 09:03:18 crc kubenswrapper[4954]: I1206 09:03:18.895413 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:03:19 crc kubenswrapper[4954]: I1206 09:03:19.020706 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gzml\" (UniqueName: \"kubernetes.io/projected/6699d8f6-3aa0-4564-ac22-cfbfade1f563-kube-api-access-5gzml\") pod \"nova-scheduler-0\" (UID: \"6699d8f6-3aa0-4564-ac22-cfbfade1f563\") " pod="openstack/nova-scheduler-0" Dec 06 09:03:19 crc kubenswrapper[4954]: I1206 09:03:19.020761 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6699d8f6-3aa0-4564-ac22-cfbfade1f563-config-data\") pod \"nova-scheduler-0\" (UID: \"6699d8f6-3aa0-4564-ac22-cfbfade1f563\") " pod="openstack/nova-scheduler-0" Dec 06 09:03:19 crc kubenswrapper[4954]: I1206 09:03:19.020917 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6699d8f6-3aa0-4564-ac22-cfbfade1f563-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6699d8f6-3aa0-4564-ac22-cfbfade1f563\") " pod="openstack/nova-scheduler-0" Dec 06 09:03:19 crc kubenswrapper[4954]: I1206 09:03:19.122694 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6699d8f6-3aa0-4564-ac22-cfbfade1f563-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6699d8f6-3aa0-4564-ac22-cfbfade1f563\") " pod="openstack/nova-scheduler-0" Dec 06 09:03:19 crc kubenswrapper[4954]: I1206 09:03:19.122893 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gzml\" (UniqueName: \"kubernetes.io/projected/6699d8f6-3aa0-4564-ac22-cfbfade1f563-kube-api-access-5gzml\") pod \"nova-scheduler-0\" (UID: \"6699d8f6-3aa0-4564-ac22-cfbfade1f563\") " pod="openstack/nova-scheduler-0" Dec 06 09:03:19 crc kubenswrapper[4954]: I1206 09:03:19.122928 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6699d8f6-3aa0-4564-ac22-cfbfade1f563-config-data\") pod \"nova-scheduler-0\" (UID: \"6699d8f6-3aa0-4564-ac22-cfbfade1f563\") " pod="openstack/nova-scheduler-0" Dec 06 09:03:19 crc kubenswrapper[4954]: I1206 09:03:19.127289 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6699d8f6-3aa0-4564-ac22-cfbfade1f563-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6699d8f6-3aa0-4564-ac22-cfbfade1f563\") " pod="openstack/nova-scheduler-0" Dec 06 09:03:19 crc kubenswrapper[4954]: I1206 09:03:19.127289 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6699d8f6-3aa0-4564-ac22-cfbfade1f563-config-data\") pod \"nova-scheduler-0\" (UID: \"6699d8f6-3aa0-4564-ac22-cfbfade1f563\") " pod="openstack/nova-scheduler-0" Dec 06 09:03:19 crc kubenswrapper[4954]: I1206 09:03:19.146374 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gzml\" (UniqueName: \"kubernetes.io/projected/6699d8f6-3aa0-4564-ac22-cfbfade1f563-kube-api-access-5gzml\") pod \"nova-scheduler-0\" (UID: \"6699d8f6-3aa0-4564-ac22-cfbfade1f563\") " pod="openstack/nova-scheduler-0" Dec 06 09:03:19 crc kubenswrapper[4954]: I1206 09:03:19.190265 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:03:19 crc kubenswrapper[4954]: I1206 09:03:19.455665 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0" path="/var/lib/kubelet/pods/cfb4fa80-6c0b-4c19-a73e-ea07f39bd4b0/volumes" Dec 06 09:03:19 crc kubenswrapper[4954]: I1206 09:03:19.617713 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:03:19 crc kubenswrapper[4954]: W1206 09:03:19.627837 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6699d8f6_3aa0_4564_ac22_cfbfade1f563.slice/crio-88a9dfcdd062baf9877594a52e9a67e1ae25d9aeb57ff3673f21fa7c6a18407b WatchSource:0}: Error finding container 88a9dfcdd062baf9877594a52e9a67e1ae25d9aeb57ff3673f21fa7c6a18407b: Status 404 returned error can't find the container with id 88a9dfcdd062baf9877594a52e9a67e1ae25d9aeb57ff3673f21fa7c6a18407b Dec 06 09:03:19 crc kubenswrapper[4954]: I1206 09:03:19.824429 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6699d8f6-3aa0-4564-ac22-cfbfade1f563","Type":"ContainerStarted","Data":"88a9dfcdd062baf9877594a52e9a67e1ae25d9aeb57ff3673f21fa7c6a18407b"} Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.797549 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.850758 4954 generic.go:334] "Generic (PLEG): container finished" podID="036caab1-0808-4d27-aa67-1ae88cb33df4" containerID="2c1ac7e426318d477c94eebd6aebc7ba871f3e2e48015050c2ee9f28476c9a76" exitCode=0 Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.850828 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"036caab1-0808-4d27-aa67-1ae88cb33df4","Type":"ContainerDied","Data":"2c1ac7e426318d477c94eebd6aebc7ba871f3e2e48015050c2ee9f28476c9a76"} Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.850858 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"036caab1-0808-4d27-aa67-1ae88cb33df4","Type":"ContainerDied","Data":"7ea4015718f214d9d1e809e454b83a8c56e6637f3ed16f1d68e3e180f5051f7d"} Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.850878 4954 scope.go:117] "RemoveContainer" containerID="2c1ac7e426318d477c94eebd6aebc7ba871f3e2e48015050c2ee9f28476c9a76" Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.851015 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.858250 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6699d8f6-3aa0-4564-ac22-cfbfade1f563","Type":"ContainerStarted","Data":"b274541346b18644b926e197db6439fba972c34ad7d27db1a9c2c23ab8675758"} Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.878021 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/036caab1-0808-4d27-aa67-1ae88cb33df4-config-data\") pod \"036caab1-0808-4d27-aa67-1ae88cb33df4\" (UID: \"036caab1-0808-4d27-aa67-1ae88cb33df4\") " Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.878090 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rff8r\" (UniqueName: \"kubernetes.io/projected/036caab1-0808-4d27-aa67-1ae88cb33df4-kube-api-access-rff8r\") pod \"036caab1-0808-4d27-aa67-1ae88cb33df4\" (UID: \"036caab1-0808-4d27-aa67-1ae88cb33df4\") " Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.878156 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/036caab1-0808-4d27-aa67-1ae88cb33df4-combined-ca-bundle\") pod \"036caab1-0808-4d27-aa67-1ae88cb33df4\" (UID: \"036caab1-0808-4d27-aa67-1ae88cb33df4\") " Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.878859 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/036caab1-0808-4d27-aa67-1ae88cb33df4-logs\") pod \"036caab1-0808-4d27-aa67-1ae88cb33df4\" (UID: \"036caab1-0808-4d27-aa67-1ae88cb33df4\") " Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.880578 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/036caab1-0808-4d27-aa67-1ae88cb33df4-logs" (OuterVolumeSpecName: "logs") pod "036caab1-0808-4d27-aa67-1ae88cb33df4" (UID: "036caab1-0808-4d27-aa67-1ae88cb33df4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.884048 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/036caab1-0808-4d27-aa67-1ae88cb33df4-kube-api-access-rff8r" (OuterVolumeSpecName: "kube-api-access-rff8r") pod "036caab1-0808-4d27-aa67-1ae88cb33df4" (UID: "036caab1-0808-4d27-aa67-1ae88cb33df4"). InnerVolumeSpecName "kube-api-access-rff8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.886109 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.886093382 podStartE2EDuration="2.886093382s" podCreationTimestamp="2025-12-06 09:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:03:20.875244662 +0000 UTC m=+7575.688604071" watchObservedRunningTime="2025-12-06 09:03:20.886093382 +0000 UTC m=+7575.699452771" Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.901642 4954 scope.go:117] "RemoveContainer" containerID="c9a9a33e1760ab152f03b911963e5b3aa28615a9b12ad95e57cc051a87b51bfc" Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.922678 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/036caab1-0808-4d27-aa67-1ae88cb33df4-config-data" (OuterVolumeSpecName: "config-data") pod "036caab1-0808-4d27-aa67-1ae88cb33df4" (UID: "036caab1-0808-4d27-aa67-1ae88cb33df4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.922728 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/036caab1-0808-4d27-aa67-1ae88cb33df4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "036caab1-0808-4d27-aa67-1ae88cb33df4" (UID: "036caab1-0808-4d27-aa67-1ae88cb33df4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.965078 4954 scope.go:117] "RemoveContainer" containerID="2c1ac7e426318d477c94eebd6aebc7ba871f3e2e48015050c2ee9f28476c9a76" Dec 06 09:03:20 crc kubenswrapper[4954]: E1206 09:03:20.965516 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c1ac7e426318d477c94eebd6aebc7ba871f3e2e48015050c2ee9f28476c9a76\": container with ID starting with 2c1ac7e426318d477c94eebd6aebc7ba871f3e2e48015050c2ee9f28476c9a76 not found: ID does not exist" containerID="2c1ac7e426318d477c94eebd6aebc7ba871f3e2e48015050c2ee9f28476c9a76" Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.965549 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c1ac7e426318d477c94eebd6aebc7ba871f3e2e48015050c2ee9f28476c9a76"} err="failed to get container status \"2c1ac7e426318d477c94eebd6aebc7ba871f3e2e48015050c2ee9f28476c9a76\": rpc error: code = NotFound desc = could not find container \"2c1ac7e426318d477c94eebd6aebc7ba871f3e2e48015050c2ee9f28476c9a76\": container with ID starting with 2c1ac7e426318d477c94eebd6aebc7ba871f3e2e48015050c2ee9f28476c9a76 not found: ID does not exist" Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.965617 4954 scope.go:117] "RemoveContainer" containerID="c9a9a33e1760ab152f03b911963e5b3aa28615a9b12ad95e57cc051a87b51bfc" Dec 06 09:03:20 crc kubenswrapper[4954]: E1206 09:03:20.965843 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9a9a33e1760ab152f03b911963e5b3aa28615a9b12ad95e57cc051a87b51bfc\": container with ID starting with c9a9a33e1760ab152f03b911963e5b3aa28615a9b12ad95e57cc051a87b51bfc not found: ID does not exist" containerID="c9a9a33e1760ab152f03b911963e5b3aa28615a9b12ad95e57cc051a87b51bfc" Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.965867 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9a9a33e1760ab152f03b911963e5b3aa28615a9b12ad95e57cc051a87b51bfc"} err="failed to get container status \"c9a9a33e1760ab152f03b911963e5b3aa28615a9b12ad95e57cc051a87b51bfc\": rpc error: code = NotFound desc = could not find container \"c9a9a33e1760ab152f03b911963e5b3aa28615a9b12ad95e57cc051a87b51bfc\": container with ID starting with c9a9a33e1760ab152f03b911963e5b3aa28615a9b12ad95e57cc051a87b51bfc not found: ID does not exist" Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.981605 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/036caab1-0808-4d27-aa67-1ae88cb33df4-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.981642 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rff8r\" (UniqueName: \"kubernetes.io/projected/036caab1-0808-4d27-aa67-1ae88cb33df4-kube-api-access-rff8r\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.981656 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/036caab1-0808-4d27-aa67-1ae88cb33df4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:20 crc kubenswrapper[4954]: I1206 09:03:20.981668 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/036caab1-0808-4d27-aa67-1ae88cb33df4-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.211759 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.223941 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.264194 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 09:03:21 crc kubenswrapper[4954]: E1206 09:03:21.264739 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="036caab1-0808-4d27-aa67-1ae88cb33df4" containerName="nova-api-api" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.264762 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="036caab1-0808-4d27-aa67-1ae88cb33df4" containerName="nova-api-api" Dec 06 09:03:21 crc kubenswrapper[4954]: E1206 09:03:21.264819 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="036caab1-0808-4d27-aa67-1ae88cb33df4" containerName="nova-api-log" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.264827 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="036caab1-0808-4d27-aa67-1ae88cb33df4" containerName="nova-api-log" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.265052 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="036caab1-0808-4d27-aa67-1ae88cb33df4" containerName="nova-api-api" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.265091 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="036caab1-0808-4d27-aa67-1ae88cb33df4" containerName="nova-api-log" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.266542 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.270754 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.291362 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.393256 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777d31d7-fa22-4f20-83ff-fa9e33419a4b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\") " pod="openstack/nova-api-0" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.393470 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8s6c\" (UniqueName: \"kubernetes.io/projected/777d31d7-fa22-4f20-83ff-fa9e33419a4b-kube-api-access-d8s6c\") pod \"nova-api-0\" (UID: \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\") " pod="openstack/nova-api-0" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.393524 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777d31d7-fa22-4f20-83ff-fa9e33419a4b-logs\") pod \"nova-api-0\" (UID: \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\") " pod="openstack/nova-api-0" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.393772 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777d31d7-fa22-4f20-83ff-fa9e33419a4b-config-data\") pod \"nova-api-0\" (UID: \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\") " pod="openstack/nova-api-0" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.440194 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.440244 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.461602 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="036caab1-0808-4d27-aa67-1ae88cb33df4" path="/var/lib/kubelet/pods/036caab1-0808-4d27-aa67-1ae88cb33df4/volumes" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.495430 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777d31d7-fa22-4f20-83ff-fa9e33419a4b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\") " pod="openstack/nova-api-0" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.495628 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8s6c\" (UniqueName: \"kubernetes.io/projected/777d31d7-fa22-4f20-83ff-fa9e33419a4b-kube-api-access-d8s6c\") pod \"nova-api-0\" (UID: \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\") " pod="openstack/nova-api-0" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.495683 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777d31d7-fa22-4f20-83ff-fa9e33419a4b-logs\") pod \"nova-api-0\" (UID: \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\") " pod="openstack/nova-api-0" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.495746 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777d31d7-fa22-4f20-83ff-fa9e33419a4b-config-data\") pod \"nova-api-0\" (UID: \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\") " pod="openstack/nova-api-0" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.496248 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777d31d7-fa22-4f20-83ff-fa9e33419a4b-logs\") pod \"nova-api-0\" (UID: \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\") " pod="openstack/nova-api-0" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.500100 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777d31d7-fa22-4f20-83ff-fa9e33419a4b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\") " pod="openstack/nova-api-0" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.500719 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777d31d7-fa22-4f20-83ff-fa9e33419a4b-config-data\") pod \"nova-api-0\" (UID: \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\") " pod="openstack/nova-api-0" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.533356 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8s6c\" (UniqueName: \"kubernetes.io/projected/777d31d7-fa22-4f20-83ff-fa9e33419a4b-kube-api-access-d8s6c\") pod \"nova-api-0\" (UID: \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\") " pod="openstack/nova-api-0" Dec 06 09:03:21 crc kubenswrapper[4954]: I1206 09:03:21.592243 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:03:22 crc kubenswrapper[4954]: I1206 09:03:22.105400 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:03:22 crc kubenswrapper[4954]: W1206 09:03:22.110277 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod777d31d7_fa22_4f20_83ff_fa9e33419a4b.slice/crio-e323d6b41958a47ded23b1936aa4aa3e906c81813f608fd6aa19e8e6a2095a8f WatchSource:0}: Error finding container e323d6b41958a47ded23b1936aa4aa3e906c81813f608fd6aa19e8e6a2095a8f: Status 404 returned error can't find the container with id e323d6b41958a47ded23b1936aa4aa3e906c81813f608fd6aa19e8e6a2095a8f Dec 06 09:03:22 crc kubenswrapper[4954]: I1206 09:03:22.458821 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2ac9b546-77cc-444e-9263-ad4f8edfba97" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.107:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 09:03:22 crc kubenswrapper[4954]: I1206 09:03:22.458851 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2ac9b546-77cc-444e-9263-ad4f8edfba97" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.107:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 09:03:22 crc kubenswrapper[4954]: I1206 09:03:22.894583 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"777d31d7-fa22-4f20-83ff-fa9e33419a4b","Type":"ContainerStarted","Data":"6e4b54ae1d7207ba795136e3026ec5e9646d96a93fc8ea13cb11435e5240d0f4"} Dec 06 09:03:22 crc kubenswrapper[4954]: I1206 09:03:22.894632 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"777d31d7-fa22-4f20-83ff-fa9e33419a4b","Type":"ContainerStarted","Data":"61fcb4cc0126980ce238e55bbb767704fe8f3276ba92e8a10aa25f4bfae26acf"} Dec 06 09:03:22 crc kubenswrapper[4954]: I1206 09:03:22.894642 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"777d31d7-fa22-4f20-83ff-fa9e33419a4b","Type":"ContainerStarted","Data":"e323d6b41958a47ded23b1936aa4aa3e906c81813f608fd6aa19e8e6a2095a8f"} Dec 06 09:03:22 crc kubenswrapper[4954]: I1206 09:03:22.915600 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.91554496 podStartE2EDuration="1.91554496s" podCreationTimestamp="2025-12-06 09:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:03:22.912636832 +0000 UTC m=+7577.725996221" watchObservedRunningTime="2025-12-06 09:03:22.91554496 +0000 UTC m=+7577.728904349" Dec 06 09:03:23 crc kubenswrapper[4954]: I1206 09:03:23.753226 4954 scope.go:117] "RemoveContainer" containerID="dbc8f66159d7cae384b467443a21ce3e67df52b39bad636b0e7d7fb3391acfcf" Dec 06 09:03:23 crc kubenswrapper[4954]: I1206 09:03:23.855542 4954 scope.go:117] "RemoveContainer" containerID="ec5af4d7d75513ef4eeebe851183690db923c6fa8c063cb21797c6a784499a9f" Dec 06 09:03:23 crc kubenswrapper[4954]: I1206 09:03:23.893277 4954 scope.go:117] "RemoveContainer" containerID="c0268e8e515dc62fa3730ba52504e551aa9923e39cabd7e224c50081a0f62485" Dec 06 09:03:23 crc kubenswrapper[4954]: I1206 09:03:23.931879 4954 scope.go:117] "RemoveContainer" containerID="5720dd372db598a4d9ed79a7faf225ca141e8786642b06a3540a3def26930713" Dec 06 09:03:24 crc kubenswrapper[4954]: I1206 09:03:24.190827 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 06 09:03:29 crc kubenswrapper[4954]: I1206 09:03:29.191518 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 06 09:03:29 crc kubenswrapper[4954]: I1206 09:03:29.221992 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 06 09:03:30 crc kubenswrapper[4954]: I1206 09:03:30.005626 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 06 09:03:31 crc kubenswrapper[4954]: I1206 09:03:31.457110 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 09:03:31 crc kubenswrapper[4954]: I1206 09:03:31.457485 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 09:03:31 crc kubenswrapper[4954]: I1206 09:03:31.462749 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 09:03:31 crc kubenswrapper[4954]: I1206 09:03:31.462986 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 09:03:31 crc kubenswrapper[4954]: I1206 09:03:31.592551 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 09:03:31 crc kubenswrapper[4954]: I1206 09:03:31.592611 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 09:03:32 crc kubenswrapper[4954]: I1206 09:03:32.633806 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="777d31d7-fa22-4f20-83ff-fa9e33419a4b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.109:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 09:03:32 crc kubenswrapper[4954]: I1206 09:03:32.633832 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="777d31d7-fa22-4f20-83ff-fa9e33419a4b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.109:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 09:03:41 crc kubenswrapper[4954]: I1206 09:03:41.597382 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 09:03:41 crc kubenswrapper[4954]: I1206 09:03:41.598992 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 09:03:41 crc kubenswrapper[4954]: I1206 09:03:41.602526 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 09:03:41 crc kubenswrapper[4954]: I1206 09:03:41.602804 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.105135 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.110026 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.283003 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t"] Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.284365 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.322319 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t"] Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.404445 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-config\") pod \"dnsmasq-dns-6f5f7d8bd5-fgd7t\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.404494 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-ovsdbserver-nb\") pod \"dnsmasq-dns-6f5f7d8bd5-fgd7t\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.404522 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-dns-svc\") pod \"dnsmasq-dns-6f5f7d8bd5-fgd7t\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.404543 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvsvq\" (UniqueName: \"kubernetes.io/projected/edc42899-5e6c-4354-92e3-d2a3dfa7707a-kube-api-access-cvsvq\") pod \"dnsmasq-dns-6f5f7d8bd5-fgd7t\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.404661 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-ovsdbserver-sb\") pod \"dnsmasq-dns-6f5f7d8bd5-fgd7t\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.506760 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-ovsdbserver-nb\") pod \"dnsmasq-dns-6f5f7d8bd5-fgd7t\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.506806 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-dns-svc\") pod \"dnsmasq-dns-6f5f7d8bd5-fgd7t\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.506833 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvsvq\" (UniqueName: \"kubernetes.io/projected/edc42899-5e6c-4354-92e3-d2a3dfa7707a-kube-api-access-cvsvq\") pod \"dnsmasq-dns-6f5f7d8bd5-fgd7t\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.506897 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-ovsdbserver-sb\") pod \"dnsmasq-dns-6f5f7d8bd5-fgd7t\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.507054 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-config\") pod \"dnsmasq-dns-6f5f7d8bd5-fgd7t\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.507971 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-config\") pod \"dnsmasq-dns-6f5f7d8bd5-fgd7t\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.508475 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-ovsdbserver-nb\") pod \"dnsmasq-dns-6f5f7d8bd5-fgd7t\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.509017 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-dns-svc\") pod \"dnsmasq-dns-6f5f7d8bd5-fgd7t\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.510535 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-ovsdbserver-sb\") pod \"dnsmasq-dns-6f5f7d8bd5-fgd7t\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.529337 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvsvq\" (UniqueName: \"kubernetes.io/projected/edc42899-5e6c-4354-92e3-d2a3dfa7707a-kube-api-access-cvsvq\") pod \"dnsmasq-dns-6f5f7d8bd5-fgd7t\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:42 crc kubenswrapper[4954]: I1206 09:03:42.613916 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:43 crc kubenswrapper[4954]: I1206 09:03:43.102599 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t"] Dec 06 09:03:43 crc kubenswrapper[4954]: I1206 09:03:43.114210 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" event={"ID":"edc42899-5e6c-4354-92e3-d2a3dfa7707a","Type":"ContainerStarted","Data":"f75abca5c29f5b0f13657dab28f2992430824754a4d00126991da907ee89f698"} Dec 06 09:03:44 crc kubenswrapper[4954]: I1206 09:03:44.124176 4954 generic.go:334] "Generic (PLEG): container finished" podID="edc42899-5e6c-4354-92e3-d2a3dfa7707a" containerID="abaa1d070c406cc0c6728a24a344c2f93be130b166c211357f87edea2eb1f511" exitCode=0 Dec 06 09:03:44 crc kubenswrapper[4954]: I1206 09:03:44.124373 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" event={"ID":"edc42899-5e6c-4354-92e3-d2a3dfa7707a","Type":"ContainerDied","Data":"abaa1d070c406cc0c6728a24a344c2f93be130b166c211357f87edea2eb1f511"} Dec 06 09:03:44 crc kubenswrapper[4954]: I1206 09:03:44.909502 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:03:45 crc kubenswrapper[4954]: I1206 09:03:45.135711 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" event={"ID":"edc42899-5e6c-4354-92e3-d2a3dfa7707a","Type":"ContainerStarted","Data":"d3c62812cc251143e4c8eea60ca6dbd6a5e40ba07d0ff37780eed9c691086b04"} Dec 06 09:03:45 crc kubenswrapper[4954]: I1206 09:03:45.135847 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="777d31d7-fa22-4f20-83ff-fa9e33419a4b" containerName="nova-api-log" containerID="cri-o://61fcb4cc0126980ce238e55bbb767704fe8f3276ba92e8a10aa25f4bfae26acf" gracePeriod=30 Dec 06 09:03:45 crc kubenswrapper[4954]: I1206 09:03:45.135976 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="777d31d7-fa22-4f20-83ff-fa9e33419a4b" containerName="nova-api-api" containerID="cri-o://6e4b54ae1d7207ba795136e3026ec5e9646d96a93fc8ea13cb11435e5240d0f4" gracePeriod=30 Dec 06 09:03:45 crc kubenswrapper[4954]: I1206 09:03:45.136440 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:45 crc kubenswrapper[4954]: I1206 09:03:45.166763 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" podStartSLOduration=3.166742447 podStartE2EDuration="3.166742447s" podCreationTimestamp="2025-12-06 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:03:45.162831612 +0000 UTC m=+7599.976191001" watchObservedRunningTime="2025-12-06 09:03:45.166742447 +0000 UTC m=+7599.980101836" Dec 06 09:03:46 crc kubenswrapper[4954]: I1206 09:03:46.144449 4954 generic.go:334] "Generic (PLEG): container finished" podID="777d31d7-fa22-4f20-83ff-fa9e33419a4b" containerID="61fcb4cc0126980ce238e55bbb767704fe8f3276ba92e8a10aa25f4bfae26acf" exitCode=143 Dec 06 09:03:46 crc kubenswrapper[4954]: I1206 09:03:46.144504 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"777d31d7-fa22-4f20-83ff-fa9e33419a4b","Type":"ContainerDied","Data":"61fcb4cc0126980ce238e55bbb767704fe8f3276ba92e8a10aa25f4bfae26acf"} Dec 06 09:03:48 crc kubenswrapper[4954]: I1206 09:03:48.750042 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:03:48 crc kubenswrapper[4954]: I1206 09:03:48.862406 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777d31d7-fa22-4f20-83ff-fa9e33419a4b-combined-ca-bundle\") pod \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\" (UID: \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\") " Dec 06 09:03:48 crc kubenswrapper[4954]: I1206 09:03:48.862474 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8s6c\" (UniqueName: \"kubernetes.io/projected/777d31d7-fa22-4f20-83ff-fa9e33419a4b-kube-api-access-d8s6c\") pod \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\" (UID: \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\") " Dec 06 09:03:48 crc kubenswrapper[4954]: I1206 09:03:48.862591 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777d31d7-fa22-4f20-83ff-fa9e33419a4b-config-data\") pod \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\" (UID: \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\") " Dec 06 09:03:48 crc kubenswrapper[4954]: I1206 09:03:48.862661 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777d31d7-fa22-4f20-83ff-fa9e33419a4b-logs\") pod \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\" (UID: \"777d31d7-fa22-4f20-83ff-fa9e33419a4b\") " Dec 06 09:03:48 crc kubenswrapper[4954]: I1206 09:03:48.863470 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/777d31d7-fa22-4f20-83ff-fa9e33419a4b-logs" (OuterVolumeSpecName: "logs") pod "777d31d7-fa22-4f20-83ff-fa9e33419a4b" (UID: "777d31d7-fa22-4f20-83ff-fa9e33419a4b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:03:48 crc kubenswrapper[4954]: I1206 09:03:48.869696 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/777d31d7-fa22-4f20-83ff-fa9e33419a4b-kube-api-access-d8s6c" (OuterVolumeSpecName: "kube-api-access-d8s6c") pod "777d31d7-fa22-4f20-83ff-fa9e33419a4b" (UID: "777d31d7-fa22-4f20-83ff-fa9e33419a4b"). InnerVolumeSpecName "kube-api-access-d8s6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:03:48 crc kubenswrapper[4954]: I1206 09:03:48.898817 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/777d31d7-fa22-4f20-83ff-fa9e33419a4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "777d31d7-fa22-4f20-83ff-fa9e33419a4b" (UID: "777d31d7-fa22-4f20-83ff-fa9e33419a4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:03:48 crc kubenswrapper[4954]: I1206 09:03:48.913142 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/777d31d7-fa22-4f20-83ff-fa9e33419a4b-config-data" (OuterVolumeSpecName: "config-data") pod "777d31d7-fa22-4f20-83ff-fa9e33419a4b" (UID: "777d31d7-fa22-4f20-83ff-fa9e33419a4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:03:48 crc kubenswrapper[4954]: I1206 09:03:48.964397 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777d31d7-fa22-4f20-83ff-fa9e33419a4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:48 crc kubenswrapper[4954]: I1206 09:03:48.964429 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8s6c\" (UniqueName: \"kubernetes.io/projected/777d31d7-fa22-4f20-83ff-fa9e33419a4b-kube-api-access-d8s6c\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:48 crc kubenswrapper[4954]: I1206 09:03:48.964441 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777d31d7-fa22-4f20-83ff-fa9e33419a4b-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:48 crc kubenswrapper[4954]: I1206 09:03:48.964451 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777d31d7-fa22-4f20-83ff-fa9e33419a4b-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.170065 4954 generic.go:334] "Generic (PLEG): container finished" podID="777d31d7-fa22-4f20-83ff-fa9e33419a4b" containerID="6e4b54ae1d7207ba795136e3026ec5e9646d96a93fc8ea13cb11435e5240d0f4" exitCode=0 Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.170117 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"777d31d7-fa22-4f20-83ff-fa9e33419a4b","Type":"ContainerDied","Data":"6e4b54ae1d7207ba795136e3026ec5e9646d96a93fc8ea13cb11435e5240d0f4"} Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.170147 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"777d31d7-fa22-4f20-83ff-fa9e33419a4b","Type":"ContainerDied","Data":"e323d6b41958a47ded23b1936aa4aa3e906c81813f608fd6aa19e8e6a2095a8f"} Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.170144 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.170168 4954 scope.go:117] "RemoveContainer" containerID="6e4b54ae1d7207ba795136e3026ec5e9646d96a93fc8ea13cb11435e5240d0f4" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.198983 4954 scope.go:117] "RemoveContainer" containerID="61fcb4cc0126980ce238e55bbb767704fe8f3276ba92e8a10aa25f4bfae26acf" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.210401 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.229145 4954 scope.go:117] "RemoveContainer" containerID="6e4b54ae1d7207ba795136e3026ec5e9646d96a93fc8ea13cb11435e5240d0f4" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.229368 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:03:49 crc kubenswrapper[4954]: E1206 09:03:49.229964 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e4b54ae1d7207ba795136e3026ec5e9646d96a93fc8ea13cb11435e5240d0f4\": container with ID starting with 6e4b54ae1d7207ba795136e3026ec5e9646d96a93fc8ea13cb11435e5240d0f4 not found: ID does not exist" containerID="6e4b54ae1d7207ba795136e3026ec5e9646d96a93fc8ea13cb11435e5240d0f4" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.229992 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e4b54ae1d7207ba795136e3026ec5e9646d96a93fc8ea13cb11435e5240d0f4"} err="failed to get container status \"6e4b54ae1d7207ba795136e3026ec5e9646d96a93fc8ea13cb11435e5240d0f4\": rpc error: code = NotFound desc = could not find container \"6e4b54ae1d7207ba795136e3026ec5e9646d96a93fc8ea13cb11435e5240d0f4\": container with ID starting with 6e4b54ae1d7207ba795136e3026ec5e9646d96a93fc8ea13cb11435e5240d0f4 not found: ID does not exist" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.230014 4954 scope.go:117] "RemoveContainer" containerID="61fcb4cc0126980ce238e55bbb767704fe8f3276ba92e8a10aa25f4bfae26acf" Dec 06 09:03:49 crc kubenswrapper[4954]: E1206 09:03:49.230865 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61fcb4cc0126980ce238e55bbb767704fe8f3276ba92e8a10aa25f4bfae26acf\": container with ID starting with 61fcb4cc0126980ce238e55bbb767704fe8f3276ba92e8a10aa25f4bfae26acf not found: ID does not exist" containerID="61fcb4cc0126980ce238e55bbb767704fe8f3276ba92e8a10aa25f4bfae26acf" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.230893 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61fcb4cc0126980ce238e55bbb767704fe8f3276ba92e8a10aa25f4bfae26acf"} err="failed to get container status \"61fcb4cc0126980ce238e55bbb767704fe8f3276ba92e8a10aa25f4bfae26acf\": rpc error: code = NotFound desc = could not find container \"61fcb4cc0126980ce238e55bbb767704fe8f3276ba92e8a10aa25f4bfae26acf\": container with ID starting with 61fcb4cc0126980ce238e55bbb767704fe8f3276ba92e8a10aa25f4bfae26acf not found: ID does not exist" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.242851 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 09:03:49 crc kubenswrapper[4954]: E1206 09:03:49.243256 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="777d31d7-fa22-4f20-83ff-fa9e33419a4b" containerName="nova-api-api" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.243274 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="777d31d7-fa22-4f20-83ff-fa9e33419a4b" containerName="nova-api-api" Dec 06 09:03:49 crc kubenswrapper[4954]: E1206 09:03:49.243291 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="777d31d7-fa22-4f20-83ff-fa9e33419a4b" containerName="nova-api-log" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.243300 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="777d31d7-fa22-4f20-83ff-fa9e33419a4b" containerName="nova-api-log" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.243460 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="777d31d7-fa22-4f20-83ff-fa9e33419a4b" containerName="nova-api-api" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.243475 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="777d31d7-fa22-4f20-83ff-fa9e33419a4b" containerName="nova-api-log" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.244645 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.248737 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.249157 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.249426 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.256743 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.370767 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djgbq\" (UniqueName: \"kubernetes.io/projected/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-kube-api-access-djgbq\") pod \"nova-api-0\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.370839 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.371069 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.371291 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-public-tls-certs\") pod \"nova-api-0\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.371529 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-config-data\") pod \"nova-api-0\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.371824 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-logs\") pod \"nova-api-0\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.463308 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="777d31d7-fa22-4f20-83ff-fa9e33419a4b" path="/var/lib/kubelet/pods/777d31d7-fa22-4f20-83ff-fa9e33419a4b/volumes" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.474069 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djgbq\" (UniqueName: \"kubernetes.io/projected/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-kube-api-access-djgbq\") pod \"nova-api-0\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.474382 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.474489 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.474631 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-public-tls-certs\") pod \"nova-api-0\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.474784 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-config-data\") pod \"nova-api-0\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.474935 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-logs\") pod \"nova-api-0\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.475421 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-logs\") pod \"nova-api-0\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.480047 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-config-data\") pod \"nova-api-0\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.480465 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-public-tls-certs\") pod \"nova-api-0\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.481544 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.481638 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.494735 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djgbq\" (UniqueName: \"kubernetes.io/projected/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-kube-api-access-djgbq\") pod \"nova-api-0\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " pod="openstack/nova-api-0" Dec 06 09:03:49 crc kubenswrapper[4954]: I1206 09:03:49.580577 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:03:50 crc kubenswrapper[4954]: I1206 09:03:50.103327 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:03:50 crc kubenswrapper[4954]: I1206 09:03:50.179488 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb","Type":"ContainerStarted","Data":"6b23254c3a0a68e9730786a752e9ba490964144c5de92c690c832a19edc70b84"} Dec 06 09:03:51 crc kubenswrapper[4954]: I1206 09:03:51.189859 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb","Type":"ContainerStarted","Data":"d71644552accf5ae017efcff05ca3deea024bd07c42a94f8b77215c7f8d0956b"} Dec 06 09:03:51 crc kubenswrapper[4954]: I1206 09:03:51.190125 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb","Type":"ContainerStarted","Data":"4bc6e5b0e964b77c627b986b87b8adee2fc9226f47b5cb20140ddbd607f93902"} Dec 06 09:03:51 crc kubenswrapper[4954]: I1206 09:03:51.210937 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.210918486 podStartE2EDuration="2.210918486s" podCreationTimestamp="2025-12-06 09:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:03:51.209307813 +0000 UTC m=+7606.022667242" watchObservedRunningTime="2025-12-06 09:03:51.210918486 +0000 UTC m=+7606.024277875" Dec 06 09:03:52 crc kubenswrapper[4954]: I1206 09:03:52.614897 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:03:52 crc kubenswrapper[4954]: I1206 09:03:52.707126 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f85bbdcf9-tk6ff"] Dec 06 09:03:52 crc kubenswrapper[4954]: I1206 09:03:52.707378 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" podUID="2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a" containerName="dnsmasq-dns" containerID="cri-o://073a19617c17941e084c57a92ccacf88aede8bc77f16ff842329f11ea49766e3" gracePeriod=10 Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.214438 4954 generic.go:334] "Generic (PLEG): container finished" podID="2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a" containerID="073a19617c17941e084c57a92ccacf88aede8bc77f16ff842329f11ea49766e3" exitCode=0 Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.214503 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" event={"ID":"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a","Type":"ContainerDied","Data":"073a19617c17941e084c57a92ccacf88aede8bc77f16ff842329f11ea49766e3"} Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.214743 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" event={"ID":"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a","Type":"ContainerDied","Data":"ebc8dbe722682110dcec9feef55e8f0fbd8d9cd9f86d68885c25c641a316907c"} Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.214756 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebc8dbe722682110dcec9feef55e8f0fbd8d9cd9f86d68885c25c641a316907c" Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.252714 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.361122 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-ovsdbserver-nb\") pod \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.361274 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-dns-svc\") pod \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.361312 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lps2f\" (UniqueName: \"kubernetes.io/projected/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-kube-api-access-lps2f\") pod \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.361399 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-config\") pod \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.361477 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-ovsdbserver-sb\") pod \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\" (UID: \"2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a\") " Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.366940 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-kube-api-access-lps2f" (OuterVolumeSpecName: "kube-api-access-lps2f") pod "2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a" (UID: "2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a"). InnerVolumeSpecName "kube-api-access-lps2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.420962 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a" (UID: "2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.427720 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a" (UID: "2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.429917 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a" (UID: "2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.442608 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-config" (OuterVolumeSpecName: "config") pod "2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a" (UID: "2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.464168 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.464201 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.464243 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.464257 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lps2f\" (UniqueName: \"kubernetes.io/projected/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-kube-api-access-lps2f\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:53 crc kubenswrapper[4954]: I1206 09:03:53.464270 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:03:54 crc kubenswrapper[4954]: I1206 09:03:54.222963 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f85bbdcf9-tk6ff" Dec 06 09:03:54 crc kubenswrapper[4954]: I1206 09:03:54.249948 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f85bbdcf9-tk6ff"] Dec 06 09:03:54 crc kubenswrapper[4954]: I1206 09:03:54.259546 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f85bbdcf9-tk6ff"] Dec 06 09:03:55 crc kubenswrapper[4954]: I1206 09:03:55.455319 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a" path="/var/lib/kubelet/pods/2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a/volumes" Dec 06 09:03:59 crc kubenswrapper[4954]: I1206 09:03:59.581232 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 09:03:59 crc kubenswrapper[4954]: I1206 09:03:59.582857 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 09:04:00 crc kubenswrapper[4954]: I1206 09:04:00.594778 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.111:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 09:04:00 crc kubenswrapper[4954]: I1206 09:04:00.594795 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.111:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 09:04:09 crc kubenswrapper[4954]: I1206 09:04:09.590264 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 09:04:09 crc kubenswrapper[4954]: I1206 09:04:09.591052 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 09:04:09 crc kubenswrapper[4954]: I1206 09:04:09.592825 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 09:04:09 crc kubenswrapper[4954]: I1206 09:04:09.600888 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 09:04:10 crc kubenswrapper[4954]: I1206 09:04:10.101637 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:04:10 crc kubenswrapper[4954]: I1206 09:04:10.101739 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:04:10 crc kubenswrapper[4954]: I1206 09:04:10.368196 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 09:04:10 crc kubenswrapper[4954]: I1206 09:04:10.381050 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.504583 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-65b5bbcb89-skdjp"] Dec 06 09:04:24 crc kubenswrapper[4954]: E1206 09:04:24.505595 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a" containerName="init" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.505613 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a" containerName="init" Dec 06 09:04:24 crc kubenswrapper[4954]: E1206 09:04:24.505658 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a" containerName="dnsmasq-dns" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.505666 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a" containerName="dnsmasq-dns" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.505892 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2c2cf5-9333-4702-bee1-b0f7aeb54e2a" containerName="dnsmasq-dns" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.507111 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.513367 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.513922 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.514471 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.519637 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-kjfhj" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.533753 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-65b5bbcb89-skdjp"] Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.572234 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d89624ad-2d7f-4ab2-86f2-77936051c570-scripts\") pod \"horizon-65b5bbcb89-skdjp\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.572353 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d89624ad-2d7f-4ab2-86f2-77936051c570-logs\") pod \"horizon-65b5bbcb89-skdjp\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.572447 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d89624ad-2d7f-4ab2-86f2-77936051c570-horizon-secret-key\") pod \"horizon-65b5bbcb89-skdjp\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.572556 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6w6d\" (UniqueName: \"kubernetes.io/projected/d89624ad-2d7f-4ab2-86f2-77936051c570-kube-api-access-x6w6d\") pod \"horizon-65b5bbcb89-skdjp\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.572728 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d89624ad-2d7f-4ab2-86f2-77936051c570-config-data\") pod \"horizon-65b5bbcb89-skdjp\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.583517 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.583909 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bccf6ec3-7db7-497d-97ba-e7002fb77b80" containerName="glance-log" containerID="cri-o://ab537d70b86dbdeee1673a5e85e881972b23f51ef78d8ab71bc18429ee8fe81b" gracePeriod=30 Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.586954 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bccf6ec3-7db7-497d-97ba-e7002fb77b80" containerName="glance-httpd" containerID="cri-o://fd79a5aa15258c69a1e9270ca7c72365c816bc413d9201e4eecfb57701183d44" gracePeriod=30 Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.674539 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d89624ad-2d7f-4ab2-86f2-77936051c570-config-data\") pod \"horizon-65b5bbcb89-skdjp\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.674667 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d89624ad-2d7f-4ab2-86f2-77936051c570-scripts\") pod \"horizon-65b5bbcb89-skdjp\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.674743 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d89624ad-2d7f-4ab2-86f2-77936051c570-logs\") pod \"horizon-65b5bbcb89-skdjp\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.674765 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d89624ad-2d7f-4ab2-86f2-77936051c570-horizon-secret-key\") pod \"horizon-65b5bbcb89-skdjp\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.674790 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6w6d\" (UniqueName: \"kubernetes.io/projected/d89624ad-2d7f-4ab2-86f2-77936051c570-kube-api-access-x6w6d\") pod \"horizon-65b5bbcb89-skdjp\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.676327 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d89624ad-2d7f-4ab2-86f2-77936051c570-config-data\") pod \"horizon-65b5bbcb89-skdjp\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.676392 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7f76f758d9-hnvjl"] Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.676754 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d89624ad-2d7f-4ab2-86f2-77936051c570-scripts\") pod \"horizon-65b5bbcb89-skdjp\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.677517 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d89624ad-2d7f-4ab2-86f2-77936051c570-logs\") pod \"horizon-65b5bbcb89-skdjp\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.679149 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.683279 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d89624ad-2d7f-4ab2-86f2-77936051c570-horizon-secret-key\") pod \"horizon-65b5bbcb89-skdjp\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.691529 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.691963 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f5005a42-2172-4a4d-b063-a9c24649e747" containerName="glance-log" containerID="cri-o://500f2fc8df3f1ece73cec8ecc49047b0de1241d9f0ee0300d4f8e7cafe8c1b6c" gracePeriod=30 Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.692026 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f5005a42-2172-4a4d-b063-a9c24649e747" containerName="glance-httpd" containerID="cri-o://fdc387b7a76029405379b37127fe7d9ec4b0befd24f8eb115bb83973bf3671fa" gracePeriod=30 Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.711820 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6w6d\" (UniqueName: \"kubernetes.io/projected/d89624ad-2d7f-4ab2-86f2-77936051c570-kube-api-access-x6w6d\") pod \"horizon-65b5bbcb89-skdjp\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.729221 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f76f758d9-hnvjl"] Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.777078 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae2b9b5e-4640-4458-8335-e7afa9a587cc-logs\") pod \"horizon-7f76f758d9-hnvjl\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.777173 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae2b9b5e-4640-4458-8335-e7afa9a587cc-config-data\") pod \"horizon-7f76f758d9-hnvjl\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.777218 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae2b9b5e-4640-4458-8335-e7afa9a587cc-scripts\") pod \"horizon-7f76f758d9-hnvjl\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.777338 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfckq\" (UniqueName: \"kubernetes.io/projected/ae2b9b5e-4640-4458-8335-e7afa9a587cc-kube-api-access-lfckq\") pod \"horizon-7f76f758d9-hnvjl\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.778113 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ae2b9b5e-4640-4458-8335-e7afa9a587cc-horizon-secret-key\") pod \"horizon-7f76f758d9-hnvjl\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.840166 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.881487 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ae2b9b5e-4640-4458-8335-e7afa9a587cc-horizon-secret-key\") pod \"horizon-7f76f758d9-hnvjl\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.881673 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae2b9b5e-4640-4458-8335-e7afa9a587cc-logs\") pod \"horizon-7f76f758d9-hnvjl\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.881722 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae2b9b5e-4640-4458-8335-e7afa9a587cc-config-data\") pod \"horizon-7f76f758d9-hnvjl\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.881749 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae2b9b5e-4640-4458-8335-e7afa9a587cc-scripts\") pod \"horizon-7f76f758d9-hnvjl\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.881836 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfckq\" (UniqueName: \"kubernetes.io/projected/ae2b9b5e-4640-4458-8335-e7afa9a587cc-kube-api-access-lfckq\") pod \"horizon-7f76f758d9-hnvjl\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.882212 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae2b9b5e-4640-4458-8335-e7afa9a587cc-logs\") pod \"horizon-7f76f758d9-hnvjl\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.883555 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae2b9b5e-4640-4458-8335-e7afa9a587cc-config-data\") pod \"horizon-7f76f758d9-hnvjl\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.883865 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae2b9b5e-4640-4458-8335-e7afa9a587cc-scripts\") pod \"horizon-7f76f758d9-hnvjl\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.888786 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ae2b9b5e-4640-4458-8335-e7afa9a587cc-horizon-secret-key\") pod \"horizon-7f76f758d9-hnvjl\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:04:24 crc kubenswrapper[4954]: I1206 09:04:24.906695 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfckq\" (UniqueName: \"kubernetes.io/projected/ae2b9b5e-4640-4458-8335-e7afa9a587cc-kube-api-access-lfckq\") pod \"horizon-7f76f758d9-hnvjl\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:04:25 crc kubenswrapper[4954]: I1206 09:04:25.093347 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:04:25 crc kubenswrapper[4954]: I1206 09:04:25.281037 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-65b5bbcb89-skdjp"] Dec 06 09:04:25 crc kubenswrapper[4954]: I1206 09:04:25.286554 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 09:04:25 crc kubenswrapper[4954]: I1206 09:04:25.520831 4954 generic.go:334] "Generic (PLEG): container finished" podID="f5005a42-2172-4a4d-b063-a9c24649e747" containerID="500f2fc8df3f1ece73cec8ecc49047b0de1241d9f0ee0300d4f8e7cafe8c1b6c" exitCode=143 Dec 06 09:04:25 crc kubenswrapper[4954]: I1206 09:04:25.520914 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f5005a42-2172-4a4d-b063-a9c24649e747","Type":"ContainerDied","Data":"500f2fc8df3f1ece73cec8ecc49047b0de1241d9f0ee0300d4f8e7cafe8c1b6c"} Dec 06 09:04:25 crc kubenswrapper[4954]: I1206 09:04:25.524226 4954 generic.go:334] "Generic (PLEG): container finished" podID="bccf6ec3-7db7-497d-97ba-e7002fb77b80" containerID="ab537d70b86dbdeee1673a5e85e881972b23f51ef78d8ab71bc18429ee8fe81b" exitCode=143 Dec 06 09:04:25 crc kubenswrapper[4954]: I1206 09:04:25.524303 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bccf6ec3-7db7-497d-97ba-e7002fb77b80","Type":"ContainerDied","Data":"ab537d70b86dbdeee1673a5e85e881972b23f51ef78d8ab71bc18429ee8fe81b"} Dec 06 09:04:25 crc kubenswrapper[4954]: I1206 09:04:25.526205 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65b5bbcb89-skdjp" event={"ID":"d89624ad-2d7f-4ab2-86f2-77936051c570","Type":"ContainerStarted","Data":"2d774b4cec7a34218cc09493a4e8e544ef069cf7d25adedc9dec36fdad4cae4c"} Dec 06 09:04:25 crc kubenswrapper[4954]: I1206 09:04:25.664133 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f76f758d9-hnvjl"] Dec 06 09:04:25 crc kubenswrapper[4954]: W1206 09:04:25.668072 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae2b9b5e_4640_4458_8335_e7afa9a587cc.slice/crio-dd1534e6438fb1754675093192b14b553dc13d9f303be6af65f630df4d760612 WatchSource:0}: Error finding container dd1534e6438fb1754675093192b14b553dc13d9f303be6af65f630df4d760612: Status 404 returned error can't find the container with id dd1534e6438fb1754675093192b14b553dc13d9f303be6af65f630df4d760612 Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.544720 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f76f758d9-hnvjl" event={"ID":"ae2b9b5e-4640-4458-8335-e7afa9a587cc","Type":"ContainerStarted","Data":"dd1534e6438fb1754675093192b14b553dc13d9f303be6af65f630df4d760612"} Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.614097 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-65b5bbcb89-skdjp"] Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.680899 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5b7c4f984d-gkz9f"] Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.682891 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.691058 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.697575 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b7c4f984d-gkz9f"] Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.736716 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5lsf\" (UniqueName: \"kubernetes.io/projected/dd5a5191-b7b3-418b-8c8f-034808addca0-kube-api-access-t5lsf\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.736774 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd5a5191-b7b3-418b-8c8f-034808addca0-horizon-secret-key\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.736836 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd5a5191-b7b3-418b-8c8f-034808addca0-combined-ca-bundle\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.736919 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd5a5191-b7b3-418b-8c8f-034808addca0-config-data\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.736944 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd5a5191-b7b3-418b-8c8f-034808addca0-logs\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.737060 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd5a5191-b7b3-418b-8c8f-034808addca0-scripts\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.737178 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd5a5191-b7b3-418b-8c8f-034808addca0-horizon-tls-certs\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.756819 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f76f758d9-hnvjl"] Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.802324 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7d65f959c6-krcmt"] Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.804117 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.812199 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d65f959c6-krcmt"] Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.842847 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5lsf\" (UniqueName: \"kubernetes.io/projected/dd5a5191-b7b3-418b-8c8f-034808addca0-kube-api-access-t5lsf\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.842891 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd5a5191-b7b3-418b-8c8f-034808addca0-horizon-secret-key\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.842928 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd5a5191-b7b3-418b-8c8f-034808addca0-combined-ca-bundle\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.842965 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2s5f\" (UniqueName: \"kubernetes.io/projected/1709a14f-a574-4485-8993-2c5991a6ca80-kube-api-access-f2s5f\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.842993 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1709a14f-a574-4485-8993-2c5991a6ca80-combined-ca-bundle\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.843013 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1709a14f-a574-4485-8993-2c5991a6ca80-config-data\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.843031 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1709a14f-a574-4485-8993-2c5991a6ca80-logs\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.843069 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd5a5191-b7b3-418b-8c8f-034808addca0-config-data\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.843084 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd5a5191-b7b3-418b-8c8f-034808addca0-logs\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.843116 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd5a5191-b7b3-418b-8c8f-034808addca0-scripts\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.843157 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd5a5191-b7b3-418b-8c8f-034808addca0-horizon-tls-certs\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.843176 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1709a14f-a574-4485-8993-2c5991a6ca80-horizon-tls-certs\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.843192 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1709a14f-a574-4485-8993-2c5991a6ca80-horizon-secret-key\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.843222 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1709a14f-a574-4485-8993-2c5991a6ca80-scripts\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.844808 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd5a5191-b7b3-418b-8c8f-034808addca0-scripts\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.846268 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd5a5191-b7b3-418b-8c8f-034808addca0-logs\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.846421 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd5a5191-b7b3-418b-8c8f-034808addca0-config-data\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.852717 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd5a5191-b7b3-418b-8c8f-034808addca0-horizon-secret-key\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.852997 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd5a5191-b7b3-418b-8c8f-034808addca0-combined-ca-bundle\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.863611 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd5a5191-b7b3-418b-8c8f-034808addca0-horizon-tls-certs\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.869501 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5lsf\" (UniqueName: \"kubernetes.io/projected/dd5a5191-b7b3-418b-8c8f-034808addca0-kube-api-access-t5lsf\") pod \"horizon-5b7c4f984d-gkz9f\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.945677 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2s5f\" (UniqueName: \"kubernetes.io/projected/1709a14f-a574-4485-8993-2c5991a6ca80-kube-api-access-f2s5f\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.946075 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1709a14f-a574-4485-8993-2c5991a6ca80-combined-ca-bundle\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.946696 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1709a14f-a574-4485-8993-2c5991a6ca80-config-data\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.947733 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1709a14f-a574-4485-8993-2c5991a6ca80-config-data\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.946730 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1709a14f-a574-4485-8993-2c5991a6ca80-logs\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.948018 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1709a14f-a574-4485-8993-2c5991a6ca80-logs\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.948166 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1709a14f-a574-4485-8993-2c5991a6ca80-horizon-tls-certs\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.948187 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1709a14f-a574-4485-8993-2c5991a6ca80-horizon-secret-key\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.948484 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1709a14f-a574-4485-8993-2c5991a6ca80-scripts\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.949084 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1709a14f-a574-4485-8993-2c5991a6ca80-scripts\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.949589 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1709a14f-a574-4485-8993-2c5991a6ca80-combined-ca-bundle\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.950945 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1709a14f-a574-4485-8993-2c5991a6ca80-horizon-secret-key\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.953887 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1709a14f-a574-4485-8993-2c5991a6ca80-horizon-tls-certs\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:26 crc kubenswrapper[4954]: I1206 09:04:26.963690 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2s5f\" (UniqueName: \"kubernetes.io/projected/1709a14f-a574-4485-8993-2c5991a6ca80-kube-api-access-f2s5f\") pod \"horizon-7d65f959c6-krcmt\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:27 crc kubenswrapper[4954]: I1206 09:04:27.014911 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:27 crc kubenswrapper[4954]: I1206 09:04:27.127163 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:27 crc kubenswrapper[4954]: W1206 09:04:27.575963 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd5a5191_b7b3_418b_8c8f_034808addca0.slice/crio-1859a518ee07f51c401f88cfdc0e8c86a72527d45bbf4be7d53a10986f082e8a WatchSource:0}: Error finding container 1859a518ee07f51c401f88cfdc0e8c86a72527d45bbf4be7d53a10986f082e8a: Status 404 returned error can't find the container with id 1859a518ee07f51c401f88cfdc0e8c86a72527d45bbf4be7d53a10986f082e8a Dec 06 09:04:27 crc kubenswrapper[4954]: I1206 09:04:27.580784 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b7c4f984d-gkz9f"] Dec 06 09:04:27 crc kubenswrapper[4954]: I1206 09:04:27.693651 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d65f959c6-krcmt"] Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.573399 4954 generic.go:334] "Generic (PLEG): container finished" podID="f5005a42-2172-4a4d-b063-a9c24649e747" containerID="fdc387b7a76029405379b37127fe7d9ec4b0befd24f8eb115bb83973bf3671fa" exitCode=0 Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.573466 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f5005a42-2172-4a4d-b063-a9c24649e747","Type":"ContainerDied","Data":"fdc387b7a76029405379b37127fe7d9ec4b0befd24f8eb115bb83973bf3671fa"} Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.576993 4954 generic.go:334] "Generic (PLEG): container finished" podID="bccf6ec3-7db7-497d-97ba-e7002fb77b80" containerID="fd79a5aa15258c69a1e9270ca7c72365c816bc413d9201e4eecfb57701183d44" exitCode=0 Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.577064 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bccf6ec3-7db7-497d-97ba-e7002fb77b80","Type":"ContainerDied","Data":"fd79a5aa15258c69a1e9270ca7c72365c816bc413d9201e4eecfb57701183d44"} Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.578838 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d65f959c6-krcmt" event={"ID":"1709a14f-a574-4485-8993-2c5991a6ca80","Type":"ContainerStarted","Data":"7f281bef00f061c55fcd320df1a97237e14d68b9c79dbeadc3e247459cd9a142"} Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.580264 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7c4f984d-gkz9f" event={"ID":"dd5a5191-b7b3-418b-8c8f-034808addca0","Type":"ContainerStarted","Data":"1859a518ee07f51c401f88cfdc0e8c86a72527d45bbf4be7d53a10986f082e8a"} Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.831289 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.894417 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjhfh\" (UniqueName: \"kubernetes.io/projected/bccf6ec3-7db7-497d-97ba-e7002fb77b80-kube-api-access-vjhfh\") pod \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.894840 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bccf6ec3-7db7-497d-97ba-e7002fb77b80-httpd-run\") pod \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.894862 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-config-data\") pod \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.894909 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bccf6ec3-7db7-497d-97ba-e7002fb77b80-logs\") pod \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.894947 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-public-tls-certs\") pod \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.894977 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-scripts\") pod \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.895012 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-combined-ca-bundle\") pod \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\" (UID: \"bccf6ec3-7db7-497d-97ba-e7002fb77b80\") " Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.901172 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bccf6ec3-7db7-497d-97ba-e7002fb77b80-logs" (OuterVolumeSpecName: "logs") pod "bccf6ec3-7db7-497d-97ba-e7002fb77b80" (UID: "bccf6ec3-7db7-497d-97ba-e7002fb77b80"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.903780 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bccf6ec3-7db7-497d-97ba-e7002fb77b80-kube-api-access-vjhfh" (OuterVolumeSpecName: "kube-api-access-vjhfh") pod "bccf6ec3-7db7-497d-97ba-e7002fb77b80" (UID: "bccf6ec3-7db7-497d-97ba-e7002fb77b80"). InnerVolumeSpecName "kube-api-access-vjhfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.908032 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bccf6ec3-7db7-497d-97ba-e7002fb77b80-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bccf6ec3-7db7-497d-97ba-e7002fb77b80" (UID: "bccf6ec3-7db7-497d-97ba-e7002fb77b80"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.918277 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-scripts" (OuterVolumeSpecName: "scripts") pod "bccf6ec3-7db7-497d-97ba-e7002fb77b80" (UID: "bccf6ec3-7db7-497d-97ba-e7002fb77b80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.969772 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bccf6ec3-7db7-497d-97ba-e7002fb77b80" (UID: "bccf6ec3-7db7-497d-97ba-e7002fb77b80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:04:28 crc kubenswrapper[4954]: I1206 09:04:28.996061 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bccf6ec3-7db7-497d-97ba-e7002fb77b80" (UID: "bccf6ec3-7db7-497d-97ba-e7002fb77b80"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.000350 4954 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bccf6ec3-7db7-497d-97ba-e7002fb77b80-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.000554 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bccf6ec3-7db7-497d-97ba-e7002fb77b80-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.000825 4954 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.000975 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.001140 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.001415 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjhfh\" (UniqueName: \"kubernetes.io/projected/bccf6ec3-7db7-497d-97ba-e7002fb77b80-kube-api-access-vjhfh\") on node \"crc\" DevicePath \"\"" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.007491 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-config-data" (OuterVolumeSpecName: "config-data") pod "bccf6ec3-7db7-497d-97ba-e7002fb77b80" (UID: "bccf6ec3-7db7-497d-97ba-e7002fb77b80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.104757 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccf6ec3-7db7-497d-97ba-e7002fb77b80-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.594669 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bccf6ec3-7db7-497d-97ba-e7002fb77b80","Type":"ContainerDied","Data":"118ad42982ce60c01e606e5170ba5994ec6bbe62f9be498fd4a922b07ecd5417"} Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.594731 4954 scope.go:117] "RemoveContainer" containerID="fd79a5aa15258c69a1e9270ca7c72365c816bc413d9201e4eecfb57701183d44" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.594844 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.620535 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.628647 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.652981 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 09:04:29 crc kubenswrapper[4954]: E1206 09:04:29.653457 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccf6ec3-7db7-497d-97ba-e7002fb77b80" containerName="glance-httpd" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.653478 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccf6ec3-7db7-497d-97ba-e7002fb77b80" containerName="glance-httpd" Dec 06 09:04:29 crc kubenswrapper[4954]: E1206 09:04:29.653515 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccf6ec3-7db7-497d-97ba-e7002fb77b80" containerName="glance-log" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.653523 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccf6ec3-7db7-497d-97ba-e7002fb77b80" containerName="glance-log" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.653758 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccf6ec3-7db7-497d-97ba-e7002fb77b80" containerName="glance-log" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.653787 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccf6ec3-7db7-497d-97ba-e7002fb77b80" containerName="glance-httpd" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.702994 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.703107 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.707203 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.707520 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.820099 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzgfd\" (UniqueName: \"kubernetes.io/projected/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-kube-api-access-vzgfd\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.820154 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-scripts\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.820177 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-logs\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.820202 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.820378 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.820398 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.821626 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-config-data\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.924499 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzgfd\" (UniqueName: \"kubernetes.io/projected/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-kube-api-access-vzgfd\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.924586 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-scripts\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.924616 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-logs\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.924647 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.925046 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.925069 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.925143 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-config-data\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.925290 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-logs\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.926610 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.932257 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.937368 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-config-data\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.938259 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-scripts\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.949663 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzgfd\" (UniqueName: \"kubernetes.io/projected/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-kube-api-access-vzgfd\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:29 crc kubenswrapper[4954]: I1206 09:04:29.962288 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0be3b2df-5fe0-44f7-aa96-6c714b5b96b3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3\") " pod="openstack/glance-default-external-api-0" Dec 06 09:04:30 crc kubenswrapper[4954]: I1206 09:04:30.027227 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 09:04:31 crc kubenswrapper[4954]: I1206 09:04:31.456845 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bccf6ec3-7db7-497d-97ba-e7002fb77b80" path="/var/lib/kubelet/pods/bccf6ec3-7db7-497d-97ba-e7002fb77b80/volumes" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.074476 4954 scope.go:117] "RemoveContainer" containerID="ab537d70b86dbdeee1673a5e85e881972b23f51ef78d8ab71bc18429ee8fe81b" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.225356 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.355065 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwdcg\" (UniqueName: \"kubernetes.io/projected/f5005a42-2172-4a4d-b063-a9c24649e747-kube-api-access-xwdcg\") pod \"f5005a42-2172-4a4d-b063-a9c24649e747\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.355462 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-config-data\") pod \"f5005a42-2172-4a4d-b063-a9c24649e747\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.355594 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f5005a42-2172-4a4d-b063-a9c24649e747-httpd-run\") pod \"f5005a42-2172-4a4d-b063-a9c24649e747\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.355610 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-combined-ca-bundle\") pod \"f5005a42-2172-4a4d-b063-a9c24649e747\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.355764 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5005a42-2172-4a4d-b063-a9c24649e747-logs\") pod \"f5005a42-2172-4a4d-b063-a9c24649e747\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.355854 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-internal-tls-certs\") pod \"f5005a42-2172-4a4d-b063-a9c24649e747\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.356249 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5005a42-2172-4a4d-b063-a9c24649e747-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f5005a42-2172-4a4d-b063-a9c24649e747" (UID: "f5005a42-2172-4a4d-b063-a9c24649e747"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.356545 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-scripts\") pod \"f5005a42-2172-4a4d-b063-a9c24649e747\" (UID: \"f5005a42-2172-4a4d-b063-a9c24649e747\") " Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.357094 4954 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f5005a42-2172-4a4d-b063-a9c24649e747-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.357608 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5005a42-2172-4a4d-b063-a9c24649e747-logs" (OuterVolumeSpecName: "logs") pod "f5005a42-2172-4a4d-b063-a9c24649e747" (UID: "f5005a42-2172-4a4d-b063-a9c24649e747"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.361483 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5005a42-2172-4a4d-b063-a9c24649e747-kube-api-access-xwdcg" (OuterVolumeSpecName: "kube-api-access-xwdcg") pod "f5005a42-2172-4a4d-b063-a9c24649e747" (UID: "f5005a42-2172-4a4d-b063-a9c24649e747"). InnerVolumeSpecName "kube-api-access-xwdcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.361817 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-scripts" (OuterVolumeSpecName: "scripts") pod "f5005a42-2172-4a4d-b063-a9c24649e747" (UID: "f5005a42-2172-4a4d-b063-a9c24649e747"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.458973 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5005a42-2172-4a4d-b063-a9c24649e747-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.459000 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.459012 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwdcg\" (UniqueName: \"kubernetes.io/projected/f5005a42-2172-4a4d-b063-a9c24649e747-kube-api-access-xwdcg\") on node \"crc\" DevicePath \"\"" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.550141 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5005a42-2172-4a4d-b063-a9c24649e747" (UID: "f5005a42-2172-4a4d-b063-a9c24649e747"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.560921 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.606687 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-config-data" (OuterVolumeSpecName: "config-data") pod "f5005a42-2172-4a4d-b063-a9c24649e747" (UID: "f5005a42-2172-4a4d-b063-a9c24649e747"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.611770 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f5005a42-2172-4a4d-b063-a9c24649e747" (UID: "f5005a42-2172-4a4d-b063-a9c24649e747"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.643720 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d65f959c6-krcmt" event={"ID":"1709a14f-a574-4485-8993-2c5991a6ca80","Type":"ContainerStarted","Data":"a6e3e7cbe32d726f40cb264aeba7d52f87477307d597c8801524cbfd58dd48ca"} Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.648154 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65b5bbcb89-skdjp" event={"ID":"d89624ad-2d7f-4ab2-86f2-77936051c570","Type":"ContainerStarted","Data":"6111d82da607d255c25b89312b5b098b5efc229dee3dcb54ddb20ff558778520"} Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.649748 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f76f758d9-hnvjl" event={"ID":"ae2b9b5e-4640-4458-8335-e7afa9a587cc","Type":"ContainerStarted","Data":"71a0c6290e1fb04caecd32e1a8466f1476c5b203662fd2f8a3c7a4f0fd6ffccf"} Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.655104 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7c4f984d-gkz9f" event={"ID":"dd5a5191-b7b3-418b-8c8f-034808addca0","Type":"ContainerStarted","Data":"ef30c1734aeaa1953add15280082498ef0de72516e2a7026c2d45ea12a9373ff"} Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.658778 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f5005a42-2172-4a4d-b063-a9c24649e747","Type":"ContainerDied","Data":"6d759c460d85b42748485b75443cc67bdcbc95a2fe4363973ba119ba02a7d17c"} Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.658833 4954 scope.go:117] "RemoveContainer" containerID="fdc387b7a76029405379b37127fe7d9ec4b0befd24f8eb115bb83973bf3671fa" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.658961 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.665783 4954 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.665813 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5005a42-2172-4a4d-b063-a9c24649e747-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.700931 4954 scope.go:117] "RemoveContainer" containerID="500f2fc8df3f1ece73cec8ecc49047b0de1241d9f0ee0300d4f8e7cafe8c1b6c" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.739931 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.752749 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.766947 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 09:04:34 crc kubenswrapper[4954]: E1206 09:04:34.768760 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5005a42-2172-4a4d-b063-a9c24649e747" containerName="glance-httpd" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.768783 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5005a42-2172-4a4d-b063-a9c24649e747" containerName="glance-httpd" Dec 06 09:04:34 crc kubenswrapper[4954]: E1206 09:04:34.768806 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5005a42-2172-4a4d-b063-a9c24649e747" containerName="glance-log" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.768812 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5005a42-2172-4a4d-b063-a9c24649e747" containerName="glance-log" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.768988 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5005a42-2172-4a4d-b063-a9c24649e747" containerName="glance-httpd" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.769007 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5005a42-2172-4a4d-b063-a9c24649e747" containerName="glance-log" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.770041 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.772733 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.773014 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.797075 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.856308 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 09:04:34 crc kubenswrapper[4954]: W1206 09:04:34.871871 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0be3b2df_5fe0_44f7_aa96_6c714b5b96b3.slice/crio-276d3c14deee7faf5dfe1309bb7cf11048736bc909eeb53870522fa18e3a985c WatchSource:0}: Error finding container 276d3c14deee7faf5dfe1309bb7cf11048736bc909eeb53870522fa18e3a985c: Status 404 returned error can't find the container with id 276d3c14deee7faf5dfe1309bb7cf11048736bc909eeb53870522fa18e3a985c Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.875712 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a2d5ad-ab05-499d-afa3-c52316bb2502-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.875768 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a2d5ad-ab05-499d-afa3-c52316bb2502-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.875796 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vm56\" (UniqueName: \"kubernetes.io/projected/d9a2d5ad-ab05-499d-afa3-c52316bb2502-kube-api-access-5vm56\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.876094 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9a2d5ad-ab05-499d-afa3-c52316bb2502-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.876131 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9a2d5ad-ab05-499d-afa3-c52316bb2502-logs\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.876188 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9a2d5ad-ab05-499d-afa3-c52316bb2502-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.876259 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9a2d5ad-ab05-499d-afa3-c52316bb2502-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.978612 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9a2d5ad-ab05-499d-afa3-c52316bb2502-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.979146 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9a2d5ad-ab05-499d-afa3-c52316bb2502-logs\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.979191 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9a2d5ad-ab05-499d-afa3-c52316bb2502-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.979249 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9a2d5ad-ab05-499d-afa3-c52316bb2502-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.979319 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a2d5ad-ab05-499d-afa3-c52316bb2502-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.979349 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a2d5ad-ab05-499d-afa3-c52316bb2502-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.979377 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vm56\" (UniqueName: \"kubernetes.io/projected/d9a2d5ad-ab05-499d-afa3-c52316bb2502-kube-api-access-5vm56\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.979603 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9a2d5ad-ab05-499d-afa3-c52316bb2502-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:34 crc kubenswrapper[4954]: I1206 09:04:34.980804 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9a2d5ad-ab05-499d-afa3-c52316bb2502-logs\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.015114 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9a2d5ad-ab05-499d-afa3-c52316bb2502-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.015806 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9a2d5ad-ab05-499d-afa3-c52316bb2502-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.016896 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a2d5ad-ab05-499d-afa3-c52316bb2502-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.019155 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a2d5ad-ab05-499d-afa3-c52316bb2502-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.032364 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vm56\" (UniqueName: \"kubernetes.io/projected/d9a2d5ad-ab05-499d-afa3-c52316bb2502-kube-api-access-5vm56\") pod \"glance-default-internal-api-0\" (UID: \"d9a2d5ad-ab05-499d-afa3-c52316bb2502\") " pod="openstack/glance-default-internal-api-0" Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.126183 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.497027 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5005a42-2172-4a4d-b063-a9c24649e747" path="/var/lib/kubelet/pods/f5005a42-2172-4a4d-b063-a9c24649e747/volumes" Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.726862 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7c4f984d-gkz9f" event={"ID":"dd5a5191-b7b3-418b-8c8f-034808addca0","Type":"ContainerStarted","Data":"0499fa6814d418760d7e8ca7096e3ef6a1cdab0903ac2fd783a78b1048b9a91a"} Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.773279 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5b7c4f984d-gkz9f" podStartSLOduration=3.172728137 podStartE2EDuration="9.773256199s" podCreationTimestamp="2025-12-06 09:04:26 +0000 UTC" firstStartedPulling="2025-12-06 09:04:27.579704172 +0000 UTC m=+7642.393063571" lastFinishedPulling="2025-12-06 09:04:34.180232244 +0000 UTC m=+7648.993591633" observedRunningTime="2025-12-06 09:04:35.769669543 +0000 UTC m=+7650.583028942" watchObservedRunningTime="2025-12-06 09:04:35.773256199 +0000 UTC m=+7650.586615588" Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.775669 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d65f959c6-krcmt" event={"ID":"1709a14f-a574-4485-8993-2c5991a6ca80","Type":"ContainerStarted","Data":"74a7f8fe3b65db5f113c518df505f793c9938f6e1817b39e9a6c049d7bf5f8d2"} Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.793301 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3","Type":"ContainerStarted","Data":"276d3c14deee7faf5dfe1309bb7cf11048736bc909eeb53870522fa18e3a985c"} Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.816190 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-65b5bbcb89-skdjp" podUID="d89624ad-2d7f-4ab2-86f2-77936051c570" containerName="horizon-log" containerID="cri-o://6111d82da607d255c25b89312b5b098b5efc229dee3dcb54ddb20ff558778520" gracePeriod=30 Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.816685 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65b5bbcb89-skdjp" event={"ID":"d89624ad-2d7f-4ab2-86f2-77936051c570","Type":"ContainerStarted","Data":"50410ebca16e2200383c361a425278479d427a64e32e8c2c02ea0931bc60aa62"} Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.816964 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-65b5bbcb89-skdjp" podUID="d89624ad-2d7f-4ab2-86f2-77936051c570" containerName="horizon" containerID="cri-o://50410ebca16e2200383c361a425278479d427a64e32e8c2c02ea0931bc60aa62" gracePeriod=30 Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.820488 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7d65f959c6-krcmt" podStartSLOduration=3.34492823 podStartE2EDuration="9.820471384s" podCreationTimestamp="2025-12-06 09:04:26 +0000 UTC" firstStartedPulling="2025-12-06 09:04:27.705005058 +0000 UTC m=+7642.518364447" lastFinishedPulling="2025-12-06 09:04:34.180548212 +0000 UTC m=+7648.993907601" observedRunningTime="2025-12-06 09:04:35.809546591 +0000 UTC m=+7650.622905970" watchObservedRunningTime="2025-12-06 09:04:35.820471384 +0000 UTC m=+7650.633830773" Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.821344 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f76f758d9-hnvjl" event={"ID":"ae2b9b5e-4640-4458-8335-e7afa9a587cc","Type":"ContainerStarted","Data":"0edda40b0e2fbc331865e05d19e4e1f0fd7d5ab96e83325dd917ec0fb47121f6"} Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.821494 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7f76f758d9-hnvjl" podUID="ae2b9b5e-4640-4458-8335-e7afa9a587cc" containerName="horizon-log" containerID="cri-o://71a0c6290e1fb04caecd32e1a8466f1476c5b203662fd2f8a3c7a4f0fd6ffccf" gracePeriod=30 Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.821649 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7f76f758d9-hnvjl" podUID="ae2b9b5e-4640-4458-8335-e7afa9a587cc" containerName="horizon" containerID="cri-o://0edda40b0e2fbc331865e05d19e4e1f0fd7d5ab96e83325dd917ec0fb47121f6" gracePeriod=30 Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.847285 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-65b5bbcb89-skdjp" podStartSLOduration=2.9542447149999997 podStartE2EDuration="11.847244991s" podCreationTimestamp="2025-12-06 09:04:24 +0000 UTC" firstStartedPulling="2025-12-06 09:04:25.286255792 +0000 UTC m=+7640.099615181" lastFinishedPulling="2025-12-06 09:04:34.179256058 +0000 UTC m=+7648.992615457" observedRunningTime="2025-12-06 09:04:35.842667208 +0000 UTC m=+7650.656026597" watchObservedRunningTime="2025-12-06 09:04:35.847244991 +0000 UTC m=+7650.660604380" Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.885486 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7f76f758d9-hnvjl" podStartSLOduration=3.344939802 podStartE2EDuration="11.885462345s" podCreationTimestamp="2025-12-06 09:04:24 +0000 UTC" firstStartedPulling="2025-12-06 09:04:25.673259219 +0000 UTC m=+7640.486618608" lastFinishedPulling="2025-12-06 09:04:34.213781762 +0000 UTC m=+7649.027141151" observedRunningTime="2025-12-06 09:04:35.870138294 +0000 UTC m=+7650.683497673" watchObservedRunningTime="2025-12-06 09:04:35.885462345 +0000 UTC m=+7650.698821734" Dec 06 09:04:35 crc kubenswrapper[4954]: I1206 09:04:35.895994 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 09:04:36 crc kubenswrapper[4954]: I1206 09:04:36.859173 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d9a2d5ad-ab05-499d-afa3-c52316bb2502","Type":"ContainerStarted","Data":"9761a138ebcf648628fa4644c2c2c3e174cd5e1fc0a7d29b7ed1cf788861b21b"} Dec 06 09:04:36 crc kubenswrapper[4954]: I1206 09:04:36.859778 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d9a2d5ad-ab05-499d-afa3-c52316bb2502","Type":"ContainerStarted","Data":"5f6ea7bb6cc96a095f7dcf43ef42bbbdceedff71ac63723edcf9ed55544777a5"} Dec 06 09:04:36 crc kubenswrapper[4954]: I1206 09:04:36.888368 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3","Type":"ContainerStarted","Data":"f0d7f2ec7300c44b215822c2312aa700a2c20d7b3040be7d1b46a82b3d07fd9e"} Dec 06 09:04:36 crc kubenswrapper[4954]: I1206 09:04:36.888487 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0be3b2df-5fe0-44f7-aa96-6c714b5b96b3","Type":"ContainerStarted","Data":"18d252a9025ade03c6dab0197028f18c60a4e69cf0649b03f4fdd05203c23a99"} Dec 06 09:04:36 crc kubenswrapper[4954]: I1206 09:04:36.919658 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.919632589 podStartE2EDuration="7.919632589s" podCreationTimestamp="2025-12-06 09:04:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:04:36.906984301 +0000 UTC m=+7651.720343690" watchObservedRunningTime="2025-12-06 09:04:36.919632589 +0000 UTC m=+7651.732991978" Dec 06 09:04:37 crc kubenswrapper[4954]: I1206 09:04:37.015592 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:37 crc kubenswrapper[4954]: I1206 09:04:37.015713 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:04:37 crc kubenswrapper[4954]: I1206 09:04:37.127358 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:37 crc kubenswrapper[4954]: I1206 09:04:37.127674 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:37 crc kubenswrapper[4954]: I1206 09:04:37.900895 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d9a2d5ad-ab05-499d-afa3-c52316bb2502","Type":"ContainerStarted","Data":"1594115bbc0ab25f3db9b62030f05e132fdd7feea4c1e90455ad8b7ab73a7366"} Dec 06 09:04:37 crc kubenswrapper[4954]: I1206 09:04:37.927112 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.927084248 podStartE2EDuration="3.927084248s" podCreationTimestamp="2025-12-06 09:04:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:04:37.919194797 +0000 UTC m=+7652.732554196" watchObservedRunningTime="2025-12-06 09:04:37.927084248 +0000 UTC m=+7652.740443637" Dec 06 09:04:40 crc kubenswrapper[4954]: I1206 09:04:40.028353 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 09:04:40 crc kubenswrapper[4954]: I1206 09:04:40.028898 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 09:04:40 crc kubenswrapper[4954]: I1206 09:04:40.070350 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 09:04:40 crc kubenswrapper[4954]: I1206 09:04:40.075117 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 09:04:40 crc kubenswrapper[4954]: I1206 09:04:40.101737 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:04:40 crc kubenswrapper[4954]: I1206 09:04:40.101813 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:04:40 crc kubenswrapper[4954]: I1206 09:04:40.928507 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 09:04:40 crc kubenswrapper[4954]: I1206 09:04:40.928576 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 09:04:44 crc kubenswrapper[4954]: I1206 09:04:44.185348 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 09:04:44 crc kubenswrapper[4954]: I1206 09:04:44.296453 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 09:04:44 crc kubenswrapper[4954]: I1206 09:04:44.840735 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:04:45 crc kubenswrapper[4954]: I1206 09:04:45.093794 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:04:45 crc kubenswrapper[4954]: I1206 09:04:45.127468 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 09:04:45 crc kubenswrapper[4954]: I1206 09:04:45.127519 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 09:04:45 crc kubenswrapper[4954]: I1206 09:04:45.162546 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 09:04:45 crc kubenswrapper[4954]: I1206 09:04:45.173094 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 09:04:45 crc kubenswrapper[4954]: I1206 09:04:45.988475 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 09:04:45 crc kubenswrapper[4954]: I1206 09:04:45.988953 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 09:04:47 crc kubenswrapper[4954]: I1206 09:04:47.032683 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5b7c4f984d-gkz9f" podUID="dd5a5191-b7b3-418b-8c8f-034808addca0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.114:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8443: connect: connection refused" Dec 06 09:04:47 crc kubenswrapper[4954]: I1206 09:04:47.129999 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7d65f959c6-krcmt" podUID="1709a14f-a574-4485-8993-2c5991a6ca80" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.115:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8443: connect: connection refused" Dec 06 09:04:48 crc kubenswrapper[4954]: I1206 09:04:48.004892 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 09:04:48 crc kubenswrapper[4954]: I1206 09:04:48.042730 4954 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 09:04:48 crc kubenswrapper[4954]: I1206 09:04:48.048817 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 09:04:59 crc kubenswrapper[4954]: I1206 09:04:59.068006 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:04:59 crc kubenswrapper[4954]: I1206 09:04:59.118188 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:05:00 crc kubenswrapper[4954]: I1206 09:05:00.778927 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:05:00 crc kubenswrapper[4954]: I1206 09:05:00.902378 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5b7c4f984d-gkz9f"] Dec 06 09:05:00 crc kubenswrapper[4954]: I1206 09:05:00.905882 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5b7c4f984d-gkz9f" podUID="dd5a5191-b7b3-418b-8c8f-034808addca0" containerName="horizon-log" containerID="cri-o://ef30c1734aeaa1953add15280082498ef0de72516e2a7026c2d45ea12a9373ff" gracePeriod=30 Dec 06 09:05:00 crc kubenswrapper[4954]: I1206 09:05:00.906114 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5b7c4f984d-gkz9f" podUID="dd5a5191-b7b3-418b-8c8f-034808addca0" containerName="horizon" containerID="cri-o://0499fa6814d418760d7e8ca7096e3ef6a1cdab0903ac2fd783a78b1048b9a91a" gracePeriod=30 Dec 06 09:05:00 crc kubenswrapper[4954]: I1206 09:05:00.946502 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5b7c4f984d-gkz9f" podUID="dd5a5191-b7b3-418b-8c8f-034808addca0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.114:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Dec 06 09:05:01 crc kubenswrapper[4954]: I1206 09:05:01.635252 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="f5005a42-2172-4a4d-b063-a9c24649e747" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.1.79:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 09:05:01 crc kubenswrapper[4954]: I1206 09:05:01.635347 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="f5005a42-2172-4a4d-b063-a9c24649e747" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.1.79:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 09:05:04 crc kubenswrapper[4954]: I1206 09:05:04.046607 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0ef9-account-create-update-dxp8f"] Dec 06 09:05:04 crc kubenswrapper[4954]: I1206 09:05:04.057475 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-twgpt"] Dec 06 09:05:04 crc kubenswrapper[4954]: I1206 09:05:04.067627 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0ef9-account-create-update-dxp8f"] Dec 06 09:05:04 crc kubenswrapper[4954]: I1206 09:05:04.075746 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-twgpt"] Dec 06 09:05:04 crc kubenswrapper[4954]: I1206 09:05:04.307242 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5b7c4f984d-gkz9f" podUID="dd5a5191-b7b3-418b-8c8f-034808addca0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.114:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:40980->10.217.1.114:8443: read: connection reset by peer" Dec 06 09:05:05 crc kubenswrapper[4954]: I1206 09:05:05.218058 4954 generic.go:334] "Generic (PLEG): container finished" podID="dd5a5191-b7b3-418b-8c8f-034808addca0" containerID="0499fa6814d418760d7e8ca7096e3ef6a1cdab0903ac2fd783a78b1048b9a91a" exitCode=0 Dec 06 09:05:05 crc kubenswrapper[4954]: I1206 09:05:05.218155 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7c4f984d-gkz9f" event={"ID":"dd5a5191-b7b3-418b-8c8f-034808addca0","Type":"ContainerDied","Data":"0499fa6814d418760d7e8ca7096e3ef6a1cdab0903ac2fd783a78b1048b9a91a"} Dec 06 09:05:05 crc kubenswrapper[4954]: I1206 09:05:05.476140 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d552c02-3df7-4f00-bd16-9c128a0fd274" path="/var/lib/kubelet/pods/3d552c02-3df7-4f00-bd16-9c128a0fd274/volumes" Dec 06 09:05:05 crc kubenswrapper[4954]: I1206 09:05:05.477412 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9359d5c-ba3c-4a69-91ea-c13180163dc8" path="/var/lib/kubelet/pods/f9359d5c-ba3c-4a69-91ea-c13180163dc8/volumes" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.237381 4954 generic.go:334] "Generic (PLEG): container finished" podID="d89624ad-2d7f-4ab2-86f2-77936051c570" containerID="50410ebca16e2200383c361a425278479d427a64e32e8c2c02ea0931bc60aa62" exitCode=137 Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.237841 4954 generic.go:334] "Generic (PLEG): container finished" podID="d89624ad-2d7f-4ab2-86f2-77936051c570" containerID="6111d82da607d255c25b89312b5b098b5efc229dee3dcb54ddb20ff558778520" exitCode=137 Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.237458 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65b5bbcb89-skdjp" event={"ID":"d89624ad-2d7f-4ab2-86f2-77936051c570","Type":"ContainerDied","Data":"50410ebca16e2200383c361a425278479d427a64e32e8c2c02ea0931bc60aa62"} Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.237947 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65b5bbcb89-skdjp" event={"ID":"d89624ad-2d7f-4ab2-86f2-77936051c570","Type":"ContainerDied","Data":"6111d82da607d255c25b89312b5b098b5efc229dee3dcb54ddb20ff558778520"} Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.237990 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65b5bbcb89-skdjp" event={"ID":"d89624ad-2d7f-4ab2-86f2-77936051c570","Type":"ContainerDied","Data":"2d774b4cec7a34218cc09493a4e8e544ef069cf7d25adedc9dec36fdad4cae4c"} Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.238020 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d774b4cec7a34218cc09493a4e8e544ef069cf7d25adedc9dec36fdad4cae4c" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.240878 4954 generic.go:334] "Generic (PLEG): container finished" podID="ae2b9b5e-4640-4458-8335-e7afa9a587cc" containerID="0edda40b0e2fbc331865e05d19e4e1f0fd7d5ab96e83325dd917ec0fb47121f6" exitCode=137 Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.240892 4954 generic.go:334] "Generic (PLEG): container finished" podID="ae2b9b5e-4640-4458-8335-e7afa9a587cc" containerID="71a0c6290e1fb04caecd32e1a8466f1476c5b203662fd2f8a3c7a4f0fd6ffccf" exitCode=137 Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.240910 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f76f758d9-hnvjl" event={"ID":"ae2b9b5e-4640-4458-8335-e7afa9a587cc","Type":"ContainerDied","Data":"0edda40b0e2fbc331865e05d19e4e1f0fd7d5ab96e83325dd917ec0fb47121f6"} Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.240969 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f76f758d9-hnvjl" event={"ID":"ae2b9b5e-4640-4458-8335-e7afa9a587cc","Type":"ContainerDied","Data":"71a0c6290e1fb04caecd32e1a8466f1476c5b203662fd2f8a3c7a4f0fd6ffccf"} Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.330168 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.403373 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.410710 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ae2b9b5e-4640-4458-8335-e7afa9a587cc-horizon-secret-key\") pod \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.410815 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d89624ad-2d7f-4ab2-86f2-77936051c570-logs\") pod \"d89624ad-2d7f-4ab2-86f2-77936051c570\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.410869 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae2b9b5e-4640-4458-8335-e7afa9a587cc-scripts\") pod \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.410894 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae2b9b5e-4640-4458-8335-e7afa9a587cc-logs\") pod \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.410922 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d89624ad-2d7f-4ab2-86f2-77936051c570-scripts\") pod \"d89624ad-2d7f-4ab2-86f2-77936051c570\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.410947 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6w6d\" (UniqueName: \"kubernetes.io/projected/d89624ad-2d7f-4ab2-86f2-77936051c570-kube-api-access-x6w6d\") pod \"d89624ad-2d7f-4ab2-86f2-77936051c570\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.410962 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d89624ad-2d7f-4ab2-86f2-77936051c570-horizon-secret-key\") pod \"d89624ad-2d7f-4ab2-86f2-77936051c570\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.410978 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfckq\" (UniqueName: \"kubernetes.io/projected/ae2b9b5e-4640-4458-8335-e7afa9a587cc-kube-api-access-lfckq\") pod \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.411005 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae2b9b5e-4640-4458-8335-e7afa9a587cc-config-data\") pod \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\" (UID: \"ae2b9b5e-4640-4458-8335-e7afa9a587cc\") " Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.411024 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d89624ad-2d7f-4ab2-86f2-77936051c570-config-data\") pod \"d89624ad-2d7f-4ab2-86f2-77936051c570\" (UID: \"d89624ad-2d7f-4ab2-86f2-77936051c570\") " Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.413708 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d89624ad-2d7f-4ab2-86f2-77936051c570-logs" (OuterVolumeSpecName: "logs") pod "d89624ad-2d7f-4ab2-86f2-77936051c570" (UID: "d89624ad-2d7f-4ab2-86f2-77936051c570"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.414992 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae2b9b5e-4640-4458-8335-e7afa9a587cc-logs" (OuterVolumeSpecName: "logs") pod "ae2b9b5e-4640-4458-8335-e7afa9a587cc" (UID: "ae2b9b5e-4640-4458-8335-e7afa9a587cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.418056 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae2b9b5e-4640-4458-8335-e7afa9a587cc-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ae2b9b5e-4640-4458-8335-e7afa9a587cc" (UID: "ae2b9b5e-4640-4458-8335-e7afa9a587cc"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.419307 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d89624ad-2d7f-4ab2-86f2-77936051c570-kube-api-access-x6w6d" (OuterVolumeSpecName: "kube-api-access-x6w6d") pod "d89624ad-2d7f-4ab2-86f2-77936051c570" (UID: "d89624ad-2d7f-4ab2-86f2-77936051c570"). InnerVolumeSpecName "kube-api-access-x6w6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.425408 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d89624ad-2d7f-4ab2-86f2-77936051c570-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d89624ad-2d7f-4ab2-86f2-77936051c570" (UID: "d89624ad-2d7f-4ab2-86f2-77936051c570"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.426829 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae2b9b5e-4640-4458-8335-e7afa9a587cc-kube-api-access-lfckq" (OuterVolumeSpecName: "kube-api-access-lfckq") pod "ae2b9b5e-4640-4458-8335-e7afa9a587cc" (UID: "ae2b9b5e-4640-4458-8335-e7afa9a587cc"). InnerVolumeSpecName "kube-api-access-lfckq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.445942 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d89624ad-2d7f-4ab2-86f2-77936051c570-config-data" (OuterVolumeSpecName: "config-data") pod "d89624ad-2d7f-4ab2-86f2-77936051c570" (UID: "d89624ad-2d7f-4ab2-86f2-77936051c570"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.446405 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae2b9b5e-4640-4458-8335-e7afa9a587cc-config-data" (OuterVolumeSpecName: "config-data") pod "ae2b9b5e-4640-4458-8335-e7afa9a587cc" (UID: "ae2b9b5e-4640-4458-8335-e7afa9a587cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.448931 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d89624ad-2d7f-4ab2-86f2-77936051c570-scripts" (OuterVolumeSpecName: "scripts") pod "d89624ad-2d7f-4ab2-86f2-77936051c570" (UID: "d89624ad-2d7f-4ab2-86f2-77936051c570"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.454491 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae2b9b5e-4640-4458-8335-e7afa9a587cc-scripts" (OuterVolumeSpecName: "scripts") pod "ae2b9b5e-4640-4458-8335-e7afa9a587cc" (UID: "ae2b9b5e-4640-4458-8335-e7afa9a587cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.535787 4954 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ae2b9b5e-4640-4458-8335-e7afa9a587cc-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.535837 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d89624ad-2d7f-4ab2-86f2-77936051c570-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.535851 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae2b9b5e-4640-4458-8335-e7afa9a587cc-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.535867 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae2b9b5e-4640-4458-8335-e7afa9a587cc-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.535884 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d89624ad-2d7f-4ab2-86f2-77936051c570-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.535898 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6w6d\" (UniqueName: \"kubernetes.io/projected/d89624ad-2d7f-4ab2-86f2-77936051c570-kube-api-access-x6w6d\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.535914 4954 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d89624ad-2d7f-4ab2-86f2-77936051c570-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.535926 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfckq\" (UniqueName: \"kubernetes.io/projected/ae2b9b5e-4640-4458-8335-e7afa9a587cc-kube-api-access-lfckq\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.535943 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae2b9b5e-4640-4458-8335-e7afa9a587cc-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:06 crc kubenswrapper[4954]: I1206 09:05:06.535954 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d89624ad-2d7f-4ab2-86f2-77936051c570-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:07 crc kubenswrapper[4954]: I1206 09:05:07.015342 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5b7c4f984d-gkz9f" podUID="dd5a5191-b7b3-418b-8c8f-034808addca0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.114:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8443: connect: connection refused" Dec 06 09:05:07 crc kubenswrapper[4954]: I1206 09:05:07.254612 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65b5bbcb89-skdjp" Dec 06 09:05:07 crc kubenswrapper[4954]: I1206 09:05:07.255429 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f76f758d9-hnvjl" Dec 06 09:05:07 crc kubenswrapper[4954]: I1206 09:05:07.255850 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f76f758d9-hnvjl" event={"ID":"ae2b9b5e-4640-4458-8335-e7afa9a587cc","Type":"ContainerDied","Data":"dd1534e6438fb1754675093192b14b553dc13d9f303be6af65f630df4d760612"} Dec 06 09:05:07 crc kubenswrapper[4954]: I1206 09:05:07.255915 4954 scope.go:117] "RemoveContainer" containerID="0edda40b0e2fbc331865e05d19e4e1f0fd7d5ab96e83325dd917ec0fb47121f6" Dec 06 09:05:07 crc kubenswrapper[4954]: I1206 09:05:07.302854 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f76f758d9-hnvjl"] Dec 06 09:05:07 crc kubenswrapper[4954]: I1206 09:05:07.317368 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7f76f758d9-hnvjl"] Dec 06 09:05:07 crc kubenswrapper[4954]: I1206 09:05:07.326227 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-65b5bbcb89-skdjp"] Dec 06 09:05:07 crc kubenswrapper[4954]: I1206 09:05:07.338737 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-65b5bbcb89-skdjp"] Dec 06 09:05:07 crc kubenswrapper[4954]: I1206 09:05:07.431611 4954 scope.go:117] "RemoveContainer" containerID="71a0c6290e1fb04caecd32e1a8466f1476c5b203662fd2f8a3c7a4f0fd6ffccf" Dec 06 09:05:07 crc kubenswrapper[4954]: I1206 09:05:07.471534 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae2b9b5e-4640-4458-8335-e7afa9a587cc" path="/var/lib/kubelet/pods/ae2b9b5e-4640-4458-8335-e7afa9a587cc/volumes" Dec 06 09:05:07 crc kubenswrapper[4954]: I1206 09:05:07.472187 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d89624ad-2d7f-4ab2-86f2-77936051c570" path="/var/lib/kubelet/pods/d89624ad-2d7f-4ab2-86f2-77936051c570/volumes" Dec 06 09:05:10 crc kubenswrapper[4954]: I1206 09:05:10.101648 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:05:10 crc kubenswrapper[4954]: I1206 09:05:10.102030 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:05:10 crc kubenswrapper[4954]: I1206 09:05:10.102093 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 09:05:10 crc kubenswrapper[4954]: I1206 09:05:10.103347 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:05:10 crc kubenswrapper[4954]: I1206 09:05:10.103424 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" gracePeriod=600 Dec 06 09:05:10 crc kubenswrapper[4954]: E1206 09:05:10.225957 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:05:10 crc kubenswrapper[4954]: I1206 09:05:10.289064 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" exitCode=0 Dec 06 09:05:10 crc kubenswrapper[4954]: I1206 09:05:10.289155 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78"} Dec 06 09:05:10 crc kubenswrapper[4954]: I1206 09:05:10.289793 4954 scope.go:117] "RemoveContainer" containerID="9a5af8207c0d9a138aaaadc63eac45bf948ce076df26cf72d671f69df33df520" Dec 06 09:05:10 crc kubenswrapper[4954]: I1206 09:05:10.291054 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:05:10 crc kubenswrapper[4954]: E1206 09:05:10.292042 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:05:14 crc kubenswrapper[4954]: I1206 09:05:14.035384 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-kstzz"] Dec 06 09:05:14 crc kubenswrapper[4954]: I1206 09:05:14.047780 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-kstzz"] Dec 06 09:05:15 crc kubenswrapper[4954]: I1206 09:05:15.455002 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6e30c14-e779-4b5d-ab68-479e2d136bdb" path="/var/lib/kubelet/pods/c6e30c14-e779-4b5d-ab68-479e2d136bdb/volumes" Dec 06 09:05:17 crc kubenswrapper[4954]: I1206 09:05:17.015818 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5b7c4f984d-gkz9f" podUID="dd5a5191-b7b3-418b-8c8f-034808addca0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.114:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8443: connect: connection refused" Dec 06 09:05:22 crc kubenswrapper[4954]: I1206 09:05:22.443689 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:05:22 crc kubenswrapper[4954]: E1206 09:05:22.444191 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:05:24 crc kubenswrapper[4954]: I1206 09:05:24.203130 4954 scope.go:117] "RemoveContainer" containerID="44b4bd98edb9bcd48ab2b4c3aec550a32514aa64e0069b68b3a9d2a665392849" Dec 06 09:05:24 crc kubenswrapper[4954]: I1206 09:05:24.232030 4954 scope.go:117] "RemoveContainer" containerID="ca8553feff5c2dc82ad34b81ceb5d46320297a21c5e130c084e1398088992129" Dec 06 09:05:24 crc kubenswrapper[4954]: I1206 09:05:24.277948 4954 scope.go:117] "RemoveContainer" containerID="46391dd4358449bffd5ce900ab1b3901cff56b899696122218a52ca8096dae06" Dec 06 09:05:24 crc kubenswrapper[4954]: I1206 09:05:24.320838 4954 scope.go:117] "RemoveContainer" containerID="821307b669af19f49db134ae5e408c5a20edf17d92a2eee3b7eeb3634ce89dd6" Dec 06 09:05:24 crc kubenswrapper[4954]: I1206 09:05:24.341419 4954 scope.go:117] "RemoveContainer" containerID="6352fa3d04d90cd44624806bb282f6f06ca4637543e09ca43ce788a1b30b167c" Dec 06 09:05:24 crc kubenswrapper[4954]: I1206 09:05:24.360818 4954 scope.go:117] "RemoveContainer" containerID="6b756174f73f0e8fe692af2bab18bcb0e3c4582b66190530c405f910ae983e36" Dec 06 09:05:24 crc kubenswrapper[4954]: I1206 09:05:24.419152 4954 scope.go:117] "RemoveContainer" containerID="9338ec5d4dd7ab5f6ec6a6b978da9252bde6610cc6e826ffc65e7cec279a1fea" Dec 06 09:05:27 crc kubenswrapper[4954]: I1206 09:05:27.016033 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5b7c4f984d-gkz9f" podUID="dd5a5191-b7b3-418b-8c8f-034808addca0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.114:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8443: connect: connection refused" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.437342 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.552084 4954 generic.go:334] "Generic (PLEG): container finished" podID="dd5a5191-b7b3-418b-8c8f-034808addca0" containerID="ef30c1734aeaa1953add15280082498ef0de72516e2a7026c2d45ea12a9373ff" exitCode=137 Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.552153 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7c4f984d-gkz9f" event={"ID":"dd5a5191-b7b3-418b-8c8f-034808addca0","Type":"ContainerDied","Data":"ef30c1734aeaa1953add15280082498ef0de72516e2a7026c2d45ea12a9373ff"} Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.552188 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7c4f984d-gkz9f" event={"ID":"dd5a5191-b7b3-418b-8c8f-034808addca0","Type":"ContainerDied","Data":"1859a518ee07f51c401f88cfdc0e8c86a72527d45bbf4be7d53a10986f082e8a"} Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.552319 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b7c4f984d-gkz9f" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.552633 4954 scope.go:117] "RemoveContainer" containerID="0499fa6814d418760d7e8ca7096e3ef6a1cdab0903ac2fd783a78b1048b9a91a" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.618731 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd5a5191-b7b3-418b-8c8f-034808addca0-horizon-tls-certs\") pod \"dd5a5191-b7b3-418b-8c8f-034808addca0\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.618783 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd5a5191-b7b3-418b-8c8f-034808addca0-config-data\") pod \"dd5a5191-b7b3-418b-8c8f-034808addca0\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.618831 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd5a5191-b7b3-418b-8c8f-034808addca0-scripts\") pod \"dd5a5191-b7b3-418b-8c8f-034808addca0\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.618993 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5lsf\" (UniqueName: \"kubernetes.io/projected/dd5a5191-b7b3-418b-8c8f-034808addca0-kube-api-access-t5lsf\") pod \"dd5a5191-b7b3-418b-8c8f-034808addca0\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.619048 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd5a5191-b7b3-418b-8c8f-034808addca0-horizon-secret-key\") pod \"dd5a5191-b7b3-418b-8c8f-034808addca0\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.619084 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd5a5191-b7b3-418b-8c8f-034808addca0-combined-ca-bundle\") pod \"dd5a5191-b7b3-418b-8c8f-034808addca0\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.619152 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd5a5191-b7b3-418b-8c8f-034808addca0-logs\") pod \"dd5a5191-b7b3-418b-8c8f-034808addca0\" (UID: \"dd5a5191-b7b3-418b-8c8f-034808addca0\") " Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.620218 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd5a5191-b7b3-418b-8c8f-034808addca0-logs" (OuterVolumeSpecName: "logs") pod "dd5a5191-b7b3-418b-8c8f-034808addca0" (UID: "dd5a5191-b7b3-418b-8c8f-034808addca0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.624178 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd5a5191-b7b3-418b-8c8f-034808addca0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dd5a5191-b7b3-418b-8c8f-034808addca0" (UID: "dd5a5191-b7b3-418b-8c8f-034808addca0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.624470 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd5a5191-b7b3-418b-8c8f-034808addca0-kube-api-access-t5lsf" (OuterVolumeSpecName: "kube-api-access-t5lsf") pod "dd5a5191-b7b3-418b-8c8f-034808addca0" (UID: "dd5a5191-b7b3-418b-8c8f-034808addca0"). InnerVolumeSpecName "kube-api-access-t5lsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.647007 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd5a5191-b7b3-418b-8c8f-034808addca0-config-data" (OuterVolumeSpecName: "config-data") pod "dd5a5191-b7b3-418b-8c8f-034808addca0" (UID: "dd5a5191-b7b3-418b-8c8f-034808addca0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.652183 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd5a5191-b7b3-418b-8c8f-034808addca0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd5a5191-b7b3-418b-8c8f-034808addca0" (UID: "dd5a5191-b7b3-418b-8c8f-034808addca0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.660235 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd5a5191-b7b3-418b-8c8f-034808addca0-scripts" (OuterVolumeSpecName: "scripts") pod "dd5a5191-b7b3-418b-8c8f-034808addca0" (UID: "dd5a5191-b7b3-418b-8c8f-034808addca0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.673711 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd5a5191-b7b3-418b-8c8f-034808addca0-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "dd5a5191-b7b3-418b-8c8f-034808addca0" (UID: "dd5a5191-b7b3-418b-8c8f-034808addca0"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.721122 4954 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd5a5191-b7b3-418b-8c8f-034808addca0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.721163 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd5a5191-b7b3-418b-8c8f-034808addca0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.721173 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd5a5191-b7b3-418b-8c8f-034808addca0-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.721182 4954 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd5a5191-b7b3-418b-8c8f-034808addca0-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.721192 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd5a5191-b7b3-418b-8c8f-034808addca0-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.721201 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd5a5191-b7b3-418b-8c8f-034808addca0-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.721210 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5lsf\" (UniqueName: \"kubernetes.io/projected/dd5a5191-b7b3-418b-8c8f-034808addca0-kube-api-access-t5lsf\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.723978 4954 scope.go:117] "RemoveContainer" containerID="ef30c1734aeaa1953add15280082498ef0de72516e2a7026c2d45ea12a9373ff" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.745122 4954 scope.go:117] "RemoveContainer" containerID="0499fa6814d418760d7e8ca7096e3ef6a1cdab0903ac2fd783a78b1048b9a91a" Dec 06 09:05:31 crc kubenswrapper[4954]: E1206 09:05:31.745643 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0499fa6814d418760d7e8ca7096e3ef6a1cdab0903ac2fd783a78b1048b9a91a\": container with ID starting with 0499fa6814d418760d7e8ca7096e3ef6a1cdab0903ac2fd783a78b1048b9a91a not found: ID does not exist" containerID="0499fa6814d418760d7e8ca7096e3ef6a1cdab0903ac2fd783a78b1048b9a91a" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.745675 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0499fa6814d418760d7e8ca7096e3ef6a1cdab0903ac2fd783a78b1048b9a91a"} err="failed to get container status \"0499fa6814d418760d7e8ca7096e3ef6a1cdab0903ac2fd783a78b1048b9a91a\": rpc error: code = NotFound desc = could not find container \"0499fa6814d418760d7e8ca7096e3ef6a1cdab0903ac2fd783a78b1048b9a91a\": container with ID starting with 0499fa6814d418760d7e8ca7096e3ef6a1cdab0903ac2fd783a78b1048b9a91a not found: ID does not exist" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.745696 4954 scope.go:117] "RemoveContainer" containerID="ef30c1734aeaa1953add15280082498ef0de72516e2a7026c2d45ea12a9373ff" Dec 06 09:05:31 crc kubenswrapper[4954]: E1206 09:05:31.746199 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef30c1734aeaa1953add15280082498ef0de72516e2a7026c2d45ea12a9373ff\": container with ID starting with ef30c1734aeaa1953add15280082498ef0de72516e2a7026c2d45ea12a9373ff not found: ID does not exist" containerID="ef30c1734aeaa1953add15280082498ef0de72516e2a7026c2d45ea12a9373ff" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.746222 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef30c1734aeaa1953add15280082498ef0de72516e2a7026c2d45ea12a9373ff"} err="failed to get container status \"ef30c1734aeaa1953add15280082498ef0de72516e2a7026c2d45ea12a9373ff\": rpc error: code = NotFound desc = could not find container \"ef30c1734aeaa1953add15280082498ef0de72516e2a7026c2d45ea12a9373ff\": container with ID starting with ef30c1734aeaa1953add15280082498ef0de72516e2a7026c2d45ea12a9373ff not found: ID does not exist" Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.894424 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5b7c4f984d-gkz9f"] Dec 06 09:05:31 crc kubenswrapper[4954]: I1206 09:05:31.902877 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5b7c4f984d-gkz9f"] Dec 06 09:05:33 crc kubenswrapper[4954]: I1206 09:05:33.443341 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:05:33 crc kubenswrapper[4954]: E1206 09:05:33.443896 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:05:33 crc kubenswrapper[4954]: I1206 09:05:33.453433 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd5a5191-b7b3-418b-8c8f-034808addca0" path="/var/lib/kubelet/pods/dd5a5191-b7b3-418b-8c8f-034808addca0/volumes" Dec 06 09:05:41 crc kubenswrapper[4954]: I1206 09:05:41.048451 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-z8n9l"] Dec 06 09:05:41 crc kubenswrapper[4954]: I1206 09:05:41.057873 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fa1f-account-create-update-2lftp"] Dec 06 09:05:41 crc kubenswrapper[4954]: I1206 09:05:41.066352 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-fa1f-account-create-update-2lftp"] Dec 06 09:05:41 crc kubenswrapper[4954]: I1206 09:05:41.073883 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-z8n9l"] Dec 06 09:05:41 crc kubenswrapper[4954]: I1206 09:05:41.455085 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="667c7f97-0625-43e8-bd1d-b248f07f231f" path="/var/lib/kubelet/pods/667c7f97-0625-43e8-bd1d-b248f07f231f/volumes" Dec 06 09:05:41 crc kubenswrapper[4954]: I1206 09:05:41.455817 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2198ae3-47db-45ba-990d-2bf1c32800e3" path="/var/lib/kubelet/pods/c2198ae3-47db-45ba-990d-2bf1c32800e3/volumes" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.358769 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c75d6449b-lzlh9"] Dec 06 09:05:42 crc kubenswrapper[4954]: E1206 09:05:42.359496 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae2b9b5e-4640-4458-8335-e7afa9a587cc" containerName="horizon-log" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.359513 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae2b9b5e-4640-4458-8335-e7afa9a587cc" containerName="horizon-log" Dec 06 09:05:42 crc kubenswrapper[4954]: E1206 09:05:42.359530 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d89624ad-2d7f-4ab2-86f2-77936051c570" containerName="horizon" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.359537 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89624ad-2d7f-4ab2-86f2-77936051c570" containerName="horizon" Dec 06 09:05:42 crc kubenswrapper[4954]: E1206 09:05:42.359555 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd5a5191-b7b3-418b-8c8f-034808addca0" containerName="horizon-log" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.359583 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5a5191-b7b3-418b-8c8f-034808addca0" containerName="horizon-log" Dec 06 09:05:42 crc kubenswrapper[4954]: E1206 09:05:42.359622 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d89624ad-2d7f-4ab2-86f2-77936051c570" containerName="horizon-log" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.359632 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89624ad-2d7f-4ab2-86f2-77936051c570" containerName="horizon-log" Dec 06 09:05:42 crc kubenswrapper[4954]: E1206 09:05:42.359644 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae2b9b5e-4640-4458-8335-e7afa9a587cc" containerName="horizon" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.359651 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae2b9b5e-4640-4458-8335-e7afa9a587cc" containerName="horizon" Dec 06 09:05:42 crc kubenswrapper[4954]: E1206 09:05:42.359668 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd5a5191-b7b3-418b-8c8f-034808addca0" containerName="horizon" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.359676 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5a5191-b7b3-418b-8c8f-034808addca0" containerName="horizon" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.359898 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d89624ad-2d7f-4ab2-86f2-77936051c570" containerName="horizon" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.359916 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae2b9b5e-4640-4458-8335-e7afa9a587cc" containerName="horizon-log" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.359935 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd5a5191-b7b3-418b-8c8f-034808addca0" containerName="horizon" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.359960 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae2b9b5e-4640-4458-8335-e7afa9a587cc" containerName="horizon" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.359976 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d89624ad-2d7f-4ab2-86f2-77936051c570" containerName="horizon-log" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.359986 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd5a5191-b7b3-418b-8c8f-034808addca0" containerName="horizon-log" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.361272 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.384158 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c75d6449b-lzlh9"] Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.527389 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42795beb-5796-4fb5-a767-6d241d559e75-combined-ca-bundle\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.527454 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42795beb-5796-4fb5-a767-6d241d559e75-config-data\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.527477 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/42795beb-5796-4fb5-a767-6d241d559e75-horizon-secret-key\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.527495 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42795beb-5796-4fb5-a767-6d241d559e75-logs\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.528016 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/42795beb-5796-4fb5-a767-6d241d559e75-horizon-tls-certs\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.528196 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42795beb-5796-4fb5-a767-6d241d559e75-scripts\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.528311 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l452t\" (UniqueName: \"kubernetes.io/projected/42795beb-5796-4fb5-a767-6d241d559e75-kube-api-access-l452t\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.629624 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42795beb-5796-4fb5-a767-6d241d559e75-scripts\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.629681 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l452t\" (UniqueName: \"kubernetes.io/projected/42795beb-5796-4fb5-a767-6d241d559e75-kube-api-access-l452t\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.629758 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42795beb-5796-4fb5-a767-6d241d559e75-combined-ca-bundle\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.629832 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42795beb-5796-4fb5-a767-6d241d559e75-config-data\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.629865 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/42795beb-5796-4fb5-a767-6d241d559e75-horizon-secret-key\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.629880 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42795beb-5796-4fb5-a767-6d241d559e75-logs\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.630034 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/42795beb-5796-4fb5-a767-6d241d559e75-horizon-tls-certs\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.632924 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42795beb-5796-4fb5-a767-6d241d559e75-logs\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.632977 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42795beb-5796-4fb5-a767-6d241d559e75-scripts\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.634148 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42795beb-5796-4fb5-a767-6d241d559e75-config-data\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.636844 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/42795beb-5796-4fb5-a767-6d241d559e75-horizon-secret-key\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.648400 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42795beb-5796-4fb5-a767-6d241d559e75-combined-ca-bundle\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.649534 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l452t\" (UniqueName: \"kubernetes.io/projected/42795beb-5796-4fb5-a767-6d241d559e75-kube-api-access-l452t\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.650214 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/42795beb-5796-4fb5-a767-6d241d559e75-horizon-tls-certs\") pod \"horizon-7c75d6449b-lzlh9\" (UID: \"42795beb-5796-4fb5-a767-6d241d559e75\") " pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:42 crc kubenswrapper[4954]: I1206 09:05:42.687183 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.143299 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c75d6449b-lzlh9"] Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.676252 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c75d6449b-lzlh9" event={"ID":"42795beb-5796-4fb5-a767-6d241d559e75","Type":"ContainerStarted","Data":"041aa3b30353e425363b3ee0ea18ff49fbbb1e15b062a4af0149af933e8ab9d7"} Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.676616 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c75d6449b-lzlh9" event={"ID":"42795beb-5796-4fb5-a767-6d241d559e75","Type":"ContainerStarted","Data":"8e6833f12696ba6d4c42f7b4dbdc6461a15ee166eff55476b0432cb1892ae984"} Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.676629 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c75d6449b-lzlh9" event={"ID":"42795beb-5796-4fb5-a767-6d241d559e75","Type":"ContainerStarted","Data":"50b7512df8310987365b334c28a037b260feffd0b63ac043c6f6711a2f69e27f"} Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.697941 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-dlq25"] Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.699364 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-dlq25" Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.708524 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-dlq25"] Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.716370 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c75d6449b-lzlh9" podStartSLOduration=1.716337851 podStartE2EDuration="1.716337851s" podCreationTimestamp="2025-12-06 09:05:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:05:43.708725307 +0000 UTC m=+7718.522084696" watchObservedRunningTime="2025-12-06 09:05:43.716337851 +0000 UTC m=+7718.529697240" Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.786933 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-ade3-account-create-update-s4f7m"] Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.788423 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ade3-account-create-update-s4f7m" Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.791110 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.799150 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-ade3-account-create-update-s4f7m"] Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.856447 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d43ee893-bc98-4376-9581-e44fe5e18298-operator-scripts\") pod \"heat-db-create-dlq25\" (UID: \"d43ee893-bc98-4376-9581-e44fe5e18298\") " pod="openstack/heat-db-create-dlq25" Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.856547 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh9l9\" (UniqueName: \"kubernetes.io/projected/d43ee893-bc98-4376-9581-e44fe5e18298-kube-api-access-hh9l9\") pod \"heat-db-create-dlq25\" (UID: \"d43ee893-bc98-4376-9581-e44fe5e18298\") " pod="openstack/heat-db-create-dlq25" Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.958914 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ad2ae4b-d1b0-4f50-9b00-215df9ed6069-operator-scripts\") pod \"heat-ade3-account-create-update-s4f7m\" (UID: \"1ad2ae4b-d1b0-4f50-9b00-215df9ed6069\") " pod="openstack/heat-ade3-account-create-update-s4f7m" Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.958970 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d43ee893-bc98-4376-9581-e44fe5e18298-operator-scripts\") pod \"heat-db-create-dlq25\" (UID: \"d43ee893-bc98-4376-9581-e44fe5e18298\") " pod="openstack/heat-db-create-dlq25" Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.959004 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkb5n\" (UniqueName: \"kubernetes.io/projected/1ad2ae4b-d1b0-4f50-9b00-215df9ed6069-kube-api-access-tkb5n\") pod \"heat-ade3-account-create-update-s4f7m\" (UID: \"1ad2ae4b-d1b0-4f50-9b00-215df9ed6069\") " pod="openstack/heat-ade3-account-create-update-s4f7m" Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.959031 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh9l9\" (UniqueName: \"kubernetes.io/projected/d43ee893-bc98-4376-9581-e44fe5e18298-kube-api-access-hh9l9\") pod \"heat-db-create-dlq25\" (UID: \"d43ee893-bc98-4376-9581-e44fe5e18298\") " pod="openstack/heat-db-create-dlq25" Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.959733 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d43ee893-bc98-4376-9581-e44fe5e18298-operator-scripts\") pod \"heat-db-create-dlq25\" (UID: \"d43ee893-bc98-4376-9581-e44fe5e18298\") " pod="openstack/heat-db-create-dlq25" Dec 06 09:05:43 crc kubenswrapper[4954]: I1206 09:05:43.975816 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh9l9\" (UniqueName: \"kubernetes.io/projected/d43ee893-bc98-4376-9581-e44fe5e18298-kube-api-access-hh9l9\") pod \"heat-db-create-dlq25\" (UID: \"d43ee893-bc98-4376-9581-e44fe5e18298\") " pod="openstack/heat-db-create-dlq25" Dec 06 09:05:44 crc kubenswrapper[4954]: I1206 09:05:44.041921 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-dlq25" Dec 06 09:05:44 crc kubenswrapper[4954]: I1206 09:05:44.061276 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ad2ae4b-d1b0-4f50-9b00-215df9ed6069-operator-scripts\") pod \"heat-ade3-account-create-update-s4f7m\" (UID: \"1ad2ae4b-d1b0-4f50-9b00-215df9ed6069\") " pod="openstack/heat-ade3-account-create-update-s4f7m" Dec 06 09:05:44 crc kubenswrapper[4954]: I1206 09:05:44.061613 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkb5n\" (UniqueName: \"kubernetes.io/projected/1ad2ae4b-d1b0-4f50-9b00-215df9ed6069-kube-api-access-tkb5n\") pod \"heat-ade3-account-create-update-s4f7m\" (UID: \"1ad2ae4b-d1b0-4f50-9b00-215df9ed6069\") " pod="openstack/heat-ade3-account-create-update-s4f7m" Dec 06 09:05:44 crc kubenswrapper[4954]: I1206 09:05:44.062823 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ad2ae4b-d1b0-4f50-9b00-215df9ed6069-operator-scripts\") pod \"heat-ade3-account-create-update-s4f7m\" (UID: \"1ad2ae4b-d1b0-4f50-9b00-215df9ed6069\") " pod="openstack/heat-ade3-account-create-update-s4f7m" Dec 06 09:05:44 crc kubenswrapper[4954]: I1206 09:05:44.080388 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkb5n\" (UniqueName: \"kubernetes.io/projected/1ad2ae4b-d1b0-4f50-9b00-215df9ed6069-kube-api-access-tkb5n\") pod \"heat-ade3-account-create-update-s4f7m\" (UID: \"1ad2ae4b-d1b0-4f50-9b00-215df9ed6069\") " pod="openstack/heat-ade3-account-create-update-s4f7m" Dec 06 09:05:44 crc kubenswrapper[4954]: I1206 09:05:44.106008 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ade3-account-create-update-s4f7m" Dec 06 09:05:44 crc kubenswrapper[4954]: I1206 09:05:44.553009 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-dlq25"] Dec 06 09:05:44 crc kubenswrapper[4954]: I1206 09:05:44.623897 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-ade3-account-create-update-s4f7m"] Dec 06 09:05:44 crc kubenswrapper[4954]: W1206 09:05:44.636845 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ad2ae4b_d1b0_4f50_9b00_215df9ed6069.slice/crio-674e32b5e1d4917b253ea43cca0f248c2b334543b2dae0343458c420a3792909 WatchSource:0}: Error finding container 674e32b5e1d4917b253ea43cca0f248c2b334543b2dae0343458c420a3792909: Status 404 returned error can't find the container with id 674e32b5e1d4917b253ea43cca0f248c2b334543b2dae0343458c420a3792909 Dec 06 09:05:44 crc kubenswrapper[4954]: I1206 09:05:44.688223 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-ade3-account-create-update-s4f7m" event={"ID":"1ad2ae4b-d1b0-4f50-9b00-215df9ed6069","Type":"ContainerStarted","Data":"674e32b5e1d4917b253ea43cca0f248c2b334543b2dae0343458c420a3792909"} Dec 06 09:05:44 crc kubenswrapper[4954]: I1206 09:05:44.691980 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-dlq25" event={"ID":"d43ee893-bc98-4376-9581-e44fe5e18298","Type":"ContainerStarted","Data":"d30e759ce4873a30c57142bc3f8b0dca2a4739a7fc271a5f88809a9483ce9fdd"} Dec 06 09:05:45 crc kubenswrapper[4954]: I1206 09:05:45.709366 4954 generic.go:334] "Generic (PLEG): container finished" podID="d43ee893-bc98-4376-9581-e44fe5e18298" containerID="002c8462f127e7c759f1f27db9d32832385db0f3a9351ad070a87405de6a551b" exitCode=0 Dec 06 09:05:45 crc kubenswrapper[4954]: I1206 09:05:45.709504 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-dlq25" event={"ID":"d43ee893-bc98-4376-9581-e44fe5e18298","Type":"ContainerDied","Data":"002c8462f127e7c759f1f27db9d32832385db0f3a9351ad070a87405de6a551b"} Dec 06 09:05:45 crc kubenswrapper[4954]: I1206 09:05:45.714416 4954 generic.go:334] "Generic (PLEG): container finished" podID="1ad2ae4b-d1b0-4f50-9b00-215df9ed6069" containerID="c5269119b39b0da6fa77b6a1eeddbe5f9db8eb058c73553be6e40ed493806f4e" exitCode=0 Dec 06 09:05:45 crc kubenswrapper[4954]: I1206 09:05:45.714480 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-ade3-account-create-update-s4f7m" event={"ID":"1ad2ae4b-d1b0-4f50-9b00-215df9ed6069","Type":"ContainerDied","Data":"c5269119b39b0da6fa77b6a1eeddbe5f9db8eb058c73553be6e40ed493806f4e"} Dec 06 09:05:46 crc kubenswrapper[4954]: I1206 09:05:46.443321 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:05:46 crc kubenswrapper[4954]: E1206 09:05:46.443643 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.149373 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-dlq25" Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.158140 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ade3-account-create-update-s4f7m" Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.269150 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh9l9\" (UniqueName: \"kubernetes.io/projected/d43ee893-bc98-4376-9581-e44fe5e18298-kube-api-access-hh9l9\") pod \"d43ee893-bc98-4376-9581-e44fe5e18298\" (UID: \"d43ee893-bc98-4376-9581-e44fe5e18298\") " Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.269210 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d43ee893-bc98-4376-9581-e44fe5e18298-operator-scripts\") pod \"d43ee893-bc98-4376-9581-e44fe5e18298\" (UID: \"d43ee893-bc98-4376-9581-e44fe5e18298\") " Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.269313 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ad2ae4b-d1b0-4f50-9b00-215df9ed6069-operator-scripts\") pod \"1ad2ae4b-d1b0-4f50-9b00-215df9ed6069\" (UID: \"1ad2ae4b-d1b0-4f50-9b00-215df9ed6069\") " Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.269410 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkb5n\" (UniqueName: \"kubernetes.io/projected/1ad2ae4b-d1b0-4f50-9b00-215df9ed6069-kube-api-access-tkb5n\") pod \"1ad2ae4b-d1b0-4f50-9b00-215df9ed6069\" (UID: \"1ad2ae4b-d1b0-4f50-9b00-215df9ed6069\") " Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.269788 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ad2ae4b-d1b0-4f50-9b00-215df9ed6069-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1ad2ae4b-d1b0-4f50-9b00-215df9ed6069" (UID: "1ad2ae4b-d1b0-4f50-9b00-215df9ed6069"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.269843 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d43ee893-bc98-4376-9581-e44fe5e18298-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d43ee893-bc98-4376-9581-e44fe5e18298" (UID: "d43ee893-bc98-4376-9581-e44fe5e18298"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.269950 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d43ee893-bc98-4376-9581-e44fe5e18298-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.269972 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ad2ae4b-d1b0-4f50-9b00-215df9ed6069-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.274849 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d43ee893-bc98-4376-9581-e44fe5e18298-kube-api-access-hh9l9" (OuterVolumeSpecName: "kube-api-access-hh9l9") pod "d43ee893-bc98-4376-9581-e44fe5e18298" (UID: "d43ee893-bc98-4376-9581-e44fe5e18298"). InnerVolumeSpecName "kube-api-access-hh9l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.275126 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ad2ae4b-d1b0-4f50-9b00-215df9ed6069-kube-api-access-tkb5n" (OuterVolumeSpecName: "kube-api-access-tkb5n") pod "1ad2ae4b-d1b0-4f50-9b00-215df9ed6069" (UID: "1ad2ae4b-d1b0-4f50-9b00-215df9ed6069"). InnerVolumeSpecName "kube-api-access-tkb5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.371737 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkb5n\" (UniqueName: \"kubernetes.io/projected/1ad2ae4b-d1b0-4f50-9b00-215df9ed6069-kube-api-access-tkb5n\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.371773 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh9l9\" (UniqueName: \"kubernetes.io/projected/d43ee893-bc98-4376-9581-e44fe5e18298-kube-api-access-hh9l9\") on node \"crc\" DevicePath \"\"" Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.740419 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ade3-account-create-update-s4f7m" Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.740449 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-ade3-account-create-update-s4f7m" event={"ID":"1ad2ae4b-d1b0-4f50-9b00-215df9ed6069","Type":"ContainerDied","Data":"674e32b5e1d4917b253ea43cca0f248c2b334543b2dae0343458c420a3792909"} Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.740636 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="674e32b5e1d4917b253ea43cca0f248c2b334543b2dae0343458c420a3792909" Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.741990 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-dlq25" event={"ID":"d43ee893-bc98-4376-9581-e44fe5e18298","Type":"ContainerDied","Data":"d30e759ce4873a30c57142bc3f8b0dca2a4739a7fc271a5f88809a9483ce9fdd"} Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.742338 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d30e759ce4873a30c57142bc3f8b0dca2a4739a7fc271a5f88809a9483ce9fdd" Dec 06 09:05:47 crc kubenswrapper[4954]: I1206 09:05:47.742034 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-dlq25" Dec 06 09:05:48 crc kubenswrapper[4954]: I1206 09:05:48.913695 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-w7d87"] Dec 06 09:05:48 crc kubenswrapper[4954]: E1206 09:05:48.914217 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d43ee893-bc98-4376-9581-e44fe5e18298" containerName="mariadb-database-create" Dec 06 09:05:48 crc kubenswrapper[4954]: I1206 09:05:48.914233 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d43ee893-bc98-4376-9581-e44fe5e18298" containerName="mariadb-database-create" Dec 06 09:05:48 crc kubenswrapper[4954]: E1206 09:05:48.914252 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad2ae4b-d1b0-4f50-9b00-215df9ed6069" containerName="mariadb-account-create-update" Dec 06 09:05:48 crc kubenswrapper[4954]: I1206 09:05:48.914261 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad2ae4b-d1b0-4f50-9b00-215df9ed6069" containerName="mariadb-account-create-update" Dec 06 09:05:48 crc kubenswrapper[4954]: I1206 09:05:48.914498 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d43ee893-bc98-4376-9581-e44fe5e18298" containerName="mariadb-database-create" Dec 06 09:05:48 crc kubenswrapper[4954]: I1206 09:05:48.914539 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ad2ae4b-d1b0-4f50-9b00-215df9ed6069" containerName="mariadb-account-create-update" Dec 06 09:05:48 crc kubenswrapper[4954]: I1206 09:05:48.915416 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-w7d87" Dec 06 09:05:48 crc kubenswrapper[4954]: I1206 09:05:48.921789 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-rvftz" Dec 06 09:05:48 crc kubenswrapper[4954]: I1206 09:05:48.936472 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-w7d87"] Dec 06 09:05:48 crc kubenswrapper[4954]: I1206 09:05:48.942408 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 06 09:05:49 crc kubenswrapper[4954]: I1206 09:05:49.107391 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e-combined-ca-bundle\") pod \"heat-db-sync-w7d87\" (UID: \"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e\") " pod="openstack/heat-db-sync-w7d87" Dec 06 09:05:49 crc kubenswrapper[4954]: I1206 09:05:49.107452 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq8pc\" (UniqueName: \"kubernetes.io/projected/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e-kube-api-access-kq8pc\") pod \"heat-db-sync-w7d87\" (UID: \"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e\") " pod="openstack/heat-db-sync-w7d87" Dec 06 09:05:49 crc kubenswrapper[4954]: I1206 09:05:49.107491 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e-config-data\") pod \"heat-db-sync-w7d87\" (UID: \"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e\") " pod="openstack/heat-db-sync-w7d87" Dec 06 09:05:49 crc kubenswrapper[4954]: I1206 09:05:49.209776 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e-combined-ca-bundle\") pod \"heat-db-sync-w7d87\" (UID: \"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e\") " pod="openstack/heat-db-sync-w7d87" Dec 06 09:05:49 crc kubenswrapper[4954]: I1206 09:05:49.210107 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq8pc\" (UniqueName: \"kubernetes.io/projected/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e-kube-api-access-kq8pc\") pod \"heat-db-sync-w7d87\" (UID: \"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e\") " pod="openstack/heat-db-sync-w7d87" Dec 06 09:05:49 crc kubenswrapper[4954]: I1206 09:05:49.210143 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e-config-data\") pod \"heat-db-sync-w7d87\" (UID: \"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e\") " pod="openstack/heat-db-sync-w7d87" Dec 06 09:05:49 crc kubenswrapper[4954]: I1206 09:05:49.214982 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e-combined-ca-bundle\") pod \"heat-db-sync-w7d87\" (UID: \"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e\") " pod="openstack/heat-db-sync-w7d87" Dec 06 09:05:49 crc kubenswrapper[4954]: I1206 09:05:49.216531 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e-config-data\") pod \"heat-db-sync-w7d87\" (UID: \"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e\") " pod="openstack/heat-db-sync-w7d87" Dec 06 09:05:49 crc kubenswrapper[4954]: I1206 09:05:49.227217 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq8pc\" (UniqueName: \"kubernetes.io/projected/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e-kube-api-access-kq8pc\") pod \"heat-db-sync-w7d87\" (UID: \"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e\") " pod="openstack/heat-db-sync-w7d87" Dec 06 09:05:49 crc kubenswrapper[4954]: I1206 09:05:49.253944 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-w7d87" Dec 06 09:05:49 crc kubenswrapper[4954]: I1206 09:05:49.522181 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-w7d87"] Dec 06 09:05:49 crc kubenswrapper[4954]: W1206 09:05:49.530825 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91de0dbc_f8be_4cf7_89fa_6cc87a075b0e.slice/crio-2acc279d6ce895470d7fadf13b42f3212a684680ae82f103551c14ad8c5683f4 WatchSource:0}: Error finding container 2acc279d6ce895470d7fadf13b42f3212a684680ae82f103551c14ad8c5683f4: Status 404 returned error can't find the container with id 2acc279d6ce895470d7fadf13b42f3212a684680ae82f103551c14ad8c5683f4 Dec 06 09:05:49 crc kubenswrapper[4954]: I1206 09:05:49.764400 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-w7d87" event={"ID":"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e","Type":"ContainerStarted","Data":"2acc279d6ce895470d7fadf13b42f3212a684680ae82f103551c14ad8c5683f4"} Dec 06 09:05:51 crc kubenswrapper[4954]: I1206 09:05:51.055125 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-nntvl"] Dec 06 09:05:51 crc kubenswrapper[4954]: I1206 09:05:51.065222 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-nntvl"] Dec 06 09:05:51 crc kubenswrapper[4954]: I1206 09:05:51.453004 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bcff096-e20d-4ca1-9211-d48324b0fb6a" path="/var/lib/kubelet/pods/5bcff096-e20d-4ca1-9211-d48324b0fb6a/volumes" Dec 06 09:05:52 crc kubenswrapper[4954]: I1206 09:05:52.688201 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:52 crc kubenswrapper[4954]: I1206 09:05:52.688826 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:05:58 crc kubenswrapper[4954]: I1206 09:05:58.863265 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-w7d87" event={"ID":"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e","Type":"ContainerStarted","Data":"266c92b1584be38d42e3adeb64aa03413b15bf3ad068a7df81ab5d73661b99c0"} Dec 06 09:05:58 crc kubenswrapper[4954]: I1206 09:05:58.886516 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-w7d87" podStartSLOduration=2.489573661 podStartE2EDuration="10.886495106s" podCreationTimestamp="2025-12-06 09:05:48 +0000 UTC" firstStartedPulling="2025-12-06 09:05:49.535198804 +0000 UTC m=+7724.348558193" lastFinishedPulling="2025-12-06 09:05:57.932120239 +0000 UTC m=+7732.745479638" observedRunningTime="2025-12-06 09:05:58.880935797 +0000 UTC m=+7733.694295196" watchObservedRunningTime="2025-12-06 09:05:58.886495106 +0000 UTC m=+7733.699854495" Dec 06 09:06:00 crc kubenswrapper[4954]: I1206 09:06:00.443743 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:06:00 crc kubenswrapper[4954]: E1206 09:06:00.445328 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:06:00 crc kubenswrapper[4954]: I1206 09:06:00.883002 4954 generic.go:334] "Generic (PLEG): container finished" podID="91de0dbc-f8be-4cf7-89fa-6cc87a075b0e" containerID="266c92b1584be38d42e3adeb64aa03413b15bf3ad068a7df81ab5d73661b99c0" exitCode=0 Dec 06 09:06:00 crc kubenswrapper[4954]: I1206 09:06:00.883449 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-w7d87" event={"ID":"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e","Type":"ContainerDied","Data":"266c92b1584be38d42e3adeb64aa03413b15bf3ad068a7df81ab5d73661b99c0"} Dec 06 09:06:02 crc kubenswrapper[4954]: I1206 09:06:02.264059 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-w7d87" Dec 06 09:06:02 crc kubenswrapper[4954]: I1206 09:06:02.358341 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e-config-data\") pod \"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e\" (UID: \"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e\") " Dec 06 09:06:02 crc kubenswrapper[4954]: I1206 09:06:02.358807 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq8pc\" (UniqueName: \"kubernetes.io/projected/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e-kube-api-access-kq8pc\") pod \"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e\" (UID: \"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e\") " Dec 06 09:06:02 crc kubenswrapper[4954]: I1206 09:06:02.358876 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e-combined-ca-bundle\") pod \"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e\" (UID: \"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e\") " Dec 06 09:06:02 crc kubenswrapper[4954]: I1206 09:06:02.365042 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e-kube-api-access-kq8pc" (OuterVolumeSpecName: "kube-api-access-kq8pc") pod "91de0dbc-f8be-4cf7-89fa-6cc87a075b0e" (UID: "91de0dbc-f8be-4cf7-89fa-6cc87a075b0e"). InnerVolumeSpecName "kube-api-access-kq8pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:02 crc kubenswrapper[4954]: I1206 09:06:02.387039 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91de0dbc-f8be-4cf7-89fa-6cc87a075b0e" (UID: "91de0dbc-f8be-4cf7-89fa-6cc87a075b0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:02 crc kubenswrapper[4954]: I1206 09:06:02.431879 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e-config-data" (OuterVolumeSpecName: "config-data") pod "91de0dbc-f8be-4cf7-89fa-6cc87a075b0e" (UID: "91de0dbc-f8be-4cf7-89fa-6cc87a075b0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:02 crc kubenswrapper[4954]: I1206 09:06:02.460920 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:02 crc kubenswrapper[4954]: I1206 09:06:02.460964 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:02 crc kubenswrapper[4954]: I1206 09:06:02.461006 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq8pc\" (UniqueName: \"kubernetes.io/projected/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e-kube-api-access-kq8pc\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:02 crc kubenswrapper[4954]: I1206 09:06:02.689338 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c75d6449b-lzlh9" podUID="42795beb-5796-4fb5-a767-6d241d559e75" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.118:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.118:8443: connect: connection refused" Dec 06 09:06:02 crc kubenswrapper[4954]: I1206 09:06:02.900027 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-w7d87" event={"ID":"91de0dbc-f8be-4cf7-89fa-6cc87a075b0e","Type":"ContainerDied","Data":"2acc279d6ce895470d7fadf13b42f3212a684680ae82f103551c14ad8c5683f4"} Dec 06 09:06:02 crc kubenswrapper[4954]: I1206 09:06:02.900061 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-w7d87" Dec 06 09:06:02 crc kubenswrapper[4954]: I1206 09:06:02.900065 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2acc279d6ce895470d7fadf13b42f3212a684680ae82f103551c14ad8c5683f4" Dec 06 09:06:03 crc kubenswrapper[4954]: I1206 09:06:03.887939 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6cbfdb884b-xsxh7"] Dec 06 09:06:03 crc kubenswrapper[4954]: E1206 09:06:03.889206 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91de0dbc-f8be-4cf7-89fa-6cc87a075b0e" containerName="heat-db-sync" Dec 06 09:06:03 crc kubenswrapper[4954]: I1206 09:06:03.889343 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="91de0dbc-f8be-4cf7-89fa-6cc87a075b0e" containerName="heat-db-sync" Dec 06 09:06:03 crc kubenswrapper[4954]: I1206 09:06:03.889721 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="91de0dbc-f8be-4cf7-89fa-6cc87a075b0e" containerName="heat-db-sync" Dec 06 09:06:03 crc kubenswrapper[4954]: I1206 09:06:03.890676 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6cbfdb884b-xsxh7" Dec 06 09:06:03 crc kubenswrapper[4954]: I1206 09:06:03.892890 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-rvftz" Dec 06 09:06:03 crc kubenswrapper[4954]: I1206 09:06:03.893267 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 06 09:06:03 crc kubenswrapper[4954]: I1206 09:06:03.893532 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 06 09:06:03 crc kubenswrapper[4954]: I1206 09:06:03.898952 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6cbfdb884b-xsxh7"] Dec 06 09:06:03 crc kubenswrapper[4954]: I1206 09:06:03.989167 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c709125b-f0e4-46a0-94f1-e13feafc362e-config-data-custom\") pod \"heat-engine-6cbfdb884b-xsxh7\" (UID: \"c709125b-f0e4-46a0-94f1-e13feafc362e\") " pod="openstack/heat-engine-6cbfdb884b-xsxh7" Dec 06 09:06:03 crc kubenswrapper[4954]: I1206 09:06:03.989263 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdwt2\" (UniqueName: \"kubernetes.io/projected/c709125b-f0e4-46a0-94f1-e13feafc362e-kube-api-access-tdwt2\") pod \"heat-engine-6cbfdb884b-xsxh7\" (UID: \"c709125b-f0e4-46a0-94f1-e13feafc362e\") " pod="openstack/heat-engine-6cbfdb884b-xsxh7" Dec 06 09:06:03 crc kubenswrapper[4954]: I1206 09:06:03.989517 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c709125b-f0e4-46a0-94f1-e13feafc362e-config-data\") pod \"heat-engine-6cbfdb884b-xsxh7\" (UID: \"c709125b-f0e4-46a0-94f1-e13feafc362e\") " pod="openstack/heat-engine-6cbfdb884b-xsxh7" Dec 06 09:06:03 crc kubenswrapper[4954]: I1206 09:06:03.989642 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c709125b-f0e4-46a0-94f1-e13feafc362e-combined-ca-bundle\") pod \"heat-engine-6cbfdb884b-xsxh7\" (UID: \"c709125b-f0e4-46a0-94f1-e13feafc362e\") " pod="openstack/heat-engine-6cbfdb884b-xsxh7" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.051357 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-bc845c464-m8g4r"] Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.052974 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-bc845c464-m8g4r" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.065653 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-bc845c464-m8g4r"] Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.073353 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7499b56df5-ql897"] Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.074968 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7499b56df5-ql897" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.078014 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.082950 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.092406 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c709125b-f0e4-46a0-94f1-e13feafc362e-config-data\") pod \"heat-engine-6cbfdb884b-xsxh7\" (UID: \"c709125b-f0e4-46a0-94f1-e13feafc362e\") " pod="openstack/heat-engine-6cbfdb884b-xsxh7" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.092685 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c709125b-f0e4-46a0-94f1-e13feafc362e-combined-ca-bundle\") pod \"heat-engine-6cbfdb884b-xsxh7\" (UID: \"c709125b-f0e4-46a0-94f1-e13feafc362e\") " pod="openstack/heat-engine-6cbfdb884b-xsxh7" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.092846 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c709125b-f0e4-46a0-94f1-e13feafc362e-config-data-custom\") pod \"heat-engine-6cbfdb884b-xsxh7\" (UID: \"c709125b-f0e4-46a0-94f1-e13feafc362e\") " pod="openstack/heat-engine-6cbfdb884b-xsxh7" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.092982 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdwt2\" (UniqueName: \"kubernetes.io/projected/c709125b-f0e4-46a0-94f1-e13feafc362e-kube-api-access-tdwt2\") pod \"heat-engine-6cbfdb884b-xsxh7\" (UID: \"c709125b-f0e4-46a0-94f1-e13feafc362e\") " pod="openstack/heat-engine-6cbfdb884b-xsxh7" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.101385 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7499b56df5-ql897"] Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.102640 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c709125b-f0e4-46a0-94f1-e13feafc362e-config-data-custom\") pod \"heat-engine-6cbfdb884b-xsxh7\" (UID: \"c709125b-f0e4-46a0-94f1-e13feafc362e\") " pod="openstack/heat-engine-6cbfdb884b-xsxh7" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.110845 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c709125b-f0e4-46a0-94f1-e13feafc362e-config-data\") pod \"heat-engine-6cbfdb884b-xsxh7\" (UID: \"c709125b-f0e4-46a0-94f1-e13feafc362e\") " pod="openstack/heat-engine-6cbfdb884b-xsxh7" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.120931 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c709125b-f0e4-46a0-94f1-e13feafc362e-combined-ca-bundle\") pod \"heat-engine-6cbfdb884b-xsxh7\" (UID: \"c709125b-f0e4-46a0-94f1-e13feafc362e\") " pod="openstack/heat-engine-6cbfdb884b-xsxh7" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.127752 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdwt2\" (UniqueName: \"kubernetes.io/projected/c709125b-f0e4-46a0-94f1-e13feafc362e-kube-api-access-tdwt2\") pod \"heat-engine-6cbfdb884b-xsxh7\" (UID: \"c709125b-f0e4-46a0-94f1-e13feafc362e\") " pod="openstack/heat-engine-6cbfdb884b-xsxh7" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.194700 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbd10d0-15f8-44a6-997b-974371621108-combined-ca-bundle\") pod \"heat-api-bc845c464-m8g4r\" (UID: \"8fbd10d0-15f8-44a6-997b-974371621108\") " pod="openstack/heat-api-bc845c464-m8g4r" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.194761 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgm25\" (UniqueName: \"kubernetes.io/projected/8fbd10d0-15f8-44a6-997b-974371621108-kube-api-access-xgm25\") pod \"heat-api-bc845c464-m8g4r\" (UID: \"8fbd10d0-15f8-44a6-997b-974371621108\") " pod="openstack/heat-api-bc845c464-m8g4r" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.194792 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4zrv\" (UniqueName: \"kubernetes.io/projected/2af86654-409d-4f8b-909c-dfe1930993e4-kube-api-access-v4zrv\") pod \"heat-cfnapi-7499b56df5-ql897\" (UID: \"2af86654-409d-4f8b-909c-dfe1930993e4\") " pod="openstack/heat-cfnapi-7499b56df5-ql897" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.194814 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbd10d0-15f8-44a6-997b-974371621108-config-data\") pod \"heat-api-bc845c464-m8g4r\" (UID: \"8fbd10d0-15f8-44a6-997b-974371621108\") " pod="openstack/heat-api-bc845c464-m8g4r" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.195337 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fbd10d0-15f8-44a6-997b-974371621108-config-data-custom\") pod \"heat-api-bc845c464-m8g4r\" (UID: \"8fbd10d0-15f8-44a6-997b-974371621108\") " pod="openstack/heat-api-bc845c464-m8g4r" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.195410 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2af86654-409d-4f8b-909c-dfe1930993e4-config-data-custom\") pod \"heat-cfnapi-7499b56df5-ql897\" (UID: \"2af86654-409d-4f8b-909c-dfe1930993e4\") " pod="openstack/heat-cfnapi-7499b56df5-ql897" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.195643 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af86654-409d-4f8b-909c-dfe1930993e4-combined-ca-bundle\") pod \"heat-cfnapi-7499b56df5-ql897\" (UID: \"2af86654-409d-4f8b-909c-dfe1930993e4\") " pod="openstack/heat-cfnapi-7499b56df5-ql897" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.195689 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2af86654-409d-4f8b-909c-dfe1930993e4-config-data\") pod \"heat-cfnapi-7499b56df5-ql897\" (UID: \"2af86654-409d-4f8b-909c-dfe1930993e4\") " pod="openstack/heat-cfnapi-7499b56df5-ql897" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.208217 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6cbfdb884b-xsxh7" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.297512 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbd10d0-15f8-44a6-997b-974371621108-combined-ca-bundle\") pod \"heat-api-bc845c464-m8g4r\" (UID: \"8fbd10d0-15f8-44a6-997b-974371621108\") " pod="openstack/heat-api-bc845c464-m8g4r" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.297593 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgm25\" (UniqueName: \"kubernetes.io/projected/8fbd10d0-15f8-44a6-997b-974371621108-kube-api-access-xgm25\") pod \"heat-api-bc845c464-m8g4r\" (UID: \"8fbd10d0-15f8-44a6-997b-974371621108\") " pod="openstack/heat-api-bc845c464-m8g4r" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.297633 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4zrv\" (UniqueName: \"kubernetes.io/projected/2af86654-409d-4f8b-909c-dfe1930993e4-kube-api-access-v4zrv\") pod \"heat-cfnapi-7499b56df5-ql897\" (UID: \"2af86654-409d-4f8b-909c-dfe1930993e4\") " pod="openstack/heat-cfnapi-7499b56df5-ql897" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.297654 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbd10d0-15f8-44a6-997b-974371621108-config-data\") pod \"heat-api-bc845c464-m8g4r\" (UID: \"8fbd10d0-15f8-44a6-997b-974371621108\") " pod="openstack/heat-api-bc845c464-m8g4r" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.297710 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fbd10d0-15f8-44a6-997b-974371621108-config-data-custom\") pod \"heat-api-bc845c464-m8g4r\" (UID: \"8fbd10d0-15f8-44a6-997b-974371621108\") " pod="openstack/heat-api-bc845c464-m8g4r" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.297730 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2af86654-409d-4f8b-909c-dfe1930993e4-config-data-custom\") pod \"heat-cfnapi-7499b56df5-ql897\" (UID: \"2af86654-409d-4f8b-909c-dfe1930993e4\") " pod="openstack/heat-cfnapi-7499b56df5-ql897" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.297785 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af86654-409d-4f8b-909c-dfe1930993e4-combined-ca-bundle\") pod \"heat-cfnapi-7499b56df5-ql897\" (UID: \"2af86654-409d-4f8b-909c-dfe1930993e4\") " pod="openstack/heat-cfnapi-7499b56df5-ql897" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.297799 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2af86654-409d-4f8b-909c-dfe1930993e4-config-data\") pod \"heat-cfnapi-7499b56df5-ql897\" (UID: \"2af86654-409d-4f8b-909c-dfe1930993e4\") " pod="openstack/heat-cfnapi-7499b56df5-ql897" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.303658 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fbd10d0-15f8-44a6-997b-974371621108-config-data-custom\") pod \"heat-api-bc845c464-m8g4r\" (UID: \"8fbd10d0-15f8-44a6-997b-974371621108\") " pod="openstack/heat-api-bc845c464-m8g4r" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.304220 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbd10d0-15f8-44a6-997b-974371621108-combined-ca-bundle\") pod \"heat-api-bc845c464-m8g4r\" (UID: \"8fbd10d0-15f8-44a6-997b-974371621108\") " pod="openstack/heat-api-bc845c464-m8g4r" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.305790 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbd10d0-15f8-44a6-997b-974371621108-config-data\") pod \"heat-api-bc845c464-m8g4r\" (UID: \"8fbd10d0-15f8-44a6-997b-974371621108\") " pod="openstack/heat-api-bc845c464-m8g4r" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.309116 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af86654-409d-4f8b-909c-dfe1930993e4-combined-ca-bundle\") pod \"heat-cfnapi-7499b56df5-ql897\" (UID: \"2af86654-409d-4f8b-909c-dfe1930993e4\") " pod="openstack/heat-cfnapi-7499b56df5-ql897" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.311707 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2af86654-409d-4f8b-909c-dfe1930993e4-config-data\") pod \"heat-cfnapi-7499b56df5-ql897\" (UID: \"2af86654-409d-4f8b-909c-dfe1930993e4\") " pod="openstack/heat-cfnapi-7499b56df5-ql897" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.320410 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2af86654-409d-4f8b-909c-dfe1930993e4-config-data-custom\") pod \"heat-cfnapi-7499b56df5-ql897\" (UID: \"2af86654-409d-4f8b-909c-dfe1930993e4\") " pod="openstack/heat-cfnapi-7499b56df5-ql897" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.320988 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgm25\" (UniqueName: \"kubernetes.io/projected/8fbd10d0-15f8-44a6-997b-974371621108-kube-api-access-xgm25\") pod \"heat-api-bc845c464-m8g4r\" (UID: \"8fbd10d0-15f8-44a6-997b-974371621108\") " pod="openstack/heat-api-bc845c464-m8g4r" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.321363 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4zrv\" (UniqueName: \"kubernetes.io/projected/2af86654-409d-4f8b-909c-dfe1930993e4-kube-api-access-v4zrv\") pod \"heat-cfnapi-7499b56df5-ql897\" (UID: \"2af86654-409d-4f8b-909c-dfe1930993e4\") " pod="openstack/heat-cfnapi-7499b56df5-ql897" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.400310 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-bc845c464-m8g4r" Dec 06 09:06:04 crc kubenswrapper[4954]: I1206 09:06:04.530686 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7499b56df5-ql897" Dec 06 09:06:05 crc kubenswrapper[4954]: I1206 09:06:04.752776 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6cbfdb884b-xsxh7"] Dec 06 09:06:05 crc kubenswrapper[4954]: I1206 09:06:04.946303 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6cbfdb884b-xsxh7" event={"ID":"c709125b-f0e4-46a0-94f1-e13feafc362e","Type":"ContainerStarted","Data":"9f4759073045d996020dc3da59ee48d0e339fda8617388afe904dac0f73a557e"} Dec 06 09:06:05 crc kubenswrapper[4954]: I1206 09:06:05.579523 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-bc845c464-m8g4r"] Dec 06 09:06:05 crc kubenswrapper[4954]: I1206 09:06:05.635146 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7499b56df5-ql897"] Dec 06 09:06:05 crc kubenswrapper[4954]: W1206 09:06:05.639773 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2af86654_409d_4f8b_909c_dfe1930993e4.slice/crio-e2d45112366e4b1bdaf2be083d7f0cc2dbb419566c887fc110bf5d613c0e4f0d WatchSource:0}: Error finding container e2d45112366e4b1bdaf2be083d7f0cc2dbb419566c887fc110bf5d613c0e4f0d: Status 404 returned error can't find the container with id e2d45112366e4b1bdaf2be083d7f0cc2dbb419566c887fc110bf5d613c0e4f0d Dec 06 09:06:05 crc kubenswrapper[4954]: I1206 09:06:05.956978 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6cbfdb884b-xsxh7" event={"ID":"c709125b-f0e4-46a0-94f1-e13feafc362e","Type":"ContainerStarted","Data":"1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d"} Dec 06 09:06:05 crc kubenswrapper[4954]: I1206 09:06:05.957511 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6cbfdb884b-xsxh7" Dec 06 09:06:05 crc kubenswrapper[4954]: I1206 09:06:05.960412 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-bc845c464-m8g4r" event={"ID":"8fbd10d0-15f8-44a6-997b-974371621108","Type":"ContainerStarted","Data":"3837fa902ae7066b3c944e6c3bf4ab669be521d6475e39456400465f62e650fd"} Dec 06 09:06:05 crc kubenswrapper[4954]: I1206 09:06:05.962488 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7499b56df5-ql897" event={"ID":"2af86654-409d-4f8b-909c-dfe1930993e4","Type":"ContainerStarted","Data":"e2d45112366e4b1bdaf2be083d7f0cc2dbb419566c887fc110bf5d613c0e4f0d"} Dec 06 09:06:05 crc kubenswrapper[4954]: I1206 09:06:05.984816 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6cbfdb884b-xsxh7" podStartSLOduration=2.984791172 podStartE2EDuration="2.984791172s" podCreationTimestamp="2025-12-06 09:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:06:05.975511574 +0000 UTC m=+7740.788870963" watchObservedRunningTime="2025-12-06 09:06:05.984791172 +0000 UTC m=+7740.798150561" Dec 06 09:06:07 crc kubenswrapper[4954]: I1206 09:06:07.984834 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7499b56df5-ql897" event={"ID":"2af86654-409d-4f8b-909c-dfe1930993e4","Type":"ContainerStarted","Data":"93cccc4cd7a08fd67bebb0ab739f3db92d3d4902537f1b9b2e1d69e710f0778e"} Dec 06 09:06:07 crc kubenswrapper[4954]: I1206 09:06:07.985371 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7499b56df5-ql897" Dec 06 09:06:07 crc kubenswrapper[4954]: I1206 09:06:07.988166 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-bc845c464-m8g4r" event={"ID":"8fbd10d0-15f8-44a6-997b-974371621108","Type":"ContainerStarted","Data":"45bd7f9435726a94de00386b4c7d07760546b063634963611255646eb6ab28ca"} Dec 06 09:06:07 crc kubenswrapper[4954]: I1206 09:06:07.988416 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-bc845c464-m8g4r" Dec 06 09:06:08 crc kubenswrapper[4954]: I1206 09:06:08.000298 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7499b56df5-ql897" podStartSLOduration=2.512008035 podStartE2EDuration="4.000276025s" podCreationTimestamp="2025-12-06 09:06:04 +0000 UTC" firstStartedPulling="2025-12-06 09:06:05.642936284 +0000 UTC m=+7740.456295663" lastFinishedPulling="2025-12-06 09:06:07.131204264 +0000 UTC m=+7741.944563653" observedRunningTime="2025-12-06 09:06:07.997608993 +0000 UTC m=+7742.810968382" watchObservedRunningTime="2025-12-06 09:06:08.000276025 +0000 UTC m=+7742.813635414" Dec 06 09:06:08 crc kubenswrapper[4954]: I1206 09:06:08.020298 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-bc845c464-m8g4r" podStartSLOduration=2.486751729 podStartE2EDuration="4.020279811s" podCreationTimestamp="2025-12-06 09:06:04 +0000 UTC" firstStartedPulling="2025-12-06 09:06:05.594582139 +0000 UTC m=+7740.407941528" lastFinishedPulling="2025-12-06 09:06:07.128110211 +0000 UTC m=+7741.941469610" observedRunningTime="2025-12-06 09:06:08.014437294 +0000 UTC m=+7742.827796683" watchObservedRunningTime="2025-12-06 09:06:08.020279811 +0000 UTC m=+7742.833639200" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.060571 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-944444f77-s592d"] Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.063229 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-944444f77-s592d" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.081080 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-944444f77-s592d"] Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.093533 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7f556dfdb9-m6w2r"] Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.095484 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7f556dfdb9-m6w2r" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.108997 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6877547978-9nvt2"] Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.110436 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.121498 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7f556dfdb9-m6w2r"] Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.152108 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6877547978-9nvt2"] Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.152883 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6mgl\" (UniqueName: \"kubernetes.io/projected/2027c398-a02d-4d9b-a2b7-5ff833e850c3-kube-api-access-q6mgl\") pod \"heat-engine-944444f77-s592d\" (UID: \"2027c398-a02d-4d9b-a2b7-5ff833e850c3\") " pod="openstack/heat-engine-944444f77-s592d" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.152974 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2027c398-a02d-4d9b-a2b7-5ff833e850c3-combined-ca-bundle\") pod \"heat-engine-944444f77-s592d\" (UID: \"2027c398-a02d-4d9b-a2b7-5ff833e850c3\") " pod="openstack/heat-engine-944444f77-s592d" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.153223 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2027c398-a02d-4d9b-a2b7-5ff833e850c3-config-data\") pod \"heat-engine-944444f77-s592d\" (UID: \"2027c398-a02d-4d9b-a2b7-5ff833e850c3\") " pod="openstack/heat-engine-944444f77-s592d" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.153274 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2027c398-a02d-4d9b-a2b7-5ff833e850c3-config-data-custom\") pod \"heat-engine-944444f77-s592d\" (UID: \"2027c398-a02d-4d9b-a2b7-5ff833e850c3\") " pod="openstack/heat-engine-944444f77-s592d" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.255093 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0137440-41c2-46ae-864b-c649e9331ed4-config-data-custom\") pod \"heat-cfnapi-6877547978-9nvt2\" (UID: \"b0137440-41c2-46ae-864b-c649e9331ed4\") " pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.255165 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0137440-41c2-46ae-864b-c649e9331ed4-combined-ca-bundle\") pod \"heat-cfnapi-6877547978-9nvt2\" (UID: \"b0137440-41c2-46ae-864b-c649e9331ed4\") " pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.255225 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6e0c18-9389-4828-8654-3de4e56894c6-combined-ca-bundle\") pod \"heat-api-7f556dfdb9-m6w2r\" (UID: \"be6e0c18-9389-4828-8654-3de4e56894c6\") " pod="openstack/heat-api-7f556dfdb9-m6w2r" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.255256 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0137440-41c2-46ae-864b-c649e9331ed4-config-data\") pod \"heat-cfnapi-6877547978-9nvt2\" (UID: \"b0137440-41c2-46ae-864b-c649e9331ed4\") " pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.255432 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be6e0c18-9389-4828-8654-3de4e56894c6-config-data\") pod \"heat-api-7f556dfdb9-m6w2r\" (UID: \"be6e0c18-9389-4828-8654-3de4e56894c6\") " pod="openstack/heat-api-7f556dfdb9-m6w2r" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.255464 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlq4d\" (UniqueName: \"kubernetes.io/projected/be6e0c18-9389-4828-8654-3de4e56894c6-kube-api-access-qlq4d\") pod \"heat-api-7f556dfdb9-m6w2r\" (UID: \"be6e0c18-9389-4828-8654-3de4e56894c6\") " pod="openstack/heat-api-7f556dfdb9-m6w2r" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.255596 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6mgl\" (UniqueName: \"kubernetes.io/projected/2027c398-a02d-4d9b-a2b7-5ff833e850c3-kube-api-access-q6mgl\") pod \"heat-engine-944444f77-s592d\" (UID: \"2027c398-a02d-4d9b-a2b7-5ff833e850c3\") " pod="openstack/heat-engine-944444f77-s592d" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.255696 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb52k\" (UniqueName: \"kubernetes.io/projected/b0137440-41c2-46ae-864b-c649e9331ed4-kube-api-access-bb52k\") pod \"heat-cfnapi-6877547978-9nvt2\" (UID: \"b0137440-41c2-46ae-864b-c649e9331ed4\") " pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.255804 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2027c398-a02d-4d9b-a2b7-5ff833e850c3-combined-ca-bundle\") pod \"heat-engine-944444f77-s592d\" (UID: \"2027c398-a02d-4d9b-a2b7-5ff833e850c3\") " pod="openstack/heat-engine-944444f77-s592d" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.255871 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be6e0c18-9389-4828-8654-3de4e56894c6-config-data-custom\") pod \"heat-api-7f556dfdb9-m6w2r\" (UID: \"be6e0c18-9389-4828-8654-3de4e56894c6\") " pod="openstack/heat-api-7f556dfdb9-m6w2r" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.255968 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2027c398-a02d-4d9b-a2b7-5ff833e850c3-config-data\") pod \"heat-engine-944444f77-s592d\" (UID: \"2027c398-a02d-4d9b-a2b7-5ff833e850c3\") " pod="openstack/heat-engine-944444f77-s592d" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.255998 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2027c398-a02d-4d9b-a2b7-5ff833e850c3-config-data-custom\") pod \"heat-engine-944444f77-s592d\" (UID: \"2027c398-a02d-4d9b-a2b7-5ff833e850c3\") " pod="openstack/heat-engine-944444f77-s592d" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.264120 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2027c398-a02d-4d9b-a2b7-5ff833e850c3-config-data-custom\") pod \"heat-engine-944444f77-s592d\" (UID: \"2027c398-a02d-4d9b-a2b7-5ff833e850c3\") " pod="openstack/heat-engine-944444f77-s592d" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.266126 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2027c398-a02d-4d9b-a2b7-5ff833e850c3-combined-ca-bundle\") pod \"heat-engine-944444f77-s592d\" (UID: \"2027c398-a02d-4d9b-a2b7-5ff833e850c3\") " pod="openstack/heat-engine-944444f77-s592d" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.274820 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6mgl\" (UniqueName: \"kubernetes.io/projected/2027c398-a02d-4d9b-a2b7-5ff833e850c3-kube-api-access-q6mgl\") pod \"heat-engine-944444f77-s592d\" (UID: \"2027c398-a02d-4d9b-a2b7-5ff833e850c3\") " pod="openstack/heat-engine-944444f77-s592d" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.275039 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2027c398-a02d-4d9b-a2b7-5ff833e850c3-config-data\") pod \"heat-engine-944444f77-s592d\" (UID: \"2027c398-a02d-4d9b-a2b7-5ff833e850c3\") " pod="openstack/heat-engine-944444f77-s592d" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.359142 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb52k\" (UniqueName: \"kubernetes.io/projected/b0137440-41c2-46ae-864b-c649e9331ed4-kube-api-access-bb52k\") pod \"heat-cfnapi-6877547978-9nvt2\" (UID: \"b0137440-41c2-46ae-864b-c649e9331ed4\") " pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.359240 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be6e0c18-9389-4828-8654-3de4e56894c6-config-data-custom\") pod \"heat-api-7f556dfdb9-m6w2r\" (UID: \"be6e0c18-9389-4828-8654-3de4e56894c6\") " pod="openstack/heat-api-7f556dfdb9-m6w2r" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.359305 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0137440-41c2-46ae-864b-c649e9331ed4-config-data-custom\") pod \"heat-cfnapi-6877547978-9nvt2\" (UID: \"b0137440-41c2-46ae-864b-c649e9331ed4\") " pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.359326 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0137440-41c2-46ae-864b-c649e9331ed4-combined-ca-bundle\") pod \"heat-cfnapi-6877547978-9nvt2\" (UID: \"b0137440-41c2-46ae-864b-c649e9331ed4\") " pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.359353 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6e0c18-9389-4828-8654-3de4e56894c6-combined-ca-bundle\") pod \"heat-api-7f556dfdb9-m6w2r\" (UID: \"be6e0c18-9389-4828-8654-3de4e56894c6\") " pod="openstack/heat-api-7f556dfdb9-m6w2r" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.359375 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0137440-41c2-46ae-864b-c649e9331ed4-config-data\") pod \"heat-cfnapi-6877547978-9nvt2\" (UID: \"b0137440-41c2-46ae-864b-c649e9331ed4\") " pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.359483 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be6e0c18-9389-4828-8654-3de4e56894c6-config-data\") pod \"heat-api-7f556dfdb9-m6w2r\" (UID: \"be6e0c18-9389-4828-8654-3de4e56894c6\") " pod="openstack/heat-api-7f556dfdb9-m6w2r" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.359515 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlq4d\" (UniqueName: \"kubernetes.io/projected/be6e0c18-9389-4828-8654-3de4e56894c6-kube-api-access-qlq4d\") pod \"heat-api-7f556dfdb9-m6w2r\" (UID: \"be6e0c18-9389-4828-8654-3de4e56894c6\") " pod="openstack/heat-api-7f556dfdb9-m6w2r" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.364808 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6e0c18-9389-4828-8654-3de4e56894c6-combined-ca-bundle\") pod \"heat-api-7f556dfdb9-m6w2r\" (UID: \"be6e0c18-9389-4828-8654-3de4e56894c6\") " pod="openstack/heat-api-7f556dfdb9-m6w2r" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.364976 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be6e0c18-9389-4828-8654-3de4e56894c6-config-data\") pod \"heat-api-7f556dfdb9-m6w2r\" (UID: \"be6e0c18-9389-4828-8654-3de4e56894c6\") " pod="openstack/heat-api-7f556dfdb9-m6w2r" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.366385 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0137440-41c2-46ae-864b-c649e9331ed4-combined-ca-bundle\") pod \"heat-cfnapi-6877547978-9nvt2\" (UID: \"b0137440-41c2-46ae-864b-c649e9331ed4\") " pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.366725 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be6e0c18-9389-4828-8654-3de4e56894c6-config-data-custom\") pod \"heat-api-7f556dfdb9-m6w2r\" (UID: \"be6e0c18-9389-4828-8654-3de4e56894c6\") " pod="openstack/heat-api-7f556dfdb9-m6w2r" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.366778 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0137440-41c2-46ae-864b-c649e9331ed4-config-data-custom\") pod \"heat-cfnapi-6877547978-9nvt2\" (UID: \"b0137440-41c2-46ae-864b-c649e9331ed4\") " pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.373985 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0137440-41c2-46ae-864b-c649e9331ed4-config-data\") pod \"heat-cfnapi-6877547978-9nvt2\" (UID: \"b0137440-41c2-46ae-864b-c649e9331ed4\") " pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.382453 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb52k\" (UniqueName: \"kubernetes.io/projected/b0137440-41c2-46ae-864b-c649e9331ed4-kube-api-access-bb52k\") pod \"heat-cfnapi-6877547978-9nvt2\" (UID: \"b0137440-41c2-46ae-864b-c649e9331ed4\") " pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.385662 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlq4d\" (UniqueName: \"kubernetes.io/projected/be6e0c18-9389-4828-8654-3de4e56894c6-kube-api-access-qlq4d\") pod \"heat-api-7f556dfdb9-m6w2r\" (UID: \"be6e0c18-9389-4828-8654-3de4e56894c6\") " pod="openstack/heat-api-7f556dfdb9-m6w2r" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.388931 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-944444f77-s592d" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.431273 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7f556dfdb9-m6w2r" Dec 06 09:06:11 crc kubenswrapper[4954]: I1206 09:06:11.441457 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.013313 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7f556dfdb9-m6w2r"] Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.022235 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7f556dfdb9-m6w2r" event={"ID":"be6e0c18-9389-4828-8654-3de4e56894c6","Type":"ContainerStarted","Data":"244dfa2aba4e500aa50dc9780e62a89f92d5ca7e6367225af102cbc53d179533"} Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.127747 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6877547978-9nvt2"] Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.165255 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-944444f77-s592d"] Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.223380 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-bc845c464-m8g4r"] Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.223874 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-bc845c464-m8g4r" podUID="8fbd10d0-15f8-44a6-997b-974371621108" containerName="heat-api" containerID="cri-o://45bd7f9435726a94de00386b4c7d07760546b063634963611255646eb6ab28ca" gracePeriod=60 Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.256011 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7499b56df5-ql897"] Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.256342 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-7499b56df5-ql897" podUID="2af86654-409d-4f8b-909c-dfe1930993e4" containerName="heat-cfnapi" containerID="cri-o://93cccc4cd7a08fd67bebb0ab739f3db92d3d4902537f1b9b2e1d69e710f0778e" gracePeriod=60 Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.260747 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-bc845c464-m8g4r" podUID="8fbd10d0-15f8-44a6-997b-974371621108" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.123:8004/healthcheck\": EOF" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.298761 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-79d7bb7d88-c64lf"] Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.308217 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.312199 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.312806 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.319312 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-79d7bb7d88-c64lf"] Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.355838 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-75b9554dd5-rd52z"] Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.357030 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.358679 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.359143 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.381609 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-75b9554dd5-rd52z"] Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.387684 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h96t\" (UniqueName: \"kubernetes.io/projected/4827e274-0c02-46b7-a637-412a532734d8-kube-api-access-8h96t\") pod \"heat-api-79d7bb7d88-c64lf\" (UID: \"4827e274-0c02-46b7-a637-412a532734d8\") " pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.387741 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4827e274-0c02-46b7-a637-412a532734d8-combined-ca-bundle\") pod \"heat-api-79d7bb7d88-c64lf\" (UID: \"4827e274-0c02-46b7-a637-412a532734d8\") " pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.387793 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4827e274-0c02-46b7-a637-412a532734d8-internal-tls-certs\") pod \"heat-api-79d7bb7d88-c64lf\" (UID: \"4827e274-0c02-46b7-a637-412a532734d8\") " pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.387814 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4827e274-0c02-46b7-a637-412a532734d8-public-tls-certs\") pod \"heat-api-79d7bb7d88-c64lf\" (UID: \"4827e274-0c02-46b7-a637-412a532734d8\") " pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.387852 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4827e274-0c02-46b7-a637-412a532734d8-config-data-custom\") pod \"heat-api-79d7bb7d88-c64lf\" (UID: \"4827e274-0c02-46b7-a637-412a532734d8\") " pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.387872 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4827e274-0c02-46b7-a637-412a532734d8-config-data\") pod \"heat-api-79d7bb7d88-c64lf\" (UID: \"4827e274-0c02-46b7-a637-412a532734d8\") " pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.489334 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66-config-data\") pod \"heat-cfnapi-75b9554dd5-rd52z\" (UID: \"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66\") " pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.489425 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66-config-data-custom\") pod \"heat-cfnapi-75b9554dd5-rd52z\" (UID: \"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66\") " pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.489490 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h96t\" (UniqueName: \"kubernetes.io/projected/4827e274-0c02-46b7-a637-412a532734d8-kube-api-access-8h96t\") pod \"heat-api-79d7bb7d88-c64lf\" (UID: \"4827e274-0c02-46b7-a637-412a532734d8\") " pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.489579 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4827e274-0c02-46b7-a637-412a532734d8-combined-ca-bundle\") pod \"heat-api-79d7bb7d88-c64lf\" (UID: \"4827e274-0c02-46b7-a637-412a532734d8\") " pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.489639 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66-internal-tls-certs\") pod \"heat-cfnapi-75b9554dd5-rd52z\" (UID: \"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66\") " pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.489740 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66-combined-ca-bundle\") pod \"heat-cfnapi-75b9554dd5-rd52z\" (UID: \"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66\") " pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.489791 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4827e274-0c02-46b7-a637-412a532734d8-internal-tls-certs\") pod \"heat-api-79d7bb7d88-c64lf\" (UID: \"4827e274-0c02-46b7-a637-412a532734d8\") " pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.489851 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4827e274-0c02-46b7-a637-412a532734d8-public-tls-certs\") pod \"heat-api-79d7bb7d88-c64lf\" (UID: \"4827e274-0c02-46b7-a637-412a532734d8\") " pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.489904 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4827e274-0c02-46b7-a637-412a532734d8-config-data-custom\") pod \"heat-api-79d7bb7d88-c64lf\" (UID: \"4827e274-0c02-46b7-a637-412a532734d8\") " pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.489935 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66-public-tls-certs\") pod \"heat-cfnapi-75b9554dd5-rd52z\" (UID: \"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66\") " pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.489966 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4827e274-0c02-46b7-a637-412a532734d8-config-data\") pod \"heat-api-79d7bb7d88-c64lf\" (UID: \"4827e274-0c02-46b7-a637-412a532734d8\") " pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.490000 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tpsr\" (UniqueName: \"kubernetes.io/projected/7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66-kube-api-access-2tpsr\") pod \"heat-cfnapi-75b9554dd5-rd52z\" (UID: \"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66\") " pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.495991 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4827e274-0c02-46b7-a637-412a532734d8-config-data-custom\") pod \"heat-api-79d7bb7d88-c64lf\" (UID: \"4827e274-0c02-46b7-a637-412a532734d8\") " pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.496898 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4827e274-0c02-46b7-a637-412a532734d8-public-tls-certs\") pod \"heat-api-79d7bb7d88-c64lf\" (UID: \"4827e274-0c02-46b7-a637-412a532734d8\") " pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.497685 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4827e274-0c02-46b7-a637-412a532734d8-config-data\") pod \"heat-api-79d7bb7d88-c64lf\" (UID: \"4827e274-0c02-46b7-a637-412a532734d8\") " pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.508602 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4827e274-0c02-46b7-a637-412a532734d8-combined-ca-bundle\") pod \"heat-api-79d7bb7d88-c64lf\" (UID: \"4827e274-0c02-46b7-a637-412a532734d8\") " pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.515075 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h96t\" (UniqueName: \"kubernetes.io/projected/4827e274-0c02-46b7-a637-412a532734d8-kube-api-access-8h96t\") pod \"heat-api-79d7bb7d88-c64lf\" (UID: \"4827e274-0c02-46b7-a637-412a532734d8\") " pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.515267 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4827e274-0c02-46b7-a637-412a532734d8-internal-tls-certs\") pod \"heat-api-79d7bb7d88-c64lf\" (UID: \"4827e274-0c02-46b7-a637-412a532734d8\") " pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.591799 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66-config-data\") pod \"heat-cfnapi-75b9554dd5-rd52z\" (UID: \"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66\") " pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.591848 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66-config-data-custom\") pod \"heat-cfnapi-75b9554dd5-rd52z\" (UID: \"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66\") " pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.591921 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66-internal-tls-certs\") pod \"heat-cfnapi-75b9554dd5-rd52z\" (UID: \"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66\") " pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.591954 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66-combined-ca-bundle\") pod \"heat-cfnapi-75b9554dd5-rd52z\" (UID: \"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66\") " pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.592020 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66-public-tls-certs\") pod \"heat-cfnapi-75b9554dd5-rd52z\" (UID: \"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66\") " pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.592049 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tpsr\" (UniqueName: \"kubernetes.io/projected/7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66-kube-api-access-2tpsr\") pod \"heat-cfnapi-75b9554dd5-rd52z\" (UID: \"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66\") " pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.600200 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66-combined-ca-bundle\") pod \"heat-cfnapi-75b9554dd5-rd52z\" (UID: \"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66\") " pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.600200 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66-public-tls-certs\") pod \"heat-cfnapi-75b9554dd5-rd52z\" (UID: \"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66\") " pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.600754 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66-config-data-custom\") pod \"heat-cfnapi-75b9554dd5-rd52z\" (UID: \"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66\") " pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.603664 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66-config-data\") pod \"heat-cfnapi-75b9554dd5-rd52z\" (UID: \"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66\") " pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.603756 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66-internal-tls-certs\") pod \"heat-cfnapi-75b9554dd5-rd52z\" (UID: \"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66\") " pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.610809 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tpsr\" (UniqueName: \"kubernetes.io/projected/7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66-kube-api-access-2tpsr\") pod \"heat-cfnapi-75b9554dd5-rd52z\" (UID: \"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66\") " pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.792522 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:12 crc kubenswrapper[4954]: I1206 09:06:12.802555 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:13 crc kubenswrapper[4954]: I1206 09:06:13.042751 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6877547978-9nvt2" event={"ID":"b0137440-41c2-46ae-864b-c649e9331ed4","Type":"ContainerStarted","Data":"7288a750c0cdce22db88450e1fa38e01899b154d44d71051d90426d7d347ab72"} Dec 06 09:06:13 crc kubenswrapper[4954]: I1206 09:06:13.042994 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6877547978-9nvt2" event={"ID":"b0137440-41c2-46ae-864b-c649e9331ed4","Type":"ContainerStarted","Data":"cb44965cfc9a83a14fd97807ca97ef06f772c4d492900b8c022bc50474df5347"} Dec 06 09:06:13 crc kubenswrapper[4954]: I1206 09:06:13.044200 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:13 crc kubenswrapper[4954]: I1206 09:06:13.062523 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-944444f77-s592d" event={"ID":"2027c398-a02d-4d9b-a2b7-5ff833e850c3","Type":"ContainerStarted","Data":"7ba004b362394c43901bee88a526766cf0730c10e3f41959fe488f50a1ca97e6"} Dec 06 09:06:13 crc kubenswrapper[4954]: I1206 09:06:13.062593 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-944444f77-s592d" event={"ID":"2027c398-a02d-4d9b-a2b7-5ff833e850c3","Type":"ContainerStarted","Data":"0eadcfe93f3e643830c017cfd79c1b59e6d2fb2128901ea31d0f8c9af6b66aa9"} Dec 06 09:06:13 crc kubenswrapper[4954]: I1206 09:06:13.063198 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-944444f77-s592d" Dec 06 09:06:13 crc kubenswrapper[4954]: I1206 09:06:13.089922 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6877547978-9nvt2" podStartSLOduration=2.089879791 podStartE2EDuration="2.089879791s" podCreationTimestamp="2025-12-06 09:06:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:06:13.065033126 +0000 UTC m=+7747.878392525" watchObservedRunningTime="2025-12-06 09:06:13.089879791 +0000 UTC m=+7747.903239180" Dec 06 09:06:13 crc kubenswrapper[4954]: I1206 09:06:13.113822 4954 generic.go:334] "Generic (PLEG): container finished" podID="be6e0c18-9389-4828-8654-3de4e56894c6" containerID="2371f8fa0efc965ea61d89a67197c7163b41ac0ff1156588b61dd2608935b1a6" exitCode=1 Dec 06 09:06:13 crc kubenswrapper[4954]: I1206 09:06:13.113870 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7f556dfdb9-m6w2r" event={"ID":"be6e0c18-9389-4828-8654-3de4e56894c6","Type":"ContainerDied","Data":"2371f8fa0efc965ea61d89a67197c7163b41ac0ff1156588b61dd2608935b1a6"} Dec 06 09:06:13 crc kubenswrapper[4954]: I1206 09:06:13.117958 4954 scope.go:117] "RemoveContainer" containerID="2371f8fa0efc965ea61d89a67197c7163b41ac0ff1156588b61dd2608935b1a6" Dec 06 09:06:13 crc kubenswrapper[4954]: I1206 09:06:13.123013 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-944444f77-s592d" podStartSLOduration=2.122993048 podStartE2EDuration="2.122993048s" podCreationTimestamp="2025-12-06 09:06:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:06:13.107086772 +0000 UTC m=+7747.920446161" watchObservedRunningTime="2025-12-06 09:06:13.122993048 +0000 UTC m=+7747.936352437" Dec 06 09:06:13 crc kubenswrapper[4954]: I1206 09:06:13.414432 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-75b9554dd5-rd52z"] Dec 06 09:06:13 crc kubenswrapper[4954]: I1206 09:06:13.560998 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-79d7bb7d88-c64lf"] Dec 06 09:06:13 crc kubenswrapper[4954]: I1206 09:06:13.920120 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7499b56df5-ql897" Dec 06 09:06:14 crc kubenswrapper[4954]: I1206 09:06:14.124232 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-75b9554dd5-rd52z" event={"ID":"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66","Type":"ContainerStarted","Data":"08f31f785a4f347d6f0e72a95267bbeda4969bacb00bc7e71b10e1f1355f76ae"} Dec 06 09:06:14 crc kubenswrapper[4954]: I1206 09:06:14.124288 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-75b9554dd5-rd52z" event={"ID":"7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66","Type":"ContainerStarted","Data":"abf60111ebade978536e17df35bd8c875c309fdb7d67d1b441f875d7c704db91"} Dec 06 09:06:14 crc kubenswrapper[4954]: I1206 09:06:14.124378 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:14 crc kubenswrapper[4954]: I1206 09:06:14.127171 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-79d7bb7d88-c64lf" event={"ID":"4827e274-0c02-46b7-a637-412a532734d8","Type":"ContainerStarted","Data":"858acaac2421a9eeb8d4004681314225451c69cc8e824a11c30380af849a71f4"} Dec 06 09:06:14 crc kubenswrapper[4954]: I1206 09:06:14.127381 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-79d7bb7d88-c64lf" event={"ID":"4827e274-0c02-46b7-a637-412a532734d8","Type":"ContainerStarted","Data":"e8c70e831e351d87050c704213f5c682bd9c6f733e9f26669e91abbc58ea632e"} Dec 06 09:06:14 crc kubenswrapper[4954]: I1206 09:06:14.127448 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:14 crc kubenswrapper[4954]: I1206 09:06:14.129759 4954 generic.go:334] "Generic (PLEG): container finished" podID="be6e0c18-9389-4828-8654-3de4e56894c6" containerID="95b72cb99ed01a22b604be1e6d88d271d7b075a1724dbc6a96cd4826025fc4cd" exitCode=1 Dec 06 09:06:14 crc kubenswrapper[4954]: I1206 09:06:14.129794 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7f556dfdb9-m6w2r" event={"ID":"be6e0c18-9389-4828-8654-3de4e56894c6","Type":"ContainerDied","Data":"95b72cb99ed01a22b604be1e6d88d271d7b075a1724dbc6a96cd4826025fc4cd"} Dec 06 09:06:14 crc kubenswrapper[4954]: I1206 09:06:14.129832 4954 scope.go:117] "RemoveContainer" containerID="2371f8fa0efc965ea61d89a67197c7163b41ac0ff1156588b61dd2608935b1a6" Dec 06 09:06:14 crc kubenswrapper[4954]: I1206 09:06:14.130232 4954 scope.go:117] "RemoveContainer" containerID="95b72cb99ed01a22b604be1e6d88d271d7b075a1724dbc6a96cd4826025fc4cd" Dec 06 09:06:14 crc kubenswrapper[4954]: E1206 09:06:14.130519 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7f556dfdb9-m6w2r_openstack(be6e0c18-9389-4828-8654-3de4e56894c6)\"" pod="openstack/heat-api-7f556dfdb9-m6w2r" podUID="be6e0c18-9389-4828-8654-3de4e56894c6" Dec 06 09:06:14 crc kubenswrapper[4954]: I1206 09:06:14.131790 4954 generic.go:334] "Generic (PLEG): container finished" podID="b0137440-41c2-46ae-864b-c649e9331ed4" containerID="7288a750c0cdce22db88450e1fa38e01899b154d44d71051d90426d7d347ab72" exitCode=1 Dec 06 09:06:14 crc kubenswrapper[4954]: I1206 09:06:14.132676 4954 scope.go:117] "RemoveContainer" containerID="7288a750c0cdce22db88450e1fa38e01899b154d44d71051d90426d7d347ab72" Dec 06 09:06:14 crc kubenswrapper[4954]: I1206 09:06:14.132921 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6877547978-9nvt2" event={"ID":"b0137440-41c2-46ae-864b-c649e9331ed4","Type":"ContainerDied","Data":"7288a750c0cdce22db88450e1fa38e01899b154d44d71051d90426d7d347ab72"} Dec 06 09:06:14 crc kubenswrapper[4954]: I1206 09:06:14.173858 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-75b9554dd5-rd52z" podStartSLOduration=2.17383244 podStartE2EDuration="2.17383244s" podCreationTimestamp="2025-12-06 09:06:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:06:14.155340754 +0000 UTC m=+7748.968700163" watchObservedRunningTime="2025-12-06 09:06:14.17383244 +0000 UTC m=+7748.987191829" Dec 06 09:06:14 crc kubenswrapper[4954]: I1206 09:06:14.240797 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-79d7bb7d88-c64lf" podStartSLOduration=2.240770983 podStartE2EDuration="2.240770983s" podCreationTimestamp="2025-12-06 09:06:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:06:14.190227719 +0000 UTC m=+7749.003587128" watchObservedRunningTime="2025-12-06 09:06:14.240770983 +0000 UTC m=+7749.054130402" Dec 06 09:06:14 crc kubenswrapper[4954]: I1206 09:06:14.402153 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6cbfdb884b-xsxh7" Dec 06 09:06:14 crc kubenswrapper[4954]: I1206 09:06:14.443117 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:06:14 crc kubenswrapper[4954]: E1206 09:06:14.443359 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:06:15 crc kubenswrapper[4954]: I1206 09:06:15.142651 4954 generic.go:334] "Generic (PLEG): container finished" podID="b0137440-41c2-46ae-864b-c649e9331ed4" containerID="43232145371effab1f39420ce4a673d17e237826fc9fc136c21e3b6d9b4f5bda" exitCode=1 Dec 06 09:06:15 crc kubenswrapper[4954]: I1206 09:06:15.142732 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6877547978-9nvt2" event={"ID":"b0137440-41c2-46ae-864b-c649e9331ed4","Type":"ContainerDied","Data":"43232145371effab1f39420ce4a673d17e237826fc9fc136c21e3b6d9b4f5bda"} Dec 06 09:06:15 crc kubenswrapper[4954]: I1206 09:06:15.142772 4954 scope.go:117] "RemoveContainer" containerID="7288a750c0cdce22db88450e1fa38e01899b154d44d71051d90426d7d347ab72" Dec 06 09:06:15 crc kubenswrapper[4954]: I1206 09:06:15.143322 4954 scope.go:117] "RemoveContainer" containerID="43232145371effab1f39420ce4a673d17e237826fc9fc136c21e3b6d9b4f5bda" Dec 06 09:06:15 crc kubenswrapper[4954]: E1206 09:06:15.143991 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6877547978-9nvt2_openstack(b0137440-41c2-46ae-864b-c649e9331ed4)\"" pod="openstack/heat-cfnapi-6877547978-9nvt2" podUID="b0137440-41c2-46ae-864b-c649e9331ed4" Dec 06 09:06:15 crc kubenswrapper[4954]: I1206 09:06:15.146223 4954 scope.go:117] "RemoveContainer" containerID="95b72cb99ed01a22b604be1e6d88d271d7b075a1724dbc6a96cd4826025fc4cd" Dec 06 09:06:15 crc kubenswrapper[4954]: E1206 09:06:15.146442 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7f556dfdb9-m6w2r_openstack(be6e0c18-9389-4828-8654-3de4e56894c6)\"" pod="openstack/heat-api-7f556dfdb9-m6w2r" podUID="be6e0c18-9389-4828-8654-3de4e56894c6" Dec 06 09:06:15 crc kubenswrapper[4954]: I1206 09:06:15.471653 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:06:16 crc kubenswrapper[4954]: I1206 09:06:16.159226 4954 scope.go:117] "RemoveContainer" containerID="43232145371effab1f39420ce4a673d17e237826fc9fc136c21e3b6d9b4f5bda" Dec 06 09:06:16 crc kubenswrapper[4954]: E1206 09:06:16.159493 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6877547978-9nvt2_openstack(b0137440-41c2-46ae-864b-c649e9331ed4)\"" pod="openstack/heat-cfnapi-6877547978-9nvt2" podUID="b0137440-41c2-46ae-864b-c649e9331ed4" Dec 06 09:06:16 crc kubenswrapper[4954]: I1206 09:06:16.432116 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-7f556dfdb9-m6w2r" Dec 06 09:06:16 crc kubenswrapper[4954]: I1206 09:06:16.432180 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7f556dfdb9-m6w2r" Dec 06 09:06:16 crc kubenswrapper[4954]: I1206 09:06:16.433031 4954 scope.go:117] "RemoveContainer" containerID="95b72cb99ed01a22b604be1e6d88d271d7b075a1724dbc6a96cd4826025fc4cd" Dec 06 09:06:16 crc kubenswrapper[4954]: E1206 09:06:16.433447 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7f556dfdb9-m6w2r_openstack(be6e0c18-9389-4828-8654-3de4e56894c6)\"" pod="openstack/heat-api-7f556dfdb9-m6w2r" podUID="be6e0c18-9389-4828-8654-3de4e56894c6" Dec 06 09:06:16 crc kubenswrapper[4954]: I1206 09:06:16.442463 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:16 crc kubenswrapper[4954]: I1206 09:06:16.442506 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:17 crc kubenswrapper[4954]: I1206 09:06:17.166185 4954 scope.go:117] "RemoveContainer" containerID="43232145371effab1f39420ce4a673d17e237826fc9fc136c21e3b6d9b4f5bda" Dec 06 09:06:17 crc kubenswrapper[4954]: E1206 09:06:17.166712 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6877547978-9nvt2_openstack(b0137440-41c2-46ae-864b-c649e9331ed4)\"" pod="openstack/heat-cfnapi-6877547978-9nvt2" podUID="b0137440-41c2-46ae-864b-c649e9331ed4" Dec 06 09:06:17 crc kubenswrapper[4954]: I1206 09:06:17.575412 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7c75d6449b-lzlh9" Dec 06 09:06:17 crc kubenswrapper[4954]: I1206 09:06:17.650283 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d65f959c6-krcmt"] Dec 06 09:06:17 crc kubenswrapper[4954]: I1206 09:06:17.650523 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7d65f959c6-krcmt" podUID="1709a14f-a574-4485-8993-2c5991a6ca80" containerName="horizon-log" containerID="cri-o://a6e3e7cbe32d726f40cb264aeba7d52f87477307d597c8801524cbfd58dd48ca" gracePeriod=30 Dec 06 09:06:17 crc kubenswrapper[4954]: I1206 09:06:17.650609 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7d65f959c6-krcmt" podUID="1709a14f-a574-4485-8993-2c5991a6ca80" containerName="horizon" containerID="cri-o://74a7f8fe3b65db5f113c518df505f793c9938f6e1817b39e9a6c049d7bf5f8d2" gracePeriod=30 Dec 06 09:06:17 crc kubenswrapper[4954]: I1206 09:06:17.686366 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-7499b56df5-ql897" podUID="2af86654-409d-4f8b-909c-dfe1930993e4" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.124:8000/healthcheck\": read tcp 10.217.0.2:58832->10.217.1.124:8000: read: connection reset by peer" Dec 06 09:06:17 crc kubenswrapper[4954]: I1206 09:06:17.714154 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-bc845c464-m8g4r" podUID="8fbd10d0-15f8-44a6-997b-974371621108" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.123:8004/healthcheck\": read tcp 10.217.0.2:33948->10.217.1.123:8004: read: connection reset by peer" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.175608 4954 generic.go:334] "Generic (PLEG): container finished" podID="8fbd10d0-15f8-44a6-997b-974371621108" containerID="45bd7f9435726a94de00386b4c7d07760546b063634963611255646eb6ab28ca" exitCode=0 Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.175702 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-bc845c464-m8g4r" event={"ID":"8fbd10d0-15f8-44a6-997b-974371621108","Type":"ContainerDied","Data":"45bd7f9435726a94de00386b4c7d07760546b063634963611255646eb6ab28ca"} Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.175946 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-bc845c464-m8g4r" event={"ID":"8fbd10d0-15f8-44a6-997b-974371621108","Type":"ContainerDied","Data":"3837fa902ae7066b3c944e6c3bf4ab669be521d6475e39456400465f62e650fd"} Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.175962 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3837fa902ae7066b3c944e6c3bf4ab669be521d6475e39456400465f62e650fd" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.177664 4954 generic.go:334] "Generic (PLEG): container finished" podID="2af86654-409d-4f8b-909c-dfe1930993e4" containerID="93cccc4cd7a08fd67bebb0ab739f3db92d3d4902537f1b9b2e1d69e710f0778e" exitCode=0 Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.177712 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7499b56df5-ql897" event={"ID":"2af86654-409d-4f8b-909c-dfe1930993e4","Type":"ContainerDied","Data":"93cccc4cd7a08fd67bebb0ab739f3db92d3d4902537f1b9b2e1d69e710f0778e"} Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.177739 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7499b56df5-ql897" event={"ID":"2af86654-409d-4f8b-909c-dfe1930993e4","Type":"ContainerDied","Data":"e2d45112366e4b1bdaf2be083d7f0cc2dbb419566c887fc110bf5d613c0e4f0d"} Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.177756 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2d45112366e4b1bdaf2be083d7f0cc2dbb419566c887fc110bf5d613c0e4f0d" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.194556 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-bc845c464-m8g4r" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.212958 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7499b56df5-ql897" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.366312 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af86654-409d-4f8b-909c-dfe1930993e4-combined-ca-bundle\") pod \"2af86654-409d-4f8b-909c-dfe1930993e4\" (UID: \"2af86654-409d-4f8b-909c-dfe1930993e4\") " Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.366425 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbd10d0-15f8-44a6-997b-974371621108-config-data\") pod \"8fbd10d0-15f8-44a6-997b-974371621108\" (UID: \"8fbd10d0-15f8-44a6-997b-974371621108\") " Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.366464 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbd10d0-15f8-44a6-997b-974371621108-combined-ca-bundle\") pod \"8fbd10d0-15f8-44a6-997b-974371621108\" (UID: \"8fbd10d0-15f8-44a6-997b-974371621108\") " Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.366534 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgm25\" (UniqueName: \"kubernetes.io/projected/8fbd10d0-15f8-44a6-997b-974371621108-kube-api-access-xgm25\") pod \"8fbd10d0-15f8-44a6-997b-974371621108\" (UID: \"8fbd10d0-15f8-44a6-997b-974371621108\") " Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.366587 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4zrv\" (UniqueName: \"kubernetes.io/projected/2af86654-409d-4f8b-909c-dfe1930993e4-kube-api-access-v4zrv\") pod \"2af86654-409d-4f8b-909c-dfe1930993e4\" (UID: \"2af86654-409d-4f8b-909c-dfe1930993e4\") " Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.366633 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2af86654-409d-4f8b-909c-dfe1930993e4-config-data-custom\") pod \"2af86654-409d-4f8b-909c-dfe1930993e4\" (UID: \"2af86654-409d-4f8b-909c-dfe1930993e4\") " Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.366655 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2af86654-409d-4f8b-909c-dfe1930993e4-config-data\") pod \"2af86654-409d-4f8b-909c-dfe1930993e4\" (UID: \"2af86654-409d-4f8b-909c-dfe1930993e4\") " Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.366687 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fbd10d0-15f8-44a6-997b-974371621108-config-data-custom\") pod \"8fbd10d0-15f8-44a6-997b-974371621108\" (UID: \"8fbd10d0-15f8-44a6-997b-974371621108\") " Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.371745 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbd10d0-15f8-44a6-997b-974371621108-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8fbd10d0-15f8-44a6-997b-974371621108" (UID: "8fbd10d0-15f8-44a6-997b-974371621108"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.372935 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af86654-409d-4f8b-909c-dfe1930993e4-kube-api-access-v4zrv" (OuterVolumeSpecName: "kube-api-access-v4zrv") pod "2af86654-409d-4f8b-909c-dfe1930993e4" (UID: "2af86654-409d-4f8b-909c-dfe1930993e4"). InnerVolumeSpecName "kube-api-access-v4zrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.374350 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af86654-409d-4f8b-909c-dfe1930993e4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2af86654-409d-4f8b-909c-dfe1930993e4" (UID: "2af86654-409d-4f8b-909c-dfe1930993e4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.382715 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fbd10d0-15f8-44a6-997b-974371621108-kube-api-access-xgm25" (OuterVolumeSpecName: "kube-api-access-xgm25") pod "8fbd10d0-15f8-44a6-997b-974371621108" (UID: "8fbd10d0-15f8-44a6-997b-974371621108"). InnerVolumeSpecName "kube-api-access-xgm25". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.406443 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbd10d0-15f8-44a6-997b-974371621108-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fbd10d0-15f8-44a6-997b-974371621108" (UID: "8fbd10d0-15f8-44a6-997b-974371621108"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.414889 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af86654-409d-4f8b-909c-dfe1930993e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2af86654-409d-4f8b-909c-dfe1930993e4" (UID: "2af86654-409d-4f8b-909c-dfe1930993e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.427128 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af86654-409d-4f8b-909c-dfe1930993e4-config-data" (OuterVolumeSpecName: "config-data") pod "2af86654-409d-4f8b-909c-dfe1930993e4" (UID: "2af86654-409d-4f8b-909c-dfe1930993e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.434896 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbd10d0-15f8-44a6-997b-974371621108-config-data" (OuterVolumeSpecName: "config-data") pod "8fbd10d0-15f8-44a6-997b-974371621108" (UID: "8fbd10d0-15f8-44a6-997b-974371621108"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.470637 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af86654-409d-4f8b-909c-dfe1930993e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.470679 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbd10d0-15f8-44a6-997b-974371621108-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.470688 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbd10d0-15f8-44a6-997b-974371621108-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.470698 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgm25\" (UniqueName: \"kubernetes.io/projected/8fbd10d0-15f8-44a6-997b-974371621108-kube-api-access-xgm25\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.470707 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4zrv\" (UniqueName: \"kubernetes.io/projected/2af86654-409d-4f8b-909c-dfe1930993e4-kube-api-access-v4zrv\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.470714 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2af86654-409d-4f8b-909c-dfe1930993e4-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.470723 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2af86654-409d-4f8b-909c-dfe1930993e4-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:18 crc kubenswrapper[4954]: I1206 09:06:18.470735 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fbd10d0-15f8-44a6-997b-974371621108-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:19 crc kubenswrapper[4954]: I1206 09:06:19.188216 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7499b56df5-ql897" Dec 06 09:06:19 crc kubenswrapper[4954]: I1206 09:06:19.188237 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-bc845c464-m8g4r" Dec 06 09:06:19 crc kubenswrapper[4954]: I1206 09:06:19.234751 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-bc845c464-m8g4r"] Dec 06 09:06:19 crc kubenswrapper[4954]: I1206 09:06:19.243783 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-bc845c464-m8g4r"] Dec 06 09:06:19 crc kubenswrapper[4954]: I1206 09:06:19.251600 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7499b56df5-ql897"] Dec 06 09:06:19 crc kubenswrapper[4954]: I1206 09:06:19.259133 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-7499b56df5-ql897"] Dec 06 09:06:19 crc kubenswrapper[4954]: I1206 09:06:19.453647 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af86654-409d-4f8b-909c-dfe1930993e4" path="/var/lib/kubelet/pods/2af86654-409d-4f8b-909c-dfe1930993e4/volumes" Dec 06 09:06:19 crc kubenswrapper[4954]: I1206 09:06:19.454501 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fbd10d0-15f8-44a6-997b-974371621108" path="/var/lib/kubelet/pods/8fbd10d0-15f8-44a6-997b-974371621108/volumes" Dec 06 09:06:21 crc kubenswrapper[4954]: I1206 09:06:21.209785 4954 generic.go:334] "Generic (PLEG): container finished" podID="1709a14f-a574-4485-8993-2c5991a6ca80" containerID="74a7f8fe3b65db5f113c518df505f793c9938f6e1817b39e9a6c049d7bf5f8d2" exitCode=0 Dec 06 09:06:21 crc kubenswrapper[4954]: I1206 09:06:21.209992 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d65f959c6-krcmt" event={"ID":"1709a14f-a574-4485-8993-2c5991a6ca80","Type":"ContainerDied","Data":"74a7f8fe3b65db5f113c518df505f793c9938f6e1817b39e9a6c049d7bf5f8d2"} Dec 06 09:06:24 crc kubenswrapper[4954]: I1206 09:06:24.133804 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-75b9554dd5-rd52z" Dec 06 09:06:24 crc kubenswrapper[4954]: I1206 09:06:24.220155 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6877547978-9nvt2"] Dec 06 09:06:24 crc kubenswrapper[4954]: I1206 09:06:24.248281 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-79d7bb7d88-c64lf" Dec 06 09:06:24 crc kubenswrapper[4954]: I1206 09:06:24.329013 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7f556dfdb9-m6w2r"] Dec 06 09:06:24 crc kubenswrapper[4954]: I1206 09:06:24.653789 4954 scope.go:117] "RemoveContainer" containerID="408ace98b13ac0a40a56d12e3715987414139b725ebe95977a65f9b57dbf9d52" Dec 06 09:06:24 crc kubenswrapper[4954]: I1206 09:06:24.688711 4954 scope.go:117] "RemoveContainer" containerID="880d28c1d9494c7097260505ef6e861d83d2dda7c7a72f26aa870e73c47d8ac4" Dec 06 09:06:24 crc kubenswrapper[4954]: I1206 09:06:24.861274 4954 scope.go:117] "RemoveContainer" containerID="57da20c78f44687d6e19a19e0f7e423131cc5a44909bffe240062ab1d39edca5" Dec 06 09:06:24 crc kubenswrapper[4954]: I1206 09:06:24.915209 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:24 crc kubenswrapper[4954]: I1206 09:06:24.922780 4954 scope.go:117] "RemoveContainer" containerID="94601b554cf78848be44218c3a266428e96317bd91aee40881f5e1b614fbe3e1" Dec 06 09:06:24 crc kubenswrapper[4954]: I1206 09:06:24.926947 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7f556dfdb9-m6w2r" Dec 06 09:06:24 crc kubenswrapper[4954]: I1206 09:06:24.968892 4954 scope.go:117] "RemoveContainer" containerID="2c5d13112bfd47d6e296f52208c803d39c67ba3734ba30a7ecdb80dc5bd604ff" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.009380 4954 scope.go:117] "RemoveContainer" containerID="fd04ce991f30226b3bb92f166e317cda14679d9d93acb2f43f6c14dd283e8e2e" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.014367 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6e0c18-9389-4828-8654-3de4e56894c6-combined-ca-bundle\") pod \"be6e0c18-9389-4828-8654-3de4e56894c6\" (UID: \"be6e0c18-9389-4828-8654-3de4e56894c6\") " Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.014432 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0137440-41c2-46ae-864b-c649e9331ed4-config-data-custom\") pod \"b0137440-41c2-46ae-864b-c649e9331ed4\" (UID: \"b0137440-41c2-46ae-864b-c649e9331ed4\") " Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.014472 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlq4d\" (UniqueName: \"kubernetes.io/projected/be6e0c18-9389-4828-8654-3de4e56894c6-kube-api-access-qlq4d\") pod \"be6e0c18-9389-4828-8654-3de4e56894c6\" (UID: \"be6e0c18-9389-4828-8654-3de4e56894c6\") " Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.014653 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be6e0c18-9389-4828-8654-3de4e56894c6-config-data\") pod \"be6e0c18-9389-4828-8654-3de4e56894c6\" (UID: \"be6e0c18-9389-4828-8654-3de4e56894c6\") " Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.014675 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0137440-41c2-46ae-864b-c649e9331ed4-combined-ca-bundle\") pod \"b0137440-41c2-46ae-864b-c649e9331ed4\" (UID: \"b0137440-41c2-46ae-864b-c649e9331ed4\") " Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.014705 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be6e0c18-9389-4828-8654-3de4e56894c6-config-data-custom\") pod \"be6e0c18-9389-4828-8654-3de4e56894c6\" (UID: \"be6e0c18-9389-4828-8654-3de4e56894c6\") " Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.014749 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb52k\" (UniqueName: \"kubernetes.io/projected/b0137440-41c2-46ae-864b-c649e9331ed4-kube-api-access-bb52k\") pod \"b0137440-41c2-46ae-864b-c649e9331ed4\" (UID: \"b0137440-41c2-46ae-864b-c649e9331ed4\") " Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.014789 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0137440-41c2-46ae-864b-c649e9331ed4-config-data\") pod \"b0137440-41c2-46ae-864b-c649e9331ed4\" (UID: \"b0137440-41c2-46ae-864b-c649e9331ed4\") " Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.020771 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0137440-41c2-46ae-864b-c649e9331ed4-kube-api-access-bb52k" (OuterVolumeSpecName: "kube-api-access-bb52k") pod "b0137440-41c2-46ae-864b-c649e9331ed4" (UID: "b0137440-41c2-46ae-864b-c649e9331ed4"). InnerVolumeSpecName "kube-api-access-bb52k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.021089 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be6e0c18-9389-4828-8654-3de4e56894c6-kube-api-access-qlq4d" (OuterVolumeSpecName: "kube-api-access-qlq4d") pod "be6e0c18-9389-4828-8654-3de4e56894c6" (UID: "be6e0c18-9389-4828-8654-3de4e56894c6"). InnerVolumeSpecName "kube-api-access-qlq4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.021408 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be6e0c18-9389-4828-8654-3de4e56894c6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "be6e0c18-9389-4828-8654-3de4e56894c6" (UID: "be6e0c18-9389-4828-8654-3de4e56894c6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.023973 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0137440-41c2-46ae-864b-c649e9331ed4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b0137440-41c2-46ae-864b-c649e9331ed4" (UID: "b0137440-41c2-46ae-864b-c649e9331ed4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.045552 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be6e0c18-9389-4828-8654-3de4e56894c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be6e0c18-9389-4828-8654-3de4e56894c6" (UID: "be6e0c18-9389-4828-8654-3de4e56894c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.052739 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0137440-41c2-46ae-864b-c649e9331ed4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0137440-41c2-46ae-864b-c649e9331ed4" (UID: "b0137440-41c2-46ae-864b-c649e9331ed4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.070274 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be6e0c18-9389-4828-8654-3de4e56894c6-config-data" (OuterVolumeSpecName: "config-data") pod "be6e0c18-9389-4828-8654-3de4e56894c6" (UID: "be6e0c18-9389-4828-8654-3de4e56894c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.070710 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0137440-41c2-46ae-864b-c649e9331ed4-config-data" (OuterVolumeSpecName: "config-data") pod "b0137440-41c2-46ae-864b-c649e9331ed4" (UID: "b0137440-41c2-46ae-864b-c649e9331ed4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.116784 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be6e0c18-9389-4828-8654-3de4e56894c6-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.116810 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0137440-41c2-46ae-864b-c649e9331ed4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.116821 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be6e0c18-9389-4828-8654-3de4e56894c6-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.116830 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb52k\" (UniqueName: \"kubernetes.io/projected/b0137440-41c2-46ae-864b-c649e9331ed4-kube-api-access-bb52k\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.116839 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0137440-41c2-46ae-864b-c649e9331ed4-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.116847 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6e0c18-9389-4828-8654-3de4e56894c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.116855 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0137440-41c2-46ae-864b-c649e9331ed4-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.116863 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlq4d\" (UniqueName: \"kubernetes.io/projected/be6e0c18-9389-4828-8654-3de4e56894c6-kube-api-access-qlq4d\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.269425 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7f556dfdb9-m6w2r" event={"ID":"be6e0c18-9389-4828-8654-3de4e56894c6","Type":"ContainerDied","Data":"244dfa2aba4e500aa50dc9780e62a89f92d5ca7e6367225af102cbc53d179533"} Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.269444 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7f556dfdb9-m6w2r" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.269510 4954 scope.go:117] "RemoveContainer" containerID="95b72cb99ed01a22b604be1e6d88d271d7b075a1724dbc6a96cd4826025fc4cd" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.272925 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6877547978-9nvt2" event={"ID":"b0137440-41c2-46ae-864b-c649e9331ed4","Type":"ContainerDied","Data":"cb44965cfc9a83a14fd97807ca97ef06f772c4d492900b8c022bc50474df5347"} Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.272958 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6877547978-9nvt2" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.301378 4954 scope.go:117] "RemoveContainer" containerID="43232145371effab1f39420ce4a673d17e237826fc9fc136c21e3b6d9b4f5bda" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.315574 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6877547978-9nvt2"] Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.328291 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6877547978-9nvt2"] Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.339157 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7f556dfdb9-m6w2r"] Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.348931 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7f556dfdb9-m6w2r"] Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.459167 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0137440-41c2-46ae-864b-c649e9331ed4" path="/var/lib/kubelet/pods/b0137440-41c2-46ae-864b-c649e9331ed4/volumes" Dec 06 09:06:25 crc kubenswrapper[4954]: I1206 09:06:25.459992 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be6e0c18-9389-4828-8654-3de4e56894c6" path="/var/lib/kubelet/pods/be6e0c18-9389-4828-8654-3de4e56894c6/volumes" Dec 06 09:06:27 crc kubenswrapper[4954]: I1206 09:06:27.128491 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7d65f959c6-krcmt" podUID="1709a14f-a574-4485-8993-2c5991a6ca80" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.115:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8443: connect: connection refused" Dec 06 09:06:28 crc kubenswrapper[4954]: I1206 09:06:28.443122 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:06:28 crc kubenswrapper[4954]: E1206 09:06:28.444042 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:06:31 crc kubenswrapper[4954]: I1206 09:06:31.453300 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-944444f77-s592d" Dec 06 09:06:31 crc kubenswrapper[4954]: I1206 09:06:31.520621 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6cbfdb884b-xsxh7"] Dec 06 09:06:31 crc kubenswrapper[4954]: I1206 09:06:31.520956 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-6cbfdb884b-xsxh7" podUID="c709125b-f0e4-46a0-94f1-e13feafc362e" containerName="heat-engine" containerID="cri-o://1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d" gracePeriod=60 Dec 06 09:06:34 crc kubenswrapper[4954]: E1206 09:06:34.210487 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 06 09:06:34 crc kubenswrapper[4954]: E1206 09:06:34.212105 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 06 09:06:34 crc kubenswrapper[4954]: E1206 09:06:34.213678 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 06 09:06:34 crc kubenswrapper[4954]: E1206 09:06:34.213720 4954 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6cbfdb884b-xsxh7" podUID="c709125b-f0e4-46a0-94f1-e13feafc362e" containerName="heat-engine" Dec 06 09:06:37 crc kubenswrapper[4954]: I1206 09:06:37.128472 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7d65f959c6-krcmt" podUID="1709a14f-a574-4485-8993-2c5991a6ca80" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.115:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8443: connect: connection refused" Dec 06 09:06:42 crc kubenswrapper[4954]: I1206 09:06:42.443963 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:06:42 crc kubenswrapper[4954]: E1206 09:06:42.444766 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:06:44 crc kubenswrapper[4954]: E1206 09:06:44.211211 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d is running failed: container process not found" containerID="1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 06 09:06:44 crc kubenswrapper[4954]: E1206 09:06:44.211803 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d is running failed: container process not found" containerID="1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 06 09:06:44 crc kubenswrapper[4954]: E1206 09:06:44.212214 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d is running failed: container process not found" containerID="1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 06 09:06:44 crc kubenswrapper[4954]: E1206 09:06:44.212249 4954 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d is running failed: container process not found" probeType="Readiness" pod="openstack/heat-engine-6cbfdb884b-xsxh7" podUID="c709125b-f0e4-46a0-94f1-e13feafc362e" containerName="heat-engine" Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.301327 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6cbfdb884b-xsxh7" Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.398999 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdwt2\" (UniqueName: \"kubernetes.io/projected/c709125b-f0e4-46a0-94f1-e13feafc362e-kube-api-access-tdwt2\") pod \"c709125b-f0e4-46a0-94f1-e13feafc362e\" (UID: \"c709125b-f0e4-46a0-94f1-e13feafc362e\") " Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.399140 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c709125b-f0e4-46a0-94f1-e13feafc362e-config-data\") pod \"c709125b-f0e4-46a0-94f1-e13feafc362e\" (UID: \"c709125b-f0e4-46a0-94f1-e13feafc362e\") " Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.399290 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c709125b-f0e4-46a0-94f1-e13feafc362e-config-data-custom\") pod \"c709125b-f0e4-46a0-94f1-e13feafc362e\" (UID: \"c709125b-f0e4-46a0-94f1-e13feafc362e\") " Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.399375 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c709125b-f0e4-46a0-94f1-e13feafc362e-combined-ca-bundle\") pod \"c709125b-f0e4-46a0-94f1-e13feafc362e\" (UID: \"c709125b-f0e4-46a0-94f1-e13feafc362e\") " Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.404997 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c709125b-f0e4-46a0-94f1-e13feafc362e-kube-api-access-tdwt2" (OuterVolumeSpecName: "kube-api-access-tdwt2") pod "c709125b-f0e4-46a0-94f1-e13feafc362e" (UID: "c709125b-f0e4-46a0-94f1-e13feafc362e"). InnerVolumeSpecName "kube-api-access-tdwt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.413672 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c709125b-f0e4-46a0-94f1-e13feafc362e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c709125b-f0e4-46a0-94f1-e13feafc362e" (UID: "c709125b-f0e4-46a0-94f1-e13feafc362e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.432784 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c709125b-f0e4-46a0-94f1-e13feafc362e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c709125b-f0e4-46a0-94f1-e13feafc362e" (UID: "c709125b-f0e4-46a0-94f1-e13feafc362e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.468588 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c709125b-f0e4-46a0-94f1-e13feafc362e-config-data" (OuterVolumeSpecName: "config-data") pod "c709125b-f0e4-46a0-94f1-e13feafc362e" (UID: "c709125b-f0e4-46a0-94f1-e13feafc362e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.501823 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdwt2\" (UniqueName: \"kubernetes.io/projected/c709125b-f0e4-46a0-94f1-e13feafc362e-kube-api-access-tdwt2\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.501861 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c709125b-f0e4-46a0-94f1-e13feafc362e-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.501874 4954 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c709125b-f0e4-46a0-94f1-e13feafc362e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.501885 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c709125b-f0e4-46a0-94f1-e13feafc362e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.504674 4954 generic.go:334] "Generic (PLEG): container finished" podID="c709125b-f0e4-46a0-94f1-e13feafc362e" containerID="1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d" exitCode=0 Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.504724 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6cbfdb884b-xsxh7" Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.504750 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6cbfdb884b-xsxh7" event={"ID":"c709125b-f0e4-46a0-94f1-e13feafc362e","Type":"ContainerDied","Data":"1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d"} Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.504777 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6cbfdb884b-xsxh7" event={"ID":"c709125b-f0e4-46a0-94f1-e13feafc362e","Type":"ContainerDied","Data":"9f4759073045d996020dc3da59ee48d0e339fda8617388afe904dac0f73a557e"} Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.504792 4954 scope.go:117] "RemoveContainer" containerID="1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d" Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.548275 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6cbfdb884b-xsxh7"] Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.549527 4954 scope.go:117] "RemoveContainer" containerID="1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d" Dec 06 09:06:44 crc kubenswrapper[4954]: E1206 09:06:44.550360 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d\": container with ID starting with 1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d not found: ID does not exist" containerID="1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d" Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.550416 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d"} err="failed to get container status \"1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d\": rpc error: code = NotFound desc = could not find container \"1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d\": container with ID starting with 1c030a0e1fdd4e9e32493182c07cc498dcdec60e86e0d8e57801a63b73302a6d not found: ID does not exist" Dec 06 09:06:44 crc kubenswrapper[4954]: I1206 09:06:44.560507 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-6cbfdb884b-xsxh7"] Dec 06 09:06:45 crc kubenswrapper[4954]: I1206 09:06:45.457640 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c709125b-f0e4-46a0-94f1-e13feafc362e" path="/var/lib/kubelet/pods/c709125b-f0e4-46a0-94f1-e13feafc362e/volumes" Dec 06 09:06:47 crc kubenswrapper[4954]: I1206 09:06:47.128451 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7d65f959c6-krcmt" podUID="1709a14f-a574-4485-8993-2c5991a6ca80" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.115:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8443: connect: connection refused" Dec 06 09:06:47 crc kubenswrapper[4954]: I1206 09:06:47.128870 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.118499 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.279576 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1709a14f-a574-4485-8993-2c5991a6ca80-combined-ca-bundle\") pod \"1709a14f-a574-4485-8993-2c5991a6ca80\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.280230 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1709a14f-a574-4485-8993-2c5991a6ca80-scripts\") pod \"1709a14f-a574-4485-8993-2c5991a6ca80\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.280394 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1709a14f-a574-4485-8993-2c5991a6ca80-config-data\") pod \"1709a14f-a574-4485-8993-2c5991a6ca80\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.280472 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2s5f\" (UniqueName: \"kubernetes.io/projected/1709a14f-a574-4485-8993-2c5991a6ca80-kube-api-access-f2s5f\") pod \"1709a14f-a574-4485-8993-2c5991a6ca80\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.280551 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1709a14f-a574-4485-8993-2c5991a6ca80-horizon-tls-certs\") pod \"1709a14f-a574-4485-8993-2c5991a6ca80\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.280777 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1709a14f-a574-4485-8993-2c5991a6ca80-logs\") pod \"1709a14f-a574-4485-8993-2c5991a6ca80\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.280904 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1709a14f-a574-4485-8993-2c5991a6ca80-horizon-secret-key\") pod \"1709a14f-a574-4485-8993-2c5991a6ca80\" (UID: \"1709a14f-a574-4485-8993-2c5991a6ca80\") " Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.281626 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1709a14f-a574-4485-8993-2c5991a6ca80-logs" (OuterVolumeSpecName: "logs") pod "1709a14f-a574-4485-8993-2c5991a6ca80" (UID: "1709a14f-a574-4485-8993-2c5991a6ca80"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.285904 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1709a14f-a574-4485-8993-2c5991a6ca80-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1709a14f-a574-4485-8993-2c5991a6ca80" (UID: "1709a14f-a574-4485-8993-2c5991a6ca80"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.288404 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1709a14f-a574-4485-8993-2c5991a6ca80-kube-api-access-f2s5f" (OuterVolumeSpecName: "kube-api-access-f2s5f") pod "1709a14f-a574-4485-8993-2c5991a6ca80" (UID: "1709a14f-a574-4485-8993-2c5991a6ca80"). InnerVolumeSpecName "kube-api-access-f2s5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.307891 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1709a14f-a574-4485-8993-2c5991a6ca80-config-data" (OuterVolumeSpecName: "config-data") pod "1709a14f-a574-4485-8993-2c5991a6ca80" (UID: "1709a14f-a574-4485-8993-2c5991a6ca80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.311512 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1709a14f-a574-4485-8993-2c5991a6ca80-scripts" (OuterVolumeSpecName: "scripts") pod "1709a14f-a574-4485-8993-2c5991a6ca80" (UID: "1709a14f-a574-4485-8993-2c5991a6ca80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.321095 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1709a14f-a574-4485-8993-2c5991a6ca80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1709a14f-a574-4485-8993-2c5991a6ca80" (UID: "1709a14f-a574-4485-8993-2c5991a6ca80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.333934 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1709a14f-a574-4485-8993-2c5991a6ca80-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "1709a14f-a574-4485-8993-2c5991a6ca80" (UID: "1709a14f-a574-4485-8993-2c5991a6ca80"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.383250 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1709a14f-a574-4485-8993-2c5991a6ca80-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.383300 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2s5f\" (UniqueName: \"kubernetes.io/projected/1709a14f-a574-4485-8993-2c5991a6ca80-kube-api-access-f2s5f\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.383315 4954 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1709a14f-a574-4485-8993-2c5991a6ca80-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.383328 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1709a14f-a574-4485-8993-2c5991a6ca80-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.383341 4954 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1709a14f-a574-4485-8993-2c5991a6ca80-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.383351 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1709a14f-a574-4485-8993-2c5991a6ca80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.383361 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1709a14f-a574-4485-8993-2c5991a6ca80-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.547333 4954 generic.go:334] "Generic (PLEG): container finished" podID="1709a14f-a574-4485-8993-2c5991a6ca80" containerID="a6e3e7cbe32d726f40cb264aeba7d52f87477307d597c8801524cbfd58dd48ca" exitCode=137 Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.547384 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d65f959c6-krcmt" event={"ID":"1709a14f-a574-4485-8993-2c5991a6ca80","Type":"ContainerDied","Data":"a6e3e7cbe32d726f40cb264aeba7d52f87477307d597c8801524cbfd58dd48ca"} Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.547417 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d65f959c6-krcmt" event={"ID":"1709a14f-a574-4485-8993-2c5991a6ca80","Type":"ContainerDied","Data":"7f281bef00f061c55fcd320df1a97237e14d68b9c79dbeadc3e247459cd9a142"} Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.547438 4954 scope.go:117] "RemoveContainer" containerID="74a7f8fe3b65db5f113c518df505f793c9938f6e1817b39e9a6c049d7bf5f8d2" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.547389 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d65f959c6-krcmt" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.580224 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d65f959c6-krcmt"] Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.590580 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7d65f959c6-krcmt"] Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.724289 4954 scope.go:117] "RemoveContainer" containerID="a6e3e7cbe32d726f40cb264aeba7d52f87477307d597c8801524cbfd58dd48ca" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.749316 4954 scope.go:117] "RemoveContainer" containerID="74a7f8fe3b65db5f113c518df505f793c9938f6e1817b39e9a6c049d7bf5f8d2" Dec 06 09:06:48 crc kubenswrapper[4954]: E1206 09:06:48.751830 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74a7f8fe3b65db5f113c518df505f793c9938f6e1817b39e9a6c049d7bf5f8d2\": container with ID starting with 74a7f8fe3b65db5f113c518df505f793c9938f6e1817b39e9a6c049d7bf5f8d2 not found: ID does not exist" containerID="74a7f8fe3b65db5f113c518df505f793c9938f6e1817b39e9a6c049d7bf5f8d2" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.751898 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a7f8fe3b65db5f113c518df505f793c9938f6e1817b39e9a6c049d7bf5f8d2"} err="failed to get container status \"74a7f8fe3b65db5f113c518df505f793c9938f6e1817b39e9a6c049d7bf5f8d2\": rpc error: code = NotFound desc = could not find container \"74a7f8fe3b65db5f113c518df505f793c9938f6e1817b39e9a6c049d7bf5f8d2\": container with ID starting with 74a7f8fe3b65db5f113c518df505f793c9938f6e1817b39e9a6c049d7bf5f8d2 not found: ID does not exist" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.751934 4954 scope.go:117] "RemoveContainer" containerID="a6e3e7cbe32d726f40cb264aeba7d52f87477307d597c8801524cbfd58dd48ca" Dec 06 09:06:48 crc kubenswrapper[4954]: E1206 09:06:48.752433 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6e3e7cbe32d726f40cb264aeba7d52f87477307d597c8801524cbfd58dd48ca\": container with ID starting with a6e3e7cbe32d726f40cb264aeba7d52f87477307d597c8801524cbfd58dd48ca not found: ID does not exist" containerID="a6e3e7cbe32d726f40cb264aeba7d52f87477307d597c8801524cbfd58dd48ca" Dec 06 09:06:48 crc kubenswrapper[4954]: I1206 09:06:48.752473 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6e3e7cbe32d726f40cb264aeba7d52f87477307d597c8801524cbfd58dd48ca"} err="failed to get container status \"a6e3e7cbe32d726f40cb264aeba7d52f87477307d597c8801524cbfd58dd48ca\": rpc error: code = NotFound desc = could not find container \"a6e3e7cbe32d726f40cb264aeba7d52f87477307d597c8801524cbfd58dd48ca\": container with ID starting with a6e3e7cbe32d726f40cb264aeba7d52f87477307d597c8801524cbfd58dd48ca not found: ID does not exist" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.083975 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x"] Dec 06 09:06:49 crc kubenswrapper[4954]: E1206 09:06:49.084344 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be6e0c18-9389-4828-8654-3de4e56894c6" containerName="heat-api" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.084361 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="be6e0c18-9389-4828-8654-3de4e56894c6" containerName="heat-api" Dec 06 09:06:49 crc kubenswrapper[4954]: E1206 09:06:49.084370 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0137440-41c2-46ae-864b-c649e9331ed4" containerName="heat-cfnapi" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.084376 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0137440-41c2-46ae-864b-c649e9331ed4" containerName="heat-cfnapi" Dec 06 09:06:49 crc kubenswrapper[4954]: E1206 09:06:49.084389 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1709a14f-a574-4485-8993-2c5991a6ca80" containerName="horizon-log" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.084395 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1709a14f-a574-4485-8993-2c5991a6ca80" containerName="horizon-log" Dec 06 09:06:49 crc kubenswrapper[4954]: E1206 09:06:49.084415 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbd10d0-15f8-44a6-997b-974371621108" containerName="heat-api" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.084423 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbd10d0-15f8-44a6-997b-974371621108" containerName="heat-api" Dec 06 09:06:49 crc kubenswrapper[4954]: E1206 09:06:49.084436 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af86654-409d-4f8b-909c-dfe1930993e4" containerName="heat-cfnapi" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.084443 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af86654-409d-4f8b-909c-dfe1930993e4" containerName="heat-cfnapi" Dec 06 09:06:49 crc kubenswrapper[4954]: E1206 09:06:49.084467 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c709125b-f0e4-46a0-94f1-e13feafc362e" containerName="heat-engine" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.084474 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c709125b-f0e4-46a0-94f1-e13feafc362e" containerName="heat-engine" Dec 06 09:06:49 crc kubenswrapper[4954]: E1206 09:06:49.084487 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be6e0c18-9389-4828-8654-3de4e56894c6" containerName="heat-api" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.084495 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="be6e0c18-9389-4828-8654-3de4e56894c6" containerName="heat-api" Dec 06 09:06:49 crc kubenswrapper[4954]: E1206 09:06:49.084513 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1709a14f-a574-4485-8993-2c5991a6ca80" containerName="horizon" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.084520 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1709a14f-a574-4485-8993-2c5991a6ca80" containerName="horizon" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.084716 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0137440-41c2-46ae-864b-c649e9331ed4" containerName="heat-cfnapi" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.084730 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="be6e0c18-9389-4828-8654-3de4e56894c6" containerName="heat-api" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.084739 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbd10d0-15f8-44a6-997b-974371621108" containerName="heat-api" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.084753 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1709a14f-a574-4485-8993-2c5991a6ca80" containerName="horizon-log" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.084759 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1709a14f-a574-4485-8993-2c5991a6ca80" containerName="horizon" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.084771 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af86654-409d-4f8b-909c-dfe1930993e4" containerName="heat-cfnapi" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.084786 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c709125b-f0e4-46a0-94f1-e13feafc362e" containerName="heat-engine" Dec 06 09:06:49 crc kubenswrapper[4954]: E1206 09:06:49.084973 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0137440-41c2-46ae-864b-c649e9331ed4" containerName="heat-cfnapi" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.084981 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0137440-41c2-46ae-864b-c649e9331ed4" containerName="heat-cfnapi" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.085149 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0137440-41c2-46ae-864b-c649e9331ed4" containerName="heat-cfnapi" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.085162 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="be6e0c18-9389-4828-8654-3de4e56894c6" containerName="heat-api" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.086119 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.088879 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.100839 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x"] Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.198534 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b46b0863-d9e9-4adb-a2ba-0e992b84dc73-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x\" (UID: \"b46b0863-d9e9-4adb-a2ba-0e992b84dc73\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.198630 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6scmt\" (UniqueName: \"kubernetes.io/projected/b46b0863-d9e9-4adb-a2ba-0e992b84dc73-kube-api-access-6scmt\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x\" (UID: \"b46b0863-d9e9-4adb-a2ba-0e992b84dc73\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.198716 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b46b0863-d9e9-4adb-a2ba-0e992b84dc73-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x\" (UID: \"b46b0863-d9e9-4adb-a2ba-0e992b84dc73\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.300793 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b46b0863-d9e9-4adb-a2ba-0e992b84dc73-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x\" (UID: \"b46b0863-d9e9-4adb-a2ba-0e992b84dc73\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.300945 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b46b0863-d9e9-4adb-a2ba-0e992b84dc73-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x\" (UID: \"b46b0863-d9e9-4adb-a2ba-0e992b84dc73\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.300981 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6scmt\" (UniqueName: \"kubernetes.io/projected/b46b0863-d9e9-4adb-a2ba-0e992b84dc73-kube-api-access-6scmt\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x\" (UID: \"b46b0863-d9e9-4adb-a2ba-0e992b84dc73\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.301837 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b46b0863-d9e9-4adb-a2ba-0e992b84dc73-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x\" (UID: \"b46b0863-d9e9-4adb-a2ba-0e992b84dc73\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.302059 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b46b0863-d9e9-4adb-a2ba-0e992b84dc73-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x\" (UID: \"b46b0863-d9e9-4adb-a2ba-0e992b84dc73\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.322622 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6scmt\" (UniqueName: \"kubernetes.io/projected/b46b0863-d9e9-4adb-a2ba-0e992b84dc73-kube-api-access-6scmt\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x\" (UID: \"b46b0863-d9e9-4adb-a2ba-0e992b84dc73\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.411586 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.462035 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1709a14f-a574-4485-8993-2c5991a6ca80" path="/var/lib/kubelet/pods/1709a14f-a574-4485-8993-2c5991a6ca80/volumes" Dec 06 09:06:49 crc kubenswrapper[4954]: I1206 09:06:49.896729 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x"] Dec 06 09:06:50 crc kubenswrapper[4954]: I1206 09:06:50.568039 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" event={"ID":"b46b0863-d9e9-4adb-a2ba-0e992b84dc73","Type":"ContainerStarted","Data":"e0f06860965ad0b15ea00c44bb7b87551c76821fc98ee4594be0f16b71c29650"} Dec 06 09:06:50 crc kubenswrapper[4954]: I1206 09:06:50.568383 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" event={"ID":"b46b0863-d9e9-4adb-a2ba-0e992b84dc73","Type":"ContainerStarted","Data":"ce1c566d03ac0ee55fa9a8ed26ce5764d5598c16adf84ebaa757324dda077174"} Dec 06 09:06:51 crc kubenswrapper[4954]: I1206 09:06:51.581894 4954 generic.go:334] "Generic (PLEG): container finished" podID="b46b0863-d9e9-4adb-a2ba-0e992b84dc73" containerID="e0f06860965ad0b15ea00c44bb7b87551c76821fc98ee4594be0f16b71c29650" exitCode=0 Dec 06 09:06:51 crc kubenswrapper[4954]: I1206 09:06:51.581977 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" event={"ID":"b46b0863-d9e9-4adb-a2ba-0e992b84dc73","Type":"ContainerDied","Data":"e0f06860965ad0b15ea00c44bb7b87551c76821fc98ee4594be0f16b71c29650"} Dec 06 09:06:51 crc kubenswrapper[4954]: I1206 09:06:51.629647 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dx7bp"] Dec 06 09:06:51 crc kubenswrapper[4954]: I1206 09:06:51.634692 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dx7bp" Dec 06 09:06:51 crc kubenswrapper[4954]: I1206 09:06:51.663343 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dx7bp"] Dec 06 09:06:51 crc kubenswrapper[4954]: I1206 09:06:51.749643 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cced3467-688c-4256-b9e5-bcbc7f1d0634-utilities\") pod \"redhat-operators-dx7bp\" (UID: \"cced3467-688c-4256-b9e5-bcbc7f1d0634\") " pod="openshift-marketplace/redhat-operators-dx7bp" Dec 06 09:06:51 crc kubenswrapper[4954]: I1206 09:06:51.749702 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cced3467-688c-4256-b9e5-bcbc7f1d0634-catalog-content\") pod \"redhat-operators-dx7bp\" (UID: \"cced3467-688c-4256-b9e5-bcbc7f1d0634\") " pod="openshift-marketplace/redhat-operators-dx7bp" Dec 06 09:06:51 crc kubenswrapper[4954]: I1206 09:06:51.749788 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdhl4\" (UniqueName: \"kubernetes.io/projected/cced3467-688c-4256-b9e5-bcbc7f1d0634-kube-api-access-xdhl4\") pod \"redhat-operators-dx7bp\" (UID: \"cced3467-688c-4256-b9e5-bcbc7f1d0634\") " pod="openshift-marketplace/redhat-operators-dx7bp" Dec 06 09:06:51 crc kubenswrapper[4954]: I1206 09:06:51.852011 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdhl4\" (UniqueName: \"kubernetes.io/projected/cced3467-688c-4256-b9e5-bcbc7f1d0634-kube-api-access-xdhl4\") pod \"redhat-operators-dx7bp\" (UID: \"cced3467-688c-4256-b9e5-bcbc7f1d0634\") " pod="openshift-marketplace/redhat-operators-dx7bp" Dec 06 09:06:51 crc kubenswrapper[4954]: I1206 09:06:51.852157 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cced3467-688c-4256-b9e5-bcbc7f1d0634-utilities\") pod \"redhat-operators-dx7bp\" (UID: \"cced3467-688c-4256-b9e5-bcbc7f1d0634\") " pod="openshift-marketplace/redhat-operators-dx7bp" Dec 06 09:06:51 crc kubenswrapper[4954]: I1206 09:06:51.852191 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cced3467-688c-4256-b9e5-bcbc7f1d0634-catalog-content\") pod \"redhat-operators-dx7bp\" (UID: \"cced3467-688c-4256-b9e5-bcbc7f1d0634\") " pod="openshift-marketplace/redhat-operators-dx7bp" Dec 06 09:06:51 crc kubenswrapper[4954]: I1206 09:06:51.852737 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cced3467-688c-4256-b9e5-bcbc7f1d0634-utilities\") pod \"redhat-operators-dx7bp\" (UID: \"cced3467-688c-4256-b9e5-bcbc7f1d0634\") " pod="openshift-marketplace/redhat-operators-dx7bp" Dec 06 09:06:51 crc kubenswrapper[4954]: I1206 09:06:51.852772 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cced3467-688c-4256-b9e5-bcbc7f1d0634-catalog-content\") pod \"redhat-operators-dx7bp\" (UID: \"cced3467-688c-4256-b9e5-bcbc7f1d0634\") " pod="openshift-marketplace/redhat-operators-dx7bp" Dec 06 09:06:51 crc kubenswrapper[4954]: I1206 09:06:51.873706 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdhl4\" (UniqueName: \"kubernetes.io/projected/cced3467-688c-4256-b9e5-bcbc7f1d0634-kube-api-access-xdhl4\") pod \"redhat-operators-dx7bp\" (UID: \"cced3467-688c-4256-b9e5-bcbc7f1d0634\") " pod="openshift-marketplace/redhat-operators-dx7bp" Dec 06 09:06:51 crc kubenswrapper[4954]: I1206 09:06:51.956091 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dx7bp" Dec 06 09:06:52 crc kubenswrapper[4954]: I1206 09:06:52.455612 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dx7bp"] Dec 06 09:06:52 crc kubenswrapper[4954]: W1206 09:06:52.473920 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcced3467_688c_4256_b9e5_bcbc7f1d0634.slice/crio-3d1fb8462c6b664bd2cf55e6b922b0d29ca67318c79399378449adc15b53c979 WatchSource:0}: Error finding container 3d1fb8462c6b664bd2cf55e6b922b0d29ca67318c79399378449adc15b53c979: Status 404 returned error can't find the container with id 3d1fb8462c6b664bd2cf55e6b922b0d29ca67318c79399378449adc15b53c979 Dec 06 09:06:52 crc kubenswrapper[4954]: I1206 09:06:52.595208 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx7bp" event={"ID":"cced3467-688c-4256-b9e5-bcbc7f1d0634","Type":"ContainerStarted","Data":"3d1fb8462c6b664bd2cf55e6b922b0d29ca67318c79399378449adc15b53c979"} Dec 06 09:06:53 crc kubenswrapper[4954]: I1206 09:06:53.442919 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:06:53 crc kubenswrapper[4954]: E1206 09:06:53.443378 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:06:53 crc kubenswrapper[4954]: I1206 09:06:53.604709 4954 generic.go:334] "Generic (PLEG): container finished" podID="cced3467-688c-4256-b9e5-bcbc7f1d0634" containerID="81e69cde72344d4c7fbc4e85e93b88aeec06e28629accc30e8d57f3fce876cdd" exitCode=0 Dec 06 09:06:53 crc kubenswrapper[4954]: I1206 09:06:53.604792 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx7bp" event={"ID":"cced3467-688c-4256-b9e5-bcbc7f1d0634","Type":"ContainerDied","Data":"81e69cde72344d4c7fbc4e85e93b88aeec06e28629accc30e8d57f3fce876cdd"} Dec 06 09:06:53 crc kubenswrapper[4954]: I1206 09:06:53.607239 4954 generic.go:334] "Generic (PLEG): container finished" podID="b46b0863-d9e9-4adb-a2ba-0e992b84dc73" containerID="2ac66523b646f9e971c259e48252299c46d977788c3117861010defcd4d7315f" exitCode=0 Dec 06 09:06:53 crc kubenswrapper[4954]: I1206 09:06:53.607286 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" event={"ID":"b46b0863-d9e9-4adb-a2ba-0e992b84dc73","Type":"ContainerDied","Data":"2ac66523b646f9e971c259e48252299c46d977788c3117861010defcd4d7315f"} Dec 06 09:06:54 crc kubenswrapper[4954]: I1206 09:06:54.620591 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx7bp" event={"ID":"cced3467-688c-4256-b9e5-bcbc7f1d0634","Type":"ContainerStarted","Data":"2046032f41f5726627699666f0fd05b3da1dab492ed4d397317e1b655090f417"} Dec 06 09:06:54 crc kubenswrapper[4954]: I1206 09:06:54.623308 4954 generic.go:334] "Generic (PLEG): container finished" podID="b46b0863-d9e9-4adb-a2ba-0e992b84dc73" containerID="6ac5a875c6efd60bf7327bff0d07a28b267306f76a3f8ff115cc593f583b450e" exitCode=0 Dec 06 09:06:54 crc kubenswrapper[4954]: I1206 09:06:54.623359 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" event={"ID":"b46b0863-d9e9-4adb-a2ba-0e992b84dc73","Type":"ContainerDied","Data":"6ac5a875c6efd60bf7327bff0d07a28b267306f76a3f8ff115cc593f583b450e"} Dec 06 09:06:55 crc kubenswrapper[4954]: I1206 09:06:55.978112 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" Dec 06 09:06:56 crc kubenswrapper[4954]: I1206 09:06:56.162416 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b46b0863-d9e9-4adb-a2ba-0e992b84dc73-bundle\") pod \"b46b0863-d9e9-4adb-a2ba-0e992b84dc73\" (UID: \"b46b0863-d9e9-4adb-a2ba-0e992b84dc73\") " Dec 06 09:06:56 crc kubenswrapper[4954]: I1206 09:06:56.162480 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6scmt\" (UniqueName: \"kubernetes.io/projected/b46b0863-d9e9-4adb-a2ba-0e992b84dc73-kube-api-access-6scmt\") pod \"b46b0863-d9e9-4adb-a2ba-0e992b84dc73\" (UID: \"b46b0863-d9e9-4adb-a2ba-0e992b84dc73\") " Dec 06 09:06:56 crc kubenswrapper[4954]: I1206 09:06:56.162654 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b46b0863-d9e9-4adb-a2ba-0e992b84dc73-util\") pod \"b46b0863-d9e9-4adb-a2ba-0e992b84dc73\" (UID: \"b46b0863-d9e9-4adb-a2ba-0e992b84dc73\") " Dec 06 09:06:56 crc kubenswrapper[4954]: I1206 09:06:56.163939 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b46b0863-d9e9-4adb-a2ba-0e992b84dc73-bundle" (OuterVolumeSpecName: "bundle") pod "b46b0863-d9e9-4adb-a2ba-0e992b84dc73" (UID: "b46b0863-d9e9-4adb-a2ba-0e992b84dc73"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:06:56 crc kubenswrapper[4954]: I1206 09:06:56.169460 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b46b0863-d9e9-4adb-a2ba-0e992b84dc73-kube-api-access-6scmt" (OuterVolumeSpecName: "kube-api-access-6scmt") pod "b46b0863-d9e9-4adb-a2ba-0e992b84dc73" (UID: "b46b0863-d9e9-4adb-a2ba-0e992b84dc73"). InnerVolumeSpecName "kube-api-access-6scmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:06:56 crc kubenswrapper[4954]: I1206 09:06:56.265013 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6scmt\" (UniqueName: \"kubernetes.io/projected/b46b0863-d9e9-4adb-a2ba-0e992b84dc73-kube-api-access-6scmt\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:56 crc kubenswrapper[4954]: I1206 09:06:56.265051 4954 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b46b0863-d9e9-4adb-a2ba-0e992b84dc73-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:56 crc kubenswrapper[4954]: I1206 09:06:56.652953 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" event={"ID":"b46b0863-d9e9-4adb-a2ba-0e992b84dc73","Type":"ContainerDied","Data":"ce1c566d03ac0ee55fa9a8ed26ce5764d5598c16adf84ebaa757324dda077174"} Dec 06 09:06:56 crc kubenswrapper[4954]: I1206 09:06:56.653387 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce1c566d03ac0ee55fa9a8ed26ce5764d5598c16adf84ebaa757324dda077174" Dec 06 09:06:56 crc kubenswrapper[4954]: I1206 09:06:56.653065 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x" Dec 06 09:06:57 crc kubenswrapper[4954]: I1206 09:06:57.595172 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b46b0863-d9e9-4adb-a2ba-0e992b84dc73-util" (OuterVolumeSpecName: "util") pod "b46b0863-d9e9-4adb-a2ba-0e992b84dc73" (UID: "b46b0863-d9e9-4adb-a2ba-0e992b84dc73"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:06:57 crc kubenswrapper[4954]: I1206 09:06:57.595932 4954 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b46b0863-d9e9-4adb-a2ba-0e992b84dc73-util\") on node \"crc\" DevicePath \"\"" Dec 06 09:06:57 crc kubenswrapper[4954]: I1206 09:06:57.662480 4954 generic.go:334] "Generic (PLEG): container finished" podID="cced3467-688c-4256-b9e5-bcbc7f1d0634" containerID="2046032f41f5726627699666f0fd05b3da1dab492ed4d397317e1b655090f417" exitCode=0 Dec 06 09:06:57 crc kubenswrapper[4954]: I1206 09:06:57.662522 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx7bp" event={"ID":"cced3467-688c-4256-b9e5-bcbc7f1d0634","Type":"ContainerDied","Data":"2046032f41f5726627699666f0fd05b3da1dab492ed4d397317e1b655090f417"} Dec 06 09:06:58 crc kubenswrapper[4954]: I1206 09:06:58.675865 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx7bp" event={"ID":"cced3467-688c-4256-b9e5-bcbc7f1d0634","Type":"ContainerStarted","Data":"fe9a4d064eec26b81bc811a39eccef46524147caf0eded57a60ee4dd050349a6"} Dec 06 09:06:58 crc kubenswrapper[4954]: I1206 09:06:58.695620 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dx7bp" podStartSLOduration=3.125446817 podStartE2EDuration="7.695587006s" podCreationTimestamp="2025-12-06 09:06:51 +0000 UTC" firstStartedPulling="2025-12-06 09:06:53.608495588 +0000 UTC m=+7788.421854977" lastFinishedPulling="2025-12-06 09:06:58.178635767 +0000 UTC m=+7792.991995166" observedRunningTime="2025-12-06 09:06:58.691375133 +0000 UTC m=+7793.504734522" watchObservedRunningTime="2025-12-06 09:06:58.695587006 +0000 UTC m=+7793.508946405" Dec 06 09:07:01 crc kubenswrapper[4954]: I1206 09:07:01.956585 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dx7bp" Dec 06 09:07:01 crc kubenswrapper[4954]: I1206 09:07:01.956901 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dx7bp" Dec 06 09:07:03 crc kubenswrapper[4954]: I1206 09:07:03.034762 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dx7bp" podUID="cced3467-688c-4256-b9e5-bcbc7f1d0634" containerName="registry-server" probeResult="failure" output=< Dec 06 09:07:03 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 09:07:03 crc kubenswrapper[4954]: > Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.758544 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-wdm7s"] Dec 06 09:07:06 crc kubenswrapper[4954]: E1206 09:07:06.759652 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46b0863-d9e9-4adb-a2ba-0e992b84dc73" containerName="pull" Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.759675 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46b0863-d9e9-4adb-a2ba-0e992b84dc73" containerName="pull" Dec 06 09:07:06 crc kubenswrapper[4954]: E1206 09:07:06.759722 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46b0863-d9e9-4adb-a2ba-0e992b84dc73" containerName="util" Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.759731 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46b0863-d9e9-4adb-a2ba-0e992b84dc73" containerName="util" Dec 06 09:07:06 crc kubenswrapper[4954]: E1206 09:07:06.759773 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46b0863-d9e9-4adb-a2ba-0e992b84dc73" containerName="extract" Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.759785 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46b0863-d9e9-4adb-a2ba-0e992b84dc73" containerName="extract" Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.760337 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b46b0863-d9e9-4adb-a2ba-0e992b84dc73" containerName="extract" Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.761694 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-wdm7s" Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.763576 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-x2mm2" Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.767648 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.768088 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.801487 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-wdm7s"] Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.879788 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9htn\" (UniqueName: \"kubernetes.io/projected/c5a70aef-4a54-42be-9bf4-1832a3497ac5-kube-api-access-k9htn\") pod \"obo-prometheus-operator-668cf9dfbb-wdm7s\" (UID: \"c5a70aef-4a54-42be-9bf4-1832a3497ac5\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-wdm7s" Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.883212 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph"] Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.884577 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph" Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.889238 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.889407 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-9gj47" Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.899994 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb"] Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.901315 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb" Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.927050 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph"] Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.938410 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb"] Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.983437 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe9110b5-92c9-4f7a-8f19-e8eef8fd7040-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb\" (UID: \"fe9110b5-92c9-4f7a-8f19-e8eef8fd7040\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb" Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.983492 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe9110b5-92c9-4f7a-8f19-e8eef8fd7040-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb\" (UID: \"fe9110b5-92c9-4f7a-8f19-e8eef8fd7040\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb" Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.983584 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fb1ff733-3aca-4019-8535-ce5a462c6099-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph\" (UID: \"fb1ff733-3aca-4019-8535-ce5a462c6099\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph" Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.983662 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb1ff733-3aca-4019-8535-ce5a462c6099-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph\" (UID: \"fb1ff733-3aca-4019-8535-ce5a462c6099\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph" Dec 06 09:07:06 crc kubenswrapper[4954]: I1206 09:07:06.983759 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9htn\" (UniqueName: \"kubernetes.io/projected/c5a70aef-4a54-42be-9bf4-1832a3497ac5-kube-api-access-k9htn\") pod \"obo-prometheus-operator-668cf9dfbb-wdm7s\" (UID: \"c5a70aef-4a54-42be-9bf4-1832a3497ac5\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-wdm7s" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.018709 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9htn\" (UniqueName: \"kubernetes.io/projected/c5a70aef-4a54-42be-9bf4-1832a3497ac5-kube-api-access-k9htn\") pod \"obo-prometheus-operator-668cf9dfbb-wdm7s\" (UID: \"c5a70aef-4a54-42be-9bf4-1832a3497ac5\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-wdm7s" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.072678 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-bppch"] Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.074252 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-bppch" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.076200 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-ssmqk" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.077306 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.085743 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe9110b5-92c9-4f7a-8f19-e8eef8fd7040-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb\" (UID: \"fe9110b5-92c9-4f7a-8f19-e8eef8fd7040\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.085794 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fb1ff733-3aca-4019-8535-ce5a462c6099-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph\" (UID: \"fb1ff733-3aca-4019-8535-ce5a462c6099\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.085853 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb1ff733-3aca-4019-8535-ce5a462c6099-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph\" (UID: \"fb1ff733-3aca-4019-8535-ce5a462c6099\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.085966 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe9110b5-92c9-4f7a-8f19-e8eef8fd7040-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb\" (UID: \"fe9110b5-92c9-4f7a-8f19-e8eef8fd7040\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.091673 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe9110b5-92c9-4f7a-8f19-e8eef8fd7040-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb\" (UID: \"fe9110b5-92c9-4f7a-8f19-e8eef8fd7040\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.091777 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe9110b5-92c9-4f7a-8f19-e8eef8fd7040-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb\" (UID: \"fe9110b5-92c9-4f7a-8f19-e8eef8fd7040\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.094662 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb1ff733-3aca-4019-8535-ce5a462c6099-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph\" (UID: \"fb1ff733-3aca-4019-8535-ce5a462c6099\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.096434 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-wdm7s" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.099274 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fb1ff733-3aca-4019-8535-ce5a462c6099-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph\" (UID: \"fb1ff733-3aca-4019-8535-ce5a462c6099\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.099613 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-bppch"] Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.187939 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpb5g\" (UniqueName: \"kubernetes.io/projected/acee39da-b368-4329-921b-8051f2493627-kube-api-access-jpb5g\") pod \"observability-operator-d8bb48f5d-bppch\" (UID: \"acee39da-b368-4329-921b-8051f2493627\") " pod="openshift-operators/observability-operator-d8bb48f5d-bppch" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.188015 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/acee39da-b368-4329-921b-8051f2493627-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-bppch\" (UID: \"acee39da-b368-4329-921b-8051f2493627\") " pod="openshift-operators/observability-operator-d8bb48f5d-bppch" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.203435 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.227096 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.296244 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-fmvfc"] Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.297778 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-fmvfc" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.301138 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpb5g\" (UniqueName: \"kubernetes.io/projected/acee39da-b368-4329-921b-8051f2493627-kube-api-access-jpb5g\") pod \"observability-operator-d8bb48f5d-bppch\" (UID: \"acee39da-b368-4329-921b-8051f2493627\") " pod="openshift-operators/observability-operator-d8bb48f5d-bppch" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.301226 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/acee39da-b368-4329-921b-8051f2493627-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-bppch\" (UID: \"acee39da-b368-4329-921b-8051f2493627\") " pod="openshift-operators/observability-operator-d8bb48f5d-bppch" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.307807 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-w8prv" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.311852 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/acee39da-b368-4329-921b-8051f2493627-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-bppch\" (UID: \"acee39da-b368-4329-921b-8051f2493627\") " pod="openshift-operators/observability-operator-d8bb48f5d-bppch" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.332402 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-fmvfc"] Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.333894 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpb5g\" (UniqueName: \"kubernetes.io/projected/acee39da-b368-4329-921b-8051f2493627-kube-api-access-jpb5g\") pod \"observability-operator-d8bb48f5d-bppch\" (UID: \"acee39da-b368-4329-921b-8051f2493627\") " pod="openshift-operators/observability-operator-d8bb48f5d-bppch" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.403337 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9wzw\" (UniqueName: \"kubernetes.io/projected/4105a6d8-9e43-435c-b69f-de2f10d97eea-kube-api-access-m9wzw\") pod \"perses-operator-5446b9c989-fmvfc\" (UID: \"4105a6d8-9e43-435c-b69f-de2f10d97eea\") " pod="openshift-operators/perses-operator-5446b9c989-fmvfc" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.403385 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/4105a6d8-9e43-435c-b69f-de2f10d97eea-openshift-service-ca\") pod \"perses-operator-5446b9c989-fmvfc\" (UID: \"4105a6d8-9e43-435c-b69f-de2f10d97eea\") " pod="openshift-operators/perses-operator-5446b9c989-fmvfc" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.444494 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:07:07 crc kubenswrapper[4954]: E1206 09:07:07.444720 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.501031 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-bppch" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.505041 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9wzw\" (UniqueName: \"kubernetes.io/projected/4105a6d8-9e43-435c-b69f-de2f10d97eea-kube-api-access-m9wzw\") pod \"perses-operator-5446b9c989-fmvfc\" (UID: \"4105a6d8-9e43-435c-b69f-de2f10d97eea\") " pod="openshift-operators/perses-operator-5446b9c989-fmvfc" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.505078 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/4105a6d8-9e43-435c-b69f-de2f10d97eea-openshift-service-ca\") pod \"perses-operator-5446b9c989-fmvfc\" (UID: \"4105a6d8-9e43-435c-b69f-de2f10d97eea\") " pod="openshift-operators/perses-operator-5446b9c989-fmvfc" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.506512 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/4105a6d8-9e43-435c-b69f-de2f10d97eea-openshift-service-ca\") pod \"perses-operator-5446b9c989-fmvfc\" (UID: \"4105a6d8-9e43-435c-b69f-de2f10d97eea\") " pod="openshift-operators/perses-operator-5446b9c989-fmvfc" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.538143 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9wzw\" (UniqueName: \"kubernetes.io/projected/4105a6d8-9e43-435c-b69f-de2f10d97eea-kube-api-access-m9wzw\") pod \"perses-operator-5446b9c989-fmvfc\" (UID: \"4105a6d8-9e43-435c-b69f-de2f10d97eea\") " pod="openshift-operators/perses-operator-5446b9c989-fmvfc" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.644172 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-wdm7s"] Dec 06 09:07:07 crc kubenswrapper[4954]: W1206 09:07:07.646440 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5a70aef_4a54_42be_9bf4_1832a3497ac5.slice/crio-6c737d785e9bf01e3f24d1f072a248cc70bdf4a89c9e2251be1c0537cafee6f4 WatchSource:0}: Error finding container 6c737d785e9bf01e3f24d1f072a248cc70bdf4a89c9e2251be1c0537cafee6f4: Status 404 returned error can't find the container with id 6c737d785e9bf01e3f24d1f072a248cc70bdf4a89c9e2251be1c0537cafee6f4 Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.675125 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-fmvfc" Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.838498 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-wdm7s" event={"ID":"c5a70aef-4a54-42be-9bf4-1832a3497ac5","Type":"ContainerStarted","Data":"6c737d785e9bf01e3f24d1f072a248cc70bdf4a89c9e2251be1c0537cafee6f4"} Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.847357 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb"] Dec 06 09:07:07 crc kubenswrapper[4954]: W1206 09:07:07.872149 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe9110b5_92c9_4f7a_8f19_e8eef8fd7040.slice/crio-e5c26464ef05df3be4612f32d5a58af79f8f9ef5089b10ae5897b1186ff0e0b5 WatchSource:0}: Error finding container e5c26464ef05df3be4612f32d5a58af79f8f9ef5089b10ae5897b1186ff0e0b5: Status 404 returned error can't find the container with id e5c26464ef05df3be4612f32d5a58af79f8f9ef5089b10ae5897b1186ff0e0b5 Dec 06 09:07:07 crc kubenswrapper[4954]: I1206 09:07:07.948486 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph"] Dec 06 09:07:07 crc kubenswrapper[4954]: W1206 09:07:07.960143 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb1ff733_3aca_4019_8535_ce5a462c6099.slice/crio-181dca624ac3810069521e0a3b0bc144356fe168afd6d27f207205c0bf62ed4f WatchSource:0}: Error finding container 181dca624ac3810069521e0a3b0bc144356fe168afd6d27f207205c0bf62ed4f: Status 404 returned error can't find the container with id 181dca624ac3810069521e0a3b0bc144356fe168afd6d27f207205c0bf62ed4f Dec 06 09:07:08 crc kubenswrapper[4954]: I1206 09:07:08.037116 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-bppch"] Dec 06 09:07:08 crc kubenswrapper[4954]: I1206 09:07:08.243170 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-fmvfc"] Dec 06 09:07:08 crc kubenswrapper[4954]: I1206 09:07:08.849081 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph" event={"ID":"fb1ff733-3aca-4019-8535-ce5a462c6099","Type":"ContainerStarted","Data":"181dca624ac3810069521e0a3b0bc144356fe168afd6d27f207205c0bf62ed4f"} Dec 06 09:07:08 crc kubenswrapper[4954]: I1206 09:07:08.850773 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb" event={"ID":"fe9110b5-92c9-4f7a-8f19-e8eef8fd7040","Type":"ContainerStarted","Data":"e5c26464ef05df3be4612f32d5a58af79f8f9ef5089b10ae5897b1186ff0e0b5"} Dec 06 09:07:08 crc kubenswrapper[4954]: I1206 09:07:08.851862 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-bppch" event={"ID":"acee39da-b368-4329-921b-8051f2493627","Type":"ContainerStarted","Data":"8dab308f30b11c5efdbcff483d6bac44824a7a1a891abc320126c2e70a6a1c20"} Dec 06 09:07:08 crc kubenswrapper[4954]: I1206 09:07:08.853544 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-fmvfc" event={"ID":"4105a6d8-9e43-435c-b69f-de2f10d97eea","Type":"ContainerStarted","Data":"b12e5e1c340821e60f33b8356a1174437245a7a512797d97b3272341c6ed62ff"} Dec 06 09:07:13 crc kubenswrapper[4954]: I1206 09:07:13.027016 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dx7bp" podUID="cced3467-688c-4256-b9e5-bcbc7f1d0634" containerName="registry-server" probeResult="failure" output=< Dec 06 09:07:13 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 09:07:13 crc kubenswrapper[4954]: > Dec 06 09:07:18 crc kubenswrapper[4954]: I1206 09:07:18.984769 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph" event={"ID":"fb1ff733-3aca-4019-8535-ce5a462c6099","Type":"ContainerStarted","Data":"219e58dbe7812ecec54a59863b05cefe80980d062a4182290bf29b553c6cca4d"} Dec 06 09:07:18 crc kubenswrapper[4954]: I1206 09:07:18.989335 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb" event={"ID":"fe9110b5-92c9-4f7a-8f19-e8eef8fd7040","Type":"ContainerStarted","Data":"308f4f8efa648abe8f616d539e55ad8a19d3cf3028b323bef235b12909983c03"} Dec 06 09:07:18 crc kubenswrapper[4954]: I1206 09:07:18.992023 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-bppch" event={"ID":"acee39da-b368-4329-921b-8051f2493627","Type":"ContainerStarted","Data":"a278fc3368b83abe1c781660505b5534e50f5bd29ae85c4a2e318f321c48127a"} Dec 06 09:07:18 crc kubenswrapper[4954]: I1206 09:07:18.992220 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-bppch" Dec 06 09:07:18 crc kubenswrapper[4954]: I1206 09:07:18.996061 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-fmvfc" event={"ID":"4105a6d8-9e43-435c-b69f-de2f10d97eea","Type":"ContainerStarted","Data":"5e7cef82f7366ac3417ee0e611cb789c9fc3cc6a0c1159f54bbfe27740ba860f"} Dec 06 09:07:18 crc kubenswrapper[4954]: I1206 09:07:18.996329 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-fmvfc" Dec 06 09:07:19 crc kubenswrapper[4954]: I1206 09:07:19.008152 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph" podStartSLOduration=2.825431796 podStartE2EDuration="13.00811056s" podCreationTimestamp="2025-12-06 09:07:06 +0000 UTC" firstStartedPulling="2025-12-06 09:07:07.97037819 +0000 UTC m=+7802.783737579" lastFinishedPulling="2025-12-06 09:07:18.153056954 +0000 UTC m=+7812.966416343" observedRunningTime="2025-12-06 09:07:19.003439685 +0000 UTC m=+7813.816799074" watchObservedRunningTime="2025-12-06 09:07:19.00811056 +0000 UTC m=+7813.821469959" Dec 06 09:07:19 crc kubenswrapper[4954]: I1206 09:07:19.033833 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-fmvfc" podStartSLOduration=2.036086409 podStartE2EDuration="12.033809268s" podCreationTimestamp="2025-12-06 09:07:07 +0000 UTC" firstStartedPulling="2025-12-06 09:07:08.239263683 +0000 UTC m=+7803.052623072" lastFinishedPulling="2025-12-06 09:07:18.236986542 +0000 UTC m=+7813.050345931" observedRunningTime="2025-12-06 09:07:19.029247746 +0000 UTC m=+7813.842607145" watchObservedRunningTime="2025-12-06 09:07:19.033809268 +0000 UTC m=+7813.847168657" Dec 06 09:07:19 crc kubenswrapper[4954]: I1206 09:07:19.078184 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-bppch" Dec 06 09:07:19 crc kubenswrapper[4954]: I1206 09:07:19.080082 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-bppch" podStartSLOduration=1.896764007 podStartE2EDuration="12.080057157s" podCreationTimestamp="2025-12-06 09:07:07 +0000 UTC" firstStartedPulling="2025-12-06 09:07:08.060983707 +0000 UTC m=+7802.874343096" lastFinishedPulling="2025-12-06 09:07:18.244276857 +0000 UTC m=+7813.057636246" observedRunningTime="2025-12-06 09:07:19.064988864 +0000 UTC m=+7813.878348253" watchObservedRunningTime="2025-12-06 09:07:19.080057157 +0000 UTC m=+7813.893416546" Dec 06 09:07:19 crc kubenswrapper[4954]: I1206 09:07:19.104294 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb" podStartSLOduration=2.840629183 podStartE2EDuration="13.104272686s" podCreationTimestamp="2025-12-06 09:07:06 +0000 UTC" firstStartedPulling="2025-12-06 09:07:07.889706499 +0000 UTC m=+7802.703065898" lastFinishedPulling="2025-12-06 09:07:18.153350012 +0000 UTC m=+7812.966709401" observedRunningTime="2025-12-06 09:07:19.100521146 +0000 UTC m=+7813.913880535" watchObservedRunningTime="2025-12-06 09:07:19.104272686 +0000 UTC m=+7813.917632075" Dec 06 09:07:19 crc kubenswrapper[4954]: I1206 09:07:19.444161 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:07:19 crc kubenswrapper[4954]: E1206 09:07:19.444494 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:07:20 crc kubenswrapper[4954]: I1206 09:07:20.007947 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-wdm7s" event={"ID":"c5a70aef-4a54-42be-9bf4-1832a3497ac5","Type":"ContainerStarted","Data":"3b42ee04b87c96747d47541a0e0f3f2d371cd39e5f1cbb793feb0a57fb1344cf"} Dec 06 09:07:20 crc kubenswrapper[4954]: I1206 09:07:20.034393 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-wdm7s" podStartSLOduration=3.446460132 podStartE2EDuration="14.034369512s" podCreationTimestamp="2025-12-06 09:07:06 +0000 UTC" firstStartedPulling="2025-12-06 09:07:07.654306712 +0000 UTC m=+7802.467666101" lastFinishedPulling="2025-12-06 09:07:18.242216092 +0000 UTC m=+7813.055575481" observedRunningTime="2025-12-06 09:07:20.026118031 +0000 UTC m=+7814.839477430" watchObservedRunningTime="2025-12-06 09:07:20.034369512 +0000 UTC m=+7814.847728901" Dec 06 09:07:22 crc kubenswrapper[4954]: I1206 09:07:22.014200 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dx7bp" Dec 06 09:07:22 crc kubenswrapper[4954]: I1206 09:07:22.079029 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dx7bp" Dec 06 09:07:23 crc kubenswrapper[4954]: I1206 09:07:23.882051 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dx7bp"] Dec 06 09:07:23 crc kubenswrapper[4954]: I1206 09:07:23.883096 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dx7bp" podUID="cced3467-688c-4256-b9e5-bcbc7f1d0634" containerName="registry-server" containerID="cri-o://fe9a4d064eec26b81bc811a39eccef46524147caf0eded57a60ee4dd050349a6" gracePeriod=2 Dec 06 09:07:24 crc kubenswrapper[4954]: I1206 09:07:24.067997 4954 generic.go:334] "Generic (PLEG): container finished" podID="cced3467-688c-4256-b9e5-bcbc7f1d0634" containerID="fe9a4d064eec26b81bc811a39eccef46524147caf0eded57a60ee4dd050349a6" exitCode=0 Dec 06 09:07:24 crc kubenswrapper[4954]: I1206 09:07:24.068041 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx7bp" event={"ID":"cced3467-688c-4256-b9e5-bcbc7f1d0634","Type":"ContainerDied","Data":"fe9a4d064eec26b81bc811a39eccef46524147caf0eded57a60ee4dd050349a6"} Dec 06 09:07:25 crc kubenswrapper[4954]: I1206 09:07:25.063844 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dx7bp" Dec 06 09:07:25 crc kubenswrapper[4954]: I1206 09:07:25.077385 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx7bp" event={"ID":"cced3467-688c-4256-b9e5-bcbc7f1d0634","Type":"ContainerDied","Data":"3d1fb8462c6b664bd2cf55e6b922b0d29ca67318c79399378449adc15b53c979"} Dec 06 09:07:25 crc kubenswrapper[4954]: I1206 09:07:25.077442 4954 scope.go:117] "RemoveContainer" containerID="fe9a4d064eec26b81bc811a39eccef46524147caf0eded57a60ee4dd050349a6" Dec 06 09:07:25 crc kubenswrapper[4954]: I1206 09:07:25.077560 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dx7bp" Dec 06 09:07:25 crc kubenswrapper[4954]: I1206 09:07:25.113507 4954 scope.go:117] "RemoveContainer" containerID="2046032f41f5726627699666f0fd05b3da1dab492ed4d397317e1b655090f417" Dec 06 09:07:25 crc kubenswrapper[4954]: I1206 09:07:25.154473 4954 scope.go:117] "RemoveContainer" containerID="81e69cde72344d4c7fbc4e85e93b88aeec06e28629accc30e8d57f3fce876cdd" Dec 06 09:07:25 crc kubenswrapper[4954]: I1206 09:07:25.204455 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdhl4\" (UniqueName: \"kubernetes.io/projected/cced3467-688c-4256-b9e5-bcbc7f1d0634-kube-api-access-xdhl4\") pod \"cced3467-688c-4256-b9e5-bcbc7f1d0634\" (UID: \"cced3467-688c-4256-b9e5-bcbc7f1d0634\") " Dec 06 09:07:25 crc kubenswrapper[4954]: I1206 09:07:25.204532 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cced3467-688c-4256-b9e5-bcbc7f1d0634-catalog-content\") pod \"cced3467-688c-4256-b9e5-bcbc7f1d0634\" (UID: \"cced3467-688c-4256-b9e5-bcbc7f1d0634\") " Dec 06 09:07:25 crc kubenswrapper[4954]: I1206 09:07:25.204573 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cced3467-688c-4256-b9e5-bcbc7f1d0634-utilities\") pod \"cced3467-688c-4256-b9e5-bcbc7f1d0634\" (UID: \"cced3467-688c-4256-b9e5-bcbc7f1d0634\") " Dec 06 09:07:25 crc kubenswrapper[4954]: I1206 09:07:25.205607 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cced3467-688c-4256-b9e5-bcbc7f1d0634-utilities" (OuterVolumeSpecName: "utilities") pod "cced3467-688c-4256-b9e5-bcbc7f1d0634" (UID: "cced3467-688c-4256-b9e5-bcbc7f1d0634"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:07:25 crc kubenswrapper[4954]: I1206 09:07:25.237881 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cced3467-688c-4256-b9e5-bcbc7f1d0634-kube-api-access-xdhl4" (OuterVolumeSpecName: "kube-api-access-xdhl4") pod "cced3467-688c-4256-b9e5-bcbc7f1d0634" (UID: "cced3467-688c-4256-b9e5-bcbc7f1d0634"). InnerVolumeSpecName "kube-api-access-xdhl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:07:25 crc kubenswrapper[4954]: I1206 09:07:25.307174 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdhl4\" (UniqueName: \"kubernetes.io/projected/cced3467-688c-4256-b9e5-bcbc7f1d0634-kube-api-access-xdhl4\") on node \"crc\" DevicePath \"\"" Dec 06 09:07:25 crc kubenswrapper[4954]: I1206 09:07:25.307212 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cced3467-688c-4256-b9e5-bcbc7f1d0634-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:07:25 crc kubenswrapper[4954]: I1206 09:07:25.345316 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cced3467-688c-4256-b9e5-bcbc7f1d0634-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cced3467-688c-4256-b9e5-bcbc7f1d0634" (UID: "cced3467-688c-4256-b9e5-bcbc7f1d0634"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:07:25 crc kubenswrapper[4954]: I1206 09:07:25.408959 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cced3467-688c-4256-b9e5-bcbc7f1d0634-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:07:25 crc kubenswrapper[4954]: I1206 09:07:25.414907 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dx7bp"] Dec 06 09:07:25 crc kubenswrapper[4954]: I1206 09:07:25.423628 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dx7bp"] Dec 06 09:07:25 crc kubenswrapper[4954]: I1206 09:07:25.454499 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cced3467-688c-4256-b9e5-bcbc7f1d0634" path="/var/lib/kubelet/pods/cced3467-688c-4256-b9e5-bcbc7f1d0634/volumes" Dec 06 09:07:27 crc kubenswrapper[4954]: I1206 09:07:27.677531 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-fmvfc" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.319549 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.321006 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="56926ef8-fd5e-4746-972f-42a622dabda3" containerName="openstackclient" containerID="cri-o://50d6661c17d8765a45cf437582afafb80de05bdb5b77260c55310a025f3a6fd4" gracePeriod=2 Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.330467 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.379765 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 06 09:07:31 crc kubenswrapper[4954]: E1206 09:07:31.380192 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56926ef8-fd5e-4746-972f-42a622dabda3" containerName="openstackclient" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.380208 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="56926ef8-fd5e-4746-972f-42a622dabda3" containerName="openstackclient" Dec 06 09:07:31 crc kubenswrapper[4954]: E1206 09:07:31.380222 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cced3467-688c-4256-b9e5-bcbc7f1d0634" containerName="registry-server" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.380228 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="cced3467-688c-4256-b9e5-bcbc7f1d0634" containerName="registry-server" Dec 06 09:07:31 crc kubenswrapper[4954]: E1206 09:07:31.380238 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cced3467-688c-4256-b9e5-bcbc7f1d0634" containerName="extract-content" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.380244 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="cced3467-688c-4256-b9e5-bcbc7f1d0634" containerName="extract-content" Dec 06 09:07:31 crc kubenswrapper[4954]: E1206 09:07:31.380264 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cced3467-688c-4256-b9e5-bcbc7f1d0634" containerName="extract-utilities" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.380271 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="cced3467-688c-4256-b9e5-bcbc7f1d0634" containerName="extract-utilities" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.380459 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="cced3467-688c-4256-b9e5-bcbc7f1d0634" containerName="registry-server" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.380477 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="56926ef8-fd5e-4746-972f-42a622dabda3" containerName="openstackclient" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.381191 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.401298 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.429417 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/573df1c9-ef2d-4407-858d-1fa0bdc533fd-openstack-config\") pod \"openstackclient\" (UID: \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\") " pod="openstack/openstackclient" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.429498 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/573df1c9-ef2d-4407-858d-1fa0bdc533fd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\") " pod="openstack/openstackclient" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.429527 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/573df1c9-ef2d-4407-858d-1fa0bdc533fd-openstack-config-secret\") pod \"openstackclient\" (UID: \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\") " pod="openstack/openstackclient" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.429758 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f9ns\" (UniqueName: \"kubernetes.io/projected/573df1c9-ef2d-4407-858d-1fa0bdc533fd-kube-api-access-7f9ns\") pod \"openstackclient\" (UID: \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\") " pod="openstack/openstackclient" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.430784 4954 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="56926ef8-fd5e-4746-972f-42a622dabda3" podUID="573df1c9-ef2d-4407-858d-1fa0bdc533fd" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.535080 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/573df1c9-ef2d-4407-858d-1fa0bdc533fd-openstack-config\") pod \"openstackclient\" (UID: \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\") " pod="openstack/openstackclient" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.535150 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/573df1c9-ef2d-4407-858d-1fa0bdc533fd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\") " pod="openstack/openstackclient" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.535174 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/573df1c9-ef2d-4407-858d-1fa0bdc533fd-openstack-config-secret\") pod \"openstackclient\" (UID: \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\") " pod="openstack/openstackclient" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.535297 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f9ns\" (UniqueName: \"kubernetes.io/projected/573df1c9-ef2d-4407-858d-1fa0bdc533fd-kube-api-access-7f9ns\") pod \"openstackclient\" (UID: \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\") " pod="openstack/openstackclient" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.537373 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/573df1c9-ef2d-4407-858d-1fa0bdc533fd-openstack-config\") pod \"openstackclient\" (UID: \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\") " pod="openstack/openstackclient" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.548085 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/573df1c9-ef2d-4407-858d-1fa0bdc533fd-openstack-config-secret\") pod \"openstackclient\" (UID: \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\") " pod="openstack/openstackclient" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.571361 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/573df1c9-ef2d-4407-858d-1fa0bdc533fd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\") " pod="openstack/openstackclient" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.594345 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f9ns\" (UniqueName: \"kubernetes.io/projected/573df1c9-ef2d-4407-858d-1fa0bdc533fd-kube-api-access-7f9ns\") pod \"openstackclient\" (UID: \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\") " pod="openstack/openstackclient" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.596973 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.598696 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.609355 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-k79gk" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.610738 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.636951 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkcf2\" (UniqueName: \"kubernetes.io/projected/e01f09b1-4b58-4eb8-90cf-d7738a282534-kube-api-access-mkcf2\") pod \"kube-state-metrics-0\" (UID: \"e01f09b1-4b58-4eb8-90cf-d7738a282534\") " pod="openstack/kube-state-metrics-0" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.726113 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.742990 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkcf2\" (UniqueName: \"kubernetes.io/projected/e01f09b1-4b58-4eb8-90cf-d7738a282534-kube-api-access-mkcf2\") pod \"kube-state-metrics-0\" (UID: \"e01f09b1-4b58-4eb8-90cf-d7738a282534\") " pod="openstack/kube-state-metrics-0" Dec 06 09:07:31 crc kubenswrapper[4954]: I1206 09:07:31.774609 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkcf2\" (UniqueName: \"kubernetes.io/projected/e01f09b1-4b58-4eb8-90cf-d7738a282534-kube-api-access-mkcf2\") pod \"kube-state-metrics-0\" (UID: \"e01f09b1-4b58-4eb8-90cf-d7738a282534\") " pod="openstack/kube-state-metrics-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.011812 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.347555 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.351774 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.361302 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.361478 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.361608 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.361801 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.363179 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-mknqh" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.405042 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.467185 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e2b2f86-d759-4e14-a011-a0060c5003a2-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.467262 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shxsm\" (UniqueName: \"kubernetes.io/projected/3e2b2f86-d759-4e14-a011-a0060c5003a2-kube-api-access-shxsm\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.467354 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3e2b2f86-d759-4e14-a011-a0060c5003a2-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.467379 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e2b2f86-d759-4e14-a011-a0060c5003a2-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.467400 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3e2b2f86-d759-4e14-a011-a0060c5003a2-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.467419 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/3e2b2f86-d759-4e14-a011-a0060c5003a2-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.467467 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e2b2f86-d759-4e14-a011-a0060c5003a2-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.569707 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e2b2f86-d759-4e14-a011-a0060c5003a2-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.570027 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shxsm\" (UniqueName: \"kubernetes.io/projected/3e2b2f86-d759-4e14-a011-a0060c5003a2-kube-api-access-shxsm\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.570093 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3e2b2f86-d759-4e14-a011-a0060c5003a2-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.570129 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e2b2f86-d759-4e14-a011-a0060c5003a2-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.570161 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3e2b2f86-d759-4e14-a011-a0060c5003a2-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.570179 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/3e2b2f86-d759-4e14-a011-a0060c5003a2-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.570237 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e2b2f86-d759-4e14-a011-a0060c5003a2-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.584095 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/3e2b2f86-d759-4e14-a011-a0060c5003a2-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.606679 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3e2b2f86-d759-4e14-a011-a0060c5003a2-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.619164 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3e2b2f86-d759-4e14-a011-a0060c5003a2-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.619287 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e2b2f86-d759-4e14-a011-a0060c5003a2-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.630064 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e2b2f86-d759-4e14-a011-a0060c5003a2-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.642095 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.650260 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e2b2f86-d759-4e14-a011-a0060c5003a2-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.661096 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shxsm\" (UniqueName: \"kubernetes.io/projected/3e2b2f86-d759-4e14-a011-a0060c5003a2-kube-api-access-shxsm\") pod \"alertmanager-metric-storage-0\" (UID: \"3e2b2f86-d759-4e14-a011-a0060c5003a2\") " pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:32 crc kubenswrapper[4954]: I1206 09:07:32.677799 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.005478 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.043583 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.043691 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.051375 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.051781 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.051948 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-qgxk7" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.052043 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.052137 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.052237 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.190644 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2a044e5d-2332-40cc-890b-7da9e544be80-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.190684 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2a044e5d-2332-40cc-890b-7da9e544be80-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.190763 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2a044e5d-2332-40cc-890b-7da9e544be80-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.190797 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.190817 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2a044e5d-2332-40cc-890b-7da9e544be80-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.190836 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a044e5d-2332-40cc-890b-7da9e544be80-config\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.190857 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2a044e5d-2332-40cc-890b-7da9e544be80-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.190877 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzkp4\" (UniqueName: \"kubernetes.io/projected/2a044e5d-2332-40cc-890b-7da9e544be80-kube-api-access-wzkp4\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.191099 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.200858 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"573df1c9-ef2d-4407-858d-1fa0bdc533fd","Type":"ContainerStarted","Data":"1c25547b52b8a360560795ebaba47e4e4aeec0c540b0452a555dd9b33677b690"} Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.294127 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2a044e5d-2332-40cc-890b-7da9e544be80-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.294507 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzkp4\" (UniqueName: \"kubernetes.io/projected/2a044e5d-2332-40cc-890b-7da9e544be80-kube-api-access-wzkp4\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.294797 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2a044e5d-2332-40cc-890b-7da9e544be80-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.294947 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2a044e5d-2332-40cc-890b-7da9e544be80-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.295135 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2a044e5d-2332-40cc-890b-7da9e544be80-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.295280 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.295312 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2a044e5d-2332-40cc-890b-7da9e544be80-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.295442 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a044e5d-2332-40cc-890b-7da9e544be80-config\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.296207 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2a044e5d-2332-40cc-890b-7da9e544be80-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.315971 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2a044e5d-2332-40cc-890b-7da9e544be80-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.316114 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a044e5d-2332-40cc-890b-7da9e544be80-config\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.316753 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2a044e5d-2332-40cc-890b-7da9e544be80-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.317152 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2a044e5d-2332-40cc-890b-7da9e544be80-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.321901 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.321949 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/23b790ef866cf8a9b9301f286c81631c28f0905e1c0059509880a0abce6a0705/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.322958 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2a044e5d-2332-40cc-890b-7da9e544be80-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.326189 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzkp4\" (UniqueName: \"kubernetes.io/projected/2a044e5d-2332-40cc-890b-7da9e544be80-kube-api-access-wzkp4\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.415841 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\") pod \"prometheus-metric-storage-0\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.430416 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 06 09:07:33 crc kubenswrapper[4954]: I1206 09:07:33.693280 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.053650 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.107142 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.153159 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/56926ef8-fd5e-4746-972f-42a622dabda3-openstack-config-secret\") pod \"56926ef8-fd5e-4746-972f-42a622dabda3\" (UID: \"56926ef8-fd5e-4746-972f-42a622dabda3\") " Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.153295 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56926ef8-fd5e-4746-972f-42a622dabda3-combined-ca-bundle\") pod \"56926ef8-fd5e-4746-972f-42a622dabda3\" (UID: \"56926ef8-fd5e-4746-972f-42a622dabda3\") " Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.154131 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/56926ef8-fd5e-4746-972f-42a622dabda3-openstack-config\") pod \"56926ef8-fd5e-4746-972f-42a622dabda3\" (UID: \"56926ef8-fd5e-4746-972f-42a622dabda3\") " Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.154215 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8jsh\" (UniqueName: \"kubernetes.io/projected/56926ef8-fd5e-4746-972f-42a622dabda3-kube-api-access-n8jsh\") pod \"56926ef8-fd5e-4746-972f-42a622dabda3\" (UID: \"56926ef8-fd5e-4746-972f-42a622dabda3\") " Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.159379 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56926ef8-fd5e-4746-972f-42a622dabda3-kube-api-access-n8jsh" (OuterVolumeSpecName: "kube-api-access-n8jsh") pod "56926ef8-fd5e-4746-972f-42a622dabda3" (UID: "56926ef8-fd5e-4746-972f-42a622dabda3"). InnerVolumeSpecName "kube-api-access-n8jsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.216640 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56926ef8-fd5e-4746-972f-42a622dabda3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56926ef8-fd5e-4746-972f-42a622dabda3" (UID: "56926ef8-fd5e-4746-972f-42a622dabda3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.221668 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"3e2b2f86-d759-4e14-a011-a0060c5003a2","Type":"ContainerStarted","Data":"a7d7b5df328929bd8bf3d64a48c81cb41fcf3c90c9ad0d26f2fd780f7720ef99"} Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.233443 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56926ef8-fd5e-4746-972f-42a622dabda3-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "56926ef8-fd5e-4746-972f-42a622dabda3" (UID: "56926ef8-fd5e-4746-972f-42a622dabda3"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.235301 4954 generic.go:334] "Generic (PLEG): container finished" podID="56926ef8-fd5e-4746-972f-42a622dabda3" containerID="50d6661c17d8765a45cf437582afafb80de05bdb5b77260c55310a025f3a6fd4" exitCode=137 Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.235405 4954 scope.go:117] "RemoveContainer" containerID="50d6661c17d8765a45cf437582afafb80de05bdb5b77260c55310a025f3a6fd4" Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.235444 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.242217 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e01f09b1-4b58-4eb8-90cf-d7738a282534","Type":"ContainerStarted","Data":"6a104489772e875edc8d11eb4688bb36944a923915ef0c8e9ee52d11952916e9"} Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.242271 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e01f09b1-4b58-4eb8-90cf-d7738a282534","Type":"ContainerStarted","Data":"7326f6faf9b2ed8d9f0a01b3ec234792eb3f24e5b6306dc5b5372566c9c5dadc"} Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.243190 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56926ef8-fd5e-4746-972f-42a622dabda3-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "56926ef8-fd5e-4746-972f-42a622dabda3" (UID: "56926ef8-fd5e-4746-972f-42a622dabda3"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.243284 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.249436 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a044e5d-2332-40cc-890b-7da9e544be80","Type":"ContainerStarted","Data":"d762c13aaefdbffe278bcc252c447a296d2d04aa3bb608689ef460a12f8713a2"} Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.257513 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/56926ef8-fd5e-4746-972f-42a622dabda3-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.257551 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56926ef8-fd5e-4746-972f-42a622dabda3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.257599 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/56926ef8-fd5e-4746-972f-42a622dabda3-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.257628 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8jsh\" (UniqueName: \"kubernetes.io/projected/56926ef8-fd5e-4746-972f-42a622dabda3-kube-api-access-n8jsh\") on node \"crc\" DevicePath \"\"" Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.258616 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"573df1c9-ef2d-4407-858d-1fa0bdc533fd","Type":"ContainerStarted","Data":"8e1d20fcf903015205a943e0b27984da4df43d6be5ed3c227405e8412c5eabb5"} Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.261199 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.735732659 podStartE2EDuration="3.261180165s" podCreationTimestamp="2025-12-06 09:07:31 +0000 UTC" firstStartedPulling="2025-12-06 09:07:33.200965444 +0000 UTC m=+7828.014324833" lastFinishedPulling="2025-12-06 09:07:33.72641295 +0000 UTC m=+7828.539772339" observedRunningTime="2025-12-06 09:07:34.256455629 +0000 UTC m=+7829.069815028" watchObservedRunningTime="2025-12-06 09:07:34.261180165 +0000 UTC m=+7829.074539554" Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.279349 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.279324811 podStartE2EDuration="3.279324811s" podCreationTimestamp="2025-12-06 09:07:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:07:34.27776071 +0000 UTC m=+7829.091120119" watchObservedRunningTime="2025-12-06 09:07:34.279324811 +0000 UTC m=+7829.092684200" Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.281857 4954 scope.go:117] "RemoveContainer" containerID="50d6661c17d8765a45cf437582afafb80de05bdb5b77260c55310a025f3a6fd4" Dec 06 09:07:34 crc kubenswrapper[4954]: E1206 09:07:34.283168 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d6661c17d8765a45cf437582afafb80de05bdb5b77260c55310a025f3a6fd4\": container with ID starting with 50d6661c17d8765a45cf437582afafb80de05bdb5b77260c55310a025f3a6fd4 not found: ID does not exist" containerID="50d6661c17d8765a45cf437582afafb80de05bdb5b77260c55310a025f3a6fd4" Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.283228 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d6661c17d8765a45cf437582afafb80de05bdb5b77260c55310a025f3a6fd4"} err="failed to get container status \"50d6661c17d8765a45cf437582afafb80de05bdb5b77260c55310a025f3a6fd4\": rpc error: code = NotFound desc = could not find container \"50d6661c17d8765a45cf437582afafb80de05bdb5b77260c55310a025f3a6fd4\": container with ID starting with 50d6661c17d8765a45cf437582afafb80de05bdb5b77260c55310a025f3a6fd4 not found: ID does not exist" Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.443940 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:07:34 crc kubenswrapper[4954]: E1206 09:07:34.444188 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:07:34 crc kubenswrapper[4954]: I1206 09:07:34.605701 4954 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="56926ef8-fd5e-4746-972f-42a622dabda3" podUID="573df1c9-ef2d-4407-858d-1fa0bdc533fd" Dec 06 09:07:35 crc kubenswrapper[4954]: I1206 09:07:35.465888 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56926ef8-fd5e-4746-972f-42a622dabda3" path="/var/lib/kubelet/pods/56926ef8-fd5e-4746-972f-42a622dabda3/volumes" Dec 06 09:07:39 crc kubenswrapper[4954]: I1206 09:07:39.347271 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a044e5d-2332-40cc-890b-7da9e544be80","Type":"ContainerStarted","Data":"171aef1870503661b37f65aa48a9fd84082071336e538a3380a13ecca8594ff3"} Dec 06 09:07:40 crc kubenswrapper[4954]: I1206 09:07:40.363800 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"3e2b2f86-d759-4e14-a011-a0060c5003a2","Type":"ContainerStarted","Data":"09cddd881bfa6e63787a3c9475285937248cd453996b8fbfb870bdf01d5ab9b5"} Dec 06 09:07:42 crc kubenswrapper[4954]: I1206 09:07:42.017458 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 06 09:07:45 crc kubenswrapper[4954]: I1206 09:07:45.035653 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-z8js6"] Dec 06 09:07:45 crc kubenswrapper[4954]: I1206 09:07:45.046663 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a3d8-account-create-update-6p8sr"] Dec 06 09:07:45 crc kubenswrapper[4954]: I1206 09:07:45.056805 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-z8js6"] Dec 06 09:07:45 crc kubenswrapper[4954]: I1206 09:07:45.065406 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a3d8-account-create-update-6p8sr"] Dec 06 09:07:45 crc kubenswrapper[4954]: I1206 09:07:45.462767 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adc0a40a-e991-4416-962b-fad304dc412c" path="/var/lib/kubelet/pods/adc0a40a-e991-4416-962b-fad304dc412c/volumes" Dec 06 09:07:45 crc kubenswrapper[4954]: I1206 09:07:45.463611 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ded01376-eebe-4038-99eb-bd70cbc5e61f" path="/var/lib/kubelet/pods/ded01376-eebe-4038-99eb-bd70cbc5e61f/volumes" Dec 06 09:07:46 crc kubenswrapper[4954]: I1206 09:07:46.420322 4954 generic.go:334] "Generic (PLEG): container finished" podID="2a044e5d-2332-40cc-890b-7da9e544be80" containerID="171aef1870503661b37f65aa48a9fd84082071336e538a3380a13ecca8594ff3" exitCode=0 Dec 06 09:07:46 crc kubenswrapper[4954]: I1206 09:07:46.420376 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a044e5d-2332-40cc-890b-7da9e544be80","Type":"ContainerDied","Data":"171aef1870503661b37f65aa48a9fd84082071336e538a3380a13ecca8594ff3"} Dec 06 09:07:46 crc kubenswrapper[4954]: I1206 09:07:46.443203 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:07:46 crc kubenswrapper[4954]: E1206 09:07:46.443458 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:07:48 crc kubenswrapper[4954]: I1206 09:07:48.457661 4954 generic.go:334] "Generic (PLEG): container finished" podID="3e2b2f86-d759-4e14-a011-a0060c5003a2" containerID="09cddd881bfa6e63787a3c9475285937248cd453996b8fbfb870bdf01d5ab9b5" exitCode=0 Dec 06 09:07:48 crc kubenswrapper[4954]: I1206 09:07:48.458016 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"3e2b2f86-d759-4e14-a011-a0060c5003a2","Type":"ContainerDied","Data":"09cddd881bfa6e63787a3c9475285937248cd453996b8fbfb870bdf01d5ab9b5"} Dec 06 09:07:54 crc kubenswrapper[4954]: I1206 09:07:54.519509 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"3e2b2f86-d759-4e14-a011-a0060c5003a2","Type":"ContainerStarted","Data":"93f7bde47c20702e7ec95febf89c5d780604fa1c62c16fcfe8a66b2d637fee78"} Dec 06 09:07:54 crc kubenswrapper[4954]: I1206 09:07:54.521770 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a044e5d-2332-40cc-890b-7da9e544be80","Type":"ContainerStarted","Data":"fdb185abee0820b7528d4e84a2bf7ee2cbde8f9d1a8c41b7650bda5ce79b7791"} Dec 06 09:07:58 crc kubenswrapper[4954]: I1206 09:07:58.563533 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a044e5d-2332-40cc-890b-7da9e544be80","Type":"ContainerStarted","Data":"20373a7b70751ccb8cbf5ed64b1c60a0b3f33e1a66cf6b1c9f88b22034679046"} Dec 06 09:07:58 crc kubenswrapper[4954]: I1206 09:07:58.566428 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"3e2b2f86-d759-4e14-a011-a0060c5003a2","Type":"ContainerStarted","Data":"22036bca601950176b1df5e95f16b7e81e0e1f55b01f73b249240c5e1ffb1545"} Dec 06 09:07:58 crc kubenswrapper[4954]: I1206 09:07:58.566694 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:58 crc kubenswrapper[4954]: I1206 09:07:58.568897 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Dec 06 09:07:58 crc kubenswrapper[4954]: I1206 09:07:58.592276 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.416938655 podStartE2EDuration="26.592254982s" podCreationTimestamp="2025-12-06 09:07:32 +0000 UTC" firstStartedPulling="2025-12-06 09:07:33.650284861 +0000 UTC m=+7828.463644240" lastFinishedPulling="2025-12-06 09:07:53.825601178 +0000 UTC m=+7848.638960567" observedRunningTime="2025-12-06 09:07:58.585980134 +0000 UTC m=+7853.399339523" watchObservedRunningTime="2025-12-06 09:07:58.592254982 +0000 UTC m=+7853.405614371" Dec 06 09:08:00 crc kubenswrapper[4954]: I1206 09:08:00.443728 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:08:00 crc kubenswrapper[4954]: E1206 09:08:00.444176 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:08:02 crc kubenswrapper[4954]: I1206 09:08:02.647885 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a044e5d-2332-40cc-890b-7da9e544be80","Type":"ContainerStarted","Data":"ee3ab5956a46d58e248250f896aec6ad3ad89ff417ade80522493ed9b081f489"} Dec 06 09:08:02 crc kubenswrapper[4954]: I1206 09:08:02.688134 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.844698996 podStartE2EDuration="31.688111575s" podCreationTimestamp="2025-12-06 09:07:31 +0000 UTC" firstStartedPulling="2025-12-06 09:07:34.108354501 +0000 UTC m=+7828.921713890" lastFinishedPulling="2025-12-06 09:08:01.95176708 +0000 UTC m=+7856.765126469" observedRunningTime="2025-12-06 09:08:02.676208126 +0000 UTC m=+7857.489567535" watchObservedRunningTime="2025-12-06 09:08:02.688111575 +0000 UTC m=+7857.501470974" Dec 06 09:08:03 crc kubenswrapper[4954]: I1206 09:08:03.431234 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:03 crc kubenswrapper[4954]: I1206 09:08:03.431509 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:03 crc kubenswrapper[4954]: I1206 09:08:03.434559 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:03 crc kubenswrapper[4954]: I1206 09:08:03.658949 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.004520 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xdx6g"] Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.007634 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xdx6g" Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.016998 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdx6g"] Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.148882 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c77f9d5e-d888-45b9-87ae-a4011a73ccf1-utilities\") pod \"redhat-marketplace-xdx6g\" (UID: \"c77f9d5e-d888-45b9-87ae-a4011a73ccf1\") " pod="openshift-marketplace/redhat-marketplace-xdx6g" Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.148959 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9v64\" (UniqueName: \"kubernetes.io/projected/c77f9d5e-d888-45b9-87ae-a4011a73ccf1-kube-api-access-w9v64\") pod \"redhat-marketplace-xdx6g\" (UID: \"c77f9d5e-d888-45b9-87ae-a4011a73ccf1\") " pod="openshift-marketplace/redhat-marketplace-xdx6g" Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.149030 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c77f9d5e-d888-45b9-87ae-a4011a73ccf1-catalog-content\") pod \"redhat-marketplace-xdx6g\" (UID: \"c77f9d5e-d888-45b9-87ae-a4011a73ccf1\") " pod="openshift-marketplace/redhat-marketplace-xdx6g" Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.250708 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c77f9d5e-d888-45b9-87ae-a4011a73ccf1-catalog-content\") pod \"redhat-marketplace-xdx6g\" (UID: \"c77f9d5e-d888-45b9-87ae-a4011a73ccf1\") " pod="openshift-marketplace/redhat-marketplace-xdx6g" Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.250930 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c77f9d5e-d888-45b9-87ae-a4011a73ccf1-utilities\") pod \"redhat-marketplace-xdx6g\" (UID: \"c77f9d5e-d888-45b9-87ae-a4011a73ccf1\") " pod="openshift-marketplace/redhat-marketplace-xdx6g" Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.250962 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9v64\" (UniqueName: \"kubernetes.io/projected/c77f9d5e-d888-45b9-87ae-a4011a73ccf1-kube-api-access-w9v64\") pod \"redhat-marketplace-xdx6g\" (UID: \"c77f9d5e-d888-45b9-87ae-a4011a73ccf1\") " pod="openshift-marketplace/redhat-marketplace-xdx6g" Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.251819 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c77f9d5e-d888-45b9-87ae-a4011a73ccf1-catalog-content\") pod \"redhat-marketplace-xdx6g\" (UID: \"c77f9d5e-d888-45b9-87ae-a4011a73ccf1\") " pod="openshift-marketplace/redhat-marketplace-xdx6g" Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.251939 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c77f9d5e-d888-45b9-87ae-a4011a73ccf1-utilities\") pod \"redhat-marketplace-xdx6g\" (UID: \"c77f9d5e-d888-45b9-87ae-a4011a73ccf1\") " pod="openshift-marketplace/redhat-marketplace-xdx6g" Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.273632 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9v64\" (UniqueName: \"kubernetes.io/projected/c77f9d5e-d888-45b9-87ae-a4011a73ccf1-kube-api-access-w9v64\") pod \"redhat-marketplace-xdx6g\" (UID: \"c77f9d5e-d888-45b9-87ae-a4011a73ccf1\") " pod="openshift-marketplace/redhat-marketplace-xdx6g" Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.366516 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xdx6g" Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.854508 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.855086 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="573df1c9-ef2d-4407-858d-1fa0bdc533fd" containerName="openstackclient" containerID="cri-o://8e1d20fcf903015205a943e0b27984da4df43d6be5ed3c227405e8412c5eabb5" gracePeriod=2 Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.875942 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.928665 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 06 09:08:05 crc kubenswrapper[4954]: E1206 09:08:05.929092 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="573df1c9-ef2d-4407-858d-1fa0bdc533fd" containerName="openstackclient" Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.929106 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="573df1c9-ef2d-4407-858d-1fa0bdc533fd" containerName="openstackclient" Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.929292 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="573df1c9-ef2d-4407-858d-1fa0bdc533fd" containerName="openstackclient" Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.929968 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.933055 4954 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="573df1c9-ef2d-4407-858d-1fa0bdc533fd" podUID="a4494926-d786-4b95-b768-7434a47be11b" Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.939199 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 06 09:08:05 crc kubenswrapper[4954]: I1206 09:08:05.985454 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdx6g"] Dec 06 09:08:05 crc kubenswrapper[4954]: W1206 09:08:05.994695 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc77f9d5e_d888_45b9_87ae_a4011a73ccf1.slice/crio-16676e058f9db4608052ea78f2148cebb3bb3abbc5af7f2aa9fc0307238b69bd WatchSource:0}: Error finding container 16676e058f9db4608052ea78f2148cebb3bb3abbc5af7f2aa9fc0307238b69bd: Status 404 returned error can't find the container with id 16676e058f9db4608052ea78f2148cebb3bb3abbc5af7f2aa9fc0307238b69bd Dec 06 09:08:06 crc kubenswrapper[4954]: I1206 09:08:06.091409 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dkkp\" (UniqueName: \"kubernetes.io/projected/a4494926-d786-4b95-b768-7434a47be11b-kube-api-access-2dkkp\") pod \"openstackclient\" (UID: \"a4494926-d786-4b95-b768-7434a47be11b\") " pod="openstack/openstackclient" Dec 06 09:08:06 crc kubenswrapper[4954]: I1206 09:08:06.091680 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4494926-d786-4b95-b768-7434a47be11b-openstack-config-secret\") pod \"openstackclient\" (UID: \"a4494926-d786-4b95-b768-7434a47be11b\") " pod="openstack/openstackclient" Dec 06 09:08:06 crc kubenswrapper[4954]: I1206 09:08:06.091716 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4494926-d786-4b95-b768-7434a47be11b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a4494926-d786-4b95-b768-7434a47be11b\") " pod="openstack/openstackclient" Dec 06 09:08:06 crc kubenswrapper[4954]: I1206 09:08:06.091753 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4494926-d786-4b95-b768-7434a47be11b-openstack-config\") pod \"openstackclient\" (UID: \"a4494926-d786-4b95-b768-7434a47be11b\") " pod="openstack/openstackclient" Dec 06 09:08:06 crc kubenswrapper[4954]: I1206 09:08:06.193764 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4494926-d786-4b95-b768-7434a47be11b-openstack-config-secret\") pod \"openstackclient\" (UID: \"a4494926-d786-4b95-b768-7434a47be11b\") " pod="openstack/openstackclient" Dec 06 09:08:06 crc kubenswrapper[4954]: I1206 09:08:06.194736 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4494926-d786-4b95-b768-7434a47be11b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a4494926-d786-4b95-b768-7434a47be11b\") " pod="openstack/openstackclient" Dec 06 09:08:06 crc kubenswrapper[4954]: I1206 09:08:06.194898 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4494926-d786-4b95-b768-7434a47be11b-openstack-config\") pod \"openstackclient\" (UID: \"a4494926-d786-4b95-b768-7434a47be11b\") " pod="openstack/openstackclient" Dec 06 09:08:06 crc kubenswrapper[4954]: I1206 09:08:06.194958 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dkkp\" (UniqueName: \"kubernetes.io/projected/a4494926-d786-4b95-b768-7434a47be11b-kube-api-access-2dkkp\") pod \"openstackclient\" (UID: \"a4494926-d786-4b95-b768-7434a47be11b\") " pod="openstack/openstackclient" Dec 06 09:08:06 crc kubenswrapper[4954]: I1206 09:08:06.195685 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4494926-d786-4b95-b768-7434a47be11b-openstack-config\") pod \"openstackclient\" (UID: \"a4494926-d786-4b95-b768-7434a47be11b\") " pod="openstack/openstackclient" Dec 06 09:08:06 crc kubenswrapper[4954]: I1206 09:08:06.200012 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4494926-d786-4b95-b768-7434a47be11b-openstack-config-secret\") pod \"openstackclient\" (UID: \"a4494926-d786-4b95-b768-7434a47be11b\") " pod="openstack/openstackclient" Dec 06 09:08:06 crc kubenswrapper[4954]: I1206 09:08:06.200219 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4494926-d786-4b95-b768-7434a47be11b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a4494926-d786-4b95-b768-7434a47be11b\") " pod="openstack/openstackclient" Dec 06 09:08:06 crc kubenswrapper[4954]: I1206 09:08:06.218173 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dkkp\" (UniqueName: \"kubernetes.io/projected/a4494926-d786-4b95-b768-7434a47be11b-kube-api-access-2dkkp\") pod \"openstackclient\" (UID: \"a4494926-d786-4b95-b768-7434a47be11b\") " pod="openstack/openstackclient" Dec 06 09:08:06 crc kubenswrapper[4954]: I1206 09:08:06.261780 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 09:08:06 crc kubenswrapper[4954]: I1206 09:08:06.706105 4954 generic.go:334] "Generic (PLEG): container finished" podID="c77f9d5e-d888-45b9-87ae-a4011a73ccf1" containerID="1264ca1e79d59b1fb0e2f2c2bd1a30c290a0b63ef29a5000582f06bbfe7b37b5" exitCode=0 Dec 06 09:08:06 crc kubenswrapper[4954]: I1206 09:08:06.706193 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdx6g" event={"ID":"c77f9d5e-d888-45b9-87ae-a4011a73ccf1","Type":"ContainerDied","Data":"1264ca1e79d59b1fb0e2f2c2bd1a30c290a0b63ef29a5000582f06bbfe7b37b5"} Dec 06 09:08:06 crc kubenswrapper[4954]: I1206 09:08:06.706456 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdx6g" event={"ID":"c77f9d5e-d888-45b9-87ae-a4011a73ccf1","Type":"ContainerStarted","Data":"16676e058f9db4608052ea78f2148cebb3bb3abbc5af7f2aa9fc0307238b69bd"} Dec 06 09:08:06 crc kubenswrapper[4954]: I1206 09:08:06.955220 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 06 09:08:07 crc kubenswrapper[4954]: I1206 09:08:07.309407 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 06 09:08:07 crc kubenswrapper[4954]: I1206 09:08:07.311330 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="2a044e5d-2332-40cc-890b-7da9e544be80" containerName="prometheus" containerID="cri-o://fdb185abee0820b7528d4e84a2bf7ee2cbde8f9d1a8c41b7650bda5ce79b7791" gracePeriod=600 Dec 06 09:08:07 crc kubenswrapper[4954]: I1206 09:08:07.312445 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="2a044e5d-2332-40cc-890b-7da9e544be80" containerName="config-reloader" containerID="cri-o://20373a7b70751ccb8cbf5ed64b1c60a0b3f33e1a66cf6b1c9f88b22034679046" gracePeriod=600 Dec 06 09:08:07 crc kubenswrapper[4954]: I1206 09:08:07.311934 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="2a044e5d-2332-40cc-890b-7da9e544be80" containerName="thanos-sidecar" containerID="cri-o://ee3ab5956a46d58e248250f896aec6ad3ad89ff417ade80522493ed9b081f489" gracePeriod=600 Dec 06 09:08:07 crc kubenswrapper[4954]: I1206 09:08:07.728916 4954 generic.go:334] "Generic (PLEG): container finished" podID="2a044e5d-2332-40cc-890b-7da9e544be80" containerID="ee3ab5956a46d58e248250f896aec6ad3ad89ff417ade80522493ed9b081f489" exitCode=0 Dec 06 09:08:07 crc kubenswrapper[4954]: I1206 09:08:07.729280 4954 generic.go:334] "Generic (PLEG): container finished" podID="2a044e5d-2332-40cc-890b-7da9e544be80" containerID="20373a7b70751ccb8cbf5ed64b1c60a0b3f33e1a66cf6b1c9f88b22034679046" exitCode=0 Dec 06 09:08:07 crc kubenswrapper[4954]: I1206 09:08:07.729294 4954 generic.go:334] "Generic (PLEG): container finished" podID="2a044e5d-2332-40cc-890b-7da9e544be80" containerID="fdb185abee0820b7528d4e84a2bf7ee2cbde8f9d1a8c41b7650bda5ce79b7791" exitCode=0 Dec 06 09:08:07 crc kubenswrapper[4954]: I1206 09:08:07.728950 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a044e5d-2332-40cc-890b-7da9e544be80","Type":"ContainerDied","Data":"ee3ab5956a46d58e248250f896aec6ad3ad89ff417ade80522493ed9b081f489"} Dec 06 09:08:07 crc kubenswrapper[4954]: I1206 09:08:07.729393 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a044e5d-2332-40cc-890b-7da9e544be80","Type":"ContainerDied","Data":"20373a7b70751ccb8cbf5ed64b1c60a0b3f33e1a66cf6b1c9f88b22034679046"} Dec 06 09:08:07 crc kubenswrapper[4954]: I1206 09:08:07.729408 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a044e5d-2332-40cc-890b-7da9e544be80","Type":"ContainerDied","Data":"fdb185abee0820b7528d4e84a2bf7ee2cbde8f9d1a8c41b7650bda5ce79b7791"} Dec 06 09:08:07 crc kubenswrapper[4954]: I1206 09:08:07.735783 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdx6g" event={"ID":"c77f9d5e-d888-45b9-87ae-a4011a73ccf1","Type":"ContainerStarted","Data":"24b722927c8d98c16e1d44cecef62b9c79b86adbf6db52c88efc05412802f800"} Dec 06 09:08:07 crc kubenswrapper[4954]: I1206 09:08:07.737073 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a4494926-d786-4b95-b768-7434a47be11b","Type":"ContainerStarted","Data":"b97c23f7b2c698e2985fc7e7e2efb226b0e27eaea34b623603d478d8de272ef4"} Dec 06 09:08:07 crc kubenswrapper[4954]: I1206 09:08:07.737121 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a4494926-d786-4b95-b768-7434a47be11b","Type":"ContainerStarted","Data":"2fcf5755ee2804905ccca4ea057c751db31cfd2956ca447772e74d47ebe84b9d"} Dec 06 09:08:07 crc kubenswrapper[4954]: I1206 09:08:07.771358 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.77133688 podStartE2EDuration="2.77133688s" podCreationTimestamp="2025-12-06 09:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:07.768389351 +0000 UTC m=+7862.581748750" watchObservedRunningTime="2025-12-06 09:08:07.77133688 +0000 UTC m=+7862.584696269" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.492352 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.624669 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.685453 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzkp4\" (UniqueName: \"kubernetes.io/projected/2a044e5d-2332-40cc-890b-7da9e544be80-kube-api-access-wzkp4\") pod \"2a044e5d-2332-40cc-890b-7da9e544be80\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.686667 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\") pod \"2a044e5d-2332-40cc-890b-7da9e544be80\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.686692 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2a044e5d-2332-40cc-890b-7da9e544be80-thanos-prometheus-http-client-file\") pod \"2a044e5d-2332-40cc-890b-7da9e544be80\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.686734 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2a044e5d-2332-40cc-890b-7da9e544be80-prometheus-metric-storage-rulefiles-0\") pod \"2a044e5d-2332-40cc-890b-7da9e544be80\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.686760 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2a044e5d-2332-40cc-890b-7da9e544be80-web-config\") pod \"2a044e5d-2332-40cc-890b-7da9e544be80\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.686808 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/573df1c9-ef2d-4407-858d-1fa0bdc533fd-combined-ca-bundle\") pod \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\" (UID: \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\") " Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.686839 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2a044e5d-2332-40cc-890b-7da9e544be80-tls-assets\") pod \"2a044e5d-2332-40cc-890b-7da9e544be80\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.686861 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f9ns\" (UniqueName: \"kubernetes.io/projected/573df1c9-ef2d-4407-858d-1fa0bdc533fd-kube-api-access-7f9ns\") pod \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\" (UID: \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\") " Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.686914 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a044e5d-2332-40cc-890b-7da9e544be80-config\") pod \"2a044e5d-2332-40cc-890b-7da9e544be80\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.686941 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/573df1c9-ef2d-4407-858d-1fa0bdc533fd-openstack-config\") pod \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\" (UID: \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\") " Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.686964 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2a044e5d-2332-40cc-890b-7da9e544be80-config-out\") pod \"2a044e5d-2332-40cc-890b-7da9e544be80\" (UID: \"2a044e5d-2332-40cc-890b-7da9e544be80\") " Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.687008 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/573df1c9-ef2d-4407-858d-1fa0bdc533fd-openstack-config-secret\") pod \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\" (UID: \"573df1c9-ef2d-4407-858d-1fa0bdc533fd\") " Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.688203 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a044e5d-2332-40cc-890b-7da9e544be80-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "2a044e5d-2332-40cc-890b-7da9e544be80" (UID: "2a044e5d-2332-40cc-890b-7da9e544be80"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.704133 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/573df1c9-ef2d-4407-858d-1fa0bdc533fd-kube-api-access-7f9ns" (OuterVolumeSpecName: "kube-api-access-7f9ns") pod "573df1c9-ef2d-4407-858d-1fa0bdc533fd" (UID: "573df1c9-ef2d-4407-858d-1fa0bdc533fd"). InnerVolumeSpecName "kube-api-access-7f9ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.704230 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a044e5d-2332-40cc-890b-7da9e544be80-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "2a044e5d-2332-40cc-890b-7da9e544be80" (UID: "2a044e5d-2332-40cc-890b-7da9e544be80"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.706213 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a044e5d-2332-40cc-890b-7da9e544be80-config-out" (OuterVolumeSpecName: "config-out") pod "2a044e5d-2332-40cc-890b-7da9e544be80" (UID: "2a044e5d-2332-40cc-890b-7da9e544be80"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.708253 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a044e5d-2332-40cc-890b-7da9e544be80-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "2a044e5d-2332-40cc-890b-7da9e544be80" (UID: "2a044e5d-2332-40cc-890b-7da9e544be80"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.728423 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "2a044e5d-2332-40cc-890b-7da9e544be80" (UID: "2a044e5d-2332-40cc-890b-7da9e544be80"). InnerVolumeSpecName "pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.730647 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a044e5d-2332-40cc-890b-7da9e544be80-web-config" (OuterVolumeSpecName: "web-config") pod "2a044e5d-2332-40cc-890b-7da9e544be80" (UID: "2a044e5d-2332-40cc-890b-7da9e544be80"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.738326 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a044e5d-2332-40cc-890b-7da9e544be80-config" (OuterVolumeSpecName: "config") pod "2a044e5d-2332-40cc-890b-7da9e544be80" (UID: "2a044e5d-2332-40cc-890b-7da9e544be80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.738401 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a044e5d-2332-40cc-890b-7da9e544be80-kube-api-access-wzkp4" (OuterVolumeSpecName: "kube-api-access-wzkp4") pod "2a044e5d-2332-40cc-890b-7da9e544be80" (UID: "2a044e5d-2332-40cc-890b-7da9e544be80"). InnerVolumeSpecName "kube-api-access-wzkp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.770773 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/573df1c9-ef2d-4407-858d-1fa0bdc533fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "573df1c9-ef2d-4407-858d-1fa0bdc533fd" (UID: "573df1c9-ef2d-4407-858d-1fa0bdc533fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.780064 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.780393 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a044e5d-2332-40cc-890b-7da9e544be80","Type":"ContainerDied","Data":"d762c13aaefdbffe278bcc252c447a296d2d04aa3bb608689ef460a12f8713a2"} Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.780471 4954 scope.go:117] "RemoveContainer" containerID="ee3ab5956a46d58e248250f896aec6ad3ad89ff417ade80522493ed9b081f489" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.786776 4954 generic.go:334] "Generic (PLEG): container finished" podID="573df1c9-ef2d-4407-858d-1fa0bdc533fd" containerID="8e1d20fcf903015205a943e0b27984da4df43d6be5ed3c227405e8412c5eabb5" exitCode=137 Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.787020 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.788647 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzkp4\" (UniqueName: \"kubernetes.io/projected/2a044e5d-2332-40cc-890b-7da9e544be80-kube-api-access-wzkp4\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.788745 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\") on node \"crc\" " Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.788760 4954 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2a044e5d-2332-40cc-890b-7da9e544be80-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.788771 4954 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2a044e5d-2332-40cc-890b-7da9e544be80-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.788783 4954 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2a044e5d-2332-40cc-890b-7da9e544be80-web-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.788810 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/573df1c9-ef2d-4407-858d-1fa0bdc533fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.788819 4954 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2a044e5d-2332-40cc-890b-7da9e544be80-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.788828 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f9ns\" (UniqueName: \"kubernetes.io/projected/573df1c9-ef2d-4407-858d-1fa0bdc533fd-kube-api-access-7f9ns\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.788836 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a044e5d-2332-40cc-890b-7da9e544be80-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.788844 4954 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2a044e5d-2332-40cc-890b-7da9e544be80-config-out\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.800215 4954 generic.go:334] "Generic (PLEG): container finished" podID="c77f9d5e-d888-45b9-87ae-a4011a73ccf1" containerID="24b722927c8d98c16e1d44cecef62b9c79b86adbf6db52c88efc05412802f800" exitCode=0 Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.800308 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdx6g" event={"ID":"c77f9d5e-d888-45b9-87ae-a4011a73ccf1","Type":"ContainerDied","Data":"24b722927c8d98c16e1d44cecef62b9c79b86adbf6db52c88efc05412802f800"} Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.807279 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/573df1c9-ef2d-4407-858d-1fa0bdc533fd-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "573df1c9-ef2d-4407-858d-1fa0bdc533fd" (UID: "573df1c9-ef2d-4407-858d-1fa0bdc533fd"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.814167 4954 scope.go:117] "RemoveContainer" containerID="20373a7b70751ccb8cbf5ed64b1c60a0b3f33e1a66cf6b1c9f88b22034679046" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.830979 4954 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.831129 4954 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59") on node "crc" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.842490 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/573df1c9-ef2d-4407-858d-1fa0bdc533fd-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "573df1c9-ef2d-4407-858d-1fa0bdc533fd" (UID: "573df1c9-ef2d-4407-858d-1fa0bdc533fd"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.864589 4954 scope.go:117] "RemoveContainer" containerID="fdb185abee0820b7528d4e84a2bf7ee2cbde8f9d1a8c41b7650bda5ce79b7791" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.884717 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.890466 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/573df1c9-ef2d-4407-858d-1fa0bdc533fd-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.890499 4954 reconciler_common.go:293] "Volume detached for volume \"pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.890509 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/573df1c9-ef2d-4407-858d-1fa0bdc533fd-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.900848 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.906594 4954 scope.go:117] "RemoveContainer" containerID="171aef1870503661b37f65aa48a9fd84082071336e538a3380a13ecca8594ff3" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.936798 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 06 09:08:08 crc kubenswrapper[4954]: E1206 09:08:08.937259 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a044e5d-2332-40cc-890b-7da9e544be80" containerName="config-reloader" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.937277 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a044e5d-2332-40cc-890b-7da9e544be80" containerName="config-reloader" Dec 06 09:08:08 crc kubenswrapper[4954]: E1206 09:08:08.937299 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a044e5d-2332-40cc-890b-7da9e544be80" containerName="init-config-reloader" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.937305 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a044e5d-2332-40cc-890b-7da9e544be80" containerName="init-config-reloader" Dec 06 09:08:08 crc kubenswrapper[4954]: E1206 09:08:08.937325 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a044e5d-2332-40cc-890b-7da9e544be80" containerName="thanos-sidecar" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.937332 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a044e5d-2332-40cc-890b-7da9e544be80" containerName="thanos-sidecar" Dec 06 09:08:08 crc kubenswrapper[4954]: E1206 09:08:08.937345 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a044e5d-2332-40cc-890b-7da9e544be80" containerName="prometheus" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.937351 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a044e5d-2332-40cc-890b-7da9e544be80" containerName="prometheus" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.937554 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a044e5d-2332-40cc-890b-7da9e544be80" containerName="config-reloader" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.937605 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a044e5d-2332-40cc-890b-7da9e544be80" containerName="thanos-sidecar" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.937621 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a044e5d-2332-40cc-890b-7da9e544be80" containerName="prometheus" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.940929 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.943073 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.950312 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.955967 4954 scope.go:117] "RemoveContainer" containerID="8e1d20fcf903015205a943e0b27984da4df43d6be5ed3c227405e8412c5eabb5" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.956849 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.957638 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.957822 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.957908 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-qgxk7" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.958022 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.960300 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 06 09:08:08 crc kubenswrapper[4954]: I1206 09:08:08.999319 4954 scope.go:117] "RemoveContainer" containerID="8e1d20fcf903015205a943e0b27984da4df43d6be5ed3c227405e8412c5eabb5" Dec 06 09:08:09 crc kubenswrapper[4954]: E1206 09:08:09.000376 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e1d20fcf903015205a943e0b27984da4df43d6be5ed3c227405e8412c5eabb5\": container with ID starting with 8e1d20fcf903015205a943e0b27984da4df43d6be5ed3c227405e8412c5eabb5 not found: ID does not exist" containerID="8e1d20fcf903015205a943e0b27984da4df43d6be5ed3c227405e8412c5eabb5" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.000513 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e1d20fcf903015205a943e0b27984da4df43d6be5ed3c227405e8412c5eabb5"} err="failed to get container status \"8e1d20fcf903015205a943e0b27984da4df43d6be5ed3c227405e8412c5eabb5\": rpc error: code = NotFound desc = could not find container \"8e1d20fcf903015205a943e0b27984da4df43d6be5ed3c227405e8412c5eabb5\": container with ID starting with 8e1d20fcf903015205a943e0b27984da4df43d6be5ed3c227405e8412c5eabb5 not found: ID does not exist" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.093462 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/82879efe-6f16-4ce2-b4cb-17ddd3349804-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.093510 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/82879efe-6f16-4ce2-b4cb-17ddd3349804-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.093548 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/82879efe-6f16-4ce2-b4cb-17ddd3349804-config\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.093587 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/82879efe-6f16-4ce2-b4cb-17ddd3349804-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.094260 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.094310 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/82879efe-6f16-4ce2-b4cb-17ddd3349804-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.094334 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/82879efe-6f16-4ce2-b4cb-17ddd3349804-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.094369 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/82879efe-6f16-4ce2-b4cb-17ddd3349804-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.094390 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82879efe-6f16-4ce2-b4cb-17ddd3349804-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.094412 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hntld\" (UniqueName: \"kubernetes.io/projected/82879efe-6f16-4ce2-b4cb-17ddd3349804-kube-api-access-hntld\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.094447 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/82879efe-6f16-4ce2-b4cb-17ddd3349804-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.104969 4954 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="573df1c9-ef2d-4407-858d-1fa0bdc533fd" podUID="a4494926-d786-4b95-b768-7434a47be11b" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.197480 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.197595 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/82879efe-6f16-4ce2-b4cb-17ddd3349804-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.197635 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/82879efe-6f16-4ce2-b4cb-17ddd3349804-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.197691 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/82879efe-6f16-4ce2-b4cb-17ddd3349804-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.197720 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82879efe-6f16-4ce2-b4cb-17ddd3349804-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.197771 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hntld\" (UniqueName: \"kubernetes.io/projected/82879efe-6f16-4ce2-b4cb-17ddd3349804-kube-api-access-hntld\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.197795 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/82879efe-6f16-4ce2-b4cb-17ddd3349804-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.197892 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/82879efe-6f16-4ce2-b4cb-17ddd3349804-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.197919 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/82879efe-6f16-4ce2-b4cb-17ddd3349804-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.197953 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/82879efe-6f16-4ce2-b4cb-17ddd3349804-config\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.197991 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/82879efe-6f16-4ce2-b4cb-17ddd3349804-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.198927 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/82879efe-6f16-4ce2-b4cb-17ddd3349804-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.203958 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/82879efe-6f16-4ce2-b4cb-17ddd3349804-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.206060 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/82879efe-6f16-4ce2-b4cb-17ddd3349804-config\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.208661 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/82879efe-6f16-4ce2-b4cb-17ddd3349804-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.208895 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/82879efe-6f16-4ce2-b4cb-17ddd3349804-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.211008 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.211039 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/23b790ef866cf8a9b9301f286c81631c28f0905e1c0059509880a0abce6a0705/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.212866 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82879efe-6f16-4ce2-b4cb-17ddd3349804-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.217623 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/82879efe-6f16-4ce2-b4cb-17ddd3349804-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.221130 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/82879efe-6f16-4ce2-b4cb-17ddd3349804-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.224213 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/82879efe-6f16-4ce2-b4cb-17ddd3349804-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.228268 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hntld\" (UniqueName: \"kubernetes.io/projected/82879efe-6f16-4ce2-b4cb-17ddd3349804-kube-api-access-hntld\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.264196 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a493b48d-6be2-496d-b13d-a9aecbaf7f59\") pod \"prometheus-metric-storage-0\" (UID: \"82879efe-6f16-4ce2-b4cb-17ddd3349804\") " pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.289784 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.497936 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a044e5d-2332-40cc-890b-7da9e544be80" path="/var/lib/kubelet/pods/2a044e5d-2332-40cc-890b-7da9e544be80/volumes" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.498863 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="573df1c9-ef2d-4407-858d-1fa0bdc533fd" path="/var/lib/kubelet/pods/573df1c9-ef2d-4407-858d-1fa0bdc533fd/volumes" Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.754021 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.872377 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdx6g" event={"ID":"c77f9d5e-d888-45b9-87ae-a4011a73ccf1","Type":"ContainerStarted","Data":"36c95ce3c369ac0c77bd6cd14f813bbae7a0064622a257d93d3e8598acacea65"} Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.879786 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"82879efe-6f16-4ce2-b4cb-17ddd3349804","Type":"ContainerStarted","Data":"73d601a4f1157f5f05c6a88c77517225d296aee61488c7d89e23971eab5bb481"} Dec 06 09:08:09 crc kubenswrapper[4954]: I1206 09:08:09.901985 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xdx6g" podStartSLOduration=3.3571141620000002 podStartE2EDuration="5.901965697s" podCreationTimestamp="2025-12-06 09:08:04 +0000 UTC" firstStartedPulling="2025-12-06 09:08:06.710435699 +0000 UTC m=+7861.523795088" lastFinishedPulling="2025-12-06 09:08:09.255287234 +0000 UTC m=+7864.068646623" observedRunningTime="2025-12-06 09:08:09.893937642 +0000 UTC m=+7864.707297031" watchObservedRunningTime="2025-12-06 09:08:09.901965697 +0000 UTC m=+7864.715325086" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.430438 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.434723 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.435692 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="2a044e5d-2332-40cc-890b-7da9e544be80" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.1.140:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.438672 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.444458 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:08:11 crc kubenswrapper[4954]: E1206 09:08:11.444789 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.447281 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.466348 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.557834 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d864a6dd-9dac-41ae-8165-b8c97dbc005d-run-httpd\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.558001 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.558045 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-config-data\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.558272 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d864a6dd-9dac-41ae-8165-b8c97dbc005d-log-httpd\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.558306 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-scripts\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.558329 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6lqd\" (UniqueName: \"kubernetes.io/projected/d864a6dd-9dac-41ae-8165-b8c97dbc005d-kube-api-access-j6lqd\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.558350 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.659763 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.659811 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-config-data\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.659924 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d864a6dd-9dac-41ae-8165-b8c97dbc005d-log-httpd\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.659945 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-scripts\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.659963 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6lqd\" (UniqueName: \"kubernetes.io/projected/d864a6dd-9dac-41ae-8165-b8c97dbc005d-kube-api-access-j6lqd\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.659979 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.660012 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d864a6dd-9dac-41ae-8165-b8c97dbc005d-run-httpd\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.660528 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d864a6dd-9dac-41ae-8165-b8c97dbc005d-run-httpd\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.660653 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d864a6dd-9dac-41ae-8165-b8c97dbc005d-log-httpd\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.768477 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.768590 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.768993 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-config-data\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.769006 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-scripts\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:11 crc kubenswrapper[4954]: I1206 09:08:11.770990 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6lqd\" (UniqueName: \"kubernetes.io/projected/d864a6dd-9dac-41ae-8165-b8c97dbc005d-kube-api-access-j6lqd\") pod \"ceilometer-0\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " pod="openstack/ceilometer-0" Dec 06 09:08:12 crc kubenswrapper[4954]: I1206 09:08:12.070105 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:08:12 crc kubenswrapper[4954]: I1206 09:08:12.672261 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:08:12 crc kubenswrapper[4954]: W1206 09:08:12.778887 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd864a6dd_9dac_41ae_8165_b8c97dbc005d.slice/crio-871b2c96917cdd55aa2a852399409d842f9f0d0c5dff49ce695db39d3b072a38 WatchSource:0}: Error finding container 871b2c96917cdd55aa2a852399409d842f9f0d0c5dff49ce695db39d3b072a38: Status 404 returned error can't find the container with id 871b2c96917cdd55aa2a852399409d842f9f0d0c5dff49ce695db39d3b072a38 Dec 06 09:08:12 crc kubenswrapper[4954]: I1206 09:08:12.924502 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d864a6dd-9dac-41ae-8165-b8c97dbc005d","Type":"ContainerStarted","Data":"871b2c96917cdd55aa2a852399409d842f9f0d0c5dff49ce695db39d3b072a38"} Dec 06 09:08:13 crc kubenswrapper[4954]: I1206 09:08:13.937427 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"82879efe-6f16-4ce2-b4cb-17ddd3349804","Type":"ContainerStarted","Data":"ba2c4b20f3b19445413cb425c445a37ca397d514636c18609cfc6325b58a36b6"} Dec 06 09:08:15 crc kubenswrapper[4954]: I1206 09:08:15.367650 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xdx6g" Dec 06 09:08:15 crc kubenswrapper[4954]: I1206 09:08:15.367978 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xdx6g" Dec 06 09:08:15 crc kubenswrapper[4954]: I1206 09:08:15.430709 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xdx6g" Dec 06 09:08:16 crc kubenswrapper[4954]: I1206 09:08:16.021541 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xdx6g" Dec 06 09:08:16 crc kubenswrapper[4954]: I1206 09:08:16.075281 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdx6g"] Dec 06 09:08:16 crc kubenswrapper[4954]: I1206 09:08:16.984516 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d864a6dd-9dac-41ae-8165-b8c97dbc005d","Type":"ContainerStarted","Data":"d29ec5dd257cc48ea6fa7e8c8fadeb2a86986c56dc2221d53290ae8530d97ada"} Dec 06 09:08:17 crc kubenswrapper[4954]: I1206 09:08:17.992944 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xdx6g" podUID="c77f9d5e-d888-45b9-87ae-a4011a73ccf1" containerName="registry-server" containerID="cri-o://36c95ce3c369ac0c77bd6cd14f813bbae7a0064622a257d93d3e8598acacea65" gracePeriod=2 Dec 06 09:08:19 crc kubenswrapper[4954]: I1206 09:08:19.003589 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d864a6dd-9dac-41ae-8165-b8c97dbc005d","Type":"ContainerStarted","Data":"78ad177c1be1c659088fba805101d648bfbedcc16475d13b93d2ecae627a2ef6"} Dec 06 09:08:19 crc kubenswrapper[4954]: I1206 09:08:19.005639 4954 generic.go:334] "Generic (PLEG): container finished" podID="c77f9d5e-d888-45b9-87ae-a4011a73ccf1" containerID="36c95ce3c369ac0c77bd6cd14f813bbae7a0064622a257d93d3e8598acacea65" exitCode=0 Dec 06 09:08:19 crc kubenswrapper[4954]: I1206 09:08:19.005681 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdx6g" event={"ID":"c77f9d5e-d888-45b9-87ae-a4011a73ccf1","Type":"ContainerDied","Data":"36c95ce3c369ac0c77bd6cd14f813bbae7a0064622a257d93d3e8598acacea65"} Dec 06 09:08:19 crc kubenswrapper[4954]: I1206 09:08:19.005705 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdx6g" event={"ID":"c77f9d5e-d888-45b9-87ae-a4011a73ccf1","Type":"ContainerDied","Data":"16676e058f9db4608052ea78f2148cebb3bb3abbc5af7f2aa9fc0307238b69bd"} Dec 06 09:08:19 crc kubenswrapper[4954]: I1206 09:08:19.005715 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16676e058f9db4608052ea78f2148cebb3bb3abbc5af7f2aa9fc0307238b69bd" Dec 06 09:08:19 crc kubenswrapper[4954]: I1206 09:08:19.064979 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xdx6g" Dec 06 09:08:19 crc kubenswrapper[4954]: I1206 09:08:19.168892 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c77f9d5e-d888-45b9-87ae-a4011a73ccf1-utilities\") pod \"c77f9d5e-d888-45b9-87ae-a4011a73ccf1\" (UID: \"c77f9d5e-d888-45b9-87ae-a4011a73ccf1\") " Dec 06 09:08:19 crc kubenswrapper[4954]: I1206 09:08:19.169008 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9v64\" (UniqueName: \"kubernetes.io/projected/c77f9d5e-d888-45b9-87ae-a4011a73ccf1-kube-api-access-w9v64\") pod \"c77f9d5e-d888-45b9-87ae-a4011a73ccf1\" (UID: \"c77f9d5e-d888-45b9-87ae-a4011a73ccf1\") " Dec 06 09:08:19 crc kubenswrapper[4954]: I1206 09:08:19.169077 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c77f9d5e-d888-45b9-87ae-a4011a73ccf1-catalog-content\") pod \"c77f9d5e-d888-45b9-87ae-a4011a73ccf1\" (UID: \"c77f9d5e-d888-45b9-87ae-a4011a73ccf1\") " Dec 06 09:08:19 crc kubenswrapper[4954]: I1206 09:08:19.172359 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c77f9d5e-d888-45b9-87ae-a4011a73ccf1-utilities" (OuterVolumeSpecName: "utilities") pod "c77f9d5e-d888-45b9-87ae-a4011a73ccf1" (UID: "c77f9d5e-d888-45b9-87ae-a4011a73ccf1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:08:19 crc kubenswrapper[4954]: I1206 09:08:19.180655 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c77f9d5e-d888-45b9-87ae-a4011a73ccf1-kube-api-access-w9v64" (OuterVolumeSpecName: "kube-api-access-w9v64") pod "c77f9d5e-d888-45b9-87ae-a4011a73ccf1" (UID: "c77f9d5e-d888-45b9-87ae-a4011a73ccf1"). InnerVolumeSpecName "kube-api-access-w9v64". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:08:19 crc kubenswrapper[4954]: I1206 09:08:19.192351 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c77f9d5e-d888-45b9-87ae-a4011a73ccf1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c77f9d5e-d888-45b9-87ae-a4011a73ccf1" (UID: "c77f9d5e-d888-45b9-87ae-a4011a73ccf1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:08:19 crc kubenswrapper[4954]: I1206 09:08:19.272606 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c77f9d5e-d888-45b9-87ae-a4011a73ccf1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:19 crc kubenswrapper[4954]: I1206 09:08:19.272648 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c77f9d5e-d888-45b9-87ae-a4011a73ccf1-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:19 crc kubenswrapper[4954]: I1206 09:08:19.272660 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9v64\" (UniqueName: \"kubernetes.io/projected/c77f9d5e-d888-45b9-87ae-a4011a73ccf1-kube-api-access-w9v64\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:20 crc kubenswrapper[4954]: I1206 09:08:20.028321 4954 generic.go:334] "Generic (PLEG): container finished" podID="82879efe-6f16-4ce2-b4cb-17ddd3349804" containerID="ba2c4b20f3b19445413cb425c445a37ca397d514636c18609cfc6325b58a36b6" exitCode=0 Dec 06 09:08:20 crc kubenswrapper[4954]: I1206 09:08:20.028376 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"82879efe-6f16-4ce2-b4cb-17ddd3349804","Type":"ContainerDied","Data":"ba2c4b20f3b19445413cb425c445a37ca397d514636c18609cfc6325b58a36b6"} Dec 06 09:08:20 crc kubenswrapper[4954]: I1206 09:08:20.032842 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xdx6g" Dec 06 09:08:20 crc kubenswrapper[4954]: I1206 09:08:20.033610 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d864a6dd-9dac-41ae-8165-b8c97dbc005d","Type":"ContainerStarted","Data":"522bd033ff9652de91451f74569eba98c18ab4f8992e15dde2add0fad5c2465d"} Dec 06 09:08:20 crc kubenswrapper[4954]: I1206 09:08:20.185202 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdx6g"] Dec 06 09:08:20 crc kubenswrapper[4954]: I1206 09:08:20.194885 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdx6g"] Dec 06 09:08:21 crc kubenswrapper[4954]: I1206 09:08:21.044436 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"82879efe-6f16-4ce2-b4cb-17ddd3349804","Type":"ContainerStarted","Data":"52ee6932c4acc9ef5af9380f048a7152da2ac86c8114fe9db245bbc52aca88a0"} Dec 06 09:08:21 crc kubenswrapper[4954]: I1206 09:08:21.048079 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d864a6dd-9dac-41ae-8165-b8c97dbc005d","Type":"ContainerStarted","Data":"b58a260b2a13db4c5e8aa5df6e094d90f93cef39536b5750c93c06d2237b0554"} Dec 06 09:08:21 crc kubenswrapper[4954]: I1206 09:08:21.049616 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 09:08:21 crc kubenswrapper[4954]: I1206 09:08:21.074820 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.270985472 podStartE2EDuration="10.074801948s" podCreationTimestamp="2025-12-06 09:08:11 +0000 UTC" firstStartedPulling="2025-12-06 09:08:12.781108448 +0000 UTC m=+7867.594467837" lastFinishedPulling="2025-12-06 09:08:20.584924924 +0000 UTC m=+7875.398284313" observedRunningTime="2025-12-06 09:08:21.070727649 +0000 UTC m=+7875.884087048" watchObservedRunningTime="2025-12-06 09:08:21.074801948 +0000 UTC m=+7875.888161337" Dec 06 09:08:21 crc kubenswrapper[4954]: I1206 09:08:21.472801 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c77f9d5e-d888-45b9-87ae-a4011a73ccf1" path="/var/lib/kubelet/pods/c77f9d5e-d888-45b9-87ae-a4011a73ccf1/volumes" Dec 06 09:08:22 crc kubenswrapper[4954]: I1206 09:08:22.443482 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:08:22 crc kubenswrapper[4954]: E1206 09:08:22.444264 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:08:25 crc kubenswrapper[4954]: I1206 09:08:25.085021 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"82879efe-6f16-4ce2-b4cb-17ddd3349804","Type":"ContainerStarted","Data":"2f06c19e44fd8758df0af53f39e8770e193a2c643b98ecb0e560bb52009351ba"} Dec 06 09:08:25 crc kubenswrapper[4954]: I1206 09:08:25.085628 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"82879efe-6f16-4ce2-b4cb-17ddd3349804","Type":"ContainerStarted","Data":"b2e866062e618275910d41907950bfe3a55da2ac848d617d9eb769b878100bf8"} Dec 06 09:08:25 crc kubenswrapper[4954]: I1206 09:08:25.117477 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=17.117458987 podStartE2EDuration="17.117458987s" podCreationTimestamp="2025-12-06 09:08:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:25.106966386 +0000 UTC m=+7879.920325775" watchObservedRunningTime="2025-12-06 09:08:25.117458987 +0000 UTC m=+7879.930818376" Dec 06 09:08:25 crc kubenswrapper[4954]: I1206 09:08:25.304693 4954 scope.go:117] "RemoveContainer" containerID="d85be8ac5449c87f4e32dcd903dd026b7361d3d619a7b137a1ce711706cfb8c0" Dec 06 09:08:25 crc kubenswrapper[4954]: I1206 09:08:25.330380 4954 scope.go:117] "RemoveContainer" containerID="c8e7b319ad5c939409ebead664b35c2411b8da565fd7e7e30c6a787d5b100c45" Dec 06 09:08:27 crc kubenswrapper[4954]: I1206 09:08:27.883527 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-c9hvq"] Dec 06 09:08:27 crc kubenswrapper[4954]: E1206 09:08:27.884481 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c77f9d5e-d888-45b9-87ae-a4011a73ccf1" containerName="extract-utilities" Dec 06 09:08:27 crc kubenswrapper[4954]: I1206 09:08:27.884495 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77f9d5e-d888-45b9-87ae-a4011a73ccf1" containerName="extract-utilities" Dec 06 09:08:27 crc kubenswrapper[4954]: E1206 09:08:27.884511 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c77f9d5e-d888-45b9-87ae-a4011a73ccf1" containerName="extract-content" Dec 06 09:08:27 crc kubenswrapper[4954]: I1206 09:08:27.884517 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77f9d5e-d888-45b9-87ae-a4011a73ccf1" containerName="extract-content" Dec 06 09:08:27 crc kubenswrapper[4954]: E1206 09:08:27.884527 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c77f9d5e-d888-45b9-87ae-a4011a73ccf1" containerName="registry-server" Dec 06 09:08:27 crc kubenswrapper[4954]: I1206 09:08:27.884533 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77f9d5e-d888-45b9-87ae-a4011a73ccf1" containerName="registry-server" Dec 06 09:08:27 crc kubenswrapper[4954]: I1206 09:08:27.884982 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c77f9d5e-d888-45b9-87ae-a4011a73ccf1" containerName="registry-server" Dec 06 09:08:27 crc kubenswrapper[4954]: I1206 09:08:27.886542 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-c9hvq" Dec 06 09:08:27 crc kubenswrapper[4954]: I1206 09:08:27.911960 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-c9hvq"] Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.046251 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-bpcjv"] Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.053544 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91e8c0ec-a401-4e33-b035-cc6cabed461a-operator-scripts\") pod \"aodh-db-create-c9hvq\" (UID: \"91e8c0ec-a401-4e33-b035-cc6cabed461a\") " pod="openstack/aodh-db-create-c9hvq" Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.053790 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b8p4\" (UniqueName: \"kubernetes.io/projected/91e8c0ec-a401-4e33-b035-cc6cabed461a-kube-api-access-2b8p4\") pod \"aodh-db-create-c9hvq\" (UID: \"91e8c0ec-a401-4e33-b035-cc6cabed461a\") " pod="openstack/aodh-db-create-c9hvq" Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.056008 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-bpcjv"] Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.096452 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0197-account-create-update-rmv95"] Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.097956 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0197-account-create-update-rmv95" Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.102961 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.112555 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0197-account-create-update-rmv95"] Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.159521 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b8p4\" (UniqueName: \"kubernetes.io/projected/91e8c0ec-a401-4e33-b035-cc6cabed461a-kube-api-access-2b8p4\") pod \"aodh-db-create-c9hvq\" (UID: \"91e8c0ec-a401-4e33-b035-cc6cabed461a\") " pod="openstack/aodh-db-create-c9hvq" Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.159687 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91e8c0ec-a401-4e33-b035-cc6cabed461a-operator-scripts\") pod \"aodh-db-create-c9hvq\" (UID: \"91e8c0ec-a401-4e33-b035-cc6cabed461a\") " pod="openstack/aodh-db-create-c9hvq" Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.162358 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91e8c0ec-a401-4e33-b035-cc6cabed461a-operator-scripts\") pod \"aodh-db-create-c9hvq\" (UID: \"91e8c0ec-a401-4e33-b035-cc6cabed461a\") " pod="openstack/aodh-db-create-c9hvq" Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.177894 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b8p4\" (UniqueName: \"kubernetes.io/projected/91e8c0ec-a401-4e33-b035-cc6cabed461a-kube-api-access-2b8p4\") pod \"aodh-db-create-c9hvq\" (UID: \"91e8c0ec-a401-4e33-b035-cc6cabed461a\") " pod="openstack/aodh-db-create-c9hvq" Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.216196 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-c9hvq" Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.272321 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb202835-f662-4f75-a820-76ffc6def068-operator-scripts\") pod \"aodh-0197-account-create-update-rmv95\" (UID: \"eb202835-f662-4f75-a820-76ffc6def068\") " pod="openstack/aodh-0197-account-create-update-rmv95" Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.272623 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl2gf\" (UniqueName: \"kubernetes.io/projected/eb202835-f662-4f75-a820-76ffc6def068-kube-api-access-cl2gf\") pod \"aodh-0197-account-create-update-rmv95\" (UID: \"eb202835-f662-4f75-a820-76ffc6def068\") " pod="openstack/aodh-0197-account-create-update-rmv95" Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.375084 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl2gf\" (UniqueName: \"kubernetes.io/projected/eb202835-f662-4f75-a820-76ffc6def068-kube-api-access-cl2gf\") pod \"aodh-0197-account-create-update-rmv95\" (UID: \"eb202835-f662-4f75-a820-76ffc6def068\") " pod="openstack/aodh-0197-account-create-update-rmv95" Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.375925 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb202835-f662-4f75-a820-76ffc6def068-operator-scripts\") pod \"aodh-0197-account-create-update-rmv95\" (UID: \"eb202835-f662-4f75-a820-76ffc6def068\") " pod="openstack/aodh-0197-account-create-update-rmv95" Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.376715 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb202835-f662-4f75-a820-76ffc6def068-operator-scripts\") pod \"aodh-0197-account-create-update-rmv95\" (UID: \"eb202835-f662-4f75-a820-76ffc6def068\") " pod="openstack/aodh-0197-account-create-update-rmv95" Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.406250 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl2gf\" (UniqueName: \"kubernetes.io/projected/eb202835-f662-4f75-a820-76ffc6def068-kube-api-access-cl2gf\") pod \"aodh-0197-account-create-update-rmv95\" (UID: \"eb202835-f662-4f75-a820-76ffc6def068\") " pod="openstack/aodh-0197-account-create-update-rmv95" Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.420018 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0197-account-create-update-rmv95" Dec 06 09:08:28 crc kubenswrapper[4954]: I1206 09:08:28.860304 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-c9hvq"] Dec 06 09:08:29 crc kubenswrapper[4954]: W1206 09:08:29.076349 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb202835_f662_4f75_a820_76ffc6def068.slice/crio-00f70f112f3af4c62cc243030c4870d5e90c3b319d69f676f86fa59b46197290 WatchSource:0}: Error finding container 00f70f112f3af4c62cc243030c4870d5e90c3b319d69f676f86fa59b46197290: Status 404 returned error can't find the container with id 00f70f112f3af4c62cc243030c4870d5e90c3b319d69f676f86fa59b46197290 Dec 06 09:08:29 crc kubenswrapper[4954]: I1206 09:08:29.086085 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0197-account-create-update-rmv95"] Dec 06 09:08:29 crc kubenswrapper[4954]: I1206 09:08:29.139926 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0197-account-create-update-rmv95" event={"ID":"eb202835-f662-4f75-a820-76ffc6def068","Type":"ContainerStarted","Data":"00f70f112f3af4c62cc243030c4870d5e90c3b319d69f676f86fa59b46197290"} Dec 06 09:08:29 crc kubenswrapper[4954]: I1206 09:08:29.144390 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-c9hvq" event={"ID":"91e8c0ec-a401-4e33-b035-cc6cabed461a","Type":"ContainerStarted","Data":"abcdeb08fa7b85002381dca07d746f3d79ea7ffff01e277649f6a2c8ca45c1b9"} Dec 06 09:08:29 crc kubenswrapper[4954]: I1206 09:08:29.144496 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-c9hvq" event={"ID":"91e8c0ec-a401-4e33-b035-cc6cabed461a","Type":"ContainerStarted","Data":"af1fa93827b047b69ab2a110e3882a4d068b891aca84da33b8b9bbc6b2512d24"} Dec 06 09:08:29 crc kubenswrapper[4954]: I1206 09:08:29.179163 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-c9hvq" podStartSLOduration=2.179134874 podStartE2EDuration="2.179134874s" podCreationTimestamp="2025-12-06 09:08:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:08:29.165270803 +0000 UTC m=+7883.978630202" watchObservedRunningTime="2025-12-06 09:08:29.179134874 +0000 UTC m=+7883.992494263" Dec 06 09:08:29 crc kubenswrapper[4954]: I1206 09:08:29.290887 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:29 crc kubenswrapper[4954]: I1206 09:08:29.454799 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5" path="/var/lib/kubelet/pods/d90c1a08-b5fa-49a4-b8b3-f26e610d5cc5/volumes" Dec 06 09:08:30 crc kubenswrapper[4954]: I1206 09:08:30.153983 4954 generic.go:334] "Generic (PLEG): container finished" podID="91e8c0ec-a401-4e33-b035-cc6cabed461a" containerID="abcdeb08fa7b85002381dca07d746f3d79ea7ffff01e277649f6a2c8ca45c1b9" exitCode=0 Dec 06 09:08:30 crc kubenswrapper[4954]: I1206 09:08:30.154064 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-c9hvq" event={"ID":"91e8c0ec-a401-4e33-b035-cc6cabed461a","Type":"ContainerDied","Data":"abcdeb08fa7b85002381dca07d746f3d79ea7ffff01e277649f6a2c8ca45c1b9"} Dec 06 09:08:30 crc kubenswrapper[4954]: I1206 09:08:30.155985 4954 generic.go:334] "Generic (PLEG): container finished" podID="eb202835-f662-4f75-a820-76ffc6def068" containerID="462cad44fd266e5407bf823a7ada4fd1db3ebdd3fcb5bbacc34ef526a65ccbde" exitCode=0 Dec 06 09:08:30 crc kubenswrapper[4954]: I1206 09:08:30.156037 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0197-account-create-update-rmv95" event={"ID":"eb202835-f662-4f75-a820-76ffc6def068","Type":"ContainerDied","Data":"462cad44fd266e5407bf823a7ada4fd1db3ebdd3fcb5bbacc34ef526a65ccbde"} Dec 06 09:08:31 crc kubenswrapper[4954]: I1206 09:08:31.605754 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-c9hvq" Dec 06 09:08:31 crc kubenswrapper[4954]: I1206 09:08:31.613148 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0197-account-create-update-rmv95" Dec 06 09:08:31 crc kubenswrapper[4954]: I1206 09:08:31.753197 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb202835-f662-4f75-a820-76ffc6def068-operator-scripts\") pod \"eb202835-f662-4f75-a820-76ffc6def068\" (UID: \"eb202835-f662-4f75-a820-76ffc6def068\") " Dec 06 09:08:31 crc kubenswrapper[4954]: I1206 09:08:31.753316 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91e8c0ec-a401-4e33-b035-cc6cabed461a-operator-scripts\") pod \"91e8c0ec-a401-4e33-b035-cc6cabed461a\" (UID: \"91e8c0ec-a401-4e33-b035-cc6cabed461a\") " Dec 06 09:08:31 crc kubenswrapper[4954]: I1206 09:08:31.753451 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl2gf\" (UniqueName: \"kubernetes.io/projected/eb202835-f662-4f75-a820-76ffc6def068-kube-api-access-cl2gf\") pod \"eb202835-f662-4f75-a820-76ffc6def068\" (UID: \"eb202835-f662-4f75-a820-76ffc6def068\") " Dec 06 09:08:31 crc kubenswrapper[4954]: I1206 09:08:31.753510 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b8p4\" (UniqueName: \"kubernetes.io/projected/91e8c0ec-a401-4e33-b035-cc6cabed461a-kube-api-access-2b8p4\") pod \"91e8c0ec-a401-4e33-b035-cc6cabed461a\" (UID: \"91e8c0ec-a401-4e33-b035-cc6cabed461a\") " Dec 06 09:08:31 crc kubenswrapper[4954]: I1206 09:08:31.754501 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb202835-f662-4f75-a820-76ffc6def068-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb202835-f662-4f75-a820-76ffc6def068" (UID: "eb202835-f662-4f75-a820-76ffc6def068"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:08:31 crc kubenswrapper[4954]: I1206 09:08:31.754524 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91e8c0ec-a401-4e33-b035-cc6cabed461a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "91e8c0ec-a401-4e33-b035-cc6cabed461a" (UID: "91e8c0ec-a401-4e33-b035-cc6cabed461a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:08:31 crc kubenswrapper[4954]: I1206 09:08:31.759357 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91e8c0ec-a401-4e33-b035-cc6cabed461a-kube-api-access-2b8p4" (OuterVolumeSpecName: "kube-api-access-2b8p4") pod "91e8c0ec-a401-4e33-b035-cc6cabed461a" (UID: "91e8c0ec-a401-4e33-b035-cc6cabed461a"). InnerVolumeSpecName "kube-api-access-2b8p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:08:31 crc kubenswrapper[4954]: I1206 09:08:31.761109 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb202835-f662-4f75-a820-76ffc6def068-kube-api-access-cl2gf" (OuterVolumeSpecName: "kube-api-access-cl2gf") pod "eb202835-f662-4f75-a820-76ffc6def068" (UID: "eb202835-f662-4f75-a820-76ffc6def068"). InnerVolumeSpecName "kube-api-access-cl2gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:08:31 crc kubenswrapper[4954]: I1206 09:08:31.855441 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl2gf\" (UniqueName: \"kubernetes.io/projected/eb202835-f662-4f75-a820-76ffc6def068-kube-api-access-cl2gf\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:31 crc kubenswrapper[4954]: I1206 09:08:31.855482 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b8p4\" (UniqueName: \"kubernetes.io/projected/91e8c0ec-a401-4e33-b035-cc6cabed461a-kube-api-access-2b8p4\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:31 crc kubenswrapper[4954]: I1206 09:08:31.855500 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb202835-f662-4f75-a820-76ffc6def068-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:31 crc kubenswrapper[4954]: I1206 09:08:31.855515 4954 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91e8c0ec-a401-4e33-b035-cc6cabed461a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:32 crc kubenswrapper[4954]: I1206 09:08:32.178337 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-c9hvq" event={"ID":"91e8c0ec-a401-4e33-b035-cc6cabed461a","Type":"ContainerDied","Data":"af1fa93827b047b69ab2a110e3882a4d068b891aca84da33b8b9bbc6b2512d24"} Dec 06 09:08:32 crc kubenswrapper[4954]: I1206 09:08:32.178378 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af1fa93827b047b69ab2a110e3882a4d068b891aca84da33b8b9bbc6b2512d24" Dec 06 09:08:32 crc kubenswrapper[4954]: I1206 09:08:32.178390 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-c9hvq" Dec 06 09:08:32 crc kubenswrapper[4954]: I1206 09:08:32.182359 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0197-account-create-update-rmv95" event={"ID":"eb202835-f662-4f75-a820-76ffc6def068","Type":"ContainerDied","Data":"00f70f112f3af4c62cc243030c4870d5e90c3b319d69f676f86fa59b46197290"} Dec 06 09:08:32 crc kubenswrapper[4954]: I1206 09:08:32.182409 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00f70f112f3af4c62cc243030c4870d5e90c3b319d69f676f86fa59b46197290" Dec 06 09:08:32 crc kubenswrapper[4954]: I1206 09:08:32.182515 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0197-account-create-update-rmv95" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.443493 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:08:33 crc kubenswrapper[4954]: E1206 09:08:33.443920 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.470078 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-zgc76"] Dec 06 09:08:33 crc kubenswrapper[4954]: E1206 09:08:33.470594 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb202835-f662-4f75-a820-76ffc6def068" containerName="mariadb-account-create-update" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.470618 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb202835-f662-4f75-a820-76ffc6def068" containerName="mariadb-account-create-update" Dec 06 09:08:33 crc kubenswrapper[4954]: E1206 09:08:33.470645 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e8c0ec-a401-4e33-b035-cc6cabed461a" containerName="mariadb-database-create" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.470654 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e8c0ec-a401-4e33-b035-cc6cabed461a" containerName="mariadb-database-create" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.470996 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="91e8c0ec-a401-4e33-b035-cc6cabed461a" containerName="mariadb-database-create" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.471019 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb202835-f662-4f75-a820-76ffc6def068" containerName="mariadb-account-create-update" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.471938 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-zgc76" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.474517 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-mx8tx" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.474836 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.474916 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.475025 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.481671 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-zgc76"] Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.537194 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-config-data\") pod \"aodh-db-sync-zgc76\" (UID: \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\") " pod="openstack/aodh-db-sync-zgc76" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.537287 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlpld\" (UniqueName: \"kubernetes.io/projected/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-kube-api-access-xlpld\") pod \"aodh-db-sync-zgc76\" (UID: \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\") " pod="openstack/aodh-db-sync-zgc76" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.596078 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-scripts\") pod \"aodh-db-sync-zgc76\" (UID: \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\") " pod="openstack/aodh-db-sync-zgc76" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.596460 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-combined-ca-bundle\") pod \"aodh-db-sync-zgc76\" (UID: \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\") " pod="openstack/aodh-db-sync-zgc76" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.698855 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-config-data\") pod \"aodh-db-sync-zgc76\" (UID: \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\") " pod="openstack/aodh-db-sync-zgc76" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.698904 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlpld\" (UniqueName: \"kubernetes.io/projected/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-kube-api-access-xlpld\") pod \"aodh-db-sync-zgc76\" (UID: \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\") " pod="openstack/aodh-db-sync-zgc76" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.698969 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-scripts\") pod \"aodh-db-sync-zgc76\" (UID: \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\") " pod="openstack/aodh-db-sync-zgc76" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.699005 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-combined-ca-bundle\") pod \"aodh-db-sync-zgc76\" (UID: \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\") " pod="openstack/aodh-db-sync-zgc76" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.705424 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-scripts\") pod \"aodh-db-sync-zgc76\" (UID: \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\") " pod="openstack/aodh-db-sync-zgc76" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.710467 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-config-data\") pod \"aodh-db-sync-zgc76\" (UID: \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\") " pod="openstack/aodh-db-sync-zgc76" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.716180 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-combined-ca-bundle\") pod \"aodh-db-sync-zgc76\" (UID: \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\") " pod="openstack/aodh-db-sync-zgc76" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.721194 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlpld\" (UniqueName: \"kubernetes.io/projected/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-kube-api-access-xlpld\") pod \"aodh-db-sync-zgc76\" (UID: \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\") " pod="openstack/aodh-db-sync-zgc76" Dec 06 09:08:33 crc kubenswrapper[4954]: I1206 09:08:33.791282 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-zgc76" Dec 06 09:08:34 crc kubenswrapper[4954]: I1206 09:08:34.347913 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-zgc76"] Dec 06 09:08:34 crc kubenswrapper[4954]: W1206 09:08:34.353130 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc73f11fa_ce19_4f8e_9029_6bee3482fc8f.slice/crio-e5f8e188698d6b880cdb1251431a4dfd9e2f722b66fc10389d241d9241518411 WatchSource:0}: Error finding container e5f8e188698d6b880cdb1251431a4dfd9e2f722b66fc10389d241d9241518411: Status 404 returned error can't find the container with id e5f8e188698d6b880cdb1251431a4dfd9e2f722b66fc10389d241d9241518411 Dec 06 09:08:35 crc kubenswrapper[4954]: I1206 09:08:35.245729 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-zgc76" event={"ID":"c73f11fa-ce19-4f8e-9029-6bee3482fc8f","Type":"ContainerStarted","Data":"e5f8e188698d6b880cdb1251431a4dfd9e2f722b66fc10389d241d9241518411"} Dec 06 09:08:39 crc kubenswrapper[4954]: I1206 09:08:39.292881 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:39 crc kubenswrapper[4954]: I1206 09:08:39.305771 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:39 crc kubenswrapper[4954]: I1206 09:08:39.321442 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 06 09:08:40 crc kubenswrapper[4954]: I1206 09:08:40.327858 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-zgc76" event={"ID":"c73f11fa-ce19-4f8e-9029-6bee3482fc8f","Type":"ContainerStarted","Data":"455db10736ecb45602e4af3895f5ab9cf9a22300fc9441546721dcfeebe2e38d"} Dec 06 09:08:40 crc kubenswrapper[4954]: I1206 09:08:40.343605 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-zgc76" podStartSLOduration=2.438516478 podStartE2EDuration="7.343558029s" podCreationTimestamp="2025-12-06 09:08:33 +0000 UTC" firstStartedPulling="2025-12-06 09:08:34.356315077 +0000 UTC m=+7889.169674466" lastFinishedPulling="2025-12-06 09:08:39.261356618 +0000 UTC m=+7894.074716017" observedRunningTime="2025-12-06 09:08:40.342863621 +0000 UTC m=+7895.156223040" watchObservedRunningTime="2025-12-06 09:08:40.343558029 +0000 UTC m=+7895.156917418" Dec 06 09:08:42 crc kubenswrapper[4954]: I1206 09:08:42.078601 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 06 09:08:42 crc kubenswrapper[4954]: I1206 09:08:42.351625 4954 generic.go:334] "Generic (PLEG): container finished" podID="c73f11fa-ce19-4f8e-9029-6bee3482fc8f" containerID="455db10736ecb45602e4af3895f5ab9cf9a22300fc9441546721dcfeebe2e38d" exitCode=0 Dec 06 09:08:42 crc kubenswrapper[4954]: I1206 09:08:42.351676 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-zgc76" event={"ID":"c73f11fa-ce19-4f8e-9029-6bee3482fc8f","Type":"ContainerDied","Data":"455db10736ecb45602e4af3895f5ab9cf9a22300fc9441546721dcfeebe2e38d"} Dec 06 09:08:43 crc kubenswrapper[4954]: I1206 09:08:43.875387 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-zgc76" Dec 06 09:08:43 crc kubenswrapper[4954]: I1206 09:08:43.956067 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-combined-ca-bundle\") pod \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\" (UID: \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\") " Dec 06 09:08:43 crc kubenswrapper[4954]: I1206 09:08:43.956131 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-config-data\") pod \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\" (UID: \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\") " Dec 06 09:08:43 crc kubenswrapper[4954]: I1206 09:08:43.956159 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlpld\" (UniqueName: \"kubernetes.io/projected/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-kube-api-access-xlpld\") pod \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\" (UID: \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\") " Dec 06 09:08:43 crc kubenswrapper[4954]: I1206 09:08:43.956310 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-scripts\") pod \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\" (UID: \"c73f11fa-ce19-4f8e-9029-6bee3482fc8f\") " Dec 06 09:08:43 crc kubenswrapper[4954]: I1206 09:08:43.961912 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-kube-api-access-xlpld" (OuterVolumeSpecName: "kube-api-access-xlpld") pod "c73f11fa-ce19-4f8e-9029-6bee3482fc8f" (UID: "c73f11fa-ce19-4f8e-9029-6bee3482fc8f"). InnerVolumeSpecName "kube-api-access-xlpld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:08:43 crc kubenswrapper[4954]: I1206 09:08:43.962324 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-scripts" (OuterVolumeSpecName: "scripts") pod "c73f11fa-ce19-4f8e-9029-6bee3482fc8f" (UID: "c73f11fa-ce19-4f8e-9029-6bee3482fc8f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:08:44 crc kubenswrapper[4954]: I1206 09:08:44.005344 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-config-data" (OuterVolumeSpecName: "config-data") pod "c73f11fa-ce19-4f8e-9029-6bee3482fc8f" (UID: "c73f11fa-ce19-4f8e-9029-6bee3482fc8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:08:44 crc kubenswrapper[4954]: I1206 09:08:44.012964 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c73f11fa-ce19-4f8e-9029-6bee3482fc8f" (UID: "c73f11fa-ce19-4f8e-9029-6bee3482fc8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:08:44 crc kubenswrapper[4954]: I1206 09:08:44.059230 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:44 crc kubenswrapper[4954]: I1206 09:08:44.059267 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:44 crc kubenswrapper[4954]: I1206 09:08:44.059280 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlpld\" (UniqueName: \"kubernetes.io/projected/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-kube-api-access-xlpld\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:44 crc kubenswrapper[4954]: I1206 09:08:44.059293 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c73f11fa-ce19-4f8e-9029-6bee3482fc8f-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:44 crc kubenswrapper[4954]: I1206 09:08:44.372894 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-zgc76" event={"ID":"c73f11fa-ce19-4f8e-9029-6bee3482fc8f","Type":"ContainerDied","Data":"e5f8e188698d6b880cdb1251431a4dfd9e2f722b66fc10389d241d9241518411"} Dec 06 09:08:44 crc kubenswrapper[4954]: I1206 09:08:44.372931 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5f8e188698d6b880cdb1251431a4dfd9e2f722b66fc10389d241d9241518411" Dec 06 09:08:44 crc kubenswrapper[4954]: I1206 09:08:44.372984 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-zgc76" Dec 06 09:08:45 crc kubenswrapper[4954]: I1206 09:08:45.828760 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 09:08:45 crc kubenswrapper[4954]: I1206 09:08:45.829249 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="e01f09b1-4b58-4eb8-90cf-d7738a282534" containerName="kube-state-metrics" containerID="cri-o://6a104489772e875edc8d11eb4688bb36944a923915ef0c8e9ee52d11952916e9" gracePeriod=30 Dec 06 09:08:46 crc kubenswrapper[4954]: I1206 09:08:46.435050 4954 generic.go:334] "Generic (PLEG): container finished" podID="e01f09b1-4b58-4eb8-90cf-d7738a282534" containerID="6a104489772e875edc8d11eb4688bb36944a923915ef0c8e9ee52d11952916e9" exitCode=2 Dec 06 09:08:46 crc kubenswrapper[4954]: I1206 09:08:46.435511 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e01f09b1-4b58-4eb8-90cf-d7738a282534","Type":"ContainerDied","Data":"6a104489772e875edc8d11eb4688bb36944a923915ef0c8e9ee52d11952916e9"} Dec 06 09:08:46 crc kubenswrapper[4954]: I1206 09:08:46.435603 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e01f09b1-4b58-4eb8-90cf-d7738a282534","Type":"ContainerDied","Data":"7326f6faf9b2ed8d9f0a01b3ec234792eb3f24e5b6306dc5b5372566c9c5dadc"} Dec 06 09:08:46 crc kubenswrapper[4954]: I1206 09:08:46.435673 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7326f6faf9b2ed8d9f0a01b3ec234792eb3f24e5b6306dc5b5372566c9c5dadc" Dec 06 09:08:46 crc kubenswrapper[4954]: I1206 09:08:46.445832 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:08:46 crc kubenswrapper[4954]: E1206 09:08:46.446218 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:08:46 crc kubenswrapper[4954]: I1206 09:08:46.477825 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 09:08:46 crc kubenswrapper[4954]: I1206 09:08:46.619989 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkcf2\" (UniqueName: \"kubernetes.io/projected/e01f09b1-4b58-4eb8-90cf-d7738a282534-kube-api-access-mkcf2\") pod \"e01f09b1-4b58-4eb8-90cf-d7738a282534\" (UID: \"e01f09b1-4b58-4eb8-90cf-d7738a282534\") " Dec 06 09:08:46 crc kubenswrapper[4954]: I1206 09:08:46.713195 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e01f09b1-4b58-4eb8-90cf-d7738a282534-kube-api-access-mkcf2" (OuterVolumeSpecName: "kube-api-access-mkcf2") pod "e01f09b1-4b58-4eb8-90cf-d7738a282534" (UID: "e01f09b1-4b58-4eb8-90cf-d7738a282534"). InnerVolumeSpecName "kube-api-access-mkcf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:08:46 crc kubenswrapper[4954]: I1206 09:08:46.723965 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkcf2\" (UniqueName: \"kubernetes.io/projected/e01f09b1-4b58-4eb8-90cf-d7738a282534-kube-api-access-mkcf2\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.444495 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.482628 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.500951 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.503282 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 09:08:47 crc kubenswrapper[4954]: E1206 09:08:47.503726 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e01f09b1-4b58-4eb8-90cf-d7738a282534" containerName="kube-state-metrics" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.503746 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e01f09b1-4b58-4eb8-90cf-d7738a282534" containerName="kube-state-metrics" Dec 06 09:08:47 crc kubenswrapper[4954]: E1206 09:08:47.503775 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c73f11fa-ce19-4f8e-9029-6bee3482fc8f" containerName="aodh-db-sync" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.503781 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c73f11fa-ce19-4f8e-9029-6bee3482fc8f" containerName="aodh-db-sync" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.503992 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c73f11fa-ce19-4f8e-9029-6bee3482fc8f" containerName="aodh-db-sync" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.504020 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="e01f09b1-4b58-4eb8-90cf-d7738a282534" containerName="kube-state-metrics" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.504741 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.508338 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.508680 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.515127 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.541319 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a51b2504-4a33-4058-be27-67afbfe5efc4-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a51b2504-4a33-4058-be27-67afbfe5efc4\") " pod="openstack/kube-state-metrics-0" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.541440 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a51b2504-4a33-4058-be27-67afbfe5efc4-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a51b2504-4a33-4058-be27-67afbfe5efc4\") " pod="openstack/kube-state-metrics-0" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.541487 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a51b2504-4a33-4058-be27-67afbfe5efc4-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a51b2504-4a33-4058-be27-67afbfe5efc4\") " pod="openstack/kube-state-metrics-0" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.541713 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pbc6\" (UniqueName: \"kubernetes.io/projected/a51b2504-4a33-4058-be27-67afbfe5efc4-kube-api-access-2pbc6\") pod \"kube-state-metrics-0\" (UID: \"a51b2504-4a33-4058-be27-67afbfe5efc4\") " pod="openstack/kube-state-metrics-0" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.643144 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a51b2504-4a33-4058-be27-67afbfe5efc4-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a51b2504-4a33-4058-be27-67afbfe5efc4\") " pod="openstack/kube-state-metrics-0" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.643189 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a51b2504-4a33-4058-be27-67afbfe5efc4-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a51b2504-4a33-4058-be27-67afbfe5efc4\") " pod="openstack/kube-state-metrics-0" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.644480 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pbc6\" (UniqueName: \"kubernetes.io/projected/a51b2504-4a33-4058-be27-67afbfe5efc4-kube-api-access-2pbc6\") pod \"kube-state-metrics-0\" (UID: \"a51b2504-4a33-4058-be27-67afbfe5efc4\") " pod="openstack/kube-state-metrics-0" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.644711 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a51b2504-4a33-4058-be27-67afbfe5efc4-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a51b2504-4a33-4058-be27-67afbfe5efc4\") " pod="openstack/kube-state-metrics-0" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.647527 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a51b2504-4a33-4058-be27-67afbfe5efc4-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a51b2504-4a33-4058-be27-67afbfe5efc4\") " pod="openstack/kube-state-metrics-0" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.651820 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a51b2504-4a33-4058-be27-67afbfe5efc4-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a51b2504-4a33-4058-be27-67afbfe5efc4\") " pod="openstack/kube-state-metrics-0" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.657371 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a51b2504-4a33-4058-be27-67afbfe5efc4-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a51b2504-4a33-4058-be27-67afbfe5efc4\") " pod="openstack/kube-state-metrics-0" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.665372 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pbc6\" (UniqueName: \"kubernetes.io/projected/a51b2504-4a33-4058-be27-67afbfe5efc4-kube-api-access-2pbc6\") pod \"kube-state-metrics-0\" (UID: \"a51b2504-4a33-4058-be27-67afbfe5efc4\") " pod="openstack/kube-state-metrics-0" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.819281 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.943431 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.947025 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.952535 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.952826 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.953034 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-mx8tx" Dec 06 09:08:47 crc kubenswrapper[4954]: I1206 09:08:47.992944 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.060113 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a85a67ba-1b95-44bc-8792-16ede1bc5f33-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\") " pod="openstack/aodh-0" Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.060435 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a85a67ba-1b95-44bc-8792-16ede1bc5f33-scripts\") pod \"aodh-0\" (UID: \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\") " pod="openstack/aodh-0" Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.060505 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a85a67ba-1b95-44bc-8792-16ede1bc5f33-config-data\") pod \"aodh-0\" (UID: \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\") " pod="openstack/aodh-0" Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.060604 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6pbd\" (UniqueName: \"kubernetes.io/projected/a85a67ba-1b95-44bc-8792-16ede1bc5f33-kube-api-access-k6pbd\") pod \"aodh-0\" (UID: \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\") " pod="openstack/aodh-0" Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.163253 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a85a67ba-1b95-44bc-8792-16ede1bc5f33-scripts\") pod \"aodh-0\" (UID: \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\") " pod="openstack/aodh-0" Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.163304 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a85a67ba-1b95-44bc-8792-16ede1bc5f33-config-data\") pod \"aodh-0\" (UID: \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\") " pod="openstack/aodh-0" Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.163331 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6pbd\" (UniqueName: \"kubernetes.io/projected/a85a67ba-1b95-44bc-8792-16ede1bc5f33-kube-api-access-k6pbd\") pod \"aodh-0\" (UID: \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\") " pod="openstack/aodh-0" Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.163426 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a85a67ba-1b95-44bc-8792-16ede1bc5f33-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\") " pod="openstack/aodh-0" Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.172552 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a85a67ba-1b95-44bc-8792-16ede1bc5f33-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\") " pod="openstack/aodh-0" Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.173590 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a85a67ba-1b95-44bc-8792-16ede1bc5f33-config-data\") pod \"aodh-0\" (UID: \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\") " pod="openstack/aodh-0" Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.181932 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6pbd\" (UniqueName: \"kubernetes.io/projected/a85a67ba-1b95-44bc-8792-16ede1bc5f33-kube-api-access-k6pbd\") pod \"aodh-0\" (UID: \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\") " pod="openstack/aodh-0" Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.181989 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a85a67ba-1b95-44bc-8792-16ede1bc5f33-scripts\") pod \"aodh-0\" (UID: \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\") " pod="openstack/aodh-0" Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.299082 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.386881 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.469103 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a51b2504-4a33-4058-be27-67afbfe5efc4","Type":"ContainerStarted","Data":"fc0923598ae0acc530e7220902a729d57e93694a3bbe937089a7787ea9c48f1a"} Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.471848 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.472151 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerName="ceilometer-central-agent" containerID="cri-o://d29ec5dd257cc48ea6fa7e8c8fadeb2a86986c56dc2221d53290ae8530d97ada" gracePeriod=30 Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.472156 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerName="proxy-httpd" containerID="cri-o://b58a260b2a13db4c5e8aa5df6e094d90f93cef39536b5750c93c06d2237b0554" gracePeriod=30 Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.472239 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerName="ceilometer-notification-agent" containerID="cri-o://78ad177c1be1c659088fba805101d648bfbedcc16475d13b93d2ecae627a2ef6" gracePeriod=30 Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.472278 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerName="sg-core" containerID="cri-o://522bd033ff9652de91451f74569eba98c18ab4f8992e15dde2add0fad5c2465d" gracePeriod=30 Dec 06 09:08:48 crc kubenswrapper[4954]: I1206 09:08:48.850674 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 06 09:08:48 crc kubenswrapper[4954]: W1206 09:08:48.863467 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda85a67ba_1b95_44bc_8792_16ede1bc5f33.slice/crio-448f1e7603bdc33fbf677b8ad9e5df1209a7c62197a50f7cac7dbd941dcd43d9 WatchSource:0}: Error finding container 448f1e7603bdc33fbf677b8ad9e5df1209a7c62197a50f7cac7dbd941dcd43d9: Status 404 returned error can't find the container with id 448f1e7603bdc33fbf677b8ad9e5df1209a7c62197a50f7cac7dbd941dcd43d9 Dec 06 09:08:49 crc kubenswrapper[4954]: I1206 09:08:49.456179 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e01f09b1-4b58-4eb8-90cf-d7738a282534" path="/var/lib/kubelet/pods/e01f09b1-4b58-4eb8-90cf-d7738a282534/volumes" Dec 06 09:08:49 crc kubenswrapper[4954]: I1206 09:08:49.481489 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a85a67ba-1b95-44bc-8792-16ede1bc5f33","Type":"ContainerStarted","Data":"825f1570789deef4815b68f1d383d0bfd385ca52afb8f912c24b5bb23d82ac9d"} Dec 06 09:08:49 crc kubenswrapper[4954]: I1206 09:08:49.481973 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a85a67ba-1b95-44bc-8792-16ede1bc5f33","Type":"ContainerStarted","Data":"448f1e7603bdc33fbf677b8ad9e5df1209a7c62197a50f7cac7dbd941dcd43d9"} Dec 06 09:08:49 crc kubenswrapper[4954]: I1206 09:08:49.483330 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a51b2504-4a33-4058-be27-67afbfe5efc4","Type":"ContainerStarted","Data":"eb06189891685f76bbaf2a6bfad2a1c53d6cc0745aaf5da7cdafb74ba01d4e5f"} Dec 06 09:08:49 crc kubenswrapper[4954]: I1206 09:08:49.483475 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 06 09:08:49 crc kubenswrapper[4954]: I1206 09:08:49.488656 4954 generic.go:334] "Generic (PLEG): container finished" podID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerID="b58a260b2a13db4c5e8aa5df6e094d90f93cef39536b5750c93c06d2237b0554" exitCode=0 Dec 06 09:08:49 crc kubenswrapper[4954]: I1206 09:08:49.488717 4954 generic.go:334] "Generic (PLEG): container finished" podID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerID="522bd033ff9652de91451f74569eba98c18ab4f8992e15dde2add0fad5c2465d" exitCode=2 Dec 06 09:08:49 crc kubenswrapper[4954]: I1206 09:08:49.488729 4954 generic.go:334] "Generic (PLEG): container finished" podID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerID="d29ec5dd257cc48ea6fa7e8c8fadeb2a86986c56dc2221d53290ae8530d97ada" exitCode=0 Dec 06 09:08:49 crc kubenswrapper[4954]: I1206 09:08:49.488751 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d864a6dd-9dac-41ae-8165-b8c97dbc005d","Type":"ContainerDied","Data":"b58a260b2a13db4c5e8aa5df6e094d90f93cef39536b5750c93c06d2237b0554"} Dec 06 09:08:49 crc kubenswrapper[4954]: I1206 09:08:49.488814 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d864a6dd-9dac-41ae-8165-b8c97dbc005d","Type":"ContainerDied","Data":"522bd033ff9652de91451f74569eba98c18ab4f8992e15dde2add0fad5c2465d"} Dec 06 09:08:49 crc kubenswrapper[4954]: I1206 09:08:49.488829 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d864a6dd-9dac-41ae-8165-b8c97dbc005d","Type":"ContainerDied","Data":"d29ec5dd257cc48ea6fa7e8c8fadeb2a86986c56dc2221d53290ae8530d97ada"} Dec 06 09:08:49 crc kubenswrapper[4954]: I1206 09:08:49.516514 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.107183518 podStartE2EDuration="2.516489973s" podCreationTimestamp="2025-12-06 09:08:47 +0000 UTC" firstStartedPulling="2025-12-06 09:08:48.389432551 +0000 UTC m=+7903.202791940" lastFinishedPulling="2025-12-06 09:08:48.798738996 +0000 UTC m=+7903.612098395" observedRunningTime="2025-12-06 09:08:49.512849056 +0000 UTC m=+7904.326208445" watchObservedRunningTime="2025-12-06 09:08:49.516489973 +0000 UTC m=+7904.329849362" Dec 06 09:08:50 crc kubenswrapper[4954]: I1206 09:08:50.509782 4954 generic.go:334] "Generic (PLEG): container finished" podID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerID="78ad177c1be1c659088fba805101d648bfbedcc16475d13b93d2ecae627a2ef6" exitCode=0 Dec 06 09:08:50 crc kubenswrapper[4954]: I1206 09:08:50.510694 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d864a6dd-9dac-41ae-8165-b8c97dbc005d","Type":"ContainerDied","Data":"78ad177c1be1c659088fba805101d648bfbedcc16475d13b93d2ecae627a2ef6"} Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.041998 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.157123 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d864a6dd-9dac-41ae-8165-b8c97dbc005d-log-httpd\") pod \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.157178 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6lqd\" (UniqueName: \"kubernetes.io/projected/d864a6dd-9dac-41ae-8165-b8c97dbc005d-kube-api-access-j6lqd\") pod \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.157242 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-combined-ca-bundle\") pod \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.157270 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-sg-core-conf-yaml\") pod \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.157434 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-config-data\") pod \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.157461 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d864a6dd-9dac-41ae-8165-b8c97dbc005d-run-httpd\") pod \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.157491 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-scripts\") pod \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\" (UID: \"d864a6dd-9dac-41ae-8165-b8c97dbc005d\") " Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.158161 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d864a6dd-9dac-41ae-8165-b8c97dbc005d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d864a6dd-9dac-41ae-8165-b8c97dbc005d" (UID: "d864a6dd-9dac-41ae-8165-b8c97dbc005d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.158394 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d864a6dd-9dac-41ae-8165-b8c97dbc005d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d864a6dd-9dac-41ae-8165-b8c97dbc005d" (UID: "d864a6dd-9dac-41ae-8165-b8c97dbc005d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.162985 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d864a6dd-9dac-41ae-8165-b8c97dbc005d-kube-api-access-j6lqd" (OuterVolumeSpecName: "kube-api-access-j6lqd") pod "d864a6dd-9dac-41ae-8165-b8c97dbc005d" (UID: "d864a6dd-9dac-41ae-8165-b8c97dbc005d"). InnerVolumeSpecName "kube-api-access-j6lqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.163046 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-scripts" (OuterVolumeSpecName: "scripts") pod "d864a6dd-9dac-41ae-8165-b8c97dbc005d" (UID: "d864a6dd-9dac-41ae-8165-b8c97dbc005d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.192746 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d864a6dd-9dac-41ae-8165-b8c97dbc005d" (UID: "d864a6dd-9dac-41ae-8165-b8c97dbc005d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.260680 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d864a6dd-9dac-41ae-8165-b8c97dbc005d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.260709 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.260717 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d864a6dd-9dac-41ae-8165-b8c97dbc005d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.260726 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6lqd\" (UniqueName: \"kubernetes.io/projected/d864a6dd-9dac-41ae-8165-b8c97dbc005d-kube-api-access-j6lqd\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.260735 4954 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.265782 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d864a6dd-9dac-41ae-8165-b8c97dbc005d" (UID: "d864a6dd-9dac-41ae-8165-b8c97dbc005d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.307361 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-config-data" (OuterVolumeSpecName: "config-data") pod "d864a6dd-9dac-41ae-8165-b8c97dbc005d" (UID: "d864a6dd-9dac-41ae-8165-b8c97dbc005d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.362416 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.362456 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d864a6dd-9dac-41ae-8165-b8c97dbc005d-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.524475 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d864a6dd-9dac-41ae-8165-b8c97dbc005d","Type":"ContainerDied","Data":"871b2c96917cdd55aa2a852399409d842f9f0d0c5dff49ce695db39d3b072a38"} Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.524510 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.524547 4954 scope.go:117] "RemoveContainer" containerID="b58a260b2a13db4c5e8aa5df6e094d90f93cef39536b5750c93c06d2237b0554" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.529942 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a85a67ba-1b95-44bc-8792-16ede1bc5f33","Type":"ContainerStarted","Data":"d84b351d670b82d045a6674180f725c54c4db211fc90208fcf8589701cffd48f"} Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.556764 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.562664 4954 scope.go:117] "RemoveContainer" containerID="522bd033ff9652de91451f74569eba98c18ab4f8992e15dde2add0fad5c2465d" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.567594 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.583328 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:08:51 crc kubenswrapper[4954]: E1206 09:08:51.584035 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerName="sg-core" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.584054 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerName="sg-core" Dec 06 09:08:51 crc kubenswrapper[4954]: E1206 09:08:51.584080 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerName="ceilometer-notification-agent" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.584087 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerName="ceilometer-notification-agent" Dec 06 09:08:51 crc kubenswrapper[4954]: E1206 09:08:51.584106 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerName="ceilometer-central-agent" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.584113 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerName="ceilometer-central-agent" Dec 06 09:08:51 crc kubenswrapper[4954]: E1206 09:08:51.584122 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerName="proxy-httpd" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.584129 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerName="proxy-httpd" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.584320 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerName="ceilometer-notification-agent" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.584342 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerName="proxy-httpd" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.584352 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerName="sg-core" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.584366 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" containerName="ceilometer-central-agent" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.586120 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.594389 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.594595 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.594723 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.600206 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.604162 4954 scope.go:117] "RemoveContainer" containerID="78ad177c1be1c659088fba805101d648bfbedcc16475d13b93d2ecae627a2ef6" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.668177 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-config-data\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.668968 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkw86\" (UniqueName: \"kubernetes.io/projected/37e93c4d-2ef7-4fd1-873b-9175a1a57997-kube-api-access-bkw86\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.669114 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.669352 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.669415 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37e93c4d-2ef7-4fd1-873b-9175a1a57997-log-httpd\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.669454 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.669690 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-scripts\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.669782 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37e93c4d-2ef7-4fd1-873b-9175a1a57997-run-httpd\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.679311 4954 scope.go:117] "RemoveContainer" containerID="d29ec5dd257cc48ea6fa7e8c8fadeb2a86986c56dc2221d53290ae8530d97ada" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.774939 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.775009 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37e93c4d-2ef7-4fd1-873b-9175a1a57997-log-httpd\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.775049 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.775107 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-scripts\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.775138 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37e93c4d-2ef7-4fd1-873b-9175a1a57997-run-httpd\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.775161 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-config-data\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.775259 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkw86\" (UniqueName: \"kubernetes.io/projected/37e93c4d-2ef7-4fd1-873b-9175a1a57997-kube-api-access-bkw86\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.775328 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.776509 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37e93c4d-2ef7-4fd1-873b-9175a1a57997-log-httpd\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.777131 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37e93c4d-2ef7-4fd1-873b-9175a1a57997-run-httpd\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.781395 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.784274 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.784547 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-scripts\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.785281 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.802872 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-config-data\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.806140 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkw86\" (UniqueName: \"kubernetes.io/projected/37e93c4d-2ef7-4fd1-873b-9175a1a57997-kube-api-access-bkw86\") pod \"ceilometer-0\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " pod="openstack/ceilometer-0" Dec 06 09:08:51 crc kubenswrapper[4954]: I1206 09:08:51.949890 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 06 09:08:52 crc kubenswrapper[4954]: I1206 09:08:52.026620 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:08:52 crc kubenswrapper[4954]: I1206 09:08:52.922872 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:08:52 crc kubenswrapper[4954]: W1206 09:08:52.941817 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e93c4d_2ef7_4fd1_873b_9175a1a57997.slice/crio-e215347e1a2fc06355ef187e42878568609393f58ef139a9420da403ca9b99a1 WatchSource:0}: Error finding container e215347e1a2fc06355ef187e42878568609393f58ef139a9420da403ca9b99a1: Status 404 returned error can't find the container with id e215347e1a2fc06355ef187e42878568609393f58ef139a9420da403ca9b99a1 Dec 06 09:08:53 crc kubenswrapper[4954]: I1206 09:08:53.544372 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d864a6dd-9dac-41ae-8165-b8c97dbc005d" path="/var/lib/kubelet/pods/d864a6dd-9dac-41ae-8165-b8c97dbc005d/volumes" Dec 06 09:08:53 crc kubenswrapper[4954]: I1206 09:08:53.602210 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37e93c4d-2ef7-4fd1-873b-9175a1a57997","Type":"ContainerStarted","Data":"e215347e1a2fc06355ef187e42878568609393f58ef139a9420da403ca9b99a1"} Dec 06 09:08:53 crc kubenswrapper[4954]: I1206 09:08:53.609254 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a85a67ba-1b95-44bc-8792-16ede1bc5f33","Type":"ContainerStarted","Data":"9f1be2c436f95989236c024dde7066d64679084c6122c5b5af3ebfbbd7d4a831"} Dec 06 09:08:54 crc kubenswrapper[4954]: I1206 09:08:54.179688 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:08:54 crc kubenswrapper[4954]: I1206 09:08:54.619144 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37e93c4d-2ef7-4fd1-873b-9175a1a57997","Type":"ContainerStarted","Data":"9402e740f0e773a6304964122021a911e3ae1ea004423f9e484d757674dc2c3f"} Dec 06 09:08:54 crc kubenswrapper[4954]: I1206 09:08:54.619189 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37e93c4d-2ef7-4fd1-873b-9175a1a57997","Type":"ContainerStarted","Data":"f7a730b37dad0c6361c00e9b0492e8625c73fe67396796d17b9666df024ef1b3"} Dec 06 09:08:54 crc kubenswrapper[4954]: I1206 09:08:54.623401 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a85a67ba-1b95-44bc-8792-16ede1bc5f33","Type":"ContainerStarted","Data":"ed20b23815644375f353b27f182e8d1440f89b0bb7dfec830c60a57ea6a4932d"} Dec 06 09:08:54 crc kubenswrapper[4954]: I1206 09:08:54.623551 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerName="aodh-api" containerID="cri-o://825f1570789deef4815b68f1d383d0bfd385ca52afb8f912c24b5bb23d82ac9d" gracePeriod=30 Dec 06 09:08:54 crc kubenswrapper[4954]: I1206 09:08:54.623626 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerName="aodh-listener" containerID="cri-o://ed20b23815644375f353b27f182e8d1440f89b0bb7dfec830c60a57ea6a4932d" gracePeriod=30 Dec 06 09:08:54 crc kubenswrapper[4954]: I1206 09:08:54.623666 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerName="aodh-notifier" containerID="cri-o://9f1be2c436f95989236c024dde7066d64679084c6122c5b5af3ebfbbd7d4a831" gracePeriod=30 Dec 06 09:08:54 crc kubenswrapper[4954]: I1206 09:08:54.623709 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerName="aodh-evaluator" containerID="cri-o://d84b351d670b82d045a6674180f725c54c4db211fc90208fcf8589701cffd48f" gracePeriod=30 Dec 06 09:08:54 crc kubenswrapper[4954]: I1206 09:08:54.658412 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.468831608 podStartE2EDuration="7.658388881s" podCreationTimestamp="2025-12-06 09:08:47 +0000 UTC" firstStartedPulling="2025-12-06 09:08:48.867220421 +0000 UTC m=+7903.680579800" lastFinishedPulling="2025-12-06 09:08:54.056777674 +0000 UTC m=+7908.870137073" observedRunningTime="2025-12-06 09:08:54.652281067 +0000 UTC m=+7909.465640456" watchObservedRunningTime="2025-12-06 09:08:54.658388881 +0000 UTC m=+7909.471748270" Dec 06 09:08:55 crc kubenswrapper[4954]: I1206 09:08:55.672397 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37e93c4d-2ef7-4fd1-873b-9175a1a57997","Type":"ContainerStarted","Data":"784e9c867597865d57f814154c4079ba5d911c2579cedac68758144d21b8346f"} Dec 06 09:08:55 crc kubenswrapper[4954]: I1206 09:08:55.684909 4954 generic.go:334] "Generic (PLEG): container finished" podID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerID="d84b351d670b82d045a6674180f725c54c4db211fc90208fcf8589701cffd48f" exitCode=0 Dec 06 09:08:55 crc kubenswrapper[4954]: I1206 09:08:55.684939 4954 generic.go:334] "Generic (PLEG): container finished" podID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerID="825f1570789deef4815b68f1d383d0bfd385ca52afb8f912c24b5bb23d82ac9d" exitCode=0 Dec 06 09:08:55 crc kubenswrapper[4954]: I1206 09:08:55.684954 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a85a67ba-1b95-44bc-8792-16ede1bc5f33","Type":"ContainerDied","Data":"d84b351d670b82d045a6674180f725c54c4db211fc90208fcf8589701cffd48f"} Dec 06 09:08:55 crc kubenswrapper[4954]: I1206 09:08:55.684977 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a85a67ba-1b95-44bc-8792-16ede1bc5f33","Type":"ContainerDied","Data":"825f1570789deef4815b68f1d383d0bfd385ca52afb8f912c24b5bb23d82ac9d"} Dec 06 09:08:56 crc kubenswrapper[4954]: I1206 09:08:56.695574 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37e93c4d-2ef7-4fd1-873b-9175a1a57997","Type":"ContainerStarted","Data":"7da269384c88bc88e46293ecb948d684345d5bc287fe49c3bf00557c14ed6cb6"} Dec 06 09:08:56 crc kubenswrapper[4954]: I1206 09:08:56.695732 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerName="ceilometer-central-agent" containerID="cri-o://f7a730b37dad0c6361c00e9b0492e8625c73fe67396796d17b9666df024ef1b3" gracePeriod=30 Dec 06 09:08:56 crc kubenswrapper[4954]: I1206 09:08:56.695797 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerName="proxy-httpd" containerID="cri-o://7da269384c88bc88e46293ecb948d684345d5bc287fe49c3bf00557c14ed6cb6" gracePeriod=30 Dec 06 09:08:56 crc kubenswrapper[4954]: I1206 09:08:56.695879 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerName="ceilometer-notification-agent" containerID="cri-o://9402e740f0e773a6304964122021a911e3ae1ea004423f9e484d757674dc2c3f" gracePeriod=30 Dec 06 09:08:56 crc kubenswrapper[4954]: I1206 09:08:56.695836 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerName="sg-core" containerID="cri-o://784e9c867597865d57f814154c4079ba5d911c2579cedac68758144d21b8346f" gracePeriod=30 Dec 06 09:08:56 crc kubenswrapper[4954]: I1206 09:08:56.696010 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 09:08:56 crc kubenswrapper[4954]: I1206 09:08:56.730341 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.558460953 podStartE2EDuration="5.730314145s" podCreationTimestamp="2025-12-06 09:08:51 +0000 UTC" firstStartedPulling="2025-12-06 09:08:52.944913798 +0000 UTC m=+7907.758273187" lastFinishedPulling="2025-12-06 09:08:56.11676699 +0000 UTC m=+7910.930126379" observedRunningTime="2025-12-06 09:08:56.715912139 +0000 UTC m=+7911.529271548" watchObservedRunningTime="2025-12-06 09:08:56.730314145 +0000 UTC m=+7911.543673534" Dec 06 09:08:57 crc kubenswrapper[4954]: E1206 09:08:57.000979 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e93c4d_2ef7_4fd1_873b_9175a1a57997.slice/crio-conmon-7da269384c88bc88e46293ecb948d684345d5bc287fe49c3bf00557c14ed6cb6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e93c4d_2ef7_4fd1_873b_9175a1a57997.slice/crio-7da269384c88bc88e46293ecb948d684345d5bc287fe49c3bf00557c14ed6cb6.scope\": RecentStats: unable to find data in memory cache]" Dec 06 09:08:57 crc kubenswrapper[4954]: I1206 09:08:57.782111 4954 generic.go:334] "Generic (PLEG): container finished" podID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerID="7da269384c88bc88e46293ecb948d684345d5bc287fe49c3bf00557c14ed6cb6" exitCode=0 Dec 06 09:08:57 crc kubenswrapper[4954]: I1206 09:08:57.782774 4954 generic.go:334] "Generic (PLEG): container finished" podID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerID="784e9c867597865d57f814154c4079ba5d911c2579cedac68758144d21b8346f" exitCode=2 Dec 06 09:08:57 crc kubenswrapper[4954]: I1206 09:08:57.782794 4954 generic.go:334] "Generic (PLEG): container finished" podID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerID="9402e740f0e773a6304964122021a911e3ae1ea004423f9e484d757674dc2c3f" exitCode=0 Dec 06 09:08:57 crc kubenswrapper[4954]: I1206 09:08:57.782290 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37e93c4d-2ef7-4fd1-873b-9175a1a57997","Type":"ContainerDied","Data":"7da269384c88bc88e46293ecb948d684345d5bc287fe49c3bf00557c14ed6cb6"} Dec 06 09:08:57 crc kubenswrapper[4954]: I1206 09:08:57.782874 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37e93c4d-2ef7-4fd1-873b-9175a1a57997","Type":"ContainerDied","Data":"784e9c867597865d57f814154c4079ba5d911c2579cedac68758144d21b8346f"} Dec 06 09:08:57 crc kubenswrapper[4954]: I1206 09:08:57.782922 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37e93c4d-2ef7-4fd1-873b-9175a1a57997","Type":"ContainerDied","Data":"9402e740f0e773a6304964122021a911e3ae1ea004423f9e484d757674dc2c3f"} Dec 06 09:08:57 crc kubenswrapper[4954]: I1206 09:08:57.848886 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 06 09:09:00 crc kubenswrapper[4954]: I1206 09:09:00.443837 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:09:00 crc kubenswrapper[4954]: E1206 09:09:00.445574 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:09:00 crc kubenswrapper[4954]: I1206 09:09:00.827091 4954 generic.go:334] "Generic (PLEG): container finished" podID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerID="f7a730b37dad0c6361c00e9b0492e8625c73fe67396796d17b9666df024ef1b3" exitCode=0 Dec 06 09:09:00 crc kubenswrapper[4954]: I1206 09:09:00.827141 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37e93c4d-2ef7-4fd1-873b-9175a1a57997","Type":"ContainerDied","Data":"f7a730b37dad0c6361c00e9b0492e8625c73fe67396796d17b9666df024ef1b3"} Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.125006 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.279101 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-config-data\") pod \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.279177 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-sg-core-conf-yaml\") pod \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.279318 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-ceilometer-tls-certs\") pod \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.279353 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37e93c4d-2ef7-4fd1-873b-9175a1a57997-log-httpd\") pod \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.279414 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37e93c4d-2ef7-4fd1-873b-9175a1a57997-run-httpd\") pod \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.279500 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-combined-ca-bundle\") pod \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.279596 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkw86\" (UniqueName: \"kubernetes.io/projected/37e93c4d-2ef7-4fd1-873b-9175a1a57997-kube-api-access-bkw86\") pod \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.279624 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-scripts\") pod \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\" (UID: \"37e93c4d-2ef7-4fd1-873b-9175a1a57997\") " Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.279973 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37e93c4d-2ef7-4fd1-873b-9175a1a57997-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "37e93c4d-2ef7-4fd1-873b-9175a1a57997" (UID: "37e93c4d-2ef7-4fd1-873b-9175a1a57997"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.280073 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37e93c4d-2ef7-4fd1-873b-9175a1a57997-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "37e93c4d-2ef7-4fd1-873b-9175a1a57997" (UID: "37e93c4d-2ef7-4fd1-873b-9175a1a57997"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.280520 4954 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37e93c4d-2ef7-4fd1-873b-9175a1a57997-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.280548 4954 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37e93c4d-2ef7-4fd1-873b-9175a1a57997-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.284516 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-scripts" (OuterVolumeSpecName: "scripts") pod "37e93c4d-2ef7-4fd1-873b-9175a1a57997" (UID: "37e93c4d-2ef7-4fd1-873b-9175a1a57997"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.285117 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37e93c4d-2ef7-4fd1-873b-9175a1a57997-kube-api-access-bkw86" (OuterVolumeSpecName: "kube-api-access-bkw86") pod "37e93c4d-2ef7-4fd1-873b-9175a1a57997" (UID: "37e93c4d-2ef7-4fd1-873b-9175a1a57997"). InnerVolumeSpecName "kube-api-access-bkw86". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.311741 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "37e93c4d-2ef7-4fd1-873b-9175a1a57997" (UID: "37e93c4d-2ef7-4fd1-873b-9175a1a57997"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.346079 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "37e93c4d-2ef7-4fd1-873b-9175a1a57997" (UID: "37e93c4d-2ef7-4fd1-873b-9175a1a57997"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.381757 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkw86\" (UniqueName: \"kubernetes.io/projected/37e93c4d-2ef7-4fd1-873b-9175a1a57997-kube-api-access-bkw86\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.381790 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.381802 4954 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.381810 4954 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.392860 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37e93c4d-2ef7-4fd1-873b-9175a1a57997" (UID: "37e93c4d-2ef7-4fd1-873b-9175a1a57997"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.402390 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-config-data" (OuterVolumeSpecName: "config-data") pod "37e93c4d-2ef7-4fd1-873b-9175a1a57997" (UID: "37e93c4d-2ef7-4fd1-873b-9175a1a57997"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.484399 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.484433 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e93c4d-2ef7-4fd1-873b-9175a1a57997-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.841159 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37e93c4d-2ef7-4fd1-873b-9175a1a57997","Type":"ContainerDied","Data":"e215347e1a2fc06355ef187e42878568609393f58ef139a9420da403ca9b99a1"} Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.841234 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.841234 4954 scope.go:117] "RemoveContainer" containerID="7da269384c88bc88e46293ecb948d684345d5bc287fe49c3bf00557c14ed6cb6" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.873942 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.886423 4954 scope.go:117] "RemoveContainer" containerID="784e9c867597865d57f814154c4079ba5d911c2579cedac68758144d21b8346f" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.896049 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.915294 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:09:01 crc kubenswrapper[4954]: E1206 09:09:01.915837 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerName="proxy-httpd" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.915864 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerName="proxy-httpd" Dec 06 09:09:01 crc kubenswrapper[4954]: E1206 09:09:01.915900 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerName="sg-core" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.915908 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerName="sg-core" Dec 06 09:09:01 crc kubenswrapper[4954]: E1206 09:09:01.915919 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerName="ceilometer-central-agent" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.915927 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerName="ceilometer-central-agent" Dec 06 09:09:01 crc kubenswrapper[4954]: E1206 09:09:01.915957 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerName="ceilometer-notification-agent" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.915963 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerName="ceilometer-notification-agent" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.916147 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerName="ceilometer-notification-agent" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.916163 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerName="sg-core" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.916176 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerName="proxy-httpd" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.916194 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" containerName="ceilometer-central-agent" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.918124 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.992493 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.997640 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/185b09ad-fc16-4179-bafe-cd1bf48fef37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.997703 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/185b09ad-fc16-4179-bafe-cd1bf48fef37-run-httpd\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.997801 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185b09ad-fc16-4179-bafe-cd1bf48fef37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.997849 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc9l4\" (UniqueName: \"kubernetes.io/projected/185b09ad-fc16-4179-bafe-cd1bf48fef37-kube-api-access-mc9l4\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.997902 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185b09ad-fc16-4179-bafe-cd1bf48fef37-config-data\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.997939 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/185b09ad-fc16-4179-bafe-cd1bf48fef37-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.998067 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/185b09ad-fc16-4179-bafe-cd1bf48fef37-scripts\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:01 crc kubenswrapper[4954]: I1206 09:09:01.998097 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/185b09ad-fc16-4179-bafe-cd1bf48fef37-log-httpd\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.004130 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.004358 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.010226 4954 scope.go:117] "RemoveContainer" containerID="9402e740f0e773a6304964122021a911e3ae1ea004423f9e484d757674dc2c3f" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.013804 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.045405 4954 scope.go:117] "RemoveContainer" containerID="f7a730b37dad0c6361c00e9b0492e8625c73fe67396796d17b9666df024ef1b3" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.101399 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/185b09ad-fc16-4179-bafe-cd1bf48fef37-scripts\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.101458 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/185b09ad-fc16-4179-bafe-cd1bf48fef37-log-httpd\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.101505 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/185b09ad-fc16-4179-bafe-cd1bf48fef37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.101543 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/185b09ad-fc16-4179-bafe-cd1bf48fef37-run-httpd\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.102004 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185b09ad-fc16-4179-bafe-cd1bf48fef37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.102377 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/185b09ad-fc16-4179-bafe-cd1bf48fef37-log-httpd\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.102679 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/185b09ad-fc16-4179-bafe-cd1bf48fef37-run-httpd\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.102755 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc9l4\" (UniqueName: \"kubernetes.io/projected/185b09ad-fc16-4179-bafe-cd1bf48fef37-kube-api-access-mc9l4\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.102862 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185b09ad-fc16-4179-bafe-cd1bf48fef37-config-data\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.102914 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/185b09ad-fc16-4179-bafe-cd1bf48fef37-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.106333 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/185b09ad-fc16-4179-bafe-cd1bf48fef37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.106409 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/185b09ad-fc16-4179-bafe-cd1bf48fef37-scripts\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.106351 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/185b09ad-fc16-4179-bafe-cd1bf48fef37-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.108530 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185b09ad-fc16-4179-bafe-cd1bf48fef37-config-data\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.113365 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185b09ad-fc16-4179-bafe-cd1bf48fef37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.124733 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc9l4\" (UniqueName: \"kubernetes.io/projected/185b09ad-fc16-4179-bafe-cd1bf48fef37-kube-api-access-mc9l4\") pod \"ceilometer-0\" (UID: \"185b09ad-fc16-4179-bafe-cd1bf48fef37\") " pod="openstack/ceilometer-0" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.316056 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.810921 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 09:09:02 crc kubenswrapper[4954]: I1206 09:09:02.854215 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"185b09ad-fc16-4179-bafe-cd1bf48fef37","Type":"ContainerStarted","Data":"8ff29834ea704a32cfe858ea099e16e71ad3a52b0636346cf904d89ebefbbc5b"} Dec 06 09:09:03 crc kubenswrapper[4954]: I1206 09:09:03.456217 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37e93c4d-2ef7-4fd1-873b-9175a1a57997" path="/var/lib/kubelet/pods/37e93c4d-2ef7-4fd1-873b-9175a1a57997/volumes" Dec 06 09:09:04 crc kubenswrapper[4954]: I1206 09:09:04.896827 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"185b09ad-fc16-4179-bafe-cd1bf48fef37","Type":"ContainerStarted","Data":"fc35c8170c0cf6da8c0b7ec91fb989643c5509657543e124b4df75c0560048d8"} Dec 06 09:09:04 crc kubenswrapper[4954]: I1206 09:09:04.898432 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"185b09ad-fc16-4179-bafe-cd1bf48fef37","Type":"ContainerStarted","Data":"e7254c11ac4e27174b56a587d3e5dbd2a22f8054f087d0e170154eda449686f5"} Dec 06 09:09:05 crc kubenswrapper[4954]: I1206 09:09:05.915358 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"185b09ad-fc16-4179-bafe-cd1bf48fef37","Type":"ContainerStarted","Data":"75188022f25028d8f2318b3e7750cb6fd698baaf81cb31c219b85d9b3cec216a"} Dec 06 09:09:06 crc kubenswrapper[4954]: I1206 09:09:06.928747 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"185b09ad-fc16-4179-bafe-cd1bf48fef37","Type":"ContainerStarted","Data":"2f6cdaf9bd469d11d6b238440cfdc946a16613fbd633178a7eb111a1ea33ffeb"} Dec 06 09:09:06 crc kubenswrapper[4954]: I1206 09:09:06.929081 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 09:09:11 crc kubenswrapper[4954]: I1206 09:09:11.444457 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:09:11 crc kubenswrapper[4954]: E1206 09:09:11.445548 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:09:18 crc kubenswrapper[4954]: I1206 09:09:18.301650 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=13.636979048 podStartE2EDuration="17.30162309s" podCreationTimestamp="2025-12-06 09:09:01 +0000 UTC" firstStartedPulling="2025-12-06 09:09:02.805750531 +0000 UTC m=+7917.619109920" lastFinishedPulling="2025-12-06 09:09:06.470394573 +0000 UTC m=+7921.283753962" observedRunningTime="2025-12-06 09:09:06.95708217 +0000 UTC m=+7921.770441589" watchObservedRunningTime="2025-12-06 09:09:18.30162309 +0000 UTC m=+7933.114982519" Dec 06 09:09:18 crc kubenswrapper[4954]: I1206 09:09:18.312466 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g6xtd"] Dec 06 09:09:18 crc kubenswrapper[4954]: I1206 09:09:18.315152 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g6xtd" Dec 06 09:09:18 crc kubenswrapper[4954]: I1206 09:09:18.337052 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g6xtd"] Dec 06 09:09:18 crc kubenswrapper[4954]: I1206 09:09:18.374817 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59837cd9-7416-4497-9c24-7341bd4f3a4c-utilities\") pod \"certified-operators-g6xtd\" (UID: \"59837cd9-7416-4497-9c24-7341bd4f3a4c\") " pod="openshift-marketplace/certified-operators-g6xtd" Dec 06 09:09:18 crc kubenswrapper[4954]: I1206 09:09:18.374939 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59837cd9-7416-4497-9c24-7341bd4f3a4c-catalog-content\") pod \"certified-operators-g6xtd\" (UID: \"59837cd9-7416-4497-9c24-7341bd4f3a4c\") " pod="openshift-marketplace/certified-operators-g6xtd" Dec 06 09:09:18 crc kubenswrapper[4954]: I1206 09:09:18.375125 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh8l7\" (UniqueName: \"kubernetes.io/projected/59837cd9-7416-4497-9c24-7341bd4f3a4c-kube-api-access-qh8l7\") pod \"certified-operators-g6xtd\" (UID: \"59837cd9-7416-4497-9c24-7341bd4f3a4c\") " pod="openshift-marketplace/certified-operators-g6xtd" Dec 06 09:09:18 crc kubenswrapper[4954]: I1206 09:09:18.477481 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh8l7\" (UniqueName: \"kubernetes.io/projected/59837cd9-7416-4497-9c24-7341bd4f3a4c-kube-api-access-qh8l7\") pod \"certified-operators-g6xtd\" (UID: \"59837cd9-7416-4497-9c24-7341bd4f3a4c\") " pod="openshift-marketplace/certified-operators-g6xtd" Dec 06 09:09:18 crc kubenswrapper[4954]: I1206 09:09:18.477593 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59837cd9-7416-4497-9c24-7341bd4f3a4c-utilities\") pod \"certified-operators-g6xtd\" (UID: \"59837cd9-7416-4497-9c24-7341bd4f3a4c\") " pod="openshift-marketplace/certified-operators-g6xtd" Dec 06 09:09:18 crc kubenswrapper[4954]: I1206 09:09:18.477688 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59837cd9-7416-4497-9c24-7341bd4f3a4c-catalog-content\") pod \"certified-operators-g6xtd\" (UID: \"59837cd9-7416-4497-9c24-7341bd4f3a4c\") " pod="openshift-marketplace/certified-operators-g6xtd" Dec 06 09:09:18 crc kubenswrapper[4954]: I1206 09:09:18.478104 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59837cd9-7416-4497-9c24-7341bd4f3a4c-utilities\") pod \"certified-operators-g6xtd\" (UID: \"59837cd9-7416-4497-9c24-7341bd4f3a4c\") " pod="openshift-marketplace/certified-operators-g6xtd" Dec 06 09:09:18 crc kubenswrapper[4954]: I1206 09:09:18.478176 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59837cd9-7416-4497-9c24-7341bd4f3a4c-catalog-content\") pod \"certified-operators-g6xtd\" (UID: \"59837cd9-7416-4497-9c24-7341bd4f3a4c\") " pod="openshift-marketplace/certified-operators-g6xtd" Dec 06 09:09:18 crc kubenswrapper[4954]: I1206 09:09:18.499546 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh8l7\" (UniqueName: \"kubernetes.io/projected/59837cd9-7416-4497-9c24-7341bd4f3a4c-kube-api-access-qh8l7\") pod \"certified-operators-g6xtd\" (UID: \"59837cd9-7416-4497-9c24-7341bd4f3a4c\") " pod="openshift-marketplace/certified-operators-g6xtd" Dec 06 09:09:18 crc kubenswrapper[4954]: I1206 09:09:18.635460 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g6xtd" Dec 06 09:09:19 crc kubenswrapper[4954]: I1206 09:09:19.327886 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g6xtd"] Dec 06 09:09:20 crc kubenswrapper[4954]: I1206 09:09:20.085685 4954 generic.go:334] "Generic (PLEG): container finished" podID="59837cd9-7416-4497-9c24-7341bd4f3a4c" containerID="1448467ad4e3d6cbfcf23715c0c3c85e29f40707444552fa99536c9ec74609cf" exitCode=0 Dec 06 09:09:20 crc kubenswrapper[4954]: I1206 09:09:20.085745 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g6xtd" event={"ID":"59837cd9-7416-4497-9c24-7341bd4f3a4c","Type":"ContainerDied","Data":"1448467ad4e3d6cbfcf23715c0c3c85e29f40707444552fa99536c9ec74609cf"} Dec 06 09:09:20 crc kubenswrapper[4954]: I1206 09:09:20.086054 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g6xtd" event={"ID":"59837cd9-7416-4497-9c24-7341bd4f3a4c","Type":"ContainerStarted","Data":"d6ed7d4e06ac53142da66c493139a85c727ce7d7b250a97cc2d639540648297b"} Dec 06 09:09:22 crc kubenswrapper[4954]: I1206 09:09:22.106779 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g6xtd" event={"ID":"59837cd9-7416-4497-9c24-7341bd4f3a4c","Type":"ContainerStarted","Data":"a4d4e9bc4c29d312c064ec3d55e57dcb03dbe648fdc8e0768f6cf14eeb95f932"} Dec 06 09:09:23 crc kubenswrapper[4954]: I1206 09:09:23.119876 4954 generic.go:334] "Generic (PLEG): container finished" podID="59837cd9-7416-4497-9c24-7341bd4f3a4c" containerID="a4d4e9bc4c29d312c064ec3d55e57dcb03dbe648fdc8e0768f6cf14eeb95f932" exitCode=0 Dec 06 09:09:23 crc kubenswrapper[4954]: I1206 09:09:23.119935 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g6xtd" event={"ID":"59837cd9-7416-4497-9c24-7341bd4f3a4c","Type":"ContainerDied","Data":"a4d4e9bc4c29d312c064ec3d55e57dcb03dbe648fdc8e0768f6cf14eeb95f932"} Dec 06 09:09:24 crc kubenswrapper[4954]: I1206 09:09:24.133953 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g6xtd" event={"ID":"59837cd9-7416-4497-9c24-7341bd4f3a4c","Type":"ContainerStarted","Data":"e888d66cdd341389ed75b27ad27eb50de76d344a0b43e07c7cc2fe1fd4738162"} Dec 06 09:09:24 crc kubenswrapper[4954]: I1206 09:09:24.195090 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g6xtd" podStartSLOduration=2.71230672 podStartE2EDuration="6.19506494s" podCreationTimestamp="2025-12-06 09:09:18 +0000 UTC" firstStartedPulling="2025-12-06 09:09:20.087225115 +0000 UTC m=+7934.900584504" lastFinishedPulling="2025-12-06 09:09:23.569983345 +0000 UTC m=+7938.383342724" observedRunningTime="2025-12-06 09:09:24.177739126 +0000 UTC m=+7938.991098515" watchObservedRunningTime="2025-12-06 09:09:24.19506494 +0000 UTC m=+7939.008424329" Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.150369 4954 generic.go:334] "Generic (PLEG): container finished" podID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerID="ed20b23815644375f353b27f182e8d1440f89b0bb7dfec830c60a57ea6a4932d" exitCode=137 Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.150583 4954 generic.go:334] "Generic (PLEG): container finished" podID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerID="9f1be2c436f95989236c024dde7066d64679084c6122c5b5af3ebfbbd7d4a831" exitCode=137 Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.151471 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a85a67ba-1b95-44bc-8792-16ede1bc5f33","Type":"ContainerDied","Data":"ed20b23815644375f353b27f182e8d1440f89b0bb7dfec830c60a57ea6a4932d"} Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.151504 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a85a67ba-1b95-44bc-8792-16ede1bc5f33","Type":"ContainerDied","Data":"9f1be2c436f95989236c024dde7066d64679084c6122c5b5af3ebfbbd7d4a831"} Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.450147 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:09:25 crc kubenswrapper[4954]: E1206 09:09:25.450495 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.503087 4954 scope.go:117] "RemoveContainer" containerID="4d426f7347144e8477495f01de0100abebba3f45d74ef577632ce7dd287936b0" Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.549973 4954 scope.go:117] "RemoveContainer" containerID="a06e5d0a634317d55b03665a2dd85d91619c9c649569c42261b501bfd0a3599f" Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.572520 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.684468 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a85a67ba-1b95-44bc-8792-16ede1bc5f33-config-data\") pod \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\" (UID: \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\") " Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.685232 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6pbd\" (UniqueName: \"kubernetes.io/projected/a85a67ba-1b95-44bc-8792-16ede1bc5f33-kube-api-access-k6pbd\") pod \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\" (UID: \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\") " Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.685328 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a85a67ba-1b95-44bc-8792-16ede1bc5f33-combined-ca-bundle\") pod \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\" (UID: \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\") " Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.685465 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a85a67ba-1b95-44bc-8792-16ede1bc5f33-scripts\") pod \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\" (UID: \"a85a67ba-1b95-44bc-8792-16ede1bc5f33\") " Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.699036 4954 scope.go:117] "RemoveContainer" containerID="073a19617c17941e084c57a92ccacf88aede8bc77f16ff842329f11ea49766e3" Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.705183 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a85a67ba-1b95-44bc-8792-16ede1bc5f33-scripts" (OuterVolumeSpecName: "scripts") pod "a85a67ba-1b95-44bc-8792-16ede1bc5f33" (UID: "a85a67ba-1b95-44bc-8792-16ede1bc5f33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.716125 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a85a67ba-1b95-44bc-8792-16ede1bc5f33-kube-api-access-k6pbd" (OuterVolumeSpecName: "kube-api-access-k6pbd") pod "a85a67ba-1b95-44bc-8792-16ede1bc5f33" (UID: "a85a67ba-1b95-44bc-8792-16ede1bc5f33"). InnerVolumeSpecName "kube-api-access-k6pbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.788166 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6pbd\" (UniqueName: \"kubernetes.io/projected/a85a67ba-1b95-44bc-8792-16ede1bc5f33-kube-api-access-k6pbd\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.788862 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a85a67ba-1b95-44bc-8792-16ede1bc5f33-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.856729 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a85a67ba-1b95-44bc-8792-16ede1bc5f33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a85a67ba-1b95-44bc-8792-16ede1bc5f33" (UID: "a85a67ba-1b95-44bc-8792-16ede1bc5f33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.868846 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a85a67ba-1b95-44bc-8792-16ede1bc5f33-config-data" (OuterVolumeSpecName: "config-data") pod "a85a67ba-1b95-44bc-8792-16ede1bc5f33" (UID: "a85a67ba-1b95-44bc-8792-16ede1bc5f33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.890007 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a85a67ba-1b95-44bc-8792-16ede1bc5f33-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.890040 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a85a67ba-1b95-44bc-8792-16ede1bc5f33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.944039 4954 scope.go:117] "RemoveContainer" containerID="a7aea1c152e4b6757f714d274e08c89358664c17ef13d3234675f1b21a924fb2" Dec 06 09:09:25 crc kubenswrapper[4954]: I1206 09:09:25.969950 4954 scope.go:117] "RemoveContainer" containerID="e2361eba2d155e2b896747bb46e23176728e7d90688c7f3435fb18053039fb48" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.163251 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a85a67ba-1b95-44bc-8792-16ede1bc5f33","Type":"ContainerDied","Data":"448f1e7603bdc33fbf677b8ad9e5df1209a7c62197a50f7cac7dbd941dcd43d9"} Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.163304 4954 scope.go:117] "RemoveContainer" containerID="ed20b23815644375f353b27f182e8d1440f89b0bb7dfec830c60a57ea6a4932d" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.163309 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.186967 4954 scope.go:117] "RemoveContainer" containerID="9f1be2c436f95989236c024dde7066d64679084c6122c5b5af3ebfbbd7d4a831" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.203200 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.218376 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.231229 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 06 09:09:26 crc kubenswrapper[4954]: E1206 09:09:26.231737 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerName="aodh-notifier" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.231751 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerName="aodh-notifier" Dec 06 09:09:26 crc kubenswrapper[4954]: E1206 09:09:26.231774 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerName="aodh-evaluator" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.231781 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerName="aodh-evaluator" Dec 06 09:09:26 crc kubenswrapper[4954]: E1206 09:09:26.231799 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerName="aodh-api" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.231807 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerName="aodh-api" Dec 06 09:09:26 crc kubenswrapper[4954]: E1206 09:09:26.231828 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerName="aodh-listener" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.231836 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerName="aodh-listener" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.232090 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerName="aodh-api" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.232111 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerName="aodh-notifier" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.232122 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerName="aodh-evaluator" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.232138 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" containerName="aodh-listener" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.234306 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.235629 4954 scope.go:117] "RemoveContainer" containerID="d84b351d670b82d045a6674180f725c54c4db211fc90208fcf8589701cffd48f" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.241552 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.282224 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.282400 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.282756 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-mx8tx" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.282772 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.283918 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.306246 4954 scope.go:117] "RemoveContainer" containerID="825f1570789deef4815b68f1d383d0bfd385ca52afb8f912c24b5bb23d82ac9d" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.399358 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8944031-0936-4953-849c-83eb9f2d9d7f-scripts\") pod \"aodh-0\" (UID: \"c8944031-0936-4953-849c-83eb9f2d9d7f\") " pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.399997 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8944031-0936-4953-849c-83eb9f2d9d7f-public-tls-certs\") pod \"aodh-0\" (UID: \"c8944031-0936-4953-849c-83eb9f2d9d7f\") " pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.400073 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8944031-0936-4953-849c-83eb9f2d9d7f-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c8944031-0936-4953-849c-83eb9f2d9d7f\") " pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.400186 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8944031-0936-4953-849c-83eb9f2d9d7f-config-data\") pod \"aodh-0\" (UID: \"c8944031-0936-4953-849c-83eb9f2d9d7f\") " pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.400268 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l54vg\" (UniqueName: \"kubernetes.io/projected/c8944031-0936-4953-849c-83eb9f2d9d7f-kube-api-access-l54vg\") pod \"aodh-0\" (UID: \"c8944031-0936-4953-849c-83eb9f2d9d7f\") " pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.400373 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8944031-0936-4953-849c-83eb9f2d9d7f-internal-tls-certs\") pod \"aodh-0\" (UID: \"c8944031-0936-4953-849c-83eb9f2d9d7f\") " pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.501573 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8944031-0936-4953-849c-83eb9f2d9d7f-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c8944031-0936-4953-849c-83eb9f2d9d7f\") " pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.501971 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8944031-0936-4953-849c-83eb9f2d9d7f-config-data\") pod \"aodh-0\" (UID: \"c8944031-0936-4953-849c-83eb9f2d9d7f\") " pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.502015 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l54vg\" (UniqueName: \"kubernetes.io/projected/c8944031-0936-4953-849c-83eb9f2d9d7f-kube-api-access-l54vg\") pod \"aodh-0\" (UID: \"c8944031-0936-4953-849c-83eb9f2d9d7f\") " pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.502043 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8944031-0936-4953-849c-83eb9f2d9d7f-internal-tls-certs\") pod \"aodh-0\" (UID: \"c8944031-0936-4953-849c-83eb9f2d9d7f\") " pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.502110 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8944031-0936-4953-849c-83eb9f2d9d7f-scripts\") pod \"aodh-0\" (UID: \"c8944031-0936-4953-849c-83eb9f2d9d7f\") " pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.502198 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8944031-0936-4953-849c-83eb9f2d9d7f-public-tls-certs\") pod \"aodh-0\" (UID: \"c8944031-0936-4953-849c-83eb9f2d9d7f\") " pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.506409 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8944031-0936-4953-849c-83eb9f2d9d7f-internal-tls-certs\") pod \"aodh-0\" (UID: \"c8944031-0936-4953-849c-83eb9f2d9d7f\") " pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.506696 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8944031-0936-4953-849c-83eb9f2d9d7f-config-data\") pod \"aodh-0\" (UID: \"c8944031-0936-4953-849c-83eb9f2d9d7f\") " pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.506732 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8944031-0936-4953-849c-83eb9f2d9d7f-public-tls-certs\") pod \"aodh-0\" (UID: \"c8944031-0936-4953-849c-83eb9f2d9d7f\") " pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.510645 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8944031-0936-4953-849c-83eb9f2d9d7f-scripts\") pod \"aodh-0\" (UID: \"c8944031-0936-4953-849c-83eb9f2d9d7f\") " pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.510841 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8944031-0936-4953-849c-83eb9f2d9d7f-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c8944031-0936-4953-849c-83eb9f2d9d7f\") " pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.523470 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l54vg\" (UniqueName: \"kubernetes.io/projected/c8944031-0936-4953-849c-83eb9f2d9d7f-kube-api-access-l54vg\") pod \"aodh-0\" (UID: \"c8944031-0936-4953-849c-83eb9f2d9d7f\") " pod="openstack/aodh-0" Dec 06 09:09:26 crc kubenswrapper[4954]: I1206 09:09:26.596071 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 06 09:09:27 crc kubenswrapper[4954]: I1206 09:09:27.398338 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 09:09:27 crc kubenswrapper[4954]: I1206 09:09:27.398545 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 06 09:09:27 crc kubenswrapper[4954]: I1206 09:09:27.457073 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a85a67ba-1b95-44bc-8792-16ede1bc5f33" path="/var/lib/kubelet/pods/a85a67ba-1b95-44bc-8792-16ede1bc5f33/volumes" Dec 06 09:09:28 crc kubenswrapper[4954]: I1206 09:09:28.344699 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c8944031-0936-4953-849c-83eb9f2d9d7f","Type":"ContainerStarted","Data":"57b71437f5848b38bb51d48f33791a2d71ea23254c96bb7120d21893ce31ecfc"} Dec 06 09:09:28 crc kubenswrapper[4954]: I1206 09:09:28.345044 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c8944031-0936-4953-849c-83eb9f2d9d7f","Type":"ContainerStarted","Data":"ef9526377d4891054ef5d880d61f328d556d7612132d03b68959ba63a5043e8d"} Dec 06 09:09:28 crc kubenswrapper[4954]: I1206 09:09:28.636731 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g6xtd" Dec 06 09:09:28 crc kubenswrapper[4954]: I1206 09:09:28.637032 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g6xtd" Dec 06 09:09:28 crc kubenswrapper[4954]: I1206 09:09:28.689225 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g6xtd" Dec 06 09:09:29 crc kubenswrapper[4954]: I1206 09:09:29.357069 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c8944031-0936-4953-849c-83eb9f2d9d7f","Type":"ContainerStarted","Data":"187474820d59004c540dc779deaad98fd60b9c6cce6c0535956e3113ffa58867"} Dec 06 09:09:29 crc kubenswrapper[4954]: I1206 09:09:29.357370 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c8944031-0936-4953-849c-83eb9f2d9d7f","Type":"ContainerStarted","Data":"981a28045281f8429a4a88f562258d2a5940b6e31bb9be1744673a00afcebf57"} Dec 06 09:09:29 crc kubenswrapper[4954]: I1206 09:09:29.415026 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g6xtd" Dec 06 09:09:29 crc kubenswrapper[4954]: I1206 09:09:29.480004 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g6xtd"] Dec 06 09:09:30 crc kubenswrapper[4954]: I1206 09:09:30.366969 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c8944031-0936-4953-849c-83eb9f2d9d7f","Type":"ContainerStarted","Data":"1a446ae49245bfdced721876a32960c3cc73af979047285dae28edeb9ff74999"} Dec 06 09:09:31 crc kubenswrapper[4954]: I1206 09:09:31.374963 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g6xtd" podUID="59837cd9-7416-4497-9c24-7341bd4f3a4c" containerName="registry-server" containerID="cri-o://e888d66cdd341389ed75b27ad27eb50de76d344a0b43e07c7cc2fe1fd4738162" gracePeriod=2 Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.009489 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g6xtd" Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.034547 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=4.012120884 podStartE2EDuration="6.034523122s" podCreationTimestamp="2025-12-06 09:09:26 +0000 UTC" firstStartedPulling="2025-12-06 09:09:27.398128898 +0000 UTC m=+7942.211488287" lastFinishedPulling="2025-12-06 09:09:29.420531136 +0000 UTC m=+7944.233890525" observedRunningTime="2025-12-06 09:09:30.388007724 +0000 UTC m=+7945.201367113" watchObservedRunningTime="2025-12-06 09:09:32.034523122 +0000 UTC m=+7946.847882541" Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.132824 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59837cd9-7416-4497-9c24-7341bd4f3a4c-catalog-content\") pod \"59837cd9-7416-4497-9c24-7341bd4f3a4c\" (UID: \"59837cd9-7416-4497-9c24-7341bd4f3a4c\") " Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.133007 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh8l7\" (UniqueName: \"kubernetes.io/projected/59837cd9-7416-4497-9c24-7341bd4f3a4c-kube-api-access-qh8l7\") pod \"59837cd9-7416-4497-9c24-7341bd4f3a4c\" (UID: \"59837cd9-7416-4497-9c24-7341bd4f3a4c\") " Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.133271 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59837cd9-7416-4497-9c24-7341bd4f3a4c-utilities\") pod \"59837cd9-7416-4497-9c24-7341bd4f3a4c\" (UID: \"59837cd9-7416-4497-9c24-7341bd4f3a4c\") " Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.134542 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59837cd9-7416-4497-9c24-7341bd4f3a4c-utilities" (OuterVolumeSpecName: "utilities") pod "59837cd9-7416-4497-9c24-7341bd4f3a4c" (UID: "59837cd9-7416-4497-9c24-7341bd4f3a4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.142757 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59837cd9-7416-4497-9c24-7341bd4f3a4c-kube-api-access-qh8l7" (OuterVolumeSpecName: "kube-api-access-qh8l7") pod "59837cd9-7416-4497-9c24-7341bd4f3a4c" (UID: "59837cd9-7416-4497-9c24-7341bd4f3a4c"). InnerVolumeSpecName "kube-api-access-qh8l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.176505 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59837cd9-7416-4497-9c24-7341bd4f3a4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59837cd9-7416-4497-9c24-7341bd4f3a4c" (UID: "59837cd9-7416-4497-9c24-7341bd4f3a4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.236104 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59837cd9-7416-4497-9c24-7341bd4f3a4c-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.236135 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59837cd9-7416-4497-9c24-7341bd4f3a4c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.236149 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh8l7\" (UniqueName: \"kubernetes.io/projected/59837cd9-7416-4497-9c24-7341bd4f3a4c-kube-api-access-qh8l7\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.329805 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.390505 4954 generic.go:334] "Generic (PLEG): container finished" podID="59837cd9-7416-4497-9c24-7341bd4f3a4c" containerID="e888d66cdd341389ed75b27ad27eb50de76d344a0b43e07c7cc2fe1fd4738162" exitCode=0 Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.390553 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g6xtd" event={"ID":"59837cd9-7416-4497-9c24-7341bd4f3a4c","Type":"ContainerDied","Data":"e888d66cdd341389ed75b27ad27eb50de76d344a0b43e07c7cc2fe1fd4738162"} Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.390602 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g6xtd" event={"ID":"59837cd9-7416-4497-9c24-7341bd4f3a4c","Type":"ContainerDied","Data":"d6ed7d4e06ac53142da66c493139a85c727ce7d7b250a97cc2d639540648297b"} Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.390625 4954 scope.go:117] "RemoveContainer" containerID="e888d66cdd341389ed75b27ad27eb50de76d344a0b43e07c7cc2fe1fd4738162" Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.390627 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g6xtd" Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.416476 4954 scope.go:117] "RemoveContainer" containerID="a4d4e9bc4c29d312c064ec3d55e57dcb03dbe648fdc8e0768f6cf14eeb95f932" Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.435339 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g6xtd"] Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.449097 4954 scope.go:117] "RemoveContainer" containerID="1448467ad4e3d6cbfcf23715c0c3c85e29f40707444552fa99536c9ec74609cf" Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.454071 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g6xtd"] Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.500743 4954 scope.go:117] "RemoveContainer" containerID="e888d66cdd341389ed75b27ad27eb50de76d344a0b43e07c7cc2fe1fd4738162" Dec 06 09:09:32 crc kubenswrapper[4954]: E1206 09:09:32.501337 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e888d66cdd341389ed75b27ad27eb50de76d344a0b43e07c7cc2fe1fd4738162\": container with ID starting with e888d66cdd341389ed75b27ad27eb50de76d344a0b43e07c7cc2fe1fd4738162 not found: ID does not exist" containerID="e888d66cdd341389ed75b27ad27eb50de76d344a0b43e07c7cc2fe1fd4738162" Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.501447 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e888d66cdd341389ed75b27ad27eb50de76d344a0b43e07c7cc2fe1fd4738162"} err="failed to get container status \"e888d66cdd341389ed75b27ad27eb50de76d344a0b43e07c7cc2fe1fd4738162\": rpc error: code = NotFound desc = could not find container \"e888d66cdd341389ed75b27ad27eb50de76d344a0b43e07c7cc2fe1fd4738162\": container with ID starting with e888d66cdd341389ed75b27ad27eb50de76d344a0b43e07c7cc2fe1fd4738162 not found: ID does not exist" Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.501553 4954 scope.go:117] "RemoveContainer" containerID="a4d4e9bc4c29d312c064ec3d55e57dcb03dbe648fdc8e0768f6cf14eeb95f932" Dec 06 09:09:32 crc kubenswrapper[4954]: E1206 09:09:32.502142 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4d4e9bc4c29d312c064ec3d55e57dcb03dbe648fdc8e0768f6cf14eeb95f932\": container with ID starting with a4d4e9bc4c29d312c064ec3d55e57dcb03dbe648fdc8e0768f6cf14eeb95f932 not found: ID does not exist" containerID="a4d4e9bc4c29d312c064ec3d55e57dcb03dbe648fdc8e0768f6cf14eeb95f932" Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.502185 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4d4e9bc4c29d312c064ec3d55e57dcb03dbe648fdc8e0768f6cf14eeb95f932"} err="failed to get container status \"a4d4e9bc4c29d312c064ec3d55e57dcb03dbe648fdc8e0768f6cf14eeb95f932\": rpc error: code = NotFound desc = could not find container \"a4d4e9bc4c29d312c064ec3d55e57dcb03dbe648fdc8e0768f6cf14eeb95f932\": container with ID starting with a4d4e9bc4c29d312c064ec3d55e57dcb03dbe648fdc8e0768f6cf14eeb95f932 not found: ID does not exist" Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.502212 4954 scope.go:117] "RemoveContainer" containerID="1448467ad4e3d6cbfcf23715c0c3c85e29f40707444552fa99536c9ec74609cf" Dec 06 09:09:32 crc kubenswrapper[4954]: E1206 09:09:32.502686 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1448467ad4e3d6cbfcf23715c0c3c85e29f40707444552fa99536c9ec74609cf\": container with ID starting with 1448467ad4e3d6cbfcf23715c0c3c85e29f40707444552fa99536c9ec74609cf not found: ID does not exist" containerID="1448467ad4e3d6cbfcf23715c0c3c85e29f40707444552fa99536c9ec74609cf" Dec 06 09:09:32 crc kubenswrapper[4954]: I1206 09:09:32.502707 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1448467ad4e3d6cbfcf23715c0c3c85e29f40707444552fa99536c9ec74609cf"} err="failed to get container status \"1448467ad4e3d6cbfcf23715c0c3c85e29f40707444552fa99536c9ec74609cf\": rpc error: code = NotFound desc = could not find container \"1448467ad4e3d6cbfcf23715c0c3c85e29f40707444552fa99536c9ec74609cf\": container with ID starting with 1448467ad4e3d6cbfcf23715c0c3c85e29f40707444552fa99536c9ec74609cf not found: ID does not exist" Dec 06 09:09:33 crc kubenswrapper[4954]: I1206 09:09:33.454903 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59837cd9-7416-4497-9c24-7341bd4f3a4c" path="/var/lib/kubelet/pods/59837cd9-7416-4497-9c24-7341bd4f3a4c/volumes" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.568710 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55d54b56c-8zdk2"] Dec 06 09:09:36 crc kubenswrapper[4954]: E1206 09:09:36.569894 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59837cd9-7416-4497-9c24-7341bd4f3a4c" containerName="registry-server" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.569914 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="59837cd9-7416-4497-9c24-7341bd4f3a4c" containerName="registry-server" Dec 06 09:09:36 crc kubenswrapper[4954]: E1206 09:09:36.569939 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59837cd9-7416-4497-9c24-7341bd4f3a4c" containerName="extract-content" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.569946 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="59837cd9-7416-4497-9c24-7341bd4f3a4c" containerName="extract-content" Dec 06 09:09:36 crc kubenswrapper[4954]: E1206 09:09:36.569983 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59837cd9-7416-4497-9c24-7341bd4f3a4c" containerName="extract-utilities" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.569992 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="59837cd9-7416-4497-9c24-7341bd4f3a4c" containerName="extract-utilities" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.570250 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="59837cd9-7416-4497-9c24-7341bd4f3a4c" containerName="registry-server" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.573879 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.579951 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.605734 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55d54b56c-8zdk2"] Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.686684 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-ovsdbserver-nb\") pod \"dnsmasq-dns-55d54b56c-8zdk2\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.686860 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-openstack-cell1\") pod \"dnsmasq-dns-55d54b56c-8zdk2\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.687396 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-dns-svc\") pod \"dnsmasq-dns-55d54b56c-8zdk2\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.687461 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-ovsdbserver-sb\") pod \"dnsmasq-dns-55d54b56c-8zdk2\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.687626 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44p72\" (UniqueName: \"kubernetes.io/projected/6a108025-4e88-4b52-919a-ce5f0ac8c01e-kube-api-access-44p72\") pod \"dnsmasq-dns-55d54b56c-8zdk2\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.687861 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-config\") pod \"dnsmasq-dns-55d54b56c-8zdk2\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.789703 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-dns-svc\") pod \"dnsmasq-dns-55d54b56c-8zdk2\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.789744 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-ovsdbserver-sb\") pod \"dnsmasq-dns-55d54b56c-8zdk2\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.789768 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44p72\" (UniqueName: \"kubernetes.io/projected/6a108025-4e88-4b52-919a-ce5f0ac8c01e-kube-api-access-44p72\") pod \"dnsmasq-dns-55d54b56c-8zdk2\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.789817 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-config\") pod \"dnsmasq-dns-55d54b56c-8zdk2\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.789882 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-ovsdbserver-nb\") pod \"dnsmasq-dns-55d54b56c-8zdk2\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.789906 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-openstack-cell1\") pod \"dnsmasq-dns-55d54b56c-8zdk2\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.790939 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-openstack-cell1\") pod \"dnsmasq-dns-55d54b56c-8zdk2\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.790978 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-dns-svc\") pod \"dnsmasq-dns-55d54b56c-8zdk2\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.791176 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-ovsdbserver-sb\") pod \"dnsmasq-dns-55d54b56c-8zdk2\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.791218 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-config\") pod \"dnsmasq-dns-55d54b56c-8zdk2\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.791358 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-ovsdbserver-nb\") pod \"dnsmasq-dns-55d54b56c-8zdk2\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.818619 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44p72\" (UniqueName: \"kubernetes.io/projected/6a108025-4e88-4b52-919a-ce5f0ac8c01e-kube-api-access-44p72\") pod \"dnsmasq-dns-55d54b56c-8zdk2\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:36 crc kubenswrapper[4954]: I1206 09:09:36.906752 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:37 crc kubenswrapper[4954]: I1206 09:09:37.391819 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55d54b56c-8zdk2"] Dec 06 09:09:37 crc kubenswrapper[4954]: I1206 09:09:37.484090 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" event={"ID":"6a108025-4e88-4b52-919a-ce5f0ac8c01e","Type":"ContainerStarted","Data":"293cde7408934a1eae264fcbb7a1c189886fe10a415ece81b2f4eaa072080846"} Dec 06 09:09:38 crc kubenswrapper[4954]: I1206 09:09:38.461181 4954 generic.go:334] "Generic (PLEG): container finished" podID="6a108025-4e88-4b52-919a-ce5f0ac8c01e" containerID="b7d84f6b1b4ba78c55c01edd25f19335093efc1b4b07c45c6145cedc7395a265" exitCode=0 Dec 06 09:09:38 crc kubenswrapper[4954]: I1206 09:09:38.461277 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" event={"ID":"6a108025-4e88-4b52-919a-ce5f0ac8c01e","Type":"ContainerDied","Data":"b7d84f6b1b4ba78c55c01edd25f19335093efc1b4b07c45c6145cedc7395a265"} Dec 06 09:09:39 crc kubenswrapper[4954]: I1206 09:09:39.035794 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c137-account-create-update-xv896"] Dec 06 09:09:39 crc kubenswrapper[4954]: I1206 09:09:39.045676 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-77p2n"] Dec 06 09:09:39 crc kubenswrapper[4954]: I1206 09:09:39.057946 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c137-account-create-update-xv896"] Dec 06 09:09:39 crc kubenswrapper[4954]: I1206 09:09:39.070621 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-77p2n"] Dec 06 09:09:39 crc kubenswrapper[4954]: I1206 09:09:39.443369 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:09:39 crc kubenswrapper[4954]: E1206 09:09:39.443734 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:09:39 crc kubenswrapper[4954]: I1206 09:09:39.454128 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1478a06b-de76-4e6e-a09f-bf3599f5fad0" path="/var/lib/kubelet/pods/1478a06b-de76-4e6e-a09f-bf3599f5fad0/volumes" Dec 06 09:09:39 crc kubenswrapper[4954]: I1206 09:09:39.454706 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d763e6-1e5f-412b-9e16-f4cdf3cf98ac" path="/var/lib/kubelet/pods/25d763e6-1e5f-412b-9e16-f4cdf3cf98ac/volumes" Dec 06 09:09:39 crc kubenswrapper[4954]: I1206 09:09:39.471375 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" event={"ID":"6a108025-4e88-4b52-919a-ce5f0ac8c01e","Type":"ContainerStarted","Data":"cc08b3388ad94945cbeba1bc1a114b936f7d8a05775645523dc33e0ccec82772"} Dec 06 09:09:39 crc kubenswrapper[4954]: I1206 09:09:39.472535 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:46 crc kubenswrapper[4954]: I1206 09:09:46.907749 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:46 crc kubenswrapper[4954]: I1206 09:09:46.935342 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" podStartSLOduration=10.935323478 podStartE2EDuration="10.935323478s" podCreationTimestamp="2025-12-06 09:09:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:09:39.49568886 +0000 UTC m=+7954.309048249" watchObservedRunningTime="2025-12-06 09:09:46.935323478 +0000 UTC m=+7961.748682867" Dec 06 09:09:46 crc kubenswrapper[4954]: I1206 09:09:46.974866 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t"] Dec 06 09:09:46 crc kubenswrapper[4954]: I1206 09:09:46.975148 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" podUID="edc42899-5e6c-4354-92e3-d2a3dfa7707a" containerName="dnsmasq-dns" containerID="cri-o://d3c62812cc251143e4c8eea60ca6dbd6a5e40ba07d0ff37780eed9c691086b04" gracePeriod=10 Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.167912 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75dbb898d7-cqrnp"] Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.170209 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.172455 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-networker" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.177977 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75dbb898d7-cqrnp"] Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.238534 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmm5f\" (UniqueName: \"kubernetes.io/projected/1c6332f3-555f-4c91-a763-3152fac18bef-kube-api-access-tmm5f\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.238641 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-ovsdbserver-sb\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.238669 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-openstack-networker\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.238689 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-ovsdbserver-nb\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.238713 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-dns-svc\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.238750 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-config\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.238816 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-openstack-cell1\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.342181 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-openstack-networker\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.342523 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-ovsdbserver-nb\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.342548 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-dns-svc\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.342597 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-config\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.342656 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-openstack-cell1\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.342741 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmm5f\" (UniqueName: \"kubernetes.io/projected/1c6332f3-555f-4c91-a763-3152fac18bef-kube-api-access-tmm5f\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.342804 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-ovsdbserver-sb\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.343304 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-openstack-networker\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.346969 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-ovsdbserver-nb\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.347427 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-ovsdbserver-sb\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.356932 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-dns-svc\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.357337 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-openstack-cell1\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.357413 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-config\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.364357 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmm5f\" (UniqueName: \"kubernetes.io/projected/1c6332f3-555f-4c91-a763-3152fac18bef-kube-api-access-tmm5f\") pod \"dnsmasq-dns-75dbb898d7-cqrnp\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.494418 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.557073 4954 generic.go:334] "Generic (PLEG): container finished" podID="edc42899-5e6c-4354-92e3-d2a3dfa7707a" containerID="d3c62812cc251143e4c8eea60ca6dbd6a5e40ba07d0ff37780eed9c691086b04" exitCode=0 Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.557119 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" event={"ID":"edc42899-5e6c-4354-92e3-d2a3dfa7707a","Type":"ContainerDied","Data":"d3c62812cc251143e4c8eea60ca6dbd6a5e40ba07d0ff37780eed9c691086b04"} Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.557146 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" event={"ID":"edc42899-5e6c-4354-92e3-d2a3dfa7707a","Type":"ContainerDied","Data":"f75abca5c29f5b0f13657dab28f2992430824754a4d00126991da907ee89f698"} Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.557156 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f75abca5c29f5b0f13657dab28f2992430824754a4d00126991da907ee89f698" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.630502 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.650178 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-ovsdbserver-sb\") pod \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.650252 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-config\") pod \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.650453 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-dns-svc\") pod \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.650587 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvsvq\" (UniqueName: \"kubernetes.io/projected/edc42899-5e6c-4354-92e3-d2a3dfa7707a-kube-api-access-cvsvq\") pod \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.650659 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-ovsdbserver-nb\") pod \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\" (UID: \"edc42899-5e6c-4354-92e3-d2a3dfa7707a\") " Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.657954 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edc42899-5e6c-4354-92e3-d2a3dfa7707a-kube-api-access-cvsvq" (OuterVolumeSpecName: "kube-api-access-cvsvq") pod "edc42899-5e6c-4354-92e3-d2a3dfa7707a" (UID: "edc42899-5e6c-4354-92e3-d2a3dfa7707a"). InnerVolumeSpecName "kube-api-access-cvsvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.717720 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-config" (OuterVolumeSpecName: "config") pod "edc42899-5e6c-4354-92e3-d2a3dfa7707a" (UID: "edc42899-5e6c-4354-92e3-d2a3dfa7707a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.725244 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "edc42899-5e6c-4354-92e3-d2a3dfa7707a" (UID: "edc42899-5e6c-4354-92e3-d2a3dfa7707a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.727214 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "edc42899-5e6c-4354-92e3-d2a3dfa7707a" (UID: "edc42899-5e6c-4354-92e3-d2a3dfa7707a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.743183 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "edc42899-5e6c-4354-92e3-d2a3dfa7707a" (UID: "edc42899-5e6c-4354-92e3-d2a3dfa7707a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.753234 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.753411 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvsvq\" (UniqueName: \"kubernetes.io/projected/edc42899-5e6c-4354-92e3-d2a3dfa7707a-kube-api-access-cvsvq\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.753472 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.753557 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:47 crc kubenswrapper[4954]: I1206 09:09:47.753634 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edc42899-5e6c-4354-92e3-d2a3dfa7707a-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:48 crc kubenswrapper[4954]: I1206 09:09:48.032165 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75dbb898d7-cqrnp"] Dec 06 09:09:48 crc kubenswrapper[4954]: W1206 09:09:48.038372 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c6332f3_555f_4c91_a763_3152fac18bef.slice/crio-f6c9f4a5c90c15f9b262e8a0f4c3a98b8362dcb8e8252f7a359dcd7f848b9c6a WatchSource:0}: Error finding container f6c9f4a5c90c15f9b262e8a0f4c3a98b8362dcb8e8252f7a359dcd7f848b9c6a: Status 404 returned error can't find the container with id f6c9f4a5c90c15f9b262e8a0f4c3a98b8362dcb8e8252f7a359dcd7f848b9c6a Dec 06 09:09:48 crc kubenswrapper[4954]: I1206 09:09:48.569166 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" event={"ID":"1c6332f3-555f-4c91-a763-3152fac18bef","Type":"ContainerStarted","Data":"f6c9f4a5c90c15f9b262e8a0f4c3a98b8362dcb8e8252f7a359dcd7f848b9c6a"} Dec 06 09:09:48 crc kubenswrapper[4954]: I1206 09:09:48.569217 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" Dec 06 09:09:48 crc kubenswrapper[4954]: I1206 09:09:48.611068 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t"] Dec 06 09:09:48 crc kubenswrapper[4954]: I1206 09:09:48.619884 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t"] Dec 06 09:09:49 crc kubenswrapper[4954]: I1206 09:09:49.455683 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edc42899-5e6c-4354-92e3-d2a3dfa7707a" path="/var/lib/kubelet/pods/edc42899-5e6c-4354-92e3-d2a3dfa7707a/volumes" Dec 06 09:09:49 crc kubenswrapper[4954]: I1206 09:09:49.587270 4954 generic.go:334] "Generic (PLEG): container finished" podID="1c6332f3-555f-4c91-a763-3152fac18bef" containerID="28a9659ded76ae9307a22b44789f377783e998f20b4d4cc4d371aaaed066808e" exitCode=0 Dec 06 09:09:49 crc kubenswrapper[4954]: I1206 09:09:49.587349 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" event={"ID":"1c6332f3-555f-4c91-a763-3152fac18bef","Type":"ContainerDied","Data":"28a9659ded76ae9307a22b44789f377783e998f20b4d4cc4d371aaaed066808e"} Dec 06 09:09:50 crc kubenswrapper[4954]: I1206 09:09:50.598630 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" event={"ID":"1c6332f3-555f-4c91-a763-3152fac18bef","Type":"ContainerStarted","Data":"93b0346a6bb6032e22ea6b756090cdd93b8c9aa7c3382b69360beb506054efa2"} Dec 06 09:09:50 crc kubenswrapper[4954]: I1206 09:09:50.598917 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:50 crc kubenswrapper[4954]: I1206 09:09:50.624884 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" podStartSLOduration=3.624864487 podStartE2EDuration="3.624864487s" podCreationTimestamp="2025-12-06 09:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:09:50.619790061 +0000 UTC m=+7965.433149510" watchObservedRunningTime="2025-12-06 09:09:50.624864487 +0000 UTC m=+7965.438223876" Dec 06 09:09:52 crc kubenswrapper[4954]: I1206 09:09:52.624344 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6f5f7d8bd5-fgd7t" podUID="edc42899-5e6c-4354-92e3-d2a3dfa7707a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.110:5353: i/o timeout" Dec 06 09:09:54 crc kubenswrapper[4954]: I1206 09:09:54.444125 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:09:54 crc kubenswrapper[4954]: E1206 09:09:54.444760 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:09:57 crc kubenswrapper[4954]: I1206 09:09:57.496802 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:09:57 crc kubenswrapper[4954]: I1206 09:09:57.591668 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55d54b56c-8zdk2"] Dec 06 09:09:57 crc kubenswrapper[4954]: I1206 09:09:57.592121 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" podUID="6a108025-4e88-4b52-919a-ce5f0ac8c01e" containerName="dnsmasq-dns" containerID="cri-o://cc08b3388ad94945cbeba1bc1a114b936f7d8a05775645523dc33e0ccec82772" gracePeriod=10 Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.034156 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77648b885f-ll6cm"] Dec 06 09:09:58 crc kubenswrapper[4954]: E1206 09:09:58.035030 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc42899-5e6c-4354-92e3-d2a3dfa7707a" containerName="dnsmasq-dns" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.035054 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc42899-5e6c-4354-92e3-d2a3dfa7707a" containerName="dnsmasq-dns" Dec 06 09:09:58 crc kubenswrapper[4954]: E1206 09:09:58.035081 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc42899-5e6c-4354-92e3-d2a3dfa7707a" containerName="init" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.035091 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc42899-5e6c-4354-92e3-d2a3dfa7707a" containerName="init" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.035331 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="edc42899-5e6c-4354-92e3-d2a3dfa7707a" containerName="dnsmasq-dns" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.036697 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.047196 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77648b885f-ll6cm"] Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.078114 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-config\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.078944 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-openstack-cell1\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.079066 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25sd7\" (UniqueName: \"kubernetes.io/projected/8664325b-eecf-4fdf-aa85-0a399a4b29c3-kube-api-access-25sd7\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.079137 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-ovsdbserver-sb\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.079244 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-dns-svc\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.079343 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-ovsdbserver-nb\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.079428 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-openstack-networker\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.085381 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.180017 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-config\") pod \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.180081 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-dns-svc\") pod \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.180167 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-openstack-cell1\") pod \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.180239 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44p72\" (UniqueName: \"kubernetes.io/projected/6a108025-4e88-4b52-919a-ce5f0ac8c01e-kube-api-access-44p72\") pod \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.180290 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-ovsdbserver-sb\") pod \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.180313 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-ovsdbserver-nb\") pod \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\" (UID: \"6a108025-4e88-4b52-919a-ce5f0ac8c01e\") " Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.180618 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-config\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.180666 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-openstack-cell1\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.180683 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25sd7\" (UniqueName: \"kubernetes.io/projected/8664325b-eecf-4fdf-aa85-0a399a4b29c3-kube-api-access-25sd7\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.180706 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-ovsdbserver-sb\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.180747 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-dns-svc\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.180783 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-ovsdbserver-nb\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.180813 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-openstack-networker\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.181437 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-config\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.181799 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-openstack-networker\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.181994 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-openstack-cell1\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.182161 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-ovsdbserver-sb\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.182544 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-dns-svc\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.182555 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-ovsdbserver-nb\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.185241 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a108025-4e88-4b52-919a-ce5f0ac8c01e-kube-api-access-44p72" (OuterVolumeSpecName: "kube-api-access-44p72") pod "6a108025-4e88-4b52-919a-ce5f0ac8c01e" (UID: "6a108025-4e88-4b52-919a-ce5f0ac8c01e"). InnerVolumeSpecName "kube-api-access-44p72". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.202188 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25sd7\" (UniqueName: \"kubernetes.io/projected/8664325b-eecf-4fdf-aa85-0a399a4b29c3-kube-api-access-25sd7\") pod \"dnsmasq-dns-77648b885f-ll6cm\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.230794 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6a108025-4e88-4b52-919a-ce5f0ac8c01e" (UID: "6a108025-4e88-4b52-919a-ce5f0ac8c01e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.231343 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6a108025-4e88-4b52-919a-ce5f0ac8c01e" (UID: "6a108025-4e88-4b52-919a-ce5f0ac8c01e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.234678 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6a108025-4e88-4b52-919a-ce5f0ac8c01e" (UID: "6a108025-4e88-4b52-919a-ce5f0ac8c01e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.235538 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "6a108025-4e88-4b52-919a-ce5f0ac8c01e" (UID: "6a108025-4e88-4b52-919a-ce5f0ac8c01e"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.242448 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-config" (OuterVolumeSpecName: "config") pod "6a108025-4e88-4b52-919a-ce5f0ac8c01e" (UID: "6a108025-4e88-4b52-919a-ce5f0ac8c01e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.282775 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.282820 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.282833 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.282847 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44p72\" (UniqueName: \"kubernetes.io/projected/6a108025-4e88-4b52-919a-ce5f0ac8c01e-kube-api-access-44p72\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.282857 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.282867 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a108025-4e88-4b52-919a-ce5f0ac8c01e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.397932 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.708426 4954 generic.go:334] "Generic (PLEG): container finished" podID="6a108025-4e88-4b52-919a-ce5f0ac8c01e" containerID="cc08b3388ad94945cbeba1bc1a114b936f7d8a05775645523dc33e0ccec82772" exitCode=0 Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.708473 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" event={"ID":"6a108025-4e88-4b52-919a-ce5f0ac8c01e","Type":"ContainerDied","Data":"cc08b3388ad94945cbeba1bc1a114b936f7d8a05775645523dc33e0ccec82772"} Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.708627 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" event={"ID":"6a108025-4e88-4b52-919a-ce5f0ac8c01e","Type":"ContainerDied","Data":"293cde7408934a1eae264fcbb7a1c189886fe10a415ece81b2f4eaa072080846"} Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.708653 4954 scope.go:117] "RemoveContainer" containerID="cc08b3388ad94945cbeba1bc1a114b936f7d8a05775645523dc33e0ccec82772" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.708531 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d54b56c-8zdk2" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.774818 4954 scope.go:117] "RemoveContainer" containerID="b7d84f6b1b4ba78c55c01edd25f19335093efc1b4b07c45c6145cedc7395a265" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.779092 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55d54b56c-8zdk2"] Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.788419 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55d54b56c-8zdk2"] Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.804335 4954 scope.go:117] "RemoveContainer" containerID="cc08b3388ad94945cbeba1bc1a114b936f7d8a05775645523dc33e0ccec82772" Dec 06 09:09:58 crc kubenswrapper[4954]: E1206 09:09:58.804837 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc08b3388ad94945cbeba1bc1a114b936f7d8a05775645523dc33e0ccec82772\": container with ID starting with cc08b3388ad94945cbeba1bc1a114b936f7d8a05775645523dc33e0ccec82772 not found: ID does not exist" containerID="cc08b3388ad94945cbeba1bc1a114b936f7d8a05775645523dc33e0ccec82772" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.804874 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc08b3388ad94945cbeba1bc1a114b936f7d8a05775645523dc33e0ccec82772"} err="failed to get container status \"cc08b3388ad94945cbeba1bc1a114b936f7d8a05775645523dc33e0ccec82772\": rpc error: code = NotFound desc = could not find container \"cc08b3388ad94945cbeba1bc1a114b936f7d8a05775645523dc33e0ccec82772\": container with ID starting with cc08b3388ad94945cbeba1bc1a114b936f7d8a05775645523dc33e0ccec82772 not found: ID does not exist" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.804901 4954 scope.go:117] "RemoveContainer" containerID="b7d84f6b1b4ba78c55c01edd25f19335093efc1b4b07c45c6145cedc7395a265" Dec 06 09:09:58 crc kubenswrapper[4954]: E1206 09:09:58.805148 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7d84f6b1b4ba78c55c01edd25f19335093efc1b4b07c45c6145cedc7395a265\": container with ID starting with b7d84f6b1b4ba78c55c01edd25f19335093efc1b4b07c45c6145cedc7395a265 not found: ID does not exist" containerID="b7d84f6b1b4ba78c55c01edd25f19335093efc1b4b07c45c6145cedc7395a265" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.805178 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d84f6b1b4ba78c55c01edd25f19335093efc1b4b07c45c6145cedc7395a265"} err="failed to get container status \"b7d84f6b1b4ba78c55c01edd25f19335093efc1b4b07c45c6145cedc7395a265\": rpc error: code = NotFound desc = could not find container \"b7d84f6b1b4ba78c55c01edd25f19335093efc1b4b07c45c6145cedc7395a265\": container with ID starting with b7d84f6b1b4ba78c55c01edd25f19335093efc1b4b07c45c6145cedc7395a265 not found: ID does not exist" Dec 06 09:09:58 crc kubenswrapper[4954]: I1206 09:09:58.868580 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77648b885f-ll6cm"] Dec 06 09:09:58 crc kubenswrapper[4954]: W1206 09:09:58.896859 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8664325b_eecf_4fdf_aa85_0a399a4b29c3.slice/crio-8c96d73bc317d76b42612281120ae5b228a8dfa23b65c2093a0983a7c82b7d27 WatchSource:0}: Error finding container 8c96d73bc317d76b42612281120ae5b228a8dfa23b65c2093a0983a7c82b7d27: Status 404 returned error can't find the container with id 8c96d73bc317d76b42612281120ae5b228a8dfa23b65c2093a0983a7c82b7d27 Dec 06 09:09:59 crc kubenswrapper[4954]: I1206 09:09:59.455455 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a108025-4e88-4b52-919a-ce5f0ac8c01e" path="/var/lib/kubelet/pods/6a108025-4e88-4b52-919a-ce5f0ac8c01e/volumes" Dec 06 09:09:59 crc kubenswrapper[4954]: I1206 09:09:59.722473 4954 generic.go:334] "Generic (PLEG): container finished" podID="8664325b-eecf-4fdf-aa85-0a399a4b29c3" containerID="c8b76ccac25d01f437eaea230ed0ae41432c9fe1429d68e11e3a1ae734cd03b3" exitCode=0 Dec 06 09:09:59 crc kubenswrapper[4954]: I1206 09:09:59.722512 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77648b885f-ll6cm" event={"ID":"8664325b-eecf-4fdf-aa85-0a399a4b29c3","Type":"ContainerDied","Data":"c8b76ccac25d01f437eaea230ed0ae41432c9fe1429d68e11e3a1ae734cd03b3"} Dec 06 09:09:59 crc kubenswrapper[4954]: I1206 09:09:59.722541 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77648b885f-ll6cm" event={"ID":"8664325b-eecf-4fdf-aa85-0a399a4b29c3","Type":"ContainerStarted","Data":"8c96d73bc317d76b42612281120ae5b228a8dfa23b65c2093a0983a7c82b7d27"} Dec 06 09:10:00 crc kubenswrapper[4954]: I1206 09:10:00.733792 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77648b885f-ll6cm" event={"ID":"8664325b-eecf-4fdf-aa85-0a399a4b29c3","Type":"ContainerStarted","Data":"f07ffba0499a139f0ec42b4e5d985dbdfd0c9dd5a579844a2d0a9ba510fa0011"} Dec 06 09:10:00 crc kubenswrapper[4954]: I1206 09:10:00.734274 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:10:00 crc kubenswrapper[4954]: I1206 09:10:00.754684 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77648b885f-ll6cm" podStartSLOduration=3.754664135 podStartE2EDuration="3.754664135s" podCreationTimestamp="2025-12-06 09:09:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:10:00.750813041 +0000 UTC m=+7975.564172430" watchObservedRunningTime="2025-12-06 09:10:00.754664135 +0000 UTC m=+7975.568023524" Dec 06 09:10:01 crc kubenswrapper[4954]: I1206 09:10:01.214947 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-knzzk"] Dec 06 09:10:01 crc kubenswrapper[4954]: E1206 09:10:01.215477 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a108025-4e88-4b52-919a-ce5f0ac8c01e" containerName="dnsmasq-dns" Dec 06 09:10:01 crc kubenswrapper[4954]: I1206 09:10:01.215495 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a108025-4e88-4b52-919a-ce5f0ac8c01e" containerName="dnsmasq-dns" Dec 06 09:10:01 crc kubenswrapper[4954]: E1206 09:10:01.215506 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a108025-4e88-4b52-919a-ce5f0ac8c01e" containerName="init" Dec 06 09:10:01 crc kubenswrapper[4954]: I1206 09:10:01.215513 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a108025-4e88-4b52-919a-ce5f0ac8c01e" containerName="init" Dec 06 09:10:01 crc kubenswrapper[4954]: I1206 09:10:01.215739 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a108025-4e88-4b52-919a-ce5f0ac8c01e" containerName="dnsmasq-dns" Dec 06 09:10:01 crc kubenswrapper[4954]: I1206 09:10:01.218038 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knzzk" Dec 06 09:10:01 crc kubenswrapper[4954]: I1206 09:10:01.236327 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-knzzk"] Dec 06 09:10:01 crc kubenswrapper[4954]: I1206 09:10:01.240327 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bf10b0a-2161-423f-9cb2-f3aa72be786c-utilities\") pod \"community-operators-knzzk\" (UID: \"5bf10b0a-2161-423f-9cb2-f3aa72be786c\") " pod="openshift-marketplace/community-operators-knzzk" Dec 06 09:10:01 crc kubenswrapper[4954]: I1206 09:10:01.240374 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bf10b0a-2161-423f-9cb2-f3aa72be786c-catalog-content\") pod \"community-operators-knzzk\" (UID: \"5bf10b0a-2161-423f-9cb2-f3aa72be786c\") " pod="openshift-marketplace/community-operators-knzzk" Dec 06 09:10:01 crc kubenswrapper[4954]: I1206 09:10:01.240491 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rc7d\" (UniqueName: \"kubernetes.io/projected/5bf10b0a-2161-423f-9cb2-f3aa72be786c-kube-api-access-4rc7d\") pod \"community-operators-knzzk\" (UID: \"5bf10b0a-2161-423f-9cb2-f3aa72be786c\") " pod="openshift-marketplace/community-operators-knzzk" Dec 06 09:10:01 crc kubenswrapper[4954]: I1206 09:10:01.343554 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rc7d\" (UniqueName: \"kubernetes.io/projected/5bf10b0a-2161-423f-9cb2-f3aa72be786c-kube-api-access-4rc7d\") pod \"community-operators-knzzk\" (UID: \"5bf10b0a-2161-423f-9cb2-f3aa72be786c\") " pod="openshift-marketplace/community-operators-knzzk" Dec 06 09:10:01 crc kubenswrapper[4954]: I1206 09:10:01.343885 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bf10b0a-2161-423f-9cb2-f3aa72be786c-utilities\") pod \"community-operators-knzzk\" (UID: \"5bf10b0a-2161-423f-9cb2-f3aa72be786c\") " pod="openshift-marketplace/community-operators-knzzk" Dec 06 09:10:01 crc kubenswrapper[4954]: I1206 09:10:01.343917 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bf10b0a-2161-423f-9cb2-f3aa72be786c-catalog-content\") pod \"community-operators-knzzk\" (UID: \"5bf10b0a-2161-423f-9cb2-f3aa72be786c\") " pod="openshift-marketplace/community-operators-knzzk" Dec 06 09:10:01 crc kubenswrapper[4954]: I1206 09:10:01.344399 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bf10b0a-2161-423f-9cb2-f3aa72be786c-utilities\") pod \"community-operators-knzzk\" (UID: \"5bf10b0a-2161-423f-9cb2-f3aa72be786c\") " pod="openshift-marketplace/community-operators-knzzk" Dec 06 09:10:01 crc kubenswrapper[4954]: I1206 09:10:01.344398 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bf10b0a-2161-423f-9cb2-f3aa72be786c-catalog-content\") pod \"community-operators-knzzk\" (UID: \"5bf10b0a-2161-423f-9cb2-f3aa72be786c\") " pod="openshift-marketplace/community-operators-knzzk" Dec 06 09:10:01 crc kubenswrapper[4954]: I1206 09:10:01.361531 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rc7d\" (UniqueName: \"kubernetes.io/projected/5bf10b0a-2161-423f-9cb2-f3aa72be786c-kube-api-access-4rc7d\") pod \"community-operators-knzzk\" (UID: \"5bf10b0a-2161-423f-9cb2-f3aa72be786c\") " pod="openshift-marketplace/community-operators-knzzk" Dec 06 09:10:01 crc kubenswrapper[4954]: I1206 09:10:01.540457 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knzzk" Dec 06 09:10:02 crc kubenswrapper[4954]: I1206 09:10:02.045425 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-knzzk"] Dec 06 09:10:02 crc kubenswrapper[4954]: W1206 09:10:02.047109 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bf10b0a_2161_423f_9cb2_f3aa72be786c.slice/crio-8e5d8dc71838a88d0e729c8c86841e96f1c0da94473086a1772629ebca06986c WatchSource:0}: Error finding container 8e5d8dc71838a88d0e729c8c86841e96f1c0da94473086a1772629ebca06986c: Status 404 returned error can't find the container with id 8e5d8dc71838a88d0e729c8c86841e96f1c0da94473086a1772629ebca06986c Dec 06 09:10:02 crc kubenswrapper[4954]: I1206 09:10:02.761493 4954 generic.go:334] "Generic (PLEG): container finished" podID="5bf10b0a-2161-423f-9cb2-f3aa72be786c" containerID="60293cce24eda3c93d924aef98716ccdbda433f0961dc7f3da8e7fbed7833eab" exitCode=0 Dec 06 09:10:02 crc kubenswrapper[4954]: I1206 09:10:02.761546 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knzzk" event={"ID":"5bf10b0a-2161-423f-9cb2-f3aa72be786c","Type":"ContainerDied","Data":"60293cce24eda3c93d924aef98716ccdbda433f0961dc7f3da8e7fbed7833eab"} Dec 06 09:10:02 crc kubenswrapper[4954]: I1206 09:10:02.761819 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knzzk" event={"ID":"5bf10b0a-2161-423f-9cb2-f3aa72be786c","Type":"ContainerStarted","Data":"8e5d8dc71838a88d0e729c8c86841e96f1c0da94473086a1772629ebca06986c"} Dec 06 09:10:03 crc kubenswrapper[4954]: I1206 09:10:03.774320 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knzzk" event={"ID":"5bf10b0a-2161-423f-9cb2-f3aa72be786c","Type":"ContainerStarted","Data":"03de1ccd0280b482ca5a424c936105d59facc8f58fc6b7471c0af6f997639f37"} Dec 06 09:10:05 crc kubenswrapper[4954]: I1206 09:10:05.806902 4954 generic.go:334] "Generic (PLEG): container finished" podID="5bf10b0a-2161-423f-9cb2-f3aa72be786c" containerID="03de1ccd0280b482ca5a424c936105d59facc8f58fc6b7471c0af6f997639f37" exitCode=0 Dec 06 09:10:05 crc kubenswrapper[4954]: I1206 09:10:05.807270 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knzzk" event={"ID":"5bf10b0a-2161-423f-9cb2-f3aa72be786c","Type":"ContainerDied","Data":"03de1ccd0280b482ca5a424c936105d59facc8f58fc6b7471c0af6f997639f37"} Dec 06 09:10:06 crc kubenswrapper[4954]: I1206 09:10:06.821246 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knzzk" event={"ID":"5bf10b0a-2161-423f-9cb2-f3aa72be786c","Type":"ContainerStarted","Data":"79c58390dace09c8d85d415c315e75061b05207ac647344201f24fc06fd7e7e1"} Dec 06 09:10:06 crc kubenswrapper[4954]: I1206 09:10:06.852887 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-knzzk" podStartSLOduration=2.363273327 podStartE2EDuration="5.85286405s" podCreationTimestamp="2025-12-06 09:10:01 +0000 UTC" firstStartedPulling="2025-12-06 09:10:02.763523791 +0000 UTC m=+7977.576883180" lastFinishedPulling="2025-12-06 09:10:06.253114494 +0000 UTC m=+7981.066473903" observedRunningTime="2025-12-06 09:10:06.83866157 +0000 UTC m=+7981.652020979" watchObservedRunningTime="2025-12-06 09:10:06.85286405 +0000 UTC m=+7981.666223429" Dec 06 09:10:08 crc kubenswrapper[4954]: I1206 09:10:08.399838 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:10:08 crc kubenswrapper[4954]: I1206 09:10:08.463803 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75dbb898d7-cqrnp"] Dec 06 09:10:08 crc kubenswrapper[4954]: I1206 09:10:08.464061 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" podUID="1c6332f3-555f-4c91-a763-3152fac18bef" containerName="dnsmasq-dns" containerID="cri-o://93b0346a6bb6032e22ea6b756090cdd93b8c9aa7c3382b69360beb506054efa2" gracePeriod=10 Dec 06 09:10:08 crc kubenswrapper[4954]: I1206 09:10:08.849216 4954 generic.go:334] "Generic (PLEG): container finished" podID="1c6332f3-555f-4c91-a763-3152fac18bef" containerID="93b0346a6bb6032e22ea6b756090cdd93b8c9aa7c3382b69360beb506054efa2" exitCode=0 Dec 06 09:10:08 crc kubenswrapper[4954]: I1206 09:10:08.849292 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" event={"ID":"1c6332f3-555f-4c91-a763-3152fac18bef","Type":"ContainerDied","Data":"93b0346a6bb6032e22ea6b756090cdd93b8c9aa7c3382b69360beb506054efa2"} Dec 06 09:10:08 crc kubenswrapper[4954]: I1206 09:10:08.849572 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" event={"ID":"1c6332f3-555f-4c91-a763-3152fac18bef","Type":"ContainerDied","Data":"f6c9f4a5c90c15f9b262e8a0f4c3a98b8362dcb8e8252f7a359dcd7f848b9c6a"} Dec 06 09:10:08 crc kubenswrapper[4954]: I1206 09:10:08.849589 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6c9f4a5c90c15f9b262e8a0f4c3a98b8362dcb8e8252f7a359dcd7f848b9c6a" Dec 06 09:10:08 crc kubenswrapper[4954]: I1206 09:10:08.919542 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.100994 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-ovsdbserver-sb\") pod \"1c6332f3-555f-4c91-a763-3152fac18bef\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.101247 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-openstack-networker\") pod \"1c6332f3-555f-4c91-a763-3152fac18bef\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.101461 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-ovsdbserver-nb\") pod \"1c6332f3-555f-4c91-a763-3152fac18bef\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.101545 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-dns-svc\") pod \"1c6332f3-555f-4c91-a763-3152fac18bef\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.101680 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmm5f\" (UniqueName: \"kubernetes.io/projected/1c6332f3-555f-4c91-a763-3152fac18bef-kube-api-access-tmm5f\") pod \"1c6332f3-555f-4c91-a763-3152fac18bef\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.101785 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-openstack-cell1\") pod \"1c6332f3-555f-4c91-a763-3152fac18bef\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.101935 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-config\") pod \"1c6332f3-555f-4c91-a763-3152fac18bef\" (UID: \"1c6332f3-555f-4c91-a763-3152fac18bef\") " Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.112026 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c6332f3-555f-4c91-a763-3152fac18bef-kube-api-access-tmm5f" (OuterVolumeSpecName: "kube-api-access-tmm5f") pod "1c6332f3-555f-4c91-a763-3152fac18bef" (UID: "1c6332f3-555f-4c91-a763-3152fac18bef"). InnerVolumeSpecName "kube-api-access-tmm5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.162176 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-openstack-networker" (OuterVolumeSpecName: "openstack-networker") pod "1c6332f3-555f-4c91-a763-3152fac18bef" (UID: "1c6332f3-555f-4c91-a763-3152fac18bef"). InnerVolumeSpecName "openstack-networker". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.187772 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1c6332f3-555f-4c91-a763-3152fac18bef" (UID: "1c6332f3-555f-4c91-a763-3152fac18bef"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.207992 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.208350 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmm5f\" (UniqueName: \"kubernetes.io/projected/1c6332f3-555f-4c91-a763-3152fac18bef-kube-api-access-tmm5f\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.208364 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-openstack-networker\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.215896 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1c6332f3-555f-4c91-a763-3152fac18bef" (UID: "1c6332f3-555f-4c91-a763-3152fac18bef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.239496 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1c6332f3-555f-4c91-a763-3152fac18bef" (UID: "1c6332f3-555f-4c91-a763-3152fac18bef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.245457 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-config" (OuterVolumeSpecName: "config") pod "1c6332f3-555f-4c91-a763-3152fac18bef" (UID: "1c6332f3-555f-4c91-a763-3152fac18bef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.265195 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "1c6332f3-555f-4c91-a763-3152fac18bef" (UID: "1c6332f3-555f-4c91-a763-3152fac18bef"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.311968 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.312043 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.312054 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.312066 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c6332f3-555f-4c91-a763-3152fac18bef-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.444170 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:10:09 crc kubenswrapper[4954]: E1206 09:10:09.445745 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.858423 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb898d7-cqrnp" Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.883855 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75dbb898d7-cqrnp"] Dec 06 09:10:09 crc kubenswrapper[4954]: I1206 09:10:09.891862 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75dbb898d7-cqrnp"] Dec 06 09:10:11 crc kubenswrapper[4954]: I1206 09:10:11.455152 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c6332f3-555f-4c91-a763-3152fac18bef" path="/var/lib/kubelet/pods/1c6332f3-555f-4c91-a763-3152fac18bef/volumes" Dec 06 09:10:11 crc kubenswrapper[4954]: I1206 09:10:11.540880 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-knzzk" Dec 06 09:10:11 crc kubenswrapper[4954]: I1206 09:10:11.540924 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-knzzk" Dec 06 09:10:11 crc kubenswrapper[4954]: I1206 09:10:11.591555 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-knzzk" Dec 06 09:10:11 crc kubenswrapper[4954]: I1206 09:10:11.954780 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-knzzk" Dec 06 09:10:12 crc kubenswrapper[4954]: I1206 09:10:12.015727 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-knzzk"] Dec 06 09:10:13 crc kubenswrapper[4954]: I1206 09:10:13.902936 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-knzzk" podUID="5bf10b0a-2161-423f-9cb2-f3aa72be786c" containerName="registry-server" containerID="cri-o://79c58390dace09c8d85d415c315e75061b05207ac647344201f24fc06fd7e7e1" gracePeriod=2 Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.351204 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knzzk" Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.530911 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bf10b0a-2161-423f-9cb2-f3aa72be786c-catalog-content\") pod \"5bf10b0a-2161-423f-9cb2-f3aa72be786c\" (UID: \"5bf10b0a-2161-423f-9cb2-f3aa72be786c\") " Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.531235 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bf10b0a-2161-423f-9cb2-f3aa72be786c-utilities\") pod \"5bf10b0a-2161-423f-9cb2-f3aa72be786c\" (UID: \"5bf10b0a-2161-423f-9cb2-f3aa72be786c\") " Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.531464 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rc7d\" (UniqueName: \"kubernetes.io/projected/5bf10b0a-2161-423f-9cb2-f3aa72be786c-kube-api-access-4rc7d\") pod \"5bf10b0a-2161-423f-9cb2-f3aa72be786c\" (UID: \"5bf10b0a-2161-423f-9cb2-f3aa72be786c\") " Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.532820 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bf10b0a-2161-423f-9cb2-f3aa72be786c-utilities" (OuterVolumeSpecName: "utilities") pod "5bf10b0a-2161-423f-9cb2-f3aa72be786c" (UID: "5bf10b0a-2161-423f-9cb2-f3aa72be786c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.537102 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bf10b0a-2161-423f-9cb2-f3aa72be786c-kube-api-access-4rc7d" (OuterVolumeSpecName: "kube-api-access-4rc7d") pod "5bf10b0a-2161-423f-9cb2-f3aa72be786c" (UID: "5bf10b0a-2161-423f-9cb2-f3aa72be786c"). InnerVolumeSpecName "kube-api-access-4rc7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.597109 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bf10b0a-2161-423f-9cb2-f3aa72be786c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5bf10b0a-2161-423f-9cb2-f3aa72be786c" (UID: "5bf10b0a-2161-423f-9cb2-f3aa72be786c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.633531 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bf10b0a-2161-423f-9cb2-f3aa72be786c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.633576 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bf10b0a-2161-423f-9cb2-f3aa72be786c-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.633592 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rc7d\" (UniqueName: \"kubernetes.io/projected/5bf10b0a-2161-423f-9cb2-f3aa72be786c-kube-api-access-4rc7d\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.922225 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knzzk" Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.922355 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knzzk" event={"ID":"5bf10b0a-2161-423f-9cb2-f3aa72be786c","Type":"ContainerDied","Data":"79c58390dace09c8d85d415c315e75061b05207ac647344201f24fc06fd7e7e1"} Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.922890 4954 generic.go:334] "Generic (PLEG): container finished" podID="5bf10b0a-2161-423f-9cb2-f3aa72be786c" containerID="79c58390dace09c8d85d415c315e75061b05207ac647344201f24fc06fd7e7e1" exitCode=0 Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.922949 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knzzk" event={"ID":"5bf10b0a-2161-423f-9cb2-f3aa72be786c","Type":"ContainerDied","Data":"8e5d8dc71838a88d0e729c8c86841e96f1c0da94473086a1772629ebca06986c"} Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.923008 4954 scope.go:117] "RemoveContainer" containerID="79c58390dace09c8d85d415c315e75061b05207ac647344201f24fc06fd7e7e1" Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.961611 4954 scope.go:117] "RemoveContainer" containerID="03de1ccd0280b482ca5a424c936105d59facc8f58fc6b7471c0af6f997639f37" Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.965060 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-knzzk"] Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.974358 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-knzzk"] Dec 06 09:10:14 crc kubenswrapper[4954]: I1206 09:10:14.985782 4954 scope.go:117] "RemoveContainer" containerID="60293cce24eda3c93d924aef98716ccdbda433f0961dc7f3da8e7fbed7833eab" Dec 06 09:10:15 crc kubenswrapper[4954]: I1206 09:10:15.042989 4954 scope.go:117] "RemoveContainer" containerID="79c58390dace09c8d85d415c315e75061b05207ac647344201f24fc06fd7e7e1" Dec 06 09:10:15 crc kubenswrapper[4954]: E1206 09:10:15.043614 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79c58390dace09c8d85d415c315e75061b05207ac647344201f24fc06fd7e7e1\": container with ID starting with 79c58390dace09c8d85d415c315e75061b05207ac647344201f24fc06fd7e7e1 not found: ID does not exist" containerID="79c58390dace09c8d85d415c315e75061b05207ac647344201f24fc06fd7e7e1" Dec 06 09:10:15 crc kubenswrapper[4954]: I1206 09:10:15.043663 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79c58390dace09c8d85d415c315e75061b05207ac647344201f24fc06fd7e7e1"} err="failed to get container status \"79c58390dace09c8d85d415c315e75061b05207ac647344201f24fc06fd7e7e1\": rpc error: code = NotFound desc = could not find container \"79c58390dace09c8d85d415c315e75061b05207ac647344201f24fc06fd7e7e1\": container with ID starting with 79c58390dace09c8d85d415c315e75061b05207ac647344201f24fc06fd7e7e1 not found: ID does not exist" Dec 06 09:10:15 crc kubenswrapper[4954]: I1206 09:10:15.043694 4954 scope.go:117] "RemoveContainer" containerID="03de1ccd0280b482ca5a424c936105d59facc8f58fc6b7471c0af6f997639f37" Dec 06 09:10:15 crc kubenswrapper[4954]: E1206 09:10:15.044106 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03de1ccd0280b482ca5a424c936105d59facc8f58fc6b7471c0af6f997639f37\": container with ID starting with 03de1ccd0280b482ca5a424c936105d59facc8f58fc6b7471c0af6f997639f37 not found: ID does not exist" containerID="03de1ccd0280b482ca5a424c936105d59facc8f58fc6b7471c0af6f997639f37" Dec 06 09:10:15 crc kubenswrapper[4954]: I1206 09:10:15.044150 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03de1ccd0280b482ca5a424c936105d59facc8f58fc6b7471c0af6f997639f37"} err="failed to get container status \"03de1ccd0280b482ca5a424c936105d59facc8f58fc6b7471c0af6f997639f37\": rpc error: code = NotFound desc = could not find container \"03de1ccd0280b482ca5a424c936105d59facc8f58fc6b7471c0af6f997639f37\": container with ID starting with 03de1ccd0280b482ca5a424c936105d59facc8f58fc6b7471c0af6f997639f37 not found: ID does not exist" Dec 06 09:10:15 crc kubenswrapper[4954]: I1206 09:10:15.044176 4954 scope.go:117] "RemoveContainer" containerID="60293cce24eda3c93d924aef98716ccdbda433f0961dc7f3da8e7fbed7833eab" Dec 06 09:10:15 crc kubenswrapper[4954]: E1206 09:10:15.044734 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60293cce24eda3c93d924aef98716ccdbda433f0961dc7f3da8e7fbed7833eab\": container with ID starting with 60293cce24eda3c93d924aef98716ccdbda433f0961dc7f3da8e7fbed7833eab not found: ID does not exist" containerID="60293cce24eda3c93d924aef98716ccdbda433f0961dc7f3da8e7fbed7833eab" Dec 06 09:10:15 crc kubenswrapper[4954]: I1206 09:10:15.044796 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60293cce24eda3c93d924aef98716ccdbda433f0961dc7f3da8e7fbed7833eab"} err="failed to get container status \"60293cce24eda3c93d924aef98716ccdbda433f0961dc7f3da8e7fbed7833eab\": rpc error: code = NotFound desc = could not find container \"60293cce24eda3c93d924aef98716ccdbda433f0961dc7f3da8e7fbed7833eab\": container with ID starting with 60293cce24eda3c93d924aef98716ccdbda433f0961dc7f3da8e7fbed7833eab not found: ID does not exist" Dec 06 09:10:15 crc kubenswrapper[4954]: I1206 09:10:15.458012 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bf10b0a-2161-423f-9cb2-f3aa72be786c" path="/var/lib/kubelet/pods/5bf10b0a-2161-423f-9cb2-f3aa72be786c/volumes" Dec 06 09:10:22 crc kubenswrapper[4954]: I1206 09:10:22.444135 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.002053 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm"] Dec 06 09:10:23 crc kubenswrapper[4954]: E1206 09:10:23.002744 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf10b0a-2161-423f-9cb2-f3aa72be786c" containerName="extract-content" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.002762 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf10b0a-2161-423f-9cb2-f3aa72be786c" containerName="extract-content" Dec 06 09:10:23 crc kubenswrapper[4954]: E1206 09:10:23.002775 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf10b0a-2161-423f-9cb2-f3aa72be786c" containerName="extract-utilities" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.002782 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf10b0a-2161-423f-9cb2-f3aa72be786c" containerName="extract-utilities" Dec 06 09:10:23 crc kubenswrapper[4954]: E1206 09:10:23.002801 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6332f3-555f-4c91-a763-3152fac18bef" containerName="dnsmasq-dns" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.002808 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6332f3-555f-4c91-a763-3152fac18bef" containerName="dnsmasq-dns" Dec 06 09:10:23 crc kubenswrapper[4954]: E1206 09:10:23.002818 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6332f3-555f-4c91-a763-3152fac18bef" containerName="init" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.002823 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6332f3-555f-4c91-a763-3152fac18bef" containerName="init" Dec 06 09:10:23 crc kubenswrapper[4954]: E1206 09:10:23.002835 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf10b0a-2161-423f-9cb2-f3aa72be786c" containerName="registry-server" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.002841 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf10b0a-2161-423f-9cb2-f3aa72be786c" containerName="registry-server" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.003082 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6332f3-555f-4c91-a763-3152fac18bef" containerName="dnsmasq-dns" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.003106 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bf10b0a-2161-423f-9cb2-f3aa72be786c" containerName="registry-server" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.003817 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.005968 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.006341 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.006441 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.007826 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.057064 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"2bf4b8e7101728e942e6ac1740eae0d4c808fde87f7933834d9965f08347570b"} Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.058588 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v"] Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.060007 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.062530 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-qwx8g" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.062855 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.074859 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm"] Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.083364 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v"] Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.110669 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fb018b-abd2-4aec-8e4f-7705beb33bf6-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm\" (UID: \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.110829 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05fb018b-abd2-4aec-8e4f-7705beb33bf6-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm\" (UID: \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.110879 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kq95\" (UniqueName: \"kubernetes.io/projected/05fb018b-abd2-4aec-8e4f-7705beb33bf6-kube-api-access-6kq95\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm\" (UID: \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.110899 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05fb018b-abd2-4aec-8e4f-7705beb33bf6-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm\" (UID: \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.212518 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05fb018b-abd2-4aec-8e4f-7705beb33bf6-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm\" (UID: \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.212639 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kq95\" (UniqueName: \"kubernetes.io/projected/05fb018b-abd2-4aec-8e4f-7705beb33bf6-kube-api-access-6kq95\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm\" (UID: \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.212669 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05fb018b-abd2-4aec-8e4f-7705beb33bf6-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm\" (UID: \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.212714 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fb018b-abd2-4aec-8e4f-7705beb33bf6-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm\" (UID: \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.212766 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e11e817c-9bd1-4276-b2ec-c7732d9950ff-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v\" (UID: \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.212828 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e11e817c-9bd1-4276-b2ec-c7732d9950ff-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v\" (UID: \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.212923 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e11e817c-9bd1-4276-b2ec-c7732d9950ff-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v\" (UID: \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.213013 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrbvv\" (UniqueName: \"kubernetes.io/projected/e11e817c-9bd1-4276-b2ec-c7732d9950ff-kube-api-access-rrbvv\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v\" (UID: \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.232009 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05fb018b-abd2-4aec-8e4f-7705beb33bf6-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm\" (UID: \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.232321 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fb018b-abd2-4aec-8e4f-7705beb33bf6-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm\" (UID: \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.232357 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05fb018b-abd2-4aec-8e4f-7705beb33bf6-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm\" (UID: \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.239644 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kq95\" (UniqueName: \"kubernetes.io/projected/05fb018b-abd2-4aec-8e4f-7705beb33bf6-kube-api-access-6kq95\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm\" (UID: \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.315047 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e11e817c-9bd1-4276-b2ec-c7732d9950ff-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v\" (UID: \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.315100 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e11e817c-9bd1-4276-b2ec-c7732d9950ff-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v\" (UID: \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.315178 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e11e817c-9bd1-4276-b2ec-c7732d9950ff-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v\" (UID: \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.315207 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrbvv\" (UniqueName: \"kubernetes.io/projected/e11e817c-9bd1-4276-b2ec-c7732d9950ff-kube-api-access-rrbvv\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v\" (UID: \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.318787 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e11e817c-9bd1-4276-b2ec-c7732d9950ff-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v\" (UID: \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.319705 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e11e817c-9bd1-4276-b2ec-c7732d9950ff-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v\" (UID: \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.322082 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e11e817c-9bd1-4276-b2ec-c7732d9950ff-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v\" (UID: \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.332159 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrbvv\" (UniqueName: \"kubernetes.io/projected/e11e817c-9bd1-4276-b2ec-c7732d9950ff-kube-api-access-rrbvv\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v\" (UID: \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.354494 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" Dec 06 09:10:23 crc kubenswrapper[4954]: I1206 09:10:23.379685 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" Dec 06 09:10:24 crc kubenswrapper[4954]: I1206 09:10:24.033837 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm"] Dec 06 09:10:24 crc kubenswrapper[4954]: W1206 09:10:24.035967 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05fb018b_abd2_4aec_8e4f_7705beb33bf6.slice/crio-98385440ba3191e388a129f97f4442e5de695d3a904277c1fecca91fab56b231 WatchSource:0}: Error finding container 98385440ba3191e388a129f97f4442e5de695d3a904277c1fecca91fab56b231: Status 404 returned error can't find the container with id 98385440ba3191e388a129f97f4442e5de695d3a904277c1fecca91fab56b231 Dec 06 09:10:24 crc kubenswrapper[4954]: I1206 09:10:24.066310 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" event={"ID":"05fb018b-abd2-4aec-8e4f-7705beb33bf6","Type":"ContainerStarted","Data":"98385440ba3191e388a129f97f4442e5de695d3a904277c1fecca91fab56b231"} Dec 06 09:10:24 crc kubenswrapper[4954]: I1206 09:10:24.638394 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v"] Dec 06 09:10:25 crc kubenswrapper[4954]: I1206 09:10:25.076341 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" event={"ID":"e11e817c-9bd1-4276-b2ec-c7732d9950ff","Type":"ContainerStarted","Data":"e8c5105424d572bb5ff0b46552d89bdc460575a2467ed7ab698bb133ed560078"} Dec 06 09:10:26 crc kubenswrapper[4954]: I1206 09:10:26.059314 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-tv6n7"] Dec 06 09:10:26 crc kubenswrapper[4954]: I1206 09:10:26.068668 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-tv6n7"] Dec 06 09:10:26 crc kubenswrapper[4954]: I1206 09:10:26.097276 4954 scope.go:117] "RemoveContainer" containerID="abaa1d070c406cc0c6728a24a344c2f93be130b166c211357f87edea2eb1f511" Dec 06 09:10:26 crc kubenswrapper[4954]: I1206 09:10:26.806661 4954 scope.go:117] "RemoveContainer" containerID="d3c62812cc251143e4c8eea60ca6dbd6a5e40ba07d0ff37780eed9c691086b04" Dec 06 09:10:26 crc kubenswrapper[4954]: I1206 09:10:26.861641 4954 scope.go:117] "RemoveContainer" containerID="2653f73701e96675d3a7d4833b20eb482836944b84f5717ca05bdfe1f5de6196" Dec 06 09:10:26 crc kubenswrapper[4954]: I1206 09:10:26.887871 4954 scope.go:117] "RemoveContainer" containerID="428d38a6aea45d706de5c44686df6bc13d8dcdef2a6d045755f54720bb726c17" Dec 06 09:10:27 crc kubenswrapper[4954]: I1206 09:10:27.459170 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edc9b245-21ff-4118-8725-2c0ad2eedf7c" path="/var/lib/kubelet/pods/edc9b245-21ff-4118-8725-2c0ad2eedf7c/volumes" Dec 06 09:10:34 crc kubenswrapper[4954]: I1206 09:10:34.161924 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" event={"ID":"e11e817c-9bd1-4276-b2ec-c7732d9950ff","Type":"ContainerStarted","Data":"1ab78f977dc494d39b2751753b2145088c2d4d22154d61cbb0583aff375902f1"} Dec 06 09:10:34 crc kubenswrapper[4954]: I1206 09:10:34.164325 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" event={"ID":"05fb018b-abd2-4aec-8e4f-7705beb33bf6","Type":"ContainerStarted","Data":"0e7140bc8c53d01483ea44be712435f136e56c3ef04cc40efc9ac14fa1de7cec"} Dec 06 09:10:34 crc kubenswrapper[4954]: I1206 09:10:34.196175 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" podStartSLOduration=3.464377683 podStartE2EDuration="12.196151939s" podCreationTimestamp="2025-12-06 09:10:22 +0000 UTC" firstStartedPulling="2025-12-06 09:10:24.654139717 +0000 UTC m=+7999.467499116" lastFinishedPulling="2025-12-06 09:10:33.385913983 +0000 UTC m=+8008.199273372" observedRunningTime="2025-12-06 09:10:34.176510442 +0000 UTC m=+8008.989869831" watchObservedRunningTime="2025-12-06 09:10:34.196151939 +0000 UTC m=+8009.009511348" Dec 06 09:10:34 crc kubenswrapper[4954]: I1206 09:10:34.207502 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" podStartSLOduration=2.902282854 podStartE2EDuration="12.207481632s" podCreationTimestamp="2025-12-06 09:10:22 +0000 UTC" firstStartedPulling="2025-12-06 09:10:24.037887038 +0000 UTC m=+7998.851246427" lastFinishedPulling="2025-12-06 09:10:33.343085816 +0000 UTC m=+8008.156445205" observedRunningTime="2025-12-06 09:10:34.191525655 +0000 UTC m=+8009.004885054" watchObservedRunningTime="2025-12-06 09:10:34.207481632 +0000 UTC m=+8009.020841021" Dec 06 09:10:43 crc kubenswrapper[4954]: I1206 09:10:43.248026 4954 generic.go:334] "Generic (PLEG): container finished" podID="e11e817c-9bd1-4276-b2ec-c7732d9950ff" containerID="1ab78f977dc494d39b2751753b2145088c2d4d22154d61cbb0583aff375902f1" exitCode=0 Dec 06 09:10:43 crc kubenswrapper[4954]: I1206 09:10:43.248083 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" event={"ID":"e11e817c-9bd1-4276-b2ec-c7732d9950ff","Type":"ContainerDied","Data":"1ab78f977dc494d39b2751753b2145088c2d4d22154d61cbb0583aff375902f1"} Dec 06 09:10:44 crc kubenswrapper[4954]: I1206 09:10:44.260719 4954 generic.go:334] "Generic (PLEG): container finished" podID="05fb018b-abd2-4aec-8e4f-7705beb33bf6" containerID="0e7140bc8c53d01483ea44be712435f136e56c3ef04cc40efc9ac14fa1de7cec" exitCode=0 Dec 06 09:10:44 crc kubenswrapper[4954]: I1206 09:10:44.260819 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" event={"ID":"05fb018b-abd2-4aec-8e4f-7705beb33bf6","Type":"ContainerDied","Data":"0e7140bc8c53d01483ea44be712435f136e56c3ef04cc40efc9ac14fa1de7cec"} Dec 06 09:10:44 crc kubenswrapper[4954]: I1206 09:10:44.727490 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" Dec 06 09:10:44 crc kubenswrapper[4954]: I1206 09:10:44.854253 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrbvv\" (UniqueName: \"kubernetes.io/projected/e11e817c-9bd1-4276-b2ec-c7732d9950ff-kube-api-access-rrbvv\") pod \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\" (UID: \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\") " Dec 06 09:10:44 crc kubenswrapper[4954]: I1206 09:10:44.854372 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e11e817c-9bd1-4276-b2ec-c7732d9950ff-pre-adoption-validation-combined-ca-bundle\") pod \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\" (UID: \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\") " Dec 06 09:10:44 crc kubenswrapper[4954]: I1206 09:10:44.854519 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e11e817c-9bd1-4276-b2ec-c7732d9950ff-ssh-key\") pod \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\" (UID: \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\") " Dec 06 09:10:44 crc kubenswrapper[4954]: I1206 09:10:44.854541 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e11e817c-9bd1-4276-b2ec-c7732d9950ff-inventory\") pod \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\" (UID: \"e11e817c-9bd1-4276-b2ec-c7732d9950ff\") " Dec 06 09:10:44 crc kubenswrapper[4954]: I1206 09:10:44.859819 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e11e817c-9bd1-4276-b2ec-c7732d9950ff-kube-api-access-rrbvv" (OuterVolumeSpecName: "kube-api-access-rrbvv") pod "e11e817c-9bd1-4276-b2ec-c7732d9950ff" (UID: "e11e817c-9bd1-4276-b2ec-c7732d9950ff"). InnerVolumeSpecName "kube-api-access-rrbvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:10:44 crc kubenswrapper[4954]: I1206 09:10:44.860851 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e11e817c-9bd1-4276-b2ec-c7732d9950ff-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "e11e817c-9bd1-4276-b2ec-c7732d9950ff" (UID: "e11e817c-9bd1-4276-b2ec-c7732d9950ff"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:10:44 crc kubenswrapper[4954]: I1206 09:10:44.885164 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e11e817c-9bd1-4276-b2ec-c7732d9950ff-inventory" (OuterVolumeSpecName: "inventory") pod "e11e817c-9bd1-4276-b2ec-c7732d9950ff" (UID: "e11e817c-9bd1-4276-b2ec-c7732d9950ff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:10:44 crc kubenswrapper[4954]: I1206 09:10:44.892788 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e11e817c-9bd1-4276-b2ec-c7732d9950ff-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e11e817c-9bd1-4276-b2ec-c7732d9950ff" (UID: "e11e817c-9bd1-4276-b2ec-c7732d9950ff"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:10:44 crc kubenswrapper[4954]: I1206 09:10:44.957190 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrbvv\" (UniqueName: \"kubernetes.io/projected/e11e817c-9bd1-4276-b2ec-c7732d9950ff-kube-api-access-rrbvv\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:44 crc kubenswrapper[4954]: I1206 09:10:44.957230 4954 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e11e817c-9bd1-4276-b2ec-c7732d9950ff-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:44 crc kubenswrapper[4954]: I1206 09:10:44.957240 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e11e817c-9bd1-4276-b2ec-c7732d9950ff-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:44 crc kubenswrapper[4954]: I1206 09:10:44.957248 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e11e817c-9bd1-4276-b2ec-c7732d9950ff-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:45 crc kubenswrapper[4954]: I1206 09:10:45.276095 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" event={"ID":"e11e817c-9bd1-4276-b2ec-c7732d9950ff","Type":"ContainerDied","Data":"e8c5105424d572bb5ff0b46552d89bdc460575a2467ed7ab698bb133ed560078"} Dec 06 09:10:45 crc kubenswrapper[4954]: I1206 09:10:45.276162 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8c5105424d572bb5ff0b46552d89bdc460575a2467ed7ab698bb133ed560078" Dec 06 09:10:45 crc kubenswrapper[4954]: I1206 09:10:45.276288 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v" Dec 06 09:10:45 crc kubenswrapper[4954]: I1206 09:10:45.734492 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" Dec 06 09:10:45 crc kubenswrapper[4954]: I1206 09:10:45.873967 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05fb018b-abd2-4aec-8e4f-7705beb33bf6-inventory\") pod \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\" (UID: \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\") " Dec 06 09:10:45 crc kubenswrapper[4954]: I1206 09:10:45.874106 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05fb018b-abd2-4aec-8e4f-7705beb33bf6-ssh-key\") pod \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\" (UID: \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\") " Dec 06 09:10:45 crc kubenswrapper[4954]: I1206 09:10:45.874183 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kq95\" (UniqueName: \"kubernetes.io/projected/05fb018b-abd2-4aec-8e4f-7705beb33bf6-kube-api-access-6kq95\") pod \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\" (UID: \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\") " Dec 06 09:10:45 crc kubenswrapper[4954]: I1206 09:10:45.874261 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fb018b-abd2-4aec-8e4f-7705beb33bf6-pre-adoption-validation-combined-ca-bundle\") pod \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\" (UID: \"05fb018b-abd2-4aec-8e4f-7705beb33bf6\") " Dec 06 09:10:45 crc kubenswrapper[4954]: I1206 09:10:45.878494 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fb018b-abd2-4aec-8e4f-7705beb33bf6-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "05fb018b-abd2-4aec-8e4f-7705beb33bf6" (UID: "05fb018b-abd2-4aec-8e4f-7705beb33bf6"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:10:45 crc kubenswrapper[4954]: I1206 09:10:45.878978 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05fb018b-abd2-4aec-8e4f-7705beb33bf6-kube-api-access-6kq95" (OuterVolumeSpecName: "kube-api-access-6kq95") pod "05fb018b-abd2-4aec-8e4f-7705beb33bf6" (UID: "05fb018b-abd2-4aec-8e4f-7705beb33bf6"). InnerVolumeSpecName "kube-api-access-6kq95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:10:45 crc kubenswrapper[4954]: I1206 09:10:45.904549 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fb018b-abd2-4aec-8e4f-7705beb33bf6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "05fb018b-abd2-4aec-8e4f-7705beb33bf6" (UID: "05fb018b-abd2-4aec-8e4f-7705beb33bf6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:10:45 crc kubenswrapper[4954]: I1206 09:10:45.905740 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fb018b-abd2-4aec-8e4f-7705beb33bf6-inventory" (OuterVolumeSpecName: "inventory") pod "05fb018b-abd2-4aec-8e4f-7705beb33bf6" (UID: "05fb018b-abd2-4aec-8e4f-7705beb33bf6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:10:45 crc kubenswrapper[4954]: I1206 09:10:45.976586 4954 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fb018b-abd2-4aec-8e4f-7705beb33bf6-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:45 crc kubenswrapper[4954]: I1206 09:10:45.976622 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05fb018b-abd2-4aec-8e4f-7705beb33bf6-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:45 crc kubenswrapper[4954]: I1206 09:10:45.976636 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05fb018b-abd2-4aec-8e4f-7705beb33bf6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:45 crc kubenswrapper[4954]: I1206 09:10:45.976678 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kq95\" (UniqueName: \"kubernetes.io/projected/05fb018b-abd2-4aec-8e4f-7705beb33bf6-kube-api-access-6kq95\") on node \"crc\" DevicePath \"\"" Dec 06 09:10:46 crc kubenswrapper[4954]: I1206 09:10:46.304317 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" event={"ID":"05fb018b-abd2-4aec-8e4f-7705beb33bf6","Type":"ContainerDied","Data":"98385440ba3191e388a129f97f4442e5de695d3a904277c1fecca91fab56b231"} Dec 06 09:10:46 crc kubenswrapper[4954]: I1206 09:10:46.306549 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98385440ba3191e388a129f97f4442e5de695d3a904277c1fecca91fab56b231" Dec 06 09:10:46 crc kubenswrapper[4954]: I1206 09:10:46.304686 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.009705 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8"] Dec 06 09:10:51 crc kubenswrapper[4954]: E1206 09:10:51.010703 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05fb018b-abd2-4aec-8e4f-7705beb33bf6" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.010716 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="05fb018b-abd2-4aec-8e4f-7705beb33bf6" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 06 09:10:51 crc kubenswrapper[4954]: E1206 09:10:51.010752 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e11e817c-9bd1-4276-b2ec-c7732d9950ff" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-networ" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.010759 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e11e817c-9bd1-4276-b2ec-c7732d9950ff" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-networ" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.010966 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="e11e817c-9bd1-4276-b2ec-c7732d9950ff" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-networ" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.010989 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="05fb018b-abd2-4aec-8e4f-7705beb33bf6" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.011813 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.021729 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.022754 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8"] Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.026542 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.026821 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.026843 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.036061 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846"] Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.037594 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.042960 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.044049 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-qwx8g" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.065353 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846"] Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.180757 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/310d8011-2558-4976-a3f3-0c28d9ea366f-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846\" (UID: \"310d8011-2558-4976-a3f3-0c28d9ea366f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.180871 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwrgg\" (UniqueName: \"kubernetes.io/projected/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-kube-api-access-rwrgg\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8\" (UID: \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.180962 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/310d8011-2558-4976-a3f3-0c28d9ea366f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846\" (UID: \"310d8011-2558-4976-a3f3-0c28d9ea366f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.181006 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8\" (UID: \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.181414 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whwsd\" (UniqueName: \"kubernetes.io/projected/310d8011-2558-4976-a3f3-0c28d9ea366f-kube-api-access-whwsd\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846\" (UID: \"310d8011-2558-4976-a3f3-0c28d9ea366f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.181513 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8\" (UID: \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.181791 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8\" (UID: \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.182111 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/310d8011-2558-4976-a3f3-0c28d9ea366f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846\" (UID: \"310d8011-2558-4976-a3f3-0c28d9ea366f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.283616 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/310d8011-2558-4976-a3f3-0c28d9ea366f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846\" (UID: \"310d8011-2558-4976-a3f3-0c28d9ea366f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.283682 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8\" (UID: \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.283786 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whwsd\" (UniqueName: \"kubernetes.io/projected/310d8011-2558-4976-a3f3-0c28d9ea366f-kube-api-access-whwsd\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846\" (UID: \"310d8011-2558-4976-a3f3-0c28d9ea366f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.283815 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8\" (UID: \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.283870 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8\" (UID: \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.283906 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/310d8011-2558-4976-a3f3-0c28d9ea366f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846\" (UID: \"310d8011-2558-4976-a3f3-0c28d9ea366f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.283969 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/310d8011-2558-4976-a3f3-0c28d9ea366f-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846\" (UID: \"310d8011-2558-4976-a3f3-0c28d9ea366f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.284028 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwrgg\" (UniqueName: \"kubernetes.io/projected/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-kube-api-access-rwrgg\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8\" (UID: \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.290582 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8\" (UID: \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.290660 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/310d8011-2558-4976-a3f3-0c28d9ea366f-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846\" (UID: \"310d8011-2558-4976-a3f3-0c28d9ea366f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.292070 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8\" (UID: \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.292457 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/310d8011-2558-4976-a3f3-0c28d9ea366f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846\" (UID: \"310d8011-2558-4976-a3f3-0c28d9ea366f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.295444 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/310d8011-2558-4976-a3f3-0c28d9ea366f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846\" (UID: \"310d8011-2558-4976-a3f3-0c28d9ea366f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.298110 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8\" (UID: \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.300835 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whwsd\" (UniqueName: \"kubernetes.io/projected/310d8011-2558-4976-a3f3-0c28d9ea366f-kube-api-access-whwsd\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846\" (UID: \"310d8011-2558-4976-a3f3-0c28d9ea366f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.303006 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwrgg\" (UniqueName: \"kubernetes.io/projected/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-kube-api-access-rwrgg\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8\" (UID: \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.346283 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.361441 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" Dec 06 09:10:51 crc kubenswrapper[4954]: I1206 09:10:51.949870 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8"] Dec 06 09:10:52 crc kubenswrapper[4954]: I1206 09:10:52.047005 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846"] Dec 06 09:10:52 crc kubenswrapper[4954]: W1206 09:10:52.049121 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod310d8011_2558_4976_a3f3_0c28d9ea366f.slice/crio-09a5b2b3d0efc2d80d28cc3fe3e94e9b15867730039ad2092bbafa55640fa636 WatchSource:0}: Error finding container 09a5b2b3d0efc2d80d28cc3fe3e94e9b15867730039ad2092bbafa55640fa636: Status 404 returned error can't find the container with id 09a5b2b3d0efc2d80d28cc3fe3e94e9b15867730039ad2092bbafa55640fa636 Dec 06 09:10:52 crc kubenswrapper[4954]: I1206 09:10:52.365451 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" event={"ID":"310d8011-2558-4976-a3f3-0c28d9ea366f","Type":"ContainerStarted","Data":"09a5b2b3d0efc2d80d28cc3fe3e94e9b15867730039ad2092bbafa55640fa636"} Dec 06 09:10:52 crc kubenswrapper[4954]: I1206 09:10:52.368022 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" event={"ID":"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8","Type":"ContainerStarted","Data":"37d5a891b9858e68c519049c9139149403371f1c3493e2c08ae63743c20c27fa"} Dec 06 09:10:53 crc kubenswrapper[4954]: I1206 09:10:53.381060 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" event={"ID":"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8","Type":"ContainerStarted","Data":"d5dbdd02edaa06c1b8ae66a48a21f3d3b83b82297314b53959c3675d982d9fc3"} Dec 06 09:10:53 crc kubenswrapper[4954]: I1206 09:10:53.385140 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" event={"ID":"310d8011-2558-4976-a3f3-0c28d9ea366f","Type":"ContainerStarted","Data":"ca4925b80f9fad953c3414ba61c268d82ee7ddbc6533c49f69ed92592ac7ebcd"} Dec 06 09:10:53 crc kubenswrapper[4954]: I1206 09:10:53.408967 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" podStartSLOduration=2.987695598 podStartE2EDuration="3.408947593s" podCreationTimestamp="2025-12-06 09:10:50 +0000 UTC" firstStartedPulling="2025-12-06 09:10:51.952977699 +0000 UTC m=+8026.766337088" lastFinishedPulling="2025-12-06 09:10:52.374229654 +0000 UTC m=+8027.187589083" observedRunningTime="2025-12-06 09:10:53.403491337 +0000 UTC m=+8028.216850736" watchObservedRunningTime="2025-12-06 09:10:53.408947593 +0000 UTC m=+8028.222306982" Dec 06 09:10:53 crc kubenswrapper[4954]: I1206 09:10:53.426881 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" podStartSLOduration=3.003186613 podStartE2EDuration="3.426856203s" podCreationTimestamp="2025-12-06 09:10:50 +0000 UTC" firstStartedPulling="2025-12-06 09:10:52.050495171 +0000 UTC m=+8026.863854560" lastFinishedPulling="2025-12-06 09:10:52.474164761 +0000 UTC m=+8027.287524150" observedRunningTime="2025-12-06 09:10:53.419497926 +0000 UTC m=+8028.232857335" watchObservedRunningTime="2025-12-06 09:10:53.426856203 +0000 UTC m=+8028.240215592" Dec 06 09:10:57 crc kubenswrapper[4954]: I1206 09:10:57.050419 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-h75rt"] Dec 06 09:10:57 crc kubenswrapper[4954]: I1206 09:10:57.060088 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-h75rt"] Dec 06 09:10:57 crc kubenswrapper[4954]: I1206 09:10:57.070119 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-856b-account-create-update-tftj9"] Dec 06 09:10:57 crc kubenswrapper[4954]: I1206 09:10:57.078396 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-856b-account-create-update-tftj9"] Dec 06 09:10:57 crc kubenswrapper[4954]: I1206 09:10:57.456179 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a375ec95-2316-4971-9b5e-757088bfaa3a" path="/var/lib/kubelet/pods/a375ec95-2316-4971-9b5e-757088bfaa3a/volumes" Dec 06 09:10:57 crc kubenswrapper[4954]: I1206 09:10:57.458006 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6670e88-e4e3-47e7-988c-55a8d6fc868f" path="/var/lib/kubelet/pods/a6670e88-e4e3-47e7-988c-55a8d6fc868f/volumes" Dec 06 09:11:07 crc kubenswrapper[4954]: I1206 09:11:07.033074 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-xfwfr"] Dec 06 09:11:07 crc kubenswrapper[4954]: I1206 09:11:07.043944 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-xfwfr"] Dec 06 09:11:07 crc kubenswrapper[4954]: I1206 09:11:07.454349 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d9c4efc-2261-4ee6-8de1-ff2ee393fa77" path="/var/lib/kubelet/pods/9d9c4efc-2261-4ee6-8de1-ff2ee393fa77/volumes" Dec 06 09:11:27 crc kubenswrapper[4954]: I1206 09:11:27.086336 4954 scope.go:117] "RemoveContainer" containerID="a1be0a810e0b1182c5cdee9415d159dfb3145088f0e197f52835e596f783ae2f" Dec 06 09:11:27 crc kubenswrapper[4954]: I1206 09:11:27.122165 4954 scope.go:117] "RemoveContainer" containerID="6111d82da607d255c25b89312b5b098b5efc229dee3dcb54ddb20ff558778520" Dec 06 09:11:27 crc kubenswrapper[4954]: I1206 09:11:27.190365 4954 scope.go:117] "RemoveContainer" containerID="51ba5f6bebaee456a005a086ab9cbccc59287c82b90049fce52c6ac82e1708c5" Dec 06 09:11:27 crc kubenswrapper[4954]: I1206 09:11:27.211843 4954 scope.go:117] "RemoveContainer" containerID="50410ebca16e2200383c361a425278479d427a64e32e8c2c02ea0931bc60aa62" Dec 06 09:11:27 crc kubenswrapper[4954]: I1206 09:11:27.402929 4954 scope.go:117] "RemoveContainer" containerID="71d657c037805e8342ad47387d7eed5887aeef88e929425e622e33b9dd67c89b" Dec 06 09:11:27 crc kubenswrapper[4954]: I1206 09:11:27.438295 4954 scope.go:117] "RemoveContainer" containerID="baebdeff029d20859471674ca23a4036db0ec8cd584b13a62b8e1923fef26993" Dec 06 09:12:07 crc kubenswrapper[4954]: I1206 09:12:07.055173 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-c5cvb"] Dec 06 09:12:07 crc kubenswrapper[4954]: I1206 09:12:07.069696 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-f156-account-create-update-vxs99"] Dec 06 09:12:07 crc kubenswrapper[4954]: I1206 09:12:07.082129 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-x24tp"] Dec 06 09:12:07 crc kubenswrapper[4954]: I1206 09:12:07.091676 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-2srl5"] Dec 06 09:12:07 crc kubenswrapper[4954]: I1206 09:12:07.104613 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-f156-account-create-update-vxs99"] Dec 06 09:12:07 crc kubenswrapper[4954]: I1206 09:12:07.129427 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-c5cvb"] Dec 06 09:12:07 crc kubenswrapper[4954]: I1206 09:12:07.144798 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-2srl5"] Dec 06 09:12:07 crc kubenswrapper[4954]: I1206 09:12:07.158209 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-x24tp"] Dec 06 09:12:07 crc kubenswrapper[4954]: I1206 09:12:07.462510 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0283d717-473b-41df-a35e-dc69e47c185f" path="/var/lib/kubelet/pods/0283d717-473b-41df-a35e-dc69e47c185f/volumes" Dec 06 09:12:07 crc kubenswrapper[4954]: I1206 09:12:07.463897 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="100c0102-825d-4f1c-9a32-511f7ef4081c" path="/var/lib/kubelet/pods/100c0102-825d-4f1c-9a32-511f7ef4081c/volumes" Dec 06 09:12:07 crc kubenswrapper[4954]: I1206 09:12:07.465110 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91b65e94-c4ec-45b8-b20f-9369a89280fd" path="/var/lib/kubelet/pods/91b65e94-c4ec-45b8-b20f-9369a89280fd/volumes" Dec 06 09:12:07 crc kubenswrapper[4954]: I1206 09:12:07.466405 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb30f1cd-3245-4e8e-a852-ef58699dff5f" path="/var/lib/kubelet/pods/bb30f1cd-3245-4e8e-a852-ef58699dff5f/volumes" Dec 06 09:12:08 crc kubenswrapper[4954]: I1206 09:12:08.039216 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-05ea-account-create-update-lrhlm"] Dec 06 09:12:08 crc kubenswrapper[4954]: I1206 09:12:08.056544 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4263-account-create-update-6l4p9"] Dec 06 09:12:08 crc kubenswrapper[4954]: I1206 09:12:08.070850 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-05ea-account-create-update-lrhlm"] Dec 06 09:12:08 crc kubenswrapper[4954]: I1206 09:12:08.082659 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4263-account-create-update-6l4p9"] Dec 06 09:12:09 crc kubenswrapper[4954]: I1206 09:12:09.453530 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17477a55-543f-4561-bbae-101a1fba05d9" path="/var/lib/kubelet/pods/17477a55-543f-4561-bbae-101a1fba05d9/volumes" Dec 06 09:12:09 crc kubenswrapper[4954]: I1206 09:12:09.454646 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9836498d-4b30-4325-9cdc-0cfe5c3073fc" path="/var/lib/kubelet/pods/9836498d-4b30-4325-9cdc-0cfe5c3073fc/volumes" Dec 06 09:12:26 crc kubenswrapper[4954]: I1206 09:12:26.041068 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lqx47"] Dec 06 09:12:26 crc kubenswrapper[4954]: I1206 09:12:26.057127 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lqx47"] Dec 06 09:12:27 crc kubenswrapper[4954]: I1206 09:12:27.455579 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2346b393-bbbb-4f4f-a056-e1697ea8428b" path="/var/lib/kubelet/pods/2346b393-bbbb-4f4f-a056-e1697ea8428b/volumes" Dec 06 09:12:27 crc kubenswrapper[4954]: I1206 09:12:27.573987 4954 scope.go:117] "RemoveContainer" containerID="a7b3ebef671a3fe78d828715d13e643b24ca3d4717dd6da4b2b4e1eb74c2da22" Dec 06 09:12:27 crc kubenswrapper[4954]: I1206 09:12:27.631510 4954 scope.go:117] "RemoveContainer" containerID="82fc39882aa341485e8cd47169ef8d68727bd6f788de4fb8609efe635d7aa517" Dec 06 09:12:27 crc kubenswrapper[4954]: I1206 09:12:27.658420 4954 scope.go:117] "RemoveContainer" containerID="db03219069090717bb2cb3d4922c829147b65a42f4dab9649baea8a1b8503d1f" Dec 06 09:12:27 crc kubenswrapper[4954]: I1206 09:12:27.712238 4954 scope.go:117] "RemoveContainer" containerID="add6d48afa13434d0f9e3087e5162265fd3fcb84c411bd80c1b6000c2ade33ad" Dec 06 09:12:27 crc kubenswrapper[4954]: I1206 09:12:27.763320 4954 scope.go:117] "RemoveContainer" containerID="93cccc4cd7a08fd67bebb0ab739f3db92d3d4902537f1b9b2e1d69e710f0778e" Dec 06 09:12:27 crc kubenswrapper[4954]: I1206 09:12:27.827116 4954 scope.go:117] "RemoveContainer" containerID="12c5ed624d5e2f89166eade9f0fed510b6a70994b42c161aea7651baa6f59420" Dec 06 09:12:27 crc kubenswrapper[4954]: I1206 09:12:27.874145 4954 scope.go:117] "RemoveContainer" containerID="45bd7f9435726a94de00386b4c7d07760546b063634963611255646eb6ab28ca" Dec 06 09:12:27 crc kubenswrapper[4954]: I1206 09:12:27.913062 4954 scope.go:117] "RemoveContainer" containerID="827d60d6e470d3fdbf0b8c088055c0c89edc93d7017da2deb4dd6ad0da54697b" Dec 06 09:12:27 crc kubenswrapper[4954]: I1206 09:12:27.932270 4954 scope.go:117] "RemoveContainer" containerID="6c5f824e827abaa732aed26fc27f9076dacdab6c2af1a424276b3032c33a485d" Dec 06 09:12:40 crc kubenswrapper[4954]: I1206 09:12:40.100902 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:12:40 crc kubenswrapper[4954]: I1206 09:12:40.101351 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:12:47 crc kubenswrapper[4954]: I1206 09:12:47.026087 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6xj9l"] Dec 06 09:12:47 crc kubenswrapper[4954]: I1206 09:12:47.041301 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-nrs7z"] Dec 06 09:12:47 crc kubenswrapper[4954]: I1206 09:12:47.055405 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-nrs7z"] Dec 06 09:12:47 crc kubenswrapper[4954]: I1206 09:12:47.071882 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6xj9l"] Dec 06 09:12:47 crc kubenswrapper[4954]: I1206 09:12:47.455319 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c55851b-2db1-4dce-98e8-a7e3dcb41190" path="/var/lib/kubelet/pods/5c55851b-2db1-4dce-98e8-a7e3dcb41190/volumes" Dec 06 09:12:47 crc kubenswrapper[4954]: I1206 09:12:47.455945 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca4ca0d9-e86a-46f5-941a-b692d62ab2e6" path="/var/lib/kubelet/pods/ca4ca0d9-e86a-46f5-941a-b692d62ab2e6/volumes" Dec 06 09:13:06 crc kubenswrapper[4954]: I1206 09:13:06.044929 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-nvnws"] Dec 06 09:13:06 crc kubenswrapper[4954]: I1206 09:13:06.054742 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-nvnws"] Dec 06 09:13:07 crc kubenswrapper[4954]: I1206 09:13:07.456551 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44770c67-121a-4d30-8915-bd5083f9084b" path="/var/lib/kubelet/pods/44770c67-121a-4d30-8915-bd5083f9084b/volumes" Dec 06 09:13:10 crc kubenswrapper[4954]: I1206 09:13:10.100967 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:13:10 crc kubenswrapper[4954]: I1206 09:13:10.101456 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:13:28 crc kubenswrapper[4954]: I1206 09:13:28.093751 4954 scope.go:117] "RemoveContainer" containerID="ec5e22c416a70cb7e7195a71443f7142d49975c23ed8c4a0762e1db1c661a7dc" Dec 06 09:13:28 crc kubenswrapper[4954]: I1206 09:13:28.144756 4954 scope.go:117] "RemoveContainer" containerID="8bf150aa0019c3f84a3929dc1735d31e66ed2b18bac87cf3ef22153533367525" Dec 06 09:13:28 crc kubenswrapper[4954]: I1206 09:13:28.179222 4954 scope.go:117] "RemoveContainer" containerID="9635d8deddc78aea91f3844e877dd66ab05b48a040214ccd70d712cddc555dfe" Dec 06 09:13:40 crc kubenswrapper[4954]: I1206 09:13:40.100914 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:13:40 crc kubenswrapper[4954]: I1206 09:13:40.101456 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:13:40 crc kubenswrapper[4954]: I1206 09:13:40.101503 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 09:13:40 crc kubenswrapper[4954]: I1206 09:13:40.102339 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2bf4b8e7101728e942e6ac1740eae0d4c808fde87f7933834d9965f08347570b"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:13:40 crc kubenswrapper[4954]: I1206 09:13:40.102394 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://2bf4b8e7101728e942e6ac1740eae0d4c808fde87f7933834d9965f08347570b" gracePeriod=600 Dec 06 09:13:41 crc kubenswrapper[4954]: I1206 09:13:41.070447 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="2bf4b8e7101728e942e6ac1740eae0d4c808fde87f7933834d9965f08347570b" exitCode=0 Dec 06 09:13:41 crc kubenswrapper[4954]: I1206 09:13:41.070557 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"2bf4b8e7101728e942e6ac1740eae0d4c808fde87f7933834d9965f08347570b"} Dec 06 09:13:41 crc kubenswrapper[4954]: I1206 09:13:41.071069 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173"} Dec 06 09:13:41 crc kubenswrapper[4954]: I1206 09:13:41.071103 4954 scope.go:117] "RemoveContainer" containerID="b21a5fd2125434c672fedd6ee9d6eb284eb813a915134458e50d18a1611f0d78" Dec 06 09:14:28 crc kubenswrapper[4954]: I1206 09:14:28.334280 4954 scope.go:117] "RemoveContainer" containerID="1264ca1e79d59b1fb0e2f2c2bd1a30c290a0b63ef29a5000582f06bbfe7b37b5" Dec 06 09:14:28 crc kubenswrapper[4954]: I1206 09:14:28.365443 4954 scope.go:117] "RemoveContainer" containerID="24b722927c8d98c16e1d44cecef62b9c79b86adbf6db52c88efc05412802f800" Dec 06 09:14:28 crc kubenswrapper[4954]: I1206 09:14:28.408102 4954 scope.go:117] "RemoveContainer" containerID="6a104489772e875edc8d11eb4688bb36944a923915ef0c8e9ee52d11952916e9" Dec 06 09:14:28 crc kubenswrapper[4954]: I1206 09:14:28.468927 4954 scope.go:117] "RemoveContainer" containerID="36c95ce3c369ac0c77bd6cd14f813bbae7a0064622a257d93d3e8598acacea65" Dec 06 09:15:00 crc kubenswrapper[4954]: I1206 09:15:00.173048 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd"] Dec 06 09:15:00 crc kubenswrapper[4954]: I1206 09:15:00.175230 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd" Dec 06 09:15:00 crc kubenswrapper[4954]: I1206 09:15:00.178152 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 09:15:00 crc kubenswrapper[4954]: I1206 09:15:00.178665 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 09:15:00 crc kubenswrapper[4954]: I1206 09:15:00.184451 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd"] Dec 06 09:15:00 crc kubenswrapper[4954]: I1206 09:15:00.339487 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft6m7\" (UniqueName: \"kubernetes.io/projected/5e64b559-f2f9-42d5-9c45-bbdab57efb99-kube-api-access-ft6m7\") pod \"collect-profiles-29416875-p2cwd\" (UID: \"5e64b559-f2f9-42d5-9c45-bbdab57efb99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd" Dec 06 09:15:00 crc kubenswrapper[4954]: I1206 09:15:00.339549 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e64b559-f2f9-42d5-9c45-bbdab57efb99-secret-volume\") pod \"collect-profiles-29416875-p2cwd\" (UID: \"5e64b559-f2f9-42d5-9c45-bbdab57efb99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd" Dec 06 09:15:00 crc kubenswrapper[4954]: I1206 09:15:00.339655 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e64b559-f2f9-42d5-9c45-bbdab57efb99-config-volume\") pod \"collect-profiles-29416875-p2cwd\" (UID: \"5e64b559-f2f9-42d5-9c45-bbdab57efb99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd" Dec 06 09:15:00 crc kubenswrapper[4954]: I1206 09:15:00.442243 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft6m7\" (UniqueName: \"kubernetes.io/projected/5e64b559-f2f9-42d5-9c45-bbdab57efb99-kube-api-access-ft6m7\") pod \"collect-profiles-29416875-p2cwd\" (UID: \"5e64b559-f2f9-42d5-9c45-bbdab57efb99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd" Dec 06 09:15:00 crc kubenswrapper[4954]: I1206 09:15:00.442327 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e64b559-f2f9-42d5-9c45-bbdab57efb99-secret-volume\") pod \"collect-profiles-29416875-p2cwd\" (UID: \"5e64b559-f2f9-42d5-9c45-bbdab57efb99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd" Dec 06 09:15:00 crc kubenswrapper[4954]: I1206 09:15:00.442369 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e64b559-f2f9-42d5-9c45-bbdab57efb99-config-volume\") pod \"collect-profiles-29416875-p2cwd\" (UID: \"5e64b559-f2f9-42d5-9c45-bbdab57efb99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd" Dec 06 09:15:00 crc kubenswrapper[4954]: I1206 09:15:00.443436 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e64b559-f2f9-42d5-9c45-bbdab57efb99-config-volume\") pod \"collect-profiles-29416875-p2cwd\" (UID: \"5e64b559-f2f9-42d5-9c45-bbdab57efb99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd" Dec 06 09:15:00 crc kubenswrapper[4954]: I1206 09:15:00.452438 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e64b559-f2f9-42d5-9c45-bbdab57efb99-secret-volume\") pod \"collect-profiles-29416875-p2cwd\" (UID: \"5e64b559-f2f9-42d5-9c45-bbdab57efb99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd" Dec 06 09:15:00 crc kubenswrapper[4954]: I1206 09:15:00.458539 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft6m7\" (UniqueName: \"kubernetes.io/projected/5e64b559-f2f9-42d5-9c45-bbdab57efb99-kube-api-access-ft6m7\") pod \"collect-profiles-29416875-p2cwd\" (UID: \"5e64b559-f2f9-42d5-9c45-bbdab57efb99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd" Dec 06 09:15:00 crc kubenswrapper[4954]: I1206 09:15:00.504289 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd" Dec 06 09:15:00 crc kubenswrapper[4954]: I1206 09:15:00.958647 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd"] Dec 06 09:15:01 crc kubenswrapper[4954]: I1206 09:15:01.021788 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd" event={"ID":"5e64b559-f2f9-42d5-9c45-bbdab57efb99","Type":"ContainerStarted","Data":"d967a5fbdbd505799fe6742ed86736f149dc313c29a88431b3f888fac8f5d08b"} Dec 06 09:15:02 crc kubenswrapper[4954]: I1206 09:15:02.034083 4954 generic.go:334] "Generic (PLEG): container finished" podID="5e64b559-f2f9-42d5-9c45-bbdab57efb99" containerID="a63ac1296a8ae2835867f922d1c0a55b83dffbea4307f1a1c5e243b18dd445a9" exitCode=0 Dec 06 09:15:02 crc kubenswrapper[4954]: I1206 09:15:02.034131 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd" event={"ID":"5e64b559-f2f9-42d5-9c45-bbdab57efb99","Type":"ContainerDied","Data":"a63ac1296a8ae2835867f922d1c0a55b83dffbea4307f1a1c5e243b18dd445a9"} Dec 06 09:15:03 crc kubenswrapper[4954]: I1206 09:15:03.422368 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd" Dec 06 09:15:03 crc kubenswrapper[4954]: I1206 09:15:03.509420 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e64b559-f2f9-42d5-9c45-bbdab57efb99-config-volume\") pod \"5e64b559-f2f9-42d5-9c45-bbdab57efb99\" (UID: \"5e64b559-f2f9-42d5-9c45-bbdab57efb99\") " Dec 06 09:15:03 crc kubenswrapper[4954]: I1206 09:15:03.509574 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e64b559-f2f9-42d5-9c45-bbdab57efb99-secret-volume\") pod \"5e64b559-f2f9-42d5-9c45-bbdab57efb99\" (UID: \"5e64b559-f2f9-42d5-9c45-bbdab57efb99\") " Dec 06 09:15:03 crc kubenswrapper[4954]: I1206 09:15:03.509808 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft6m7\" (UniqueName: \"kubernetes.io/projected/5e64b559-f2f9-42d5-9c45-bbdab57efb99-kube-api-access-ft6m7\") pod \"5e64b559-f2f9-42d5-9c45-bbdab57efb99\" (UID: \"5e64b559-f2f9-42d5-9c45-bbdab57efb99\") " Dec 06 09:15:03 crc kubenswrapper[4954]: I1206 09:15:03.510219 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e64b559-f2f9-42d5-9c45-bbdab57efb99-config-volume" (OuterVolumeSpecName: "config-volume") pod "5e64b559-f2f9-42d5-9c45-bbdab57efb99" (UID: "5e64b559-f2f9-42d5-9c45-bbdab57efb99"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:15:03 crc kubenswrapper[4954]: I1206 09:15:03.515437 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e64b559-f2f9-42d5-9c45-bbdab57efb99-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5e64b559-f2f9-42d5-9c45-bbdab57efb99" (UID: "5e64b559-f2f9-42d5-9c45-bbdab57efb99"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:15:03 crc kubenswrapper[4954]: I1206 09:15:03.523497 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e64b559-f2f9-42d5-9c45-bbdab57efb99-kube-api-access-ft6m7" (OuterVolumeSpecName: "kube-api-access-ft6m7") pod "5e64b559-f2f9-42d5-9c45-bbdab57efb99" (UID: "5e64b559-f2f9-42d5-9c45-bbdab57efb99"). InnerVolumeSpecName "kube-api-access-ft6m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:15:03 crc kubenswrapper[4954]: I1206 09:15:03.613297 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft6m7\" (UniqueName: \"kubernetes.io/projected/5e64b559-f2f9-42d5-9c45-bbdab57efb99-kube-api-access-ft6m7\") on node \"crc\" DevicePath \"\"" Dec 06 09:15:03 crc kubenswrapper[4954]: I1206 09:15:03.613334 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e64b559-f2f9-42d5-9c45-bbdab57efb99-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 09:15:03 crc kubenswrapper[4954]: I1206 09:15:03.613345 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e64b559-f2f9-42d5-9c45-bbdab57efb99-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 09:15:04 crc kubenswrapper[4954]: I1206 09:15:04.056026 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd" event={"ID":"5e64b559-f2f9-42d5-9c45-bbdab57efb99","Type":"ContainerDied","Data":"d967a5fbdbd505799fe6742ed86736f149dc313c29a88431b3f888fac8f5d08b"} Dec 06 09:15:04 crc kubenswrapper[4954]: I1206 09:15:04.056112 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d967a5fbdbd505799fe6742ed86736f149dc313c29a88431b3f888fac8f5d08b" Dec 06 09:15:04 crc kubenswrapper[4954]: I1206 09:15:04.056140 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd" Dec 06 09:15:04 crc kubenswrapper[4954]: E1206 09:15:04.200839 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e64b559_f2f9_42d5_9c45_bbdab57efb99.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e64b559_f2f9_42d5_9c45_bbdab57efb99.slice/crio-d967a5fbdbd505799fe6742ed86736f149dc313c29a88431b3f888fac8f5d08b\": RecentStats: unable to find data in memory cache]" Dec 06 09:15:04 crc kubenswrapper[4954]: I1206 09:15:04.503227 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp"] Dec 06 09:15:04 crc kubenswrapper[4954]: I1206 09:15:04.515310 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416830-446qp"] Dec 06 09:15:05 crc kubenswrapper[4954]: I1206 09:15:05.458626 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="539b8ab6-bc54-459e-b9cf-a46c09e4449e" path="/var/lib/kubelet/pods/539b8ab6-bc54-459e-b9cf-a46c09e4449e/volumes" Dec 06 09:15:07 crc kubenswrapper[4954]: I1206 09:15:07.084336 4954 generic.go:334] "Generic (PLEG): container finished" podID="310d8011-2558-4976-a3f3-0c28d9ea366f" containerID="ca4925b80f9fad953c3414ba61c268d82ee7ddbc6533c49f69ed92592ac7ebcd" exitCode=0 Dec 06 09:15:07 crc kubenswrapper[4954]: I1206 09:15:07.084413 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" event={"ID":"310d8011-2558-4976-a3f3-0c28d9ea366f","Type":"ContainerDied","Data":"ca4925b80f9fad953c3414ba61c268d82ee7ddbc6533c49f69ed92592ac7ebcd"} Dec 06 09:15:08 crc kubenswrapper[4954]: I1206 09:15:08.526510 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" Dec 06 09:15:08 crc kubenswrapper[4954]: I1206 09:15:08.568701 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whwsd\" (UniqueName: \"kubernetes.io/projected/310d8011-2558-4976-a3f3-0c28d9ea366f-kube-api-access-whwsd\") pod \"310d8011-2558-4976-a3f3-0c28d9ea366f\" (UID: \"310d8011-2558-4976-a3f3-0c28d9ea366f\") " Dec 06 09:15:08 crc kubenswrapper[4954]: I1206 09:15:08.569067 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/310d8011-2558-4976-a3f3-0c28d9ea366f-tripleo-cleanup-combined-ca-bundle\") pod \"310d8011-2558-4976-a3f3-0c28d9ea366f\" (UID: \"310d8011-2558-4976-a3f3-0c28d9ea366f\") " Dec 06 09:15:08 crc kubenswrapper[4954]: I1206 09:15:08.569177 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/310d8011-2558-4976-a3f3-0c28d9ea366f-inventory\") pod \"310d8011-2558-4976-a3f3-0c28d9ea366f\" (UID: \"310d8011-2558-4976-a3f3-0c28d9ea366f\") " Dec 06 09:15:08 crc kubenswrapper[4954]: I1206 09:15:08.569228 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/310d8011-2558-4976-a3f3-0c28d9ea366f-ssh-key\") pod \"310d8011-2558-4976-a3f3-0c28d9ea366f\" (UID: \"310d8011-2558-4976-a3f3-0c28d9ea366f\") " Dec 06 09:15:08 crc kubenswrapper[4954]: I1206 09:15:08.576070 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/310d8011-2558-4976-a3f3-0c28d9ea366f-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "310d8011-2558-4976-a3f3-0c28d9ea366f" (UID: "310d8011-2558-4976-a3f3-0c28d9ea366f"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:15:08 crc kubenswrapper[4954]: I1206 09:15:08.579749 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/310d8011-2558-4976-a3f3-0c28d9ea366f-kube-api-access-whwsd" (OuterVolumeSpecName: "kube-api-access-whwsd") pod "310d8011-2558-4976-a3f3-0c28d9ea366f" (UID: "310d8011-2558-4976-a3f3-0c28d9ea366f"). InnerVolumeSpecName "kube-api-access-whwsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:15:08 crc kubenswrapper[4954]: I1206 09:15:08.601574 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/310d8011-2558-4976-a3f3-0c28d9ea366f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "310d8011-2558-4976-a3f3-0c28d9ea366f" (UID: "310d8011-2558-4976-a3f3-0c28d9ea366f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:15:08 crc kubenswrapper[4954]: I1206 09:15:08.603848 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/310d8011-2558-4976-a3f3-0c28d9ea366f-inventory" (OuterVolumeSpecName: "inventory") pod "310d8011-2558-4976-a3f3-0c28d9ea366f" (UID: "310d8011-2558-4976-a3f3-0c28d9ea366f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:15:08 crc kubenswrapper[4954]: I1206 09:15:08.671697 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whwsd\" (UniqueName: \"kubernetes.io/projected/310d8011-2558-4976-a3f3-0c28d9ea366f-kube-api-access-whwsd\") on node \"crc\" DevicePath \"\"" Dec 06 09:15:08 crc kubenswrapper[4954]: I1206 09:15:08.671734 4954 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/310d8011-2558-4976-a3f3-0c28d9ea366f-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:15:08 crc kubenswrapper[4954]: I1206 09:15:08.671748 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/310d8011-2558-4976-a3f3-0c28d9ea366f-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:15:08 crc kubenswrapper[4954]: I1206 09:15:08.671760 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/310d8011-2558-4976-a3f3-0c28d9ea366f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:15:09 crc kubenswrapper[4954]: I1206 09:15:09.105156 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" event={"ID":"310d8011-2558-4976-a3f3-0c28d9ea366f","Type":"ContainerDied","Data":"09a5b2b3d0efc2d80d28cc3fe3e94e9b15867730039ad2092bbafa55640fa636"} Dec 06 09:15:09 crc kubenswrapper[4954]: I1206 09:15:09.105212 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09a5b2b3d0efc2d80d28cc3fe3e94e9b15867730039ad2092bbafa55640fa636" Dec 06 09:15:09 crc kubenswrapper[4954]: I1206 09:15:09.105275 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846" Dec 06 09:15:28 crc kubenswrapper[4954]: I1206 09:15:28.531096 4954 scope.go:117] "RemoveContainer" containerID="a8e62d0fd2f4a7313c07c695aefff39769b1e7af592373966bb295b47168d783" Dec 06 09:15:40 crc kubenswrapper[4954]: I1206 09:15:40.101046 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:15:40 crc kubenswrapper[4954]: I1206 09:15:40.101846 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:15:48 crc kubenswrapper[4954]: I1206 09:15:48.040424 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-dlq25"] Dec 06 09:15:48 crc kubenswrapper[4954]: I1206 09:15:48.050629 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-ade3-account-create-update-s4f7m"] Dec 06 09:15:48 crc kubenswrapper[4954]: I1206 09:15:48.061915 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-dlq25"] Dec 06 09:15:48 crc kubenswrapper[4954]: I1206 09:15:48.069782 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-ade3-account-create-update-s4f7m"] Dec 06 09:15:49 crc kubenswrapper[4954]: I1206 09:15:49.472178 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ad2ae4b-d1b0-4f50-9b00-215df9ed6069" path="/var/lib/kubelet/pods/1ad2ae4b-d1b0-4f50-9b00-215df9ed6069/volumes" Dec 06 09:15:49 crc kubenswrapper[4954]: I1206 09:15:49.475951 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d43ee893-bc98-4376-9581-e44fe5e18298" path="/var/lib/kubelet/pods/d43ee893-bc98-4376-9581-e44fe5e18298/volumes" Dec 06 09:16:02 crc kubenswrapper[4954]: I1206 09:16:02.033170 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-w7d87"] Dec 06 09:16:02 crc kubenswrapper[4954]: I1206 09:16:02.044966 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-w7d87"] Dec 06 09:16:03 crc kubenswrapper[4954]: I1206 09:16:03.456887 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91de0dbc-f8be-4cf7-89fa-6cc87a075b0e" path="/var/lib/kubelet/pods/91de0dbc-f8be-4cf7-89fa-6cc87a075b0e/volumes" Dec 06 09:16:10 crc kubenswrapper[4954]: I1206 09:16:10.102006 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:16:10 crc kubenswrapper[4954]: I1206 09:16:10.102800 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:16:28 crc kubenswrapper[4954]: I1206 09:16:28.621005 4954 scope.go:117] "RemoveContainer" containerID="266c92b1584be38d42e3adeb64aa03413b15bf3ad068a7df81ab5d73661b99c0" Dec 06 09:16:28 crc kubenswrapper[4954]: I1206 09:16:28.657666 4954 scope.go:117] "RemoveContainer" containerID="002c8462f127e7c759f1f27db9d32832385db0f3a9351ad070a87405de6a551b" Dec 06 09:16:28 crc kubenswrapper[4954]: I1206 09:16:28.704357 4954 scope.go:117] "RemoveContainer" containerID="93b0346a6bb6032e22ea6b756090cdd93b8c9aa7c3382b69360beb506054efa2" Dec 06 09:16:28 crc kubenswrapper[4954]: I1206 09:16:28.750607 4954 scope.go:117] "RemoveContainer" containerID="28a9659ded76ae9307a22b44789f377783e998f20b4d4cc4d371aaaed066808e" Dec 06 09:16:28 crc kubenswrapper[4954]: I1206 09:16:28.771987 4954 scope.go:117] "RemoveContainer" containerID="c5269119b39b0da6fa77b6a1eeddbe5f9db8eb058c73553be6e40ed493806f4e" Dec 06 09:16:40 crc kubenswrapper[4954]: I1206 09:16:40.101752 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:16:40 crc kubenswrapper[4954]: I1206 09:16:40.103743 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:16:40 crc kubenswrapper[4954]: I1206 09:16:40.103818 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 09:16:40 crc kubenswrapper[4954]: I1206 09:16:40.104731 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:16:40 crc kubenswrapper[4954]: I1206 09:16:40.104808 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" gracePeriod=600 Dec 06 09:16:40 crc kubenswrapper[4954]: E1206 09:16:40.784043 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:16:41 crc kubenswrapper[4954]: I1206 09:16:41.034291 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" exitCode=0 Dec 06 09:16:41 crc kubenswrapper[4954]: I1206 09:16:41.034339 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173"} Dec 06 09:16:41 crc kubenswrapper[4954]: I1206 09:16:41.034376 4954 scope.go:117] "RemoveContainer" containerID="2bf4b8e7101728e942e6ac1740eae0d4c808fde87f7933834d9965f08347570b" Dec 06 09:16:41 crc kubenswrapper[4954]: I1206 09:16:41.035240 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:16:41 crc kubenswrapper[4954]: E1206 09:16:41.035520 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:16:56 crc kubenswrapper[4954]: I1206 09:16:56.443274 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:16:56 crc kubenswrapper[4954]: E1206 09:16:56.444041 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:17:04 crc kubenswrapper[4954]: I1206 09:17:04.773275 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nhq2c"] Dec 06 09:17:04 crc kubenswrapper[4954]: E1206 09:17:04.774088 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e64b559-f2f9-42d5-9c45-bbdab57efb99" containerName="collect-profiles" Dec 06 09:17:04 crc kubenswrapper[4954]: I1206 09:17:04.774100 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e64b559-f2f9-42d5-9c45-bbdab57efb99" containerName="collect-profiles" Dec 06 09:17:04 crc kubenswrapper[4954]: E1206 09:17:04.774139 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="310d8011-2558-4976-a3f3-0c28d9ea366f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Dec 06 09:17:04 crc kubenswrapper[4954]: I1206 09:17:04.774146 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="310d8011-2558-4976-a3f3-0c28d9ea366f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Dec 06 09:17:04 crc kubenswrapper[4954]: I1206 09:17:04.774308 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="310d8011-2558-4976-a3f3-0c28d9ea366f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Dec 06 09:17:04 crc kubenswrapper[4954]: I1206 09:17:04.774325 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e64b559-f2f9-42d5-9c45-bbdab57efb99" containerName="collect-profiles" Dec 06 09:17:04 crc kubenswrapper[4954]: I1206 09:17:04.775832 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhq2c" Dec 06 09:17:04 crc kubenswrapper[4954]: I1206 09:17:04.798883 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nhq2c"] Dec 06 09:17:04 crc kubenswrapper[4954]: I1206 09:17:04.913183 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf7cv\" (UniqueName: \"kubernetes.io/projected/1cd8598b-7b06-484f-bdb7-1d9901eb1db5-kube-api-access-xf7cv\") pod \"redhat-operators-nhq2c\" (UID: \"1cd8598b-7b06-484f-bdb7-1d9901eb1db5\") " pod="openshift-marketplace/redhat-operators-nhq2c" Dec 06 09:17:04 crc kubenswrapper[4954]: I1206 09:17:04.913258 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd8598b-7b06-484f-bdb7-1d9901eb1db5-catalog-content\") pod \"redhat-operators-nhq2c\" (UID: \"1cd8598b-7b06-484f-bdb7-1d9901eb1db5\") " pod="openshift-marketplace/redhat-operators-nhq2c" Dec 06 09:17:04 crc kubenswrapper[4954]: I1206 09:17:04.913349 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd8598b-7b06-484f-bdb7-1d9901eb1db5-utilities\") pod \"redhat-operators-nhq2c\" (UID: \"1cd8598b-7b06-484f-bdb7-1d9901eb1db5\") " pod="openshift-marketplace/redhat-operators-nhq2c" Dec 06 09:17:05 crc kubenswrapper[4954]: I1206 09:17:05.015140 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd8598b-7b06-484f-bdb7-1d9901eb1db5-catalog-content\") pod \"redhat-operators-nhq2c\" (UID: \"1cd8598b-7b06-484f-bdb7-1d9901eb1db5\") " pod="openshift-marketplace/redhat-operators-nhq2c" Dec 06 09:17:05 crc kubenswrapper[4954]: I1206 09:17:05.015272 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd8598b-7b06-484f-bdb7-1d9901eb1db5-utilities\") pod \"redhat-operators-nhq2c\" (UID: \"1cd8598b-7b06-484f-bdb7-1d9901eb1db5\") " pod="openshift-marketplace/redhat-operators-nhq2c" Dec 06 09:17:05 crc kubenswrapper[4954]: I1206 09:17:05.015368 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf7cv\" (UniqueName: \"kubernetes.io/projected/1cd8598b-7b06-484f-bdb7-1d9901eb1db5-kube-api-access-xf7cv\") pod \"redhat-operators-nhq2c\" (UID: \"1cd8598b-7b06-484f-bdb7-1d9901eb1db5\") " pod="openshift-marketplace/redhat-operators-nhq2c" Dec 06 09:17:05 crc kubenswrapper[4954]: I1206 09:17:05.015680 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd8598b-7b06-484f-bdb7-1d9901eb1db5-catalog-content\") pod \"redhat-operators-nhq2c\" (UID: \"1cd8598b-7b06-484f-bdb7-1d9901eb1db5\") " pod="openshift-marketplace/redhat-operators-nhq2c" Dec 06 09:17:05 crc kubenswrapper[4954]: I1206 09:17:05.015728 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd8598b-7b06-484f-bdb7-1d9901eb1db5-utilities\") pod \"redhat-operators-nhq2c\" (UID: \"1cd8598b-7b06-484f-bdb7-1d9901eb1db5\") " pod="openshift-marketplace/redhat-operators-nhq2c" Dec 06 09:17:05 crc kubenswrapper[4954]: I1206 09:17:05.043955 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf7cv\" (UniqueName: \"kubernetes.io/projected/1cd8598b-7b06-484f-bdb7-1d9901eb1db5-kube-api-access-xf7cv\") pod \"redhat-operators-nhq2c\" (UID: \"1cd8598b-7b06-484f-bdb7-1d9901eb1db5\") " pod="openshift-marketplace/redhat-operators-nhq2c" Dec 06 09:17:05 crc kubenswrapper[4954]: I1206 09:17:05.131766 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhq2c" Dec 06 09:17:05 crc kubenswrapper[4954]: I1206 09:17:05.677002 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nhq2c"] Dec 06 09:17:06 crc kubenswrapper[4954]: I1206 09:17:06.280826 4954 generic.go:334] "Generic (PLEG): container finished" podID="1cd8598b-7b06-484f-bdb7-1d9901eb1db5" containerID="36602f407fec2ac7ef3dfa50d778e0823ffdcf8dd1aa6fd07a4c8adf6e2078fa" exitCode=0 Dec 06 09:17:06 crc kubenswrapper[4954]: I1206 09:17:06.280896 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhq2c" event={"ID":"1cd8598b-7b06-484f-bdb7-1d9901eb1db5","Type":"ContainerDied","Data":"36602f407fec2ac7ef3dfa50d778e0823ffdcf8dd1aa6fd07a4c8adf6e2078fa"} Dec 06 09:17:06 crc kubenswrapper[4954]: I1206 09:17:06.281111 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhq2c" event={"ID":"1cd8598b-7b06-484f-bdb7-1d9901eb1db5","Type":"ContainerStarted","Data":"108b81b194accfe3fcb64482f3f2e9b0334d25aab218421d5ab564b70d3398a2"} Dec 06 09:17:06 crc kubenswrapper[4954]: I1206 09:17:06.282635 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 09:17:07 crc kubenswrapper[4954]: I1206 09:17:07.294040 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhq2c" event={"ID":"1cd8598b-7b06-484f-bdb7-1d9901eb1db5","Type":"ContainerStarted","Data":"77f97f597073dde8042fbf671930f53e30d5873014016d3cb650f980dc4d58d4"} Dec 06 09:17:10 crc kubenswrapper[4954]: I1206 09:17:10.444261 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:17:10 crc kubenswrapper[4954]: E1206 09:17:10.445110 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:17:11 crc kubenswrapper[4954]: I1206 09:17:11.332647 4954 generic.go:334] "Generic (PLEG): container finished" podID="1cd8598b-7b06-484f-bdb7-1d9901eb1db5" containerID="77f97f597073dde8042fbf671930f53e30d5873014016d3cb650f980dc4d58d4" exitCode=0 Dec 06 09:17:11 crc kubenswrapper[4954]: I1206 09:17:11.332715 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhq2c" event={"ID":"1cd8598b-7b06-484f-bdb7-1d9901eb1db5","Type":"ContainerDied","Data":"77f97f597073dde8042fbf671930f53e30d5873014016d3cb650f980dc4d58d4"} Dec 06 09:17:12 crc kubenswrapper[4954]: I1206 09:17:12.347087 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhq2c" event={"ID":"1cd8598b-7b06-484f-bdb7-1d9901eb1db5","Type":"ContainerStarted","Data":"2635049bf8d8606226d985e4b0c4a54e2e985d1eb34143e238b0d3626a91611a"} Dec 06 09:17:15 crc kubenswrapper[4954]: I1206 09:17:15.132448 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nhq2c" Dec 06 09:17:15 crc kubenswrapper[4954]: I1206 09:17:15.132815 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nhq2c" Dec 06 09:17:16 crc kubenswrapper[4954]: I1206 09:17:16.181374 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nhq2c" podUID="1cd8598b-7b06-484f-bdb7-1d9901eb1db5" containerName="registry-server" probeResult="failure" output=< Dec 06 09:17:16 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 09:17:16 crc kubenswrapper[4954]: > Dec 06 09:17:24 crc kubenswrapper[4954]: I1206 09:17:24.443299 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:17:24 crc kubenswrapper[4954]: E1206 09:17:24.443985 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:17:25 crc kubenswrapper[4954]: I1206 09:17:25.182942 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nhq2c" Dec 06 09:17:25 crc kubenswrapper[4954]: I1206 09:17:25.208240 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nhq2c" podStartSLOduration=15.620858112 podStartE2EDuration="21.20818436s" podCreationTimestamp="2025-12-06 09:17:04 +0000 UTC" firstStartedPulling="2025-12-06 09:17:06.282333525 +0000 UTC m=+8401.095692914" lastFinishedPulling="2025-12-06 09:17:11.869659733 +0000 UTC m=+8406.683019162" observedRunningTime="2025-12-06 09:17:12.38395814 +0000 UTC m=+8407.197317529" watchObservedRunningTime="2025-12-06 09:17:25.20818436 +0000 UTC m=+8420.021543759" Dec 06 09:17:25 crc kubenswrapper[4954]: I1206 09:17:25.231311 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nhq2c" Dec 06 09:17:25 crc kubenswrapper[4954]: I1206 09:17:25.419933 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nhq2c"] Dec 06 09:17:26 crc kubenswrapper[4954]: I1206 09:17:26.480171 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nhq2c" podUID="1cd8598b-7b06-484f-bdb7-1d9901eb1db5" containerName="registry-server" containerID="cri-o://2635049bf8d8606226d985e4b0c4a54e2e985d1eb34143e238b0d3626a91611a" gracePeriod=2 Dec 06 09:17:26 crc kubenswrapper[4954]: I1206 09:17:26.991343 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhq2c" Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.076677 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd8598b-7b06-484f-bdb7-1d9901eb1db5-utilities\") pod \"1cd8598b-7b06-484f-bdb7-1d9901eb1db5\" (UID: \"1cd8598b-7b06-484f-bdb7-1d9901eb1db5\") " Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.076750 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd8598b-7b06-484f-bdb7-1d9901eb1db5-catalog-content\") pod \"1cd8598b-7b06-484f-bdb7-1d9901eb1db5\" (UID: \"1cd8598b-7b06-484f-bdb7-1d9901eb1db5\") " Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.076822 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf7cv\" (UniqueName: \"kubernetes.io/projected/1cd8598b-7b06-484f-bdb7-1d9901eb1db5-kube-api-access-xf7cv\") pod \"1cd8598b-7b06-484f-bdb7-1d9901eb1db5\" (UID: \"1cd8598b-7b06-484f-bdb7-1d9901eb1db5\") " Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.077970 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cd8598b-7b06-484f-bdb7-1d9901eb1db5-utilities" (OuterVolumeSpecName: "utilities") pod "1cd8598b-7b06-484f-bdb7-1d9901eb1db5" (UID: "1cd8598b-7b06-484f-bdb7-1d9901eb1db5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.083764 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cd8598b-7b06-484f-bdb7-1d9901eb1db5-kube-api-access-xf7cv" (OuterVolumeSpecName: "kube-api-access-xf7cv") pod "1cd8598b-7b06-484f-bdb7-1d9901eb1db5" (UID: "1cd8598b-7b06-484f-bdb7-1d9901eb1db5"). InnerVolumeSpecName "kube-api-access-xf7cv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.178141 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf7cv\" (UniqueName: \"kubernetes.io/projected/1cd8598b-7b06-484f-bdb7-1d9901eb1db5-kube-api-access-xf7cv\") on node \"crc\" DevicePath \"\"" Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.178183 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd8598b-7b06-484f-bdb7-1d9901eb1db5-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.190192 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cd8598b-7b06-484f-bdb7-1d9901eb1db5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1cd8598b-7b06-484f-bdb7-1d9901eb1db5" (UID: "1cd8598b-7b06-484f-bdb7-1d9901eb1db5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.280239 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd8598b-7b06-484f-bdb7-1d9901eb1db5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.492718 4954 generic.go:334] "Generic (PLEG): container finished" podID="1cd8598b-7b06-484f-bdb7-1d9901eb1db5" containerID="2635049bf8d8606226d985e4b0c4a54e2e985d1eb34143e238b0d3626a91611a" exitCode=0 Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.492796 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhq2c" Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.492825 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhq2c" event={"ID":"1cd8598b-7b06-484f-bdb7-1d9901eb1db5","Type":"ContainerDied","Data":"2635049bf8d8606226d985e4b0c4a54e2e985d1eb34143e238b0d3626a91611a"} Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.492960 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhq2c" event={"ID":"1cd8598b-7b06-484f-bdb7-1d9901eb1db5","Type":"ContainerDied","Data":"108b81b194accfe3fcb64482f3f2e9b0334d25aab218421d5ab564b70d3398a2"} Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.492999 4954 scope.go:117] "RemoveContainer" containerID="2635049bf8d8606226d985e4b0c4a54e2e985d1eb34143e238b0d3626a91611a" Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.520217 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nhq2c"] Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.524258 4954 scope.go:117] "RemoveContainer" containerID="77f97f597073dde8042fbf671930f53e30d5873014016d3cb650f980dc4d58d4" Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.529311 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nhq2c"] Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.565240 4954 scope.go:117] "RemoveContainer" containerID="36602f407fec2ac7ef3dfa50d778e0823ffdcf8dd1aa6fd07a4c8adf6e2078fa" Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.614064 4954 scope.go:117] "RemoveContainer" containerID="2635049bf8d8606226d985e4b0c4a54e2e985d1eb34143e238b0d3626a91611a" Dec 06 09:17:27 crc kubenswrapper[4954]: E1206 09:17:27.614669 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2635049bf8d8606226d985e4b0c4a54e2e985d1eb34143e238b0d3626a91611a\": container with ID starting with 2635049bf8d8606226d985e4b0c4a54e2e985d1eb34143e238b0d3626a91611a not found: ID does not exist" containerID="2635049bf8d8606226d985e4b0c4a54e2e985d1eb34143e238b0d3626a91611a" Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.614711 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2635049bf8d8606226d985e4b0c4a54e2e985d1eb34143e238b0d3626a91611a"} err="failed to get container status \"2635049bf8d8606226d985e4b0c4a54e2e985d1eb34143e238b0d3626a91611a\": rpc error: code = NotFound desc = could not find container \"2635049bf8d8606226d985e4b0c4a54e2e985d1eb34143e238b0d3626a91611a\": container with ID starting with 2635049bf8d8606226d985e4b0c4a54e2e985d1eb34143e238b0d3626a91611a not found: ID does not exist" Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.614738 4954 scope.go:117] "RemoveContainer" containerID="77f97f597073dde8042fbf671930f53e30d5873014016d3cb650f980dc4d58d4" Dec 06 09:17:27 crc kubenswrapper[4954]: E1206 09:17:27.614970 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77f97f597073dde8042fbf671930f53e30d5873014016d3cb650f980dc4d58d4\": container with ID starting with 77f97f597073dde8042fbf671930f53e30d5873014016d3cb650f980dc4d58d4 not found: ID does not exist" containerID="77f97f597073dde8042fbf671930f53e30d5873014016d3cb650f980dc4d58d4" Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.614997 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77f97f597073dde8042fbf671930f53e30d5873014016d3cb650f980dc4d58d4"} err="failed to get container status \"77f97f597073dde8042fbf671930f53e30d5873014016d3cb650f980dc4d58d4\": rpc error: code = NotFound desc = could not find container \"77f97f597073dde8042fbf671930f53e30d5873014016d3cb650f980dc4d58d4\": container with ID starting with 77f97f597073dde8042fbf671930f53e30d5873014016d3cb650f980dc4d58d4 not found: ID does not exist" Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.615015 4954 scope.go:117] "RemoveContainer" containerID="36602f407fec2ac7ef3dfa50d778e0823ffdcf8dd1aa6fd07a4c8adf6e2078fa" Dec 06 09:17:27 crc kubenswrapper[4954]: E1206 09:17:27.615374 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36602f407fec2ac7ef3dfa50d778e0823ffdcf8dd1aa6fd07a4c8adf6e2078fa\": container with ID starting with 36602f407fec2ac7ef3dfa50d778e0823ffdcf8dd1aa6fd07a4c8adf6e2078fa not found: ID does not exist" containerID="36602f407fec2ac7ef3dfa50d778e0823ffdcf8dd1aa6fd07a4c8adf6e2078fa" Dec 06 09:17:27 crc kubenswrapper[4954]: I1206 09:17:27.615404 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36602f407fec2ac7ef3dfa50d778e0823ffdcf8dd1aa6fd07a4c8adf6e2078fa"} err="failed to get container status \"36602f407fec2ac7ef3dfa50d778e0823ffdcf8dd1aa6fd07a4c8adf6e2078fa\": rpc error: code = NotFound desc = could not find container \"36602f407fec2ac7ef3dfa50d778e0823ffdcf8dd1aa6fd07a4c8adf6e2078fa\": container with ID starting with 36602f407fec2ac7ef3dfa50d778e0823ffdcf8dd1aa6fd07a4c8adf6e2078fa not found: ID does not exist" Dec 06 09:17:29 crc kubenswrapper[4954]: I1206 09:17:29.457164 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cd8598b-7b06-484f-bdb7-1d9901eb1db5" path="/var/lib/kubelet/pods/1cd8598b-7b06-484f-bdb7-1d9901eb1db5/volumes" Dec 06 09:17:35 crc kubenswrapper[4954]: I1206 09:17:35.450879 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:17:35 crc kubenswrapper[4954]: E1206 09:17:35.451503 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:17:46 crc kubenswrapper[4954]: I1206 09:17:46.443754 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:17:46 crc kubenswrapper[4954]: E1206 09:17:46.444445 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:17:59 crc kubenswrapper[4954]: I1206 09:17:59.444721 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:17:59 crc kubenswrapper[4954]: E1206 09:17:59.446140 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:18:13 crc kubenswrapper[4954]: I1206 09:18:13.444064 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:18:13 crc kubenswrapper[4954]: E1206 09:18:13.444889 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:18:25 crc kubenswrapper[4954]: I1206 09:18:25.450033 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:18:25 crc kubenswrapper[4954]: E1206 09:18:25.450946 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:18:32 crc kubenswrapper[4954]: I1206 09:18:32.057375 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-c9hvq"] Dec 06 09:18:32 crc kubenswrapper[4954]: I1206 09:18:32.065823 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0197-account-create-update-rmv95"] Dec 06 09:18:32 crc kubenswrapper[4954]: I1206 09:18:32.075656 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-c9hvq"] Dec 06 09:18:32 crc kubenswrapper[4954]: I1206 09:18:32.083188 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0197-account-create-update-rmv95"] Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.466340 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91e8c0ec-a401-4e33-b035-cc6cabed461a" path="/var/lib/kubelet/pods/91e8c0ec-a401-4e33-b035-cc6cabed461a/volumes" Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.467059 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb202835-f662-4f75-a820-76ffc6def068" path="/var/lib/kubelet/pods/eb202835-f662-4f75-a820-76ffc6def068/volumes" Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.552788 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-64h25"] Dec 06 09:18:33 crc kubenswrapper[4954]: E1206 09:18:33.553222 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd8598b-7b06-484f-bdb7-1d9901eb1db5" containerName="registry-server" Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.553244 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd8598b-7b06-484f-bdb7-1d9901eb1db5" containerName="registry-server" Dec 06 09:18:33 crc kubenswrapper[4954]: E1206 09:18:33.553273 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd8598b-7b06-484f-bdb7-1d9901eb1db5" containerName="extract-content" Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.553279 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd8598b-7b06-484f-bdb7-1d9901eb1db5" containerName="extract-content" Dec 06 09:18:33 crc kubenswrapper[4954]: E1206 09:18:33.553302 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd8598b-7b06-484f-bdb7-1d9901eb1db5" containerName="extract-utilities" Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.553310 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd8598b-7b06-484f-bdb7-1d9901eb1db5" containerName="extract-utilities" Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.553546 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cd8598b-7b06-484f-bdb7-1d9901eb1db5" containerName="registry-server" Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.555724 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64h25" Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.567063 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64h25"] Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.755737 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d2b73ae-2be2-4eab-b6ac-56b00a613beb-catalog-content\") pod \"redhat-marketplace-64h25\" (UID: \"0d2b73ae-2be2-4eab-b6ac-56b00a613beb\") " pod="openshift-marketplace/redhat-marketplace-64h25" Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.756109 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d2b73ae-2be2-4eab-b6ac-56b00a613beb-utilities\") pod \"redhat-marketplace-64h25\" (UID: \"0d2b73ae-2be2-4eab-b6ac-56b00a613beb\") " pod="openshift-marketplace/redhat-marketplace-64h25" Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.756149 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzv2h\" (UniqueName: \"kubernetes.io/projected/0d2b73ae-2be2-4eab-b6ac-56b00a613beb-kube-api-access-zzv2h\") pod \"redhat-marketplace-64h25\" (UID: \"0d2b73ae-2be2-4eab-b6ac-56b00a613beb\") " pod="openshift-marketplace/redhat-marketplace-64h25" Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.857850 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d2b73ae-2be2-4eab-b6ac-56b00a613beb-utilities\") pod \"redhat-marketplace-64h25\" (UID: \"0d2b73ae-2be2-4eab-b6ac-56b00a613beb\") " pod="openshift-marketplace/redhat-marketplace-64h25" Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.857925 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzv2h\" (UniqueName: \"kubernetes.io/projected/0d2b73ae-2be2-4eab-b6ac-56b00a613beb-kube-api-access-zzv2h\") pod \"redhat-marketplace-64h25\" (UID: \"0d2b73ae-2be2-4eab-b6ac-56b00a613beb\") " pod="openshift-marketplace/redhat-marketplace-64h25" Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.858109 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d2b73ae-2be2-4eab-b6ac-56b00a613beb-catalog-content\") pod \"redhat-marketplace-64h25\" (UID: \"0d2b73ae-2be2-4eab-b6ac-56b00a613beb\") " pod="openshift-marketplace/redhat-marketplace-64h25" Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.858406 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d2b73ae-2be2-4eab-b6ac-56b00a613beb-utilities\") pod \"redhat-marketplace-64h25\" (UID: \"0d2b73ae-2be2-4eab-b6ac-56b00a613beb\") " pod="openshift-marketplace/redhat-marketplace-64h25" Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.858501 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d2b73ae-2be2-4eab-b6ac-56b00a613beb-catalog-content\") pod \"redhat-marketplace-64h25\" (UID: \"0d2b73ae-2be2-4eab-b6ac-56b00a613beb\") " pod="openshift-marketplace/redhat-marketplace-64h25" Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.875877 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzv2h\" (UniqueName: \"kubernetes.io/projected/0d2b73ae-2be2-4eab-b6ac-56b00a613beb-kube-api-access-zzv2h\") pod \"redhat-marketplace-64h25\" (UID: \"0d2b73ae-2be2-4eab-b6ac-56b00a613beb\") " pod="openshift-marketplace/redhat-marketplace-64h25" Dec 06 09:18:33 crc kubenswrapper[4954]: I1206 09:18:33.878794 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64h25" Dec 06 09:18:34 crc kubenswrapper[4954]: I1206 09:18:34.408447 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64h25"] Dec 06 09:18:34 crc kubenswrapper[4954]: I1206 09:18:34.655574 4954 generic.go:334] "Generic (PLEG): container finished" podID="0d2b73ae-2be2-4eab-b6ac-56b00a613beb" containerID="313eafea2a5718c2a0a6c2939d0c08b4857c9b4c5d24f86fb3815516eb064c65" exitCode=0 Dec 06 09:18:34 crc kubenswrapper[4954]: I1206 09:18:34.655680 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64h25" event={"ID":"0d2b73ae-2be2-4eab-b6ac-56b00a613beb","Type":"ContainerDied","Data":"313eafea2a5718c2a0a6c2939d0c08b4857c9b4c5d24f86fb3815516eb064c65"} Dec 06 09:18:34 crc kubenswrapper[4954]: I1206 09:18:34.655919 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64h25" event={"ID":"0d2b73ae-2be2-4eab-b6ac-56b00a613beb","Type":"ContainerStarted","Data":"0cce472ad12be12f3fee4cfea9dd71fab74b8c8a829930dda853b46d60d24121"} Dec 06 09:18:36 crc kubenswrapper[4954]: I1206 09:18:36.673383 4954 generic.go:334] "Generic (PLEG): container finished" podID="0d2b73ae-2be2-4eab-b6ac-56b00a613beb" containerID="63fc3ed852118ff5a190316967f899928075ee19ad2c197d425b39eb5cd64af9" exitCode=0 Dec 06 09:18:36 crc kubenswrapper[4954]: I1206 09:18:36.673474 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64h25" event={"ID":"0d2b73ae-2be2-4eab-b6ac-56b00a613beb","Type":"ContainerDied","Data":"63fc3ed852118ff5a190316967f899928075ee19ad2c197d425b39eb5cd64af9"} Dec 06 09:18:37 crc kubenswrapper[4954]: I1206 09:18:37.731884 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64h25" event={"ID":"0d2b73ae-2be2-4eab-b6ac-56b00a613beb","Type":"ContainerStarted","Data":"da1d5ddcf5854a848d666c5900dd3bf90c9c114a9f382c3d6afd8fe6c465d10f"} Dec 06 09:18:37 crc kubenswrapper[4954]: I1206 09:18:37.760043 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-64h25" podStartSLOduration=2.333896613 podStartE2EDuration="4.760006406s" podCreationTimestamp="2025-12-06 09:18:33 +0000 UTC" firstStartedPulling="2025-12-06 09:18:34.657192374 +0000 UTC m=+8489.470551763" lastFinishedPulling="2025-12-06 09:18:37.083302167 +0000 UTC m=+8491.896661556" observedRunningTime="2025-12-06 09:18:37.755377091 +0000 UTC m=+8492.568736480" watchObservedRunningTime="2025-12-06 09:18:37.760006406 +0000 UTC m=+8492.573365795" Dec 06 09:18:38 crc kubenswrapper[4954]: I1206 09:18:38.443247 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:18:38 crc kubenswrapper[4954]: E1206 09:18:38.443815 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:18:39 crc kubenswrapper[4954]: I1206 09:18:39.754753 4954 generic.go:334] "Generic (PLEG): container finished" podID="cbcd2e0b-c24c-4c7e-bad8-82b47335ace8" containerID="d5dbdd02edaa06c1b8ae66a48a21f3d3b83b82297314b53959c3675d982d9fc3" exitCode=0 Dec 06 09:18:39 crc kubenswrapper[4954]: I1206 09:18:39.755658 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" event={"ID":"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8","Type":"ContainerDied","Data":"d5dbdd02edaa06c1b8ae66a48a21f3d3b83b82297314b53959c3675d982d9fc3"} Dec 06 09:18:41 crc kubenswrapper[4954]: I1206 09:18:41.224351 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" Dec 06 09:18:41 crc kubenswrapper[4954]: I1206 09:18:41.416768 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwrgg\" (UniqueName: \"kubernetes.io/projected/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-kube-api-access-rwrgg\") pod \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\" (UID: \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\") " Dec 06 09:18:41 crc kubenswrapper[4954]: I1206 09:18:41.416981 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-ssh-key\") pod \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\" (UID: \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\") " Dec 06 09:18:41 crc kubenswrapper[4954]: I1206 09:18:41.417070 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-inventory\") pod \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\" (UID: \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\") " Dec 06 09:18:41 crc kubenswrapper[4954]: I1206 09:18:41.417097 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-tripleo-cleanup-combined-ca-bundle\") pod \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\" (UID: \"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8\") " Dec 06 09:18:41 crc kubenswrapper[4954]: I1206 09:18:41.422281 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "cbcd2e0b-c24c-4c7e-bad8-82b47335ace8" (UID: "cbcd2e0b-c24c-4c7e-bad8-82b47335ace8"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:18:41 crc kubenswrapper[4954]: I1206 09:18:41.422328 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-kube-api-access-rwrgg" (OuterVolumeSpecName: "kube-api-access-rwrgg") pod "cbcd2e0b-c24c-4c7e-bad8-82b47335ace8" (UID: "cbcd2e0b-c24c-4c7e-bad8-82b47335ace8"). InnerVolumeSpecName "kube-api-access-rwrgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:18:41 crc kubenswrapper[4954]: I1206 09:18:41.445758 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cbcd2e0b-c24c-4c7e-bad8-82b47335ace8" (UID: "cbcd2e0b-c24c-4c7e-bad8-82b47335ace8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:18:41 crc kubenswrapper[4954]: I1206 09:18:41.456488 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-inventory" (OuterVolumeSpecName: "inventory") pod "cbcd2e0b-c24c-4c7e-bad8-82b47335ace8" (UID: "cbcd2e0b-c24c-4c7e-bad8-82b47335ace8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:18:41 crc kubenswrapper[4954]: I1206 09:18:41.519735 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:18:41 crc kubenswrapper[4954]: I1206 09:18:41.519788 4954 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:18:41 crc kubenswrapper[4954]: I1206 09:18:41.519822 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwrgg\" (UniqueName: \"kubernetes.io/projected/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-kube-api-access-rwrgg\") on node \"crc\" DevicePath \"\"" Dec 06 09:18:41 crc kubenswrapper[4954]: I1206 09:18:41.519836 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbcd2e0b-c24c-4c7e-bad8-82b47335ace8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:18:41 crc kubenswrapper[4954]: I1206 09:18:41.773161 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" event={"ID":"cbcd2e0b-c24c-4c7e-bad8-82b47335ace8","Type":"ContainerDied","Data":"37d5a891b9858e68c519049c9139149403371f1c3493e2c08ae63743c20c27fa"} Dec 06 09:18:41 crc kubenswrapper[4954]: I1206 09:18:41.773201 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37d5a891b9858e68c519049c9139149403371f1c3493e2c08ae63743c20c27fa" Dec 06 09:18:41 crc kubenswrapper[4954]: I1206 09:18:41.773226 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8" Dec 06 09:18:43 crc kubenswrapper[4954]: I1206 09:18:43.879581 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-64h25" Dec 06 09:18:43 crc kubenswrapper[4954]: I1206 09:18:43.879632 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-64h25" Dec 06 09:18:43 crc kubenswrapper[4954]: I1206 09:18:43.936188 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-64h25" Dec 06 09:18:44 crc kubenswrapper[4954]: I1206 09:18:44.037143 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-zgc76"] Dec 06 09:18:44 crc kubenswrapper[4954]: I1206 09:18:44.049280 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-zgc76"] Dec 06 09:18:44 crc kubenswrapper[4954]: I1206 09:18:44.860314 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-64h25" Dec 06 09:18:44 crc kubenswrapper[4954]: I1206 09:18:44.911193 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64h25"] Dec 06 09:18:45 crc kubenswrapper[4954]: I1206 09:18:45.456353 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c73f11fa-ce19-4f8e-9029-6bee3482fc8f" path="/var/lib/kubelet/pods/c73f11fa-ce19-4f8e-9029-6bee3482fc8f/volumes" Dec 06 09:18:46 crc kubenswrapper[4954]: I1206 09:18:46.830139 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-64h25" podUID="0d2b73ae-2be2-4eab-b6ac-56b00a613beb" containerName="registry-server" containerID="cri-o://da1d5ddcf5854a848d666c5900dd3bf90c9c114a9f382c3d6afd8fe6c465d10f" gracePeriod=2 Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.302775 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64h25" Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.434486 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d2b73ae-2be2-4eab-b6ac-56b00a613beb-utilities\") pod \"0d2b73ae-2be2-4eab-b6ac-56b00a613beb\" (UID: \"0d2b73ae-2be2-4eab-b6ac-56b00a613beb\") " Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.434637 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d2b73ae-2be2-4eab-b6ac-56b00a613beb-catalog-content\") pod \"0d2b73ae-2be2-4eab-b6ac-56b00a613beb\" (UID: \"0d2b73ae-2be2-4eab-b6ac-56b00a613beb\") " Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.434687 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzv2h\" (UniqueName: \"kubernetes.io/projected/0d2b73ae-2be2-4eab-b6ac-56b00a613beb-kube-api-access-zzv2h\") pod \"0d2b73ae-2be2-4eab-b6ac-56b00a613beb\" (UID: \"0d2b73ae-2be2-4eab-b6ac-56b00a613beb\") " Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.435394 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d2b73ae-2be2-4eab-b6ac-56b00a613beb-utilities" (OuterVolumeSpecName: "utilities") pod "0d2b73ae-2be2-4eab-b6ac-56b00a613beb" (UID: "0d2b73ae-2be2-4eab-b6ac-56b00a613beb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.440269 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d2b73ae-2be2-4eab-b6ac-56b00a613beb-kube-api-access-zzv2h" (OuterVolumeSpecName: "kube-api-access-zzv2h") pod "0d2b73ae-2be2-4eab-b6ac-56b00a613beb" (UID: "0d2b73ae-2be2-4eab-b6ac-56b00a613beb"). InnerVolumeSpecName "kube-api-access-zzv2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.452791 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d2b73ae-2be2-4eab-b6ac-56b00a613beb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d2b73ae-2be2-4eab-b6ac-56b00a613beb" (UID: "0d2b73ae-2be2-4eab-b6ac-56b00a613beb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.536534 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d2b73ae-2be2-4eab-b6ac-56b00a613beb-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.536598 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d2b73ae-2be2-4eab-b6ac-56b00a613beb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.536613 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzv2h\" (UniqueName: \"kubernetes.io/projected/0d2b73ae-2be2-4eab-b6ac-56b00a613beb-kube-api-access-zzv2h\") on node \"crc\" DevicePath \"\"" Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.844700 4954 generic.go:334] "Generic (PLEG): container finished" podID="0d2b73ae-2be2-4eab-b6ac-56b00a613beb" containerID="da1d5ddcf5854a848d666c5900dd3bf90c9c114a9f382c3d6afd8fe6c465d10f" exitCode=0 Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.844751 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64h25" Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.844752 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64h25" event={"ID":"0d2b73ae-2be2-4eab-b6ac-56b00a613beb","Type":"ContainerDied","Data":"da1d5ddcf5854a848d666c5900dd3bf90c9c114a9f382c3d6afd8fe6c465d10f"} Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.844899 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64h25" event={"ID":"0d2b73ae-2be2-4eab-b6ac-56b00a613beb","Type":"ContainerDied","Data":"0cce472ad12be12f3fee4cfea9dd71fab74b8c8a829930dda853b46d60d24121"} Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.844934 4954 scope.go:117] "RemoveContainer" containerID="da1d5ddcf5854a848d666c5900dd3bf90c9c114a9f382c3d6afd8fe6c465d10f" Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.879458 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64h25"] Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.890480 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-64h25"] Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.891252 4954 scope.go:117] "RemoveContainer" containerID="63fc3ed852118ff5a190316967f899928075ee19ad2c197d425b39eb5cd64af9" Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.925226 4954 scope.go:117] "RemoveContainer" containerID="313eafea2a5718c2a0a6c2939d0c08b4857c9b4c5d24f86fb3815516eb064c65" Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.969307 4954 scope.go:117] "RemoveContainer" containerID="da1d5ddcf5854a848d666c5900dd3bf90c9c114a9f382c3d6afd8fe6c465d10f" Dec 06 09:18:47 crc kubenswrapper[4954]: E1206 09:18:47.970755 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da1d5ddcf5854a848d666c5900dd3bf90c9c114a9f382c3d6afd8fe6c465d10f\": container with ID starting with da1d5ddcf5854a848d666c5900dd3bf90c9c114a9f382c3d6afd8fe6c465d10f not found: ID does not exist" containerID="da1d5ddcf5854a848d666c5900dd3bf90c9c114a9f382c3d6afd8fe6c465d10f" Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.970801 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1d5ddcf5854a848d666c5900dd3bf90c9c114a9f382c3d6afd8fe6c465d10f"} err="failed to get container status \"da1d5ddcf5854a848d666c5900dd3bf90c9c114a9f382c3d6afd8fe6c465d10f\": rpc error: code = NotFound desc = could not find container \"da1d5ddcf5854a848d666c5900dd3bf90c9c114a9f382c3d6afd8fe6c465d10f\": container with ID starting with da1d5ddcf5854a848d666c5900dd3bf90c9c114a9f382c3d6afd8fe6c465d10f not found: ID does not exist" Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.970909 4954 scope.go:117] "RemoveContainer" containerID="63fc3ed852118ff5a190316967f899928075ee19ad2c197d425b39eb5cd64af9" Dec 06 09:18:47 crc kubenswrapper[4954]: E1206 09:18:47.971770 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63fc3ed852118ff5a190316967f899928075ee19ad2c197d425b39eb5cd64af9\": container with ID starting with 63fc3ed852118ff5a190316967f899928075ee19ad2c197d425b39eb5cd64af9 not found: ID does not exist" containerID="63fc3ed852118ff5a190316967f899928075ee19ad2c197d425b39eb5cd64af9" Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.971809 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63fc3ed852118ff5a190316967f899928075ee19ad2c197d425b39eb5cd64af9"} err="failed to get container status \"63fc3ed852118ff5a190316967f899928075ee19ad2c197d425b39eb5cd64af9\": rpc error: code = NotFound desc = could not find container \"63fc3ed852118ff5a190316967f899928075ee19ad2c197d425b39eb5cd64af9\": container with ID starting with 63fc3ed852118ff5a190316967f899928075ee19ad2c197d425b39eb5cd64af9 not found: ID does not exist" Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.971828 4954 scope.go:117] "RemoveContainer" containerID="313eafea2a5718c2a0a6c2939d0c08b4857c9b4c5d24f86fb3815516eb064c65" Dec 06 09:18:47 crc kubenswrapper[4954]: E1206 09:18:47.972185 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"313eafea2a5718c2a0a6c2939d0c08b4857c9b4c5d24f86fb3815516eb064c65\": container with ID starting with 313eafea2a5718c2a0a6c2939d0c08b4857c9b4c5d24f86fb3815516eb064c65 not found: ID does not exist" containerID="313eafea2a5718c2a0a6c2939d0c08b4857c9b4c5d24f86fb3815516eb064c65" Dec 06 09:18:47 crc kubenswrapper[4954]: I1206 09:18:47.972245 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"313eafea2a5718c2a0a6c2939d0c08b4857c9b4c5d24f86fb3815516eb064c65"} err="failed to get container status \"313eafea2a5718c2a0a6c2939d0c08b4857c9b4c5d24f86fb3815516eb064c65\": rpc error: code = NotFound desc = could not find container \"313eafea2a5718c2a0a6c2939d0c08b4857c9b4c5d24f86fb3815516eb064c65\": container with ID starting with 313eafea2a5718c2a0a6c2939d0c08b4857c9b4c5d24f86fb3815516eb064c65 not found: ID does not exist" Dec 06 09:18:49 crc kubenswrapper[4954]: I1206 09:18:49.580436 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:18:49 crc kubenswrapper[4954]: E1206 09:18:49.580723 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:18:49 crc kubenswrapper[4954]: I1206 09:18:49.593853 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d2b73ae-2be2-4eab-b6ac-56b00a613beb" path="/var/lib/kubelet/pods/0d2b73ae-2be2-4eab-b6ac-56b00a613beb/volumes" Dec 06 09:19:02 crc kubenswrapper[4954]: I1206 09:19:02.443386 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:19:02 crc kubenswrapper[4954]: E1206 09:19:02.444183 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:19:08 crc kubenswrapper[4954]: I1206 09:19:08.942237 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-mxjln"] Dec 06 09:19:08 crc kubenswrapper[4954]: E1206 09:19:08.942999 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbcd2e0b-c24c-4c7e-bad8-82b47335ace8" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 06 09:19:08 crc kubenswrapper[4954]: I1206 09:19:08.943014 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbcd2e0b-c24c-4c7e-bad8-82b47335ace8" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 06 09:19:08 crc kubenswrapper[4954]: E1206 09:19:08.943044 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2b73ae-2be2-4eab-b6ac-56b00a613beb" containerName="extract-content" Dec 06 09:19:08 crc kubenswrapper[4954]: I1206 09:19:08.943050 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2b73ae-2be2-4eab-b6ac-56b00a613beb" containerName="extract-content" Dec 06 09:19:08 crc kubenswrapper[4954]: E1206 09:19:08.943064 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2b73ae-2be2-4eab-b6ac-56b00a613beb" containerName="extract-utilities" Dec 06 09:19:08 crc kubenswrapper[4954]: I1206 09:19:08.943070 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2b73ae-2be2-4eab-b6ac-56b00a613beb" containerName="extract-utilities" Dec 06 09:19:08 crc kubenswrapper[4954]: E1206 09:19:08.943080 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2b73ae-2be2-4eab-b6ac-56b00a613beb" containerName="registry-server" Dec 06 09:19:08 crc kubenswrapper[4954]: I1206 09:19:08.943087 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2b73ae-2be2-4eab-b6ac-56b00a613beb" containerName="registry-server" Dec 06 09:19:08 crc kubenswrapper[4954]: I1206 09:19:08.943288 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d2b73ae-2be2-4eab-b6ac-56b00a613beb" containerName="registry-server" Dec 06 09:19:08 crc kubenswrapper[4954]: I1206 09:19:08.943314 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbcd2e0b-c24c-4c7e-bad8-82b47335ace8" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 06 09:19:08 crc kubenswrapper[4954]: I1206 09:19:08.944159 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" Dec 06 09:19:08 crc kubenswrapper[4954]: I1206 09:19:08.946801 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:19:08 crc kubenswrapper[4954]: I1206 09:19:08.947369 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:19:08 crc kubenswrapper[4954]: I1206 09:19:08.947788 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 06 09:19:08 crc kubenswrapper[4954]: I1206 09:19:08.947861 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:19:08 crc kubenswrapper[4954]: I1206 09:19:08.951074 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-6c94j"] Dec 06 09:19:08 crc kubenswrapper[4954]: I1206 09:19:08.952884 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-6c94j" Dec 06 09:19:08 crc kubenswrapper[4954]: I1206 09:19:08.960050 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-qwx8g" Dec 06 09:19:08 crc kubenswrapper[4954]: I1206 09:19:08.960449 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Dec 06 09:19:08 crc kubenswrapper[4954]: I1206 09:19:08.964913 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-mxjln"] Dec 06 09:19:08 crc kubenswrapper[4954]: I1206 09:19:08.973603 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-6c94j"] Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.014471 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-6c94j\" (UID: \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\") " pod="openstack/bootstrap-openstack-openstack-networker-6c94j" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.014640 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftdlc\" (UniqueName: \"kubernetes.io/projected/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-kube-api-access-ftdlc\") pod \"bootstrap-openstack-openstack-networker-6c94j\" (UID: \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\") " pod="openstack/bootstrap-openstack-openstack-networker-6c94j" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.014736 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6c6edf1-d239-4369-a85a-822e425a2909-inventory\") pod \"bootstrap-openstack-openstack-cell1-mxjln\" (UID: \"c6c6edf1-d239-4369-a85a-822e425a2909\") " pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.014849 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-ssh-key\") pod \"bootstrap-openstack-openstack-networker-6c94j\" (UID: \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\") " pod="openstack/bootstrap-openstack-openstack-networker-6c94j" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.014936 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jchxs\" (UniqueName: \"kubernetes.io/projected/c6c6edf1-d239-4369-a85a-822e425a2909-kube-api-access-jchxs\") pod \"bootstrap-openstack-openstack-cell1-mxjln\" (UID: \"c6c6edf1-d239-4369-a85a-822e425a2909\") " pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.015194 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-inventory\") pod \"bootstrap-openstack-openstack-networker-6c94j\" (UID: \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\") " pod="openstack/bootstrap-openstack-openstack-networker-6c94j" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.015289 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c6edf1-d239-4369-a85a-822e425a2909-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-mxjln\" (UID: \"c6c6edf1-d239-4369-a85a-822e425a2909\") " pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.015451 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6c6edf1-d239-4369-a85a-822e425a2909-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-mxjln\" (UID: \"c6c6edf1-d239-4369-a85a-822e425a2909\") " pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.117256 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-ssh-key\") pod \"bootstrap-openstack-openstack-networker-6c94j\" (UID: \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\") " pod="openstack/bootstrap-openstack-openstack-networker-6c94j" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.117340 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jchxs\" (UniqueName: \"kubernetes.io/projected/c6c6edf1-d239-4369-a85a-822e425a2909-kube-api-access-jchxs\") pod \"bootstrap-openstack-openstack-cell1-mxjln\" (UID: \"c6c6edf1-d239-4369-a85a-822e425a2909\") " pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.117376 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-inventory\") pod \"bootstrap-openstack-openstack-networker-6c94j\" (UID: \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\") " pod="openstack/bootstrap-openstack-openstack-networker-6c94j" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.117395 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c6edf1-d239-4369-a85a-822e425a2909-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-mxjln\" (UID: \"c6c6edf1-d239-4369-a85a-822e425a2909\") " pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.117450 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6c6edf1-d239-4369-a85a-822e425a2909-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-mxjln\" (UID: \"c6c6edf1-d239-4369-a85a-822e425a2909\") " pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.117489 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-6c94j\" (UID: \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\") " pod="openstack/bootstrap-openstack-openstack-networker-6c94j" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.117525 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftdlc\" (UniqueName: \"kubernetes.io/projected/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-kube-api-access-ftdlc\") pod \"bootstrap-openstack-openstack-networker-6c94j\" (UID: \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\") " pod="openstack/bootstrap-openstack-openstack-networker-6c94j" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.117578 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6c6edf1-d239-4369-a85a-822e425a2909-inventory\") pod \"bootstrap-openstack-openstack-cell1-mxjln\" (UID: \"c6c6edf1-d239-4369-a85a-822e425a2909\") " pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.124104 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c6edf1-d239-4369-a85a-822e425a2909-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-mxjln\" (UID: \"c6c6edf1-d239-4369-a85a-822e425a2909\") " pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.124148 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-ssh-key\") pod \"bootstrap-openstack-openstack-networker-6c94j\" (UID: \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\") " pod="openstack/bootstrap-openstack-openstack-networker-6c94j" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.127461 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6c6edf1-d239-4369-a85a-822e425a2909-inventory\") pod \"bootstrap-openstack-openstack-cell1-mxjln\" (UID: \"c6c6edf1-d239-4369-a85a-822e425a2909\") " pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.128306 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-inventory\") pod \"bootstrap-openstack-openstack-networker-6c94j\" (UID: \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\") " pod="openstack/bootstrap-openstack-openstack-networker-6c94j" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.128885 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6c6edf1-d239-4369-a85a-822e425a2909-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-mxjln\" (UID: \"c6c6edf1-d239-4369-a85a-822e425a2909\") " pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.137003 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-6c94j\" (UID: \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\") " pod="openstack/bootstrap-openstack-openstack-networker-6c94j" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.141232 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftdlc\" (UniqueName: \"kubernetes.io/projected/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-kube-api-access-ftdlc\") pod \"bootstrap-openstack-openstack-networker-6c94j\" (UID: \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\") " pod="openstack/bootstrap-openstack-openstack-networker-6c94j" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.143099 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jchxs\" (UniqueName: \"kubernetes.io/projected/c6c6edf1-d239-4369-a85a-822e425a2909-kube-api-access-jchxs\") pod \"bootstrap-openstack-openstack-cell1-mxjln\" (UID: \"c6c6edf1-d239-4369-a85a-822e425a2909\") " pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.265486 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.281324 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-6c94j" Dec 06 09:19:09 crc kubenswrapper[4954]: I1206 09:19:09.925926 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-mxjln"] Dec 06 09:19:10 crc kubenswrapper[4954]: W1206 09:19:10.001379 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6ef0d02_4ed6_4021_b60d_a06dc6566c48.slice/crio-697b861f01c7cfe8998bafef5d6a3f701277e618e852429238f1a1f8f8f10604 WatchSource:0}: Error finding container 697b861f01c7cfe8998bafef5d6a3f701277e618e852429238f1a1f8f8f10604: Status 404 returned error can't find the container with id 697b861f01c7cfe8998bafef5d6a3f701277e618e852429238f1a1f8f8f10604 Dec 06 09:19:10 crc kubenswrapper[4954]: I1206 09:19:10.001792 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-6c94j"] Dec 06 09:19:10 crc kubenswrapper[4954]: I1206 09:19:10.053526 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-6c94j" event={"ID":"b6ef0d02-4ed6-4021-b60d-a06dc6566c48","Type":"ContainerStarted","Data":"697b861f01c7cfe8998bafef5d6a3f701277e618e852429238f1a1f8f8f10604"} Dec 06 09:19:10 crc kubenswrapper[4954]: I1206 09:19:10.054990 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" event={"ID":"c6c6edf1-d239-4369-a85a-822e425a2909","Type":"ContainerStarted","Data":"5e4773190e1d8e1ff61d45748357ec2a50ad27db3b0b97c8dbc4cfa35bf6282b"} Dec 06 09:19:11 crc kubenswrapper[4954]: I1206 09:19:11.067390 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-6c94j" event={"ID":"b6ef0d02-4ed6-4021-b60d-a06dc6566c48","Type":"ContainerStarted","Data":"984877c432345815aa85407ed92903fc48c4c80d75c5d939ec8e34f31d299fb9"} Dec 06 09:19:11 crc kubenswrapper[4954]: I1206 09:19:11.068955 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" event={"ID":"c6c6edf1-d239-4369-a85a-822e425a2909","Type":"ContainerStarted","Data":"a4d9ae5fe73858607e5909c3e4ef4af6bef874c8ca86f41d479fc3811d25f3c6"} Dec 06 09:19:11 crc kubenswrapper[4954]: I1206 09:19:11.088970 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-networker-6c94j" podStartSLOduration=2.683866135 podStartE2EDuration="3.088951677s" podCreationTimestamp="2025-12-06 09:19:08 +0000 UTC" firstStartedPulling="2025-12-06 09:19:10.003840218 +0000 UTC m=+8524.817199617" lastFinishedPulling="2025-12-06 09:19:10.40892577 +0000 UTC m=+8525.222285159" observedRunningTime="2025-12-06 09:19:11.082984177 +0000 UTC m=+8525.896343566" watchObservedRunningTime="2025-12-06 09:19:11.088951677 +0000 UTC m=+8525.902311066" Dec 06 09:19:11 crc kubenswrapper[4954]: I1206 09:19:11.113479 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" podStartSLOduration=2.5732148009999998 podStartE2EDuration="3.113456104s" podCreationTimestamp="2025-12-06 09:19:08 +0000 UTC" firstStartedPulling="2025-12-06 09:19:09.932239379 +0000 UTC m=+8524.745598768" lastFinishedPulling="2025-12-06 09:19:10.472480682 +0000 UTC m=+8525.285840071" observedRunningTime="2025-12-06 09:19:11.103383314 +0000 UTC m=+8525.916742733" watchObservedRunningTime="2025-12-06 09:19:11.113456104 +0000 UTC m=+8525.926815513" Dec 06 09:19:15 crc kubenswrapper[4954]: I1206 09:19:15.450140 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:19:15 crc kubenswrapper[4954]: E1206 09:19:15.450714 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:19:23 crc kubenswrapper[4954]: I1206 09:19:23.201295 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-d4b995744-2qch6" podUID="4f6b671e-20ef-4d02-aedd-a5d46bc23b40" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 06 09:19:29 crc kubenswrapper[4954]: I1206 09:19:29.004311 4954 scope.go:117] "RemoveContainer" containerID="462cad44fd266e5407bf823a7ada4fd1db3ebdd3fcb5bbacc34ef526a65ccbde" Dec 06 09:19:29 crc kubenswrapper[4954]: I1206 09:19:29.027186 4954 scope.go:117] "RemoveContainer" containerID="455db10736ecb45602e4af3895f5ab9cf9a22300fc9441546721dcfeebe2e38d" Dec 06 09:19:29 crc kubenswrapper[4954]: I1206 09:19:29.092834 4954 scope.go:117] "RemoveContainer" containerID="abcdeb08fa7b85002381dca07d746f3d79ea7ffff01e277649f6a2c8ca45c1b9" Dec 06 09:19:30 crc kubenswrapper[4954]: I1206 09:19:30.443771 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:19:30 crc kubenswrapper[4954]: E1206 09:19:30.444430 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:19:41 crc kubenswrapper[4954]: I1206 09:19:41.453267 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:19:41 crc kubenswrapper[4954]: E1206 09:19:41.458309 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:19:54 crc kubenswrapper[4954]: I1206 09:19:54.443437 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:19:54 crc kubenswrapper[4954]: E1206 09:19:54.444349 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:20:08 crc kubenswrapper[4954]: I1206 09:20:08.444252 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:20:08 crc kubenswrapper[4954]: E1206 09:20:08.445180 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:20:22 crc kubenswrapper[4954]: I1206 09:20:22.443658 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:20:22 crc kubenswrapper[4954]: E1206 09:20:22.444400 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.190882 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d4ktm"] Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.194753 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4ktm" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.207846 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d4ktm"] Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.294097 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6qrp\" (UniqueName: \"kubernetes.io/projected/cfbcd29f-472f-4eee-bfff-d34cc65ff037-kube-api-access-r6qrp\") pod \"community-operators-d4ktm\" (UID: \"cfbcd29f-472f-4eee-bfff-d34cc65ff037\") " pod="openshift-marketplace/community-operators-d4ktm" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.294161 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfbcd29f-472f-4eee-bfff-d34cc65ff037-catalog-content\") pod \"community-operators-d4ktm\" (UID: \"cfbcd29f-472f-4eee-bfff-d34cc65ff037\") " pod="openshift-marketplace/community-operators-d4ktm" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.294355 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfbcd29f-472f-4eee-bfff-d34cc65ff037-utilities\") pod \"community-operators-d4ktm\" (UID: \"cfbcd29f-472f-4eee-bfff-d34cc65ff037\") " pod="openshift-marketplace/community-operators-d4ktm" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.385422 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2bl2h"] Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.387927 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bl2h" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.396671 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6qrp\" (UniqueName: \"kubernetes.io/projected/cfbcd29f-472f-4eee-bfff-d34cc65ff037-kube-api-access-r6qrp\") pod \"community-operators-d4ktm\" (UID: \"cfbcd29f-472f-4eee-bfff-d34cc65ff037\") " pod="openshift-marketplace/community-operators-d4ktm" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.396727 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfbcd29f-472f-4eee-bfff-d34cc65ff037-catalog-content\") pod \"community-operators-d4ktm\" (UID: \"cfbcd29f-472f-4eee-bfff-d34cc65ff037\") " pod="openshift-marketplace/community-operators-d4ktm" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.396783 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfbcd29f-472f-4eee-bfff-d34cc65ff037-utilities\") pod \"community-operators-d4ktm\" (UID: \"cfbcd29f-472f-4eee-bfff-d34cc65ff037\") " pod="openshift-marketplace/community-operators-d4ktm" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.397281 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfbcd29f-472f-4eee-bfff-d34cc65ff037-catalog-content\") pod \"community-operators-d4ktm\" (UID: \"cfbcd29f-472f-4eee-bfff-d34cc65ff037\") " pod="openshift-marketplace/community-operators-d4ktm" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.397382 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfbcd29f-472f-4eee-bfff-d34cc65ff037-utilities\") pod \"community-operators-d4ktm\" (UID: \"cfbcd29f-472f-4eee-bfff-d34cc65ff037\") " pod="openshift-marketplace/community-operators-d4ktm" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.404318 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2bl2h"] Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.433435 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6qrp\" (UniqueName: \"kubernetes.io/projected/cfbcd29f-472f-4eee-bfff-d34cc65ff037-kube-api-access-r6qrp\") pod \"community-operators-d4ktm\" (UID: \"cfbcd29f-472f-4eee-bfff-d34cc65ff037\") " pod="openshift-marketplace/community-operators-d4ktm" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.499324 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lpfd\" (UniqueName: \"kubernetes.io/projected/e9aaa92c-1222-425b-9cf6-c56c7dca0f25-kube-api-access-6lpfd\") pod \"certified-operators-2bl2h\" (UID: \"e9aaa92c-1222-425b-9cf6-c56c7dca0f25\") " pod="openshift-marketplace/certified-operators-2bl2h" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.499407 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9aaa92c-1222-425b-9cf6-c56c7dca0f25-utilities\") pod \"certified-operators-2bl2h\" (UID: \"e9aaa92c-1222-425b-9cf6-c56c7dca0f25\") " pod="openshift-marketplace/certified-operators-2bl2h" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.499457 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9aaa92c-1222-425b-9cf6-c56c7dca0f25-catalog-content\") pod \"certified-operators-2bl2h\" (UID: \"e9aaa92c-1222-425b-9cf6-c56c7dca0f25\") " pod="openshift-marketplace/certified-operators-2bl2h" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.525914 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4ktm" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.602682 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lpfd\" (UniqueName: \"kubernetes.io/projected/e9aaa92c-1222-425b-9cf6-c56c7dca0f25-kube-api-access-6lpfd\") pod \"certified-operators-2bl2h\" (UID: \"e9aaa92c-1222-425b-9cf6-c56c7dca0f25\") " pod="openshift-marketplace/certified-operators-2bl2h" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.602768 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9aaa92c-1222-425b-9cf6-c56c7dca0f25-utilities\") pod \"certified-operators-2bl2h\" (UID: \"e9aaa92c-1222-425b-9cf6-c56c7dca0f25\") " pod="openshift-marketplace/certified-operators-2bl2h" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.602820 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9aaa92c-1222-425b-9cf6-c56c7dca0f25-catalog-content\") pod \"certified-operators-2bl2h\" (UID: \"e9aaa92c-1222-425b-9cf6-c56c7dca0f25\") " pod="openshift-marketplace/certified-operators-2bl2h" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.605127 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9aaa92c-1222-425b-9cf6-c56c7dca0f25-utilities\") pod \"certified-operators-2bl2h\" (UID: \"e9aaa92c-1222-425b-9cf6-c56c7dca0f25\") " pod="openshift-marketplace/certified-operators-2bl2h" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.605790 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9aaa92c-1222-425b-9cf6-c56c7dca0f25-catalog-content\") pod \"certified-operators-2bl2h\" (UID: \"e9aaa92c-1222-425b-9cf6-c56c7dca0f25\") " pod="openshift-marketplace/certified-operators-2bl2h" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.627280 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lpfd\" (UniqueName: \"kubernetes.io/projected/e9aaa92c-1222-425b-9cf6-c56c7dca0f25-kube-api-access-6lpfd\") pod \"certified-operators-2bl2h\" (UID: \"e9aaa92c-1222-425b-9cf6-c56c7dca0f25\") " pod="openshift-marketplace/certified-operators-2bl2h" Dec 06 09:20:29 crc kubenswrapper[4954]: I1206 09:20:29.735624 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bl2h" Dec 06 09:20:30 crc kubenswrapper[4954]: I1206 09:20:30.202677 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d4ktm"] Dec 06 09:20:30 crc kubenswrapper[4954]: I1206 09:20:30.383136 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2bl2h"] Dec 06 09:20:30 crc kubenswrapper[4954]: W1206 09:20:30.466884 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9aaa92c_1222_425b_9cf6_c56c7dca0f25.slice/crio-282dbc35cc8f967c77860cca33ed87e8613d0ece0d3aec6e9811afcb39cb0584 WatchSource:0}: Error finding container 282dbc35cc8f967c77860cca33ed87e8613d0ece0d3aec6e9811afcb39cb0584: Status 404 returned error can't find the container with id 282dbc35cc8f967c77860cca33ed87e8613d0ece0d3aec6e9811afcb39cb0584 Dec 06 09:20:30 crc kubenswrapper[4954]: I1206 09:20:30.826111 4954 generic.go:334] "Generic (PLEG): container finished" podID="cfbcd29f-472f-4eee-bfff-d34cc65ff037" containerID="b3f5f8ec7f82e8aaa15b226eff9d9ee6a3e89434375ad5cf8d120ad038c0914c" exitCode=0 Dec 06 09:20:30 crc kubenswrapper[4954]: I1206 09:20:30.826189 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4ktm" event={"ID":"cfbcd29f-472f-4eee-bfff-d34cc65ff037","Type":"ContainerDied","Data":"b3f5f8ec7f82e8aaa15b226eff9d9ee6a3e89434375ad5cf8d120ad038c0914c"} Dec 06 09:20:30 crc kubenswrapper[4954]: I1206 09:20:30.826409 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4ktm" event={"ID":"cfbcd29f-472f-4eee-bfff-d34cc65ff037","Type":"ContainerStarted","Data":"8d8bab951d9fe9bfb50e2ca5962e1ff9bd4bbbf7d4cc51de9576092472d12f87"} Dec 06 09:20:30 crc kubenswrapper[4954]: I1206 09:20:30.828355 4954 generic.go:334] "Generic (PLEG): container finished" podID="e9aaa92c-1222-425b-9cf6-c56c7dca0f25" containerID="1b3b5bc8298b24e9b0276ff1c0a0705af5433549ee363f0e472edc81320f4e6d" exitCode=0 Dec 06 09:20:30 crc kubenswrapper[4954]: I1206 09:20:30.828446 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bl2h" event={"ID":"e9aaa92c-1222-425b-9cf6-c56c7dca0f25","Type":"ContainerDied","Data":"1b3b5bc8298b24e9b0276ff1c0a0705af5433549ee363f0e472edc81320f4e6d"} Dec 06 09:20:30 crc kubenswrapper[4954]: I1206 09:20:30.828491 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bl2h" event={"ID":"e9aaa92c-1222-425b-9cf6-c56c7dca0f25","Type":"ContainerStarted","Data":"282dbc35cc8f967c77860cca33ed87e8613d0ece0d3aec6e9811afcb39cb0584"} Dec 06 09:20:31 crc kubenswrapper[4954]: I1206 09:20:31.856960 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bl2h" event={"ID":"e9aaa92c-1222-425b-9cf6-c56c7dca0f25","Type":"ContainerStarted","Data":"59afbf14d6fd8aec2e5f606b5b160da6d14c0f076a136653599b2a487cf2f587"} Dec 06 09:20:32 crc kubenswrapper[4954]: I1206 09:20:32.890862 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4ktm" event={"ID":"cfbcd29f-472f-4eee-bfff-d34cc65ff037","Type":"ContainerStarted","Data":"d827ff63d8794ea0d2a123409e7548b12cf037d434891df250a4a696db3dcd74"} Dec 06 09:20:33 crc kubenswrapper[4954]: I1206 09:20:33.911520 4954 generic.go:334] "Generic (PLEG): container finished" podID="cfbcd29f-472f-4eee-bfff-d34cc65ff037" containerID="d827ff63d8794ea0d2a123409e7548b12cf037d434891df250a4a696db3dcd74" exitCode=0 Dec 06 09:20:33 crc kubenswrapper[4954]: I1206 09:20:33.911948 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4ktm" event={"ID":"cfbcd29f-472f-4eee-bfff-d34cc65ff037","Type":"ContainerDied","Data":"d827ff63d8794ea0d2a123409e7548b12cf037d434891df250a4a696db3dcd74"} Dec 06 09:20:33 crc kubenswrapper[4954]: I1206 09:20:33.917257 4954 generic.go:334] "Generic (PLEG): container finished" podID="e9aaa92c-1222-425b-9cf6-c56c7dca0f25" containerID="59afbf14d6fd8aec2e5f606b5b160da6d14c0f076a136653599b2a487cf2f587" exitCode=0 Dec 06 09:20:33 crc kubenswrapper[4954]: I1206 09:20:33.917295 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bl2h" event={"ID":"e9aaa92c-1222-425b-9cf6-c56c7dca0f25","Type":"ContainerDied","Data":"59afbf14d6fd8aec2e5f606b5b160da6d14c0f076a136653599b2a487cf2f587"} Dec 06 09:20:35 crc kubenswrapper[4954]: I1206 09:20:35.451711 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:20:35 crc kubenswrapper[4954]: E1206 09:20:35.452427 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:20:35 crc kubenswrapper[4954]: I1206 09:20:35.941492 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4ktm" event={"ID":"cfbcd29f-472f-4eee-bfff-d34cc65ff037","Type":"ContainerStarted","Data":"30a3697a74a88be927cf66cdd7f10abb4edfe91807c1b918d68b8345847662ca"} Dec 06 09:20:35 crc kubenswrapper[4954]: I1206 09:20:35.944888 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bl2h" event={"ID":"e9aaa92c-1222-425b-9cf6-c56c7dca0f25","Type":"ContainerStarted","Data":"4f8f5d3e1c3a22b50ac22ac46da66b8ae568b23a280c059f5211bb4825f26517"} Dec 06 09:20:35 crc kubenswrapper[4954]: I1206 09:20:35.965403 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d4ktm" podStartSLOduration=2.87608013 podStartE2EDuration="6.965375958s" podCreationTimestamp="2025-12-06 09:20:29 +0000 UTC" firstStartedPulling="2025-12-06 09:20:30.827942667 +0000 UTC m=+8605.641302056" lastFinishedPulling="2025-12-06 09:20:34.917238495 +0000 UTC m=+8609.730597884" observedRunningTime="2025-12-06 09:20:35.959307276 +0000 UTC m=+8610.772666665" watchObservedRunningTime="2025-12-06 09:20:35.965375958 +0000 UTC m=+8610.778735347" Dec 06 09:20:35 crc kubenswrapper[4954]: I1206 09:20:35.984022 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2bl2h" podStartSLOduration=3.049238746 podStartE2EDuration="6.983999024s" podCreationTimestamp="2025-12-06 09:20:29 +0000 UTC" firstStartedPulling="2025-12-06 09:20:30.831402689 +0000 UTC m=+8605.644762068" lastFinishedPulling="2025-12-06 09:20:34.766162957 +0000 UTC m=+8609.579522346" observedRunningTime="2025-12-06 09:20:35.980577573 +0000 UTC m=+8610.793936962" watchObservedRunningTime="2025-12-06 09:20:35.983999024 +0000 UTC m=+8610.797358433" Dec 06 09:20:39 crc kubenswrapper[4954]: I1206 09:20:39.527128 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d4ktm" Dec 06 09:20:39 crc kubenswrapper[4954]: I1206 09:20:39.527806 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d4ktm" Dec 06 09:20:39 crc kubenswrapper[4954]: I1206 09:20:39.572608 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d4ktm" Dec 06 09:20:39 crc kubenswrapper[4954]: I1206 09:20:39.736813 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2bl2h" Dec 06 09:20:39 crc kubenswrapper[4954]: I1206 09:20:39.736861 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2bl2h" Dec 06 09:20:39 crc kubenswrapper[4954]: I1206 09:20:39.780816 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2bl2h" Dec 06 09:20:47 crc kubenswrapper[4954]: I1206 09:20:47.443978 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:20:47 crc kubenswrapper[4954]: E1206 09:20:47.444757 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:20:49 crc kubenswrapper[4954]: I1206 09:20:49.588273 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d4ktm" Dec 06 09:20:50 crc kubenswrapper[4954]: I1206 09:20:49.648451 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d4ktm"] Dec 06 09:20:50 crc kubenswrapper[4954]: I1206 09:20:49.794094 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2bl2h" Dec 06 09:20:50 crc kubenswrapper[4954]: I1206 09:20:50.080847 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d4ktm" podUID="cfbcd29f-472f-4eee-bfff-d34cc65ff037" containerName="registry-server" containerID="cri-o://30a3697a74a88be927cf66cdd7f10abb4edfe91807c1b918d68b8345847662ca" gracePeriod=2 Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.075094 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4ktm" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.099984 4954 generic.go:334] "Generic (PLEG): container finished" podID="cfbcd29f-472f-4eee-bfff-d34cc65ff037" containerID="30a3697a74a88be927cf66cdd7f10abb4edfe91807c1b918d68b8345847662ca" exitCode=0 Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.100042 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4ktm" event={"ID":"cfbcd29f-472f-4eee-bfff-d34cc65ff037","Type":"ContainerDied","Data":"30a3697a74a88be927cf66cdd7f10abb4edfe91807c1b918d68b8345847662ca"} Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.100071 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4ktm" event={"ID":"cfbcd29f-472f-4eee-bfff-d34cc65ff037","Type":"ContainerDied","Data":"8d8bab951d9fe9bfb50e2ca5962e1ff9bd4bbbf7d4cc51de9576092472d12f87"} Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.100106 4954 scope.go:117] "RemoveContainer" containerID="30a3697a74a88be927cf66cdd7f10abb4edfe91807c1b918d68b8345847662ca" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.100283 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4ktm" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.138315 4954 scope.go:117] "RemoveContainer" containerID="d827ff63d8794ea0d2a123409e7548b12cf037d434891df250a4a696db3dcd74" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.163665 4954 scope.go:117] "RemoveContainer" containerID="b3f5f8ec7f82e8aaa15b226eff9d9ee6a3e89434375ad5cf8d120ad038c0914c" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.191465 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6qrp\" (UniqueName: \"kubernetes.io/projected/cfbcd29f-472f-4eee-bfff-d34cc65ff037-kube-api-access-r6qrp\") pod \"cfbcd29f-472f-4eee-bfff-d34cc65ff037\" (UID: \"cfbcd29f-472f-4eee-bfff-d34cc65ff037\") " Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.191545 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfbcd29f-472f-4eee-bfff-d34cc65ff037-utilities\") pod \"cfbcd29f-472f-4eee-bfff-d34cc65ff037\" (UID: \"cfbcd29f-472f-4eee-bfff-d34cc65ff037\") " Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.191772 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfbcd29f-472f-4eee-bfff-d34cc65ff037-catalog-content\") pod \"cfbcd29f-472f-4eee-bfff-d34cc65ff037\" (UID: \"cfbcd29f-472f-4eee-bfff-d34cc65ff037\") " Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.192427 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfbcd29f-472f-4eee-bfff-d34cc65ff037-utilities" (OuterVolumeSpecName: "utilities") pod "cfbcd29f-472f-4eee-bfff-d34cc65ff037" (UID: "cfbcd29f-472f-4eee-bfff-d34cc65ff037"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.199726 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfbcd29f-472f-4eee-bfff-d34cc65ff037-kube-api-access-r6qrp" (OuterVolumeSpecName: "kube-api-access-r6qrp") pod "cfbcd29f-472f-4eee-bfff-d34cc65ff037" (UID: "cfbcd29f-472f-4eee-bfff-d34cc65ff037"). InnerVolumeSpecName "kube-api-access-r6qrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.221775 4954 scope.go:117] "RemoveContainer" containerID="30a3697a74a88be927cf66cdd7f10abb4edfe91807c1b918d68b8345847662ca" Dec 06 09:20:51 crc kubenswrapper[4954]: E1206 09:20:51.222306 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30a3697a74a88be927cf66cdd7f10abb4edfe91807c1b918d68b8345847662ca\": container with ID starting with 30a3697a74a88be927cf66cdd7f10abb4edfe91807c1b918d68b8345847662ca not found: ID does not exist" containerID="30a3697a74a88be927cf66cdd7f10abb4edfe91807c1b918d68b8345847662ca" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.222344 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30a3697a74a88be927cf66cdd7f10abb4edfe91807c1b918d68b8345847662ca"} err="failed to get container status \"30a3697a74a88be927cf66cdd7f10abb4edfe91807c1b918d68b8345847662ca\": rpc error: code = NotFound desc = could not find container \"30a3697a74a88be927cf66cdd7f10abb4edfe91807c1b918d68b8345847662ca\": container with ID starting with 30a3697a74a88be927cf66cdd7f10abb4edfe91807c1b918d68b8345847662ca not found: ID does not exist" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.222366 4954 scope.go:117] "RemoveContainer" containerID="d827ff63d8794ea0d2a123409e7548b12cf037d434891df250a4a696db3dcd74" Dec 06 09:20:51 crc kubenswrapper[4954]: E1206 09:20:51.222603 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d827ff63d8794ea0d2a123409e7548b12cf037d434891df250a4a696db3dcd74\": container with ID starting with d827ff63d8794ea0d2a123409e7548b12cf037d434891df250a4a696db3dcd74 not found: ID does not exist" containerID="d827ff63d8794ea0d2a123409e7548b12cf037d434891df250a4a696db3dcd74" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.222630 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d827ff63d8794ea0d2a123409e7548b12cf037d434891df250a4a696db3dcd74"} err="failed to get container status \"d827ff63d8794ea0d2a123409e7548b12cf037d434891df250a4a696db3dcd74\": rpc error: code = NotFound desc = could not find container \"d827ff63d8794ea0d2a123409e7548b12cf037d434891df250a4a696db3dcd74\": container with ID starting with d827ff63d8794ea0d2a123409e7548b12cf037d434891df250a4a696db3dcd74 not found: ID does not exist" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.222646 4954 scope.go:117] "RemoveContainer" containerID="b3f5f8ec7f82e8aaa15b226eff9d9ee6a3e89434375ad5cf8d120ad038c0914c" Dec 06 09:20:51 crc kubenswrapper[4954]: E1206 09:20:51.222813 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3f5f8ec7f82e8aaa15b226eff9d9ee6a3e89434375ad5cf8d120ad038c0914c\": container with ID starting with b3f5f8ec7f82e8aaa15b226eff9d9ee6a3e89434375ad5cf8d120ad038c0914c not found: ID does not exist" containerID="b3f5f8ec7f82e8aaa15b226eff9d9ee6a3e89434375ad5cf8d120ad038c0914c" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.222838 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3f5f8ec7f82e8aaa15b226eff9d9ee6a3e89434375ad5cf8d120ad038c0914c"} err="failed to get container status \"b3f5f8ec7f82e8aaa15b226eff9d9ee6a3e89434375ad5cf8d120ad038c0914c\": rpc error: code = NotFound desc = could not find container \"b3f5f8ec7f82e8aaa15b226eff9d9ee6a3e89434375ad5cf8d120ad038c0914c\": container with ID starting with b3f5f8ec7f82e8aaa15b226eff9d9ee6a3e89434375ad5cf8d120ad038c0914c not found: ID does not exist" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.227678 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2bl2h"] Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.228143 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2bl2h" podUID="e9aaa92c-1222-425b-9cf6-c56c7dca0f25" containerName="registry-server" containerID="cri-o://4f8f5d3e1c3a22b50ac22ac46da66b8ae568b23a280c059f5211bb4825f26517" gracePeriod=2 Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.261920 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfbcd29f-472f-4eee-bfff-d34cc65ff037-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfbcd29f-472f-4eee-bfff-d34cc65ff037" (UID: "cfbcd29f-472f-4eee-bfff-d34cc65ff037"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.293800 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfbcd29f-472f-4eee-bfff-d34cc65ff037-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.294039 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6qrp\" (UniqueName: \"kubernetes.io/projected/cfbcd29f-472f-4eee-bfff-d34cc65ff037-kube-api-access-r6qrp\") on node \"crc\" DevicePath \"\"" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.294133 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfbcd29f-472f-4eee-bfff-d34cc65ff037-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.435823 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d4ktm"] Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.455125 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d4ktm"] Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.717469 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bl2h" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.802840 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lpfd\" (UniqueName: \"kubernetes.io/projected/e9aaa92c-1222-425b-9cf6-c56c7dca0f25-kube-api-access-6lpfd\") pod \"e9aaa92c-1222-425b-9cf6-c56c7dca0f25\" (UID: \"e9aaa92c-1222-425b-9cf6-c56c7dca0f25\") " Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.803135 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9aaa92c-1222-425b-9cf6-c56c7dca0f25-catalog-content\") pod \"e9aaa92c-1222-425b-9cf6-c56c7dca0f25\" (UID: \"e9aaa92c-1222-425b-9cf6-c56c7dca0f25\") " Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.803208 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9aaa92c-1222-425b-9cf6-c56c7dca0f25-utilities\") pod \"e9aaa92c-1222-425b-9cf6-c56c7dca0f25\" (UID: \"e9aaa92c-1222-425b-9cf6-c56c7dca0f25\") " Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.803932 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9aaa92c-1222-425b-9cf6-c56c7dca0f25-utilities" (OuterVolumeSpecName: "utilities") pod "e9aaa92c-1222-425b-9cf6-c56c7dca0f25" (UID: "e9aaa92c-1222-425b-9cf6-c56c7dca0f25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.806970 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9aaa92c-1222-425b-9cf6-c56c7dca0f25-kube-api-access-6lpfd" (OuterVolumeSpecName: "kube-api-access-6lpfd") pod "e9aaa92c-1222-425b-9cf6-c56c7dca0f25" (UID: "e9aaa92c-1222-425b-9cf6-c56c7dca0f25"). InnerVolumeSpecName "kube-api-access-6lpfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.849689 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9aaa92c-1222-425b-9cf6-c56c7dca0f25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9aaa92c-1222-425b-9cf6-c56c7dca0f25" (UID: "e9aaa92c-1222-425b-9cf6-c56c7dca0f25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.905479 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lpfd\" (UniqueName: \"kubernetes.io/projected/e9aaa92c-1222-425b-9cf6-c56c7dca0f25-kube-api-access-6lpfd\") on node \"crc\" DevicePath \"\"" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.905520 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9aaa92c-1222-425b-9cf6-c56c7dca0f25-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:20:51 crc kubenswrapper[4954]: I1206 09:20:51.905532 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9aaa92c-1222-425b-9cf6-c56c7dca0f25-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:20:52 crc kubenswrapper[4954]: I1206 09:20:52.111883 4954 generic.go:334] "Generic (PLEG): container finished" podID="e9aaa92c-1222-425b-9cf6-c56c7dca0f25" containerID="4f8f5d3e1c3a22b50ac22ac46da66b8ae568b23a280c059f5211bb4825f26517" exitCode=0 Dec 06 09:20:52 crc kubenswrapper[4954]: I1206 09:20:52.111928 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bl2h" event={"ID":"e9aaa92c-1222-425b-9cf6-c56c7dca0f25","Type":"ContainerDied","Data":"4f8f5d3e1c3a22b50ac22ac46da66b8ae568b23a280c059f5211bb4825f26517"} Dec 06 09:20:52 crc kubenswrapper[4954]: I1206 09:20:52.111938 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bl2h" Dec 06 09:20:52 crc kubenswrapper[4954]: I1206 09:20:52.111959 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bl2h" event={"ID":"e9aaa92c-1222-425b-9cf6-c56c7dca0f25","Type":"ContainerDied","Data":"282dbc35cc8f967c77860cca33ed87e8613d0ece0d3aec6e9811afcb39cb0584"} Dec 06 09:20:52 crc kubenswrapper[4954]: I1206 09:20:52.111976 4954 scope.go:117] "RemoveContainer" containerID="4f8f5d3e1c3a22b50ac22ac46da66b8ae568b23a280c059f5211bb4825f26517" Dec 06 09:20:52 crc kubenswrapper[4954]: I1206 09:20:52.136124 4954 scope.go:117] "RemoveContainer" containerID="59afbf14d6fd8aec2e5f606b5b160da6d14c0f076a136653599b2a487cf2f587" Dec 06 09:20:52 crc kubenswrapper[4954]: I1206 09:20:52.156378 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2bl2h"] Dec 06 09:20:52 crc kubenswrapper[4954]: I1206 09:20:52.164991 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2bl2h"] Dec 06 09:20:52 crc kubenswrapper[4954]: I1206 09:20:52.178184 4954 scope.go:117] "RemoveContainer" containerID="1b3b5bc8298b24e9b0276ff1c0a0705af5433549ee363f0e472edc81320f4e6d" Dec 06 09:20:52 crc kubenswrapper[4954]: I1206 09:20:52.195165 4954 scope.go:117] "RemoveContainer" containerID="4f8f5d3e1c3a22b50ac22ac46da66b8ae568b23a280c059f5211bb4825f26517" Dec 06 09:20:52 crc kubenswrapper[4954]: E1206 09:20:52.195627 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f8f5d3e1c3a22b50ac22ac46da66b8ae568b23a280c059f5211bb4825f26517\": container with ID starting with 4f8f5d3e1c3a22b50ac22ac46da66b8ae568b23a280c059f5211bb4825f26517 not found: ID does not exist" containerID="4f8f5d3e1c3a22b50ac22ac46da66b8ae568b23a280c059f5211bb4825f26517" Dec 06 09:20:52 crc kubenswrapper[4954]: I1206 09:20:52.195657 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f8f5d3e1c3a22b50ac22ac46da66b8ae568b23a280c059f5211bb4825f26517"} err="failed to get container status \"4f8f5d3e1c3a22b50ac22ac46da66b8ae568b23a280c059f5211bb4825f26517\": rpc error: code = NotFound desc = could not find container \"4f8f5d3e1c3a22b50ac22ac46da66b8ae568b23a280c059f5211bb4825f26517\": container with ID starting with 4f8f5d3e1c3a22b50ac22ac46da66b8ae568b23a280c059f5211bb4825f26517 not found: ID does not exist" Dec 06 09:20:52 crc kubenswrapper[4954]: I1206 09:20:52.195681 4954 scope.go:117] "RemoveContainer" containerID="59afbf14d6fd8aec2e5f606b5b160da6d14c0f076a136653599b2a487cf2f587" Dec 06 09:20:52 crc kubenswrapper[4954]: E1206 09:20:52.195943 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59afbf14d6fd8aec2e5f606b5b160da6d14c0f076a136653599b2a487cf2f587\": container with ID starting with 59afbf14d6fd8aec2e5f606b5b160da6d14c0f076a136653599b2a487cf2f587 not found: ID does not exist" containerID="59afbf14d6fd8aec2e5f606b5b160da6d14c0f076a136653599b2a487cf2f587" Dec 06 09:20:52 crc kubenswrapper[4954]: I1206 09:20:52.195983 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59afbf14d6fd8aec2e5f606b5b160da6d14c0f076a136653599b2a487cf2f587"} err="failed to get container status \"59afbf14d6fd8aec2e5f606b5b160da6d14c0f076a136653599b2a487cf2f587\": rpc error: code = NotFound desc = could not find container \"59afbf14d6fd8aec2e5f606b5b160da6d14c0f076a136653599b2a487cf2f587\": container with ID starting with 59afbf14d6fd8aec2e5f606b5b160da6d14c0f076a136653599b2a487cf2f587 not found: ID does not exist" Dec 06 09:20:52 crc kubenswrapper[4954]: I1206 09:20:52.196010 4954 scope.go:117] "RemoveContainer" containerID="1b3b5bc8298b24e9b0276ff1c0a0705af5433549ee363f0e472edc81320f4e6d" Dec 06 09:20:52 crc kubenswrapper[4954]: E1206 09:20:52.196328 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b3b5bc8298b24e9b0276ff1c0a0705af5433549ee363f0e472edc81320f4e6d\": container with ID starting with 1b3b5bc8298b24e9b0276ff1c0a0705af5433549ee363f0e472edc81320f4e6d not found: ID does not exist" containerID="1b3b5bc8298b24e9b0276ff1c0a0705af5433549ee363f0e472edc81320f4e6d" Dec 06 09:20:52 crc kubenswrapper[4954]: I1206 09:20:52.196351 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b3b5bc8298b24e9b0276ff1c0a0705af5433549ee363f0e472edc81320f4e6d"} err="failed to get container status \"1b3b5bc8298b24e9b0276ff1c0a0705af5433549ee363f0e472edc81320f4e6d\": rpc error: code = NotFound desc = could not find container \"1b3b5bc8298b24e9b0276ff1c0a0705af5433549ee363f0e472edc81320f4e6d\": container with ID starting with 1b3b5bc8298b24e9b0276ff1c0a0705af5433549ee363f0e472edc81320f4e6d not found: ID does not exist" Dec 06 09:20:53 crc kubenswrapper[4954]: I1206 09:20:53.465442 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfbcd29f-472f-4eee-bfff-d34cc65ff037" path="/var/lib/kubelet/pods/cfbcd29f-472f-4eee-bfff-d34cc65ff037/volumes" Dec 06 09:20:53 crc kubenswrapper[4954]: I1206 09:20:53.466618 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9aaa92c-1222-425b-9cf6-c56c7dca0f25" path="/var/lib/kubelet/pods/e9aaa92c-1222-425b-9cf6-c56c7dca0f25/volumes" Dec 06 09:20:58 crc kubenswrapper[4954]: I1206 09:20:58.443285 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:20:58 crc kubenswrapper[4954]: E1206 09:20:58.444059 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:21:13 crc kubenswrapper[4954]: I1206 09:21:13.443645 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:21:13 crc kubenswrapper[4954]: E1206 09:21:13.444613 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:21:26 crc kubenswrapper[4954]: I1206 09:21:26.443992 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:21:26 crc kubenswrapper[4954]: E1206 09:21:26.445656 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:21:39 crc kubenswrapper[4954]: I1206 09:21:39.444110 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:21:39 crc kubenswrapper[4954]: E1206 09:21:39.445008 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:21:52 crc kubenswrapper[4954]: I1206 09:21:52.443919 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:21:52 crc kubenswrapper[4954]: I1206 09:21:52.741379 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"aecf62838f817cb682fa72b028865467dea354261d177fde14cf882c68fe87c3"} Dec 06 09:22:11 crc kubenswrapper[4954]: I1206 09:22:11.884995 4954 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.225924861s: [/var/lib/containers/storage/overlay/caca706aca3462a9fd612daccce2b1fd783fc16cbed21c8239f5cbe2c90faecd/diff /var/log/pods/openstack_bootstrap-openstack-openstack-networker-6c94j_b6ef0d02-4ed6-4021-b60d-a06dc6566c48/bootstrap-openstack-openstack-networker/0.log]; will not log again for this container unless duration exceeds 2s Dec 06 09:22:12 crc kubenswrapper[4954]: I1206 09:22:12.940758 4954 generic.go:334] "Generic (PLEG): container finished" podID="c6c6edf1-d239-4369-a85a-822e425a2909" containerID="a4d9ae5fe73858607e5909c3e4ef4af6bef874c8ca86f41d479fc3811d25f3c6" exitCode=0 Dec 06 09:22:12 crc kubenswrapper[4954]: I1206 09:22:12.940851 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" event={"ID":"c6c6edf1-d239-4369-a85a-822e425a2909","Type":"ContainerDied","Data":"a4d9ae5fe73858607e5909c3e4ef4af6bef874c8ca86f41d479fc3811d25f3c6"} Dec 06 09:22:14 crc kubenswrapper[4954]: I1206 09:22:14.493451 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" Dec 06 09:22:14 crc kubenswrapper[4954]: I1206 09:22:14.662084 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6c6edf1-d239-4369-a85a-822e425a2909-ssh-key\") pod \"c6c6edf1-d239-4369-a85a-822e425a2909\" (UID: \"c6c6edf1-d239-4369-a85a-822e425a2909\") " Dec 06 09:22:14 crc kubenswrapper[4954]: I1206 09:22:14.662148 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c6edf1-d239-4369-a85a-822e425a2909-bootstrap-combined-ca-bundle\") pod \"c6c6edf1-d239-4369-a85a-822e425a2909\" (UID: \"c6c6edf1-d239-4369-a85a-822e425a2909\") " Dec 06 09:22:14 crc kubenswrapper[4954]: I1206 09:22:14.662407 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6c6edf1-d239-4369-a85a-822e425a2909-inventory\") pod \"c6c6edf1-d239-4369-a85a-822e425a2909\" (UID: \"c6c6edf1-d239-4369-a85a-822e425a2909\") " Dec 06 09:22:14 crc kubenswrapper[4954]: I1206 09:22:14.662487 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jchxs\" (UniqueName: \"kubernetes.io/projected/c6c6edf1-d239-4369-a85a-822e425a2909-kube-api-access-jchxs\") pod \"c6c6edf1-d239-4369-a85a-822e425a2909\" (UID: \"c6c6edf1-d239-4369-a85a-822e425a2909\") " Dec 06 09:22:14 crc kubenswrapper[4954]: I1206 09:22:14.668052 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c6edf1-d239-4369-a85a-822e425a2909-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c6c6edf1-d239-4369-a85a-822e425a2909" (UID: "c6c6edf1-d239-4369-a85a-822e425a2909"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:22:14 crc kubenswrapper[4954]: I1206 09:22:14.668784 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6c6edf1-d239-4369-a85a-822e425a2909-kube-api-access-jchxs" (OuterVolumeSpecName: "kube-api-access-jchxs") pod "c6c6edf1-d239-4369-a85a-822e425a2909" (UID: "c6c6edf1-d239-4369-a85a-822e425a2909"). InnerVolumeSpecName "kube-api-access-jchxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:22:14 crc kubenswrapper[4954]: I1206 09:22:14.691535 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c6edf1-d239-4369-a85a-822e425a2909-inventory" (OuterVolumeSpecName: "inventory") pod "c6c6edf1-d239-4369-a85a-822e425a2909" (UID: "c6c6edf1-d239-4369-a85a-822e425a2909"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:22:14 crc kubenswrapper[4954]: I1206 09:22:14.692435 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c6edf1-d239-4369-a85a-822e425a2909-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c6c6edf1-d239-4369-a85a-822e425a2909" (UID: "c6c6edf1-d239-4369-a85a-822e425a2909"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:22:14 crc kubenswrapper[4954]: I1206 09:22:14.766927 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6c6edf1-d239-4369-a85a-822e425a2909-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:14 crc kubenswrapper[4954]: I1206 09:22:14.767527 4954 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c6edf1-d239-4369-a85a-822e425a2909-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:14 crc kubenswrapper[4954]: I1206 09:22:14.767546 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6c6edf1-d239-4369-a85a-822e425a2909-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:14 crc kubenswrapper[4954]: I1206 09:22:14.767576 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jchxs\" (UniqueName: \"kubernetes.io/projected/c6c6edf1-d239-4369-a85a-822e425a2909-kube-api-access-jchxs\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:14 crc kubenswrapper[4954]: I1206 09:22:14.974820 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" event={"ID":"c6c6edf1-d239-4369-a85a-822e425a2909","Type":"ContainerDied","Data":"5e4773190e1d8e1ff61d45748357ec2a50ad27db3b0b97c8dbc4cfa35bf6282b"} Dec 06 09:22:14 crc kubenswrapper[4954]: I1206 09:22:14.974863 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e4773190e1d8e1ff61d45748357ec2a50ad27db3b0b97c8dbc4cfa35bf6282b" Dec 06 09:22:14 crc kubenswrapper[4954]: I1206 09:22:14.974867 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-mxjln" Dec 06 09:22:14 crc kubenswrapper[4954]: I1206 09:22:14.976537 4954 generic.go:334] "Generic (PLEG): container finished" podID="b6ef0d02-4ed6-4021-b60d-a06dc6566c48" containerID="984877c432345815aa85407ed92903fc48c4c80d75c5d939ec8e34f31d299fb9" exitCode=0 Dec 06 09:22:14 crc kubenswrapper[4954]: I1206 09:22:14.976607 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-6c94j" event={"ID":"b6ef0d02-4ed6-4021-b60d-a06dc6566c48","Type":"ContainerDied","Data":"984877c432345815aa85407ed92903fc48c4c80d75c5d939ec8e34f31d299fb9"} Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.073654 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-bjn9q"] Dec 06 09:22:15 crc kubenswrapper[4954]: E1206 09:22:15.074111 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9aaa92c-1222-425b-9cf6-c56c7dca0f25" containerName="registry-server" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.074130 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9aaa92c-1222-425b-9cf6-c56c7dca0f25" containerName="registry-server" Dec 06 09:22:15 crc kubenswrapper[4954]: E1206 09:22:15.074143 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9aaa92c-1222-425b-9cf6-c56c7dca0f25" containerName="extract-content" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.074150 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9aaa92c-1222-425b-9cf6-c56c7dca0f25" containerName="extract-content" Dec 06 09:22:15 crc kubenswrapper[4954]: E1206 09:22:15.074170 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfbcd29f-472f-4eee-bfff-d34cc65ff037" containerName="extract-content" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.074177 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfbcd29f-472f-4eee-bfff-d34cc65ff037" containerName="extract-content" Dec 06 09:22:15 crc kubenswrapper[4954]: E1206 09:22:15.074188 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c6edf1-d239-4369-a85a-822e425a2909" containerName="bootstrap-openstack-openstack-cell1" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.074194 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c6edf1-d239-4369-a85a-822e425a2909" containerName="bootstrap-openstack-openstack-cell1" Dec 06 09:22:15 crc kubenswrapper[4954]: E1206 09:22:15.074208 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfbcd29f-472f-4eee-bfff-d34cc65ff037" containerName="extract-utilities" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.074214 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfbcd29f-472f-4eee-bfff-d34cc65ff037" containerName="extract-utilities" Dec 06 09:22:15 crc kubenswrapper[4954]: E1206 09:22:15.074221 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfbcd29f-472f-4eee-bfff-d34cc65ff037" containerName="registry-server" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.074227 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfbcd29f-472f-4eee-bfff-d34cc65ff037" containerName="registry-server" Dec 06 09:22:15 crc kubenswrapper[4954]: E1206 09:22:15.074241 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9aaa92c-1222-425b-9cf6-c56c7dca0f25" containerName="extract-utilities" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.074247 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9aaa92c-1222-425b-9cf6-c56c7dca0f25" containerName="extract-utilities" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.074454 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6c6edf1-d239-4369-a85a-822e425a2909" containerName="bootstrap-openstack-openstack-cell1" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.074467 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfbcd29f-472f-4eee-bfff-d34cc65ff037" containerName="registry-server" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.074511 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9aaa92c-1222-425b-9cf6-c56c7dca0f25" containerName="registry-server" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.075375 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-bjn9q" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.079179 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.079206 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.082613 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-bjn9q"] Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.176933 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vpp9\" (UniqueName: \"kubernetes.io/projected/7fda341b-7000-4aab-ba41-ba427a3c33bd-kube-api-access-9vpp9\") pod \"download-cache-openstack-openstack-cell1-bjn9q\" (UID: \"7fda341b-7000-4aab-ba41-ba427a3c33bd\") " pod="openstack/download-cache-openstack-openstack-cell1-bjn9q" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.177043 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fda341b-7000-4aab-ba41-ba427a3c33bd-inventory\") pod \"download-cache-openstack-openstack-cell1-bjn9q\" (UID: \"7fda341b-7000-4aab-ba41-ba427a3c33bd\") " pod="openstack/download-cache-openstack-openstack-cell1-bjn9q" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.177090 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fda341b-7000-4aab-ba41-ba427a3c33bd-ssh-key\") pod \"download-cache-openstack-openstack-cell1-bjn9q\" (UID: \"7fda341b-7000-4aab-ba41-ba427a3c33bd\") " pod="openstack/download-cache-openstack-openstack-cell1-bjn9q" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.279216 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fda341b-7000-4aab-ba41-ba427a3c33bd-inventory\") pod \"download-cache-openstack-openstack-cell1-bjn9q\" (UID: \"7fda341b-7000-4aab-ba41-ba427a3c33bd\") " pod="openstack/download-cache-openstack-openstack-cell1-bjn9q" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.279945 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fda341b-7000-4aab-ba41-ba427a3c33bd-ssh-key\") pod \"download-cache-openstack-openstack-cell1-bjn9q\" (UID: \"7fda341b-7000-4aab-ba41-ba427a3c33bd\") " pod="openstack/download-cache-openstack-openstack-cell1-bjn9q" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.280101 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vpp9\" (UniqueName: \"kubernetes.io/projected/7fda341b-7000-4aab-ba41-ba427a3c33bd-kube-api-access-9vpp9\") pod \"download-cache-openstack-openstack-cell1-bjn9q\" (UID: \"7fda341b-7000-4aab-ba41-ba427a3c33bd\") " pod="openstack/download-cache-openstack-openstack-cell1-bjn9q" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.287235 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fda341b-7000-4aab-ba41-ba427a3c33bd-inventory\") pod \"download-cache-openstack-openstack-cell1-bjn9q\" (UID: \"7fda341b-7000-4aab-ba41-ba427a3c33bd\") " pod="openstack/download-cache-openstack-openstack-cell1-bjn9q" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.293398 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fda341b-7000-4aab-ba41-ba427a3c33bd-ssh-key\") pod \"download-cache-openstack-openstack-cell1-bjn9q\" (UID: \"7fda341b-7000-4aab-ba41-ba427a3c33bd\") " pod="openstack/download-cache-openstack-openstack-cell1-bjn9q" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.306936 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vpp9\" (UniqueName: \"kubernetes.io/projected/7fda341b-7000-4aab-ba41-ba427a3c33bd-kube-api-access-9vpp9\") pod \"download-cache-openstack-openstack-cell1-bjn9q\" (UID: \"7fda341b-7000-4aab-ba41-ba427a3c33bd\") " pod="openstack/download-cache-openstack-openstack-cell1-bjn9q" Dec 06 09:22:15 crc kubenswrapper[4954]: I1206 09:22:15.405294 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-bjn9q" Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.023378 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-bjn9q"] Dec 06 09:22:16 crc kubenswrapper[4954]: W1206 09:22:16.024708 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fda341b_7000_4aab_ba41_ba427a3c33bd.slice/crio-32ace15edf35c3b6352507672ff19f00f6829abe0628cf628e27cedc85558a15 WatchSource:0}: Error finding container 32ace15edf35c3b6352507672ff19f00f6829abe0628cf628e27cedc85558a15: Status 404 returned error can't find the container with id 32ace15edf35c3b6352507672ff19f00f6829abe0628cf628e27cedc85558a15 Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.029629 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.414221 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-6c94j" Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.500635 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-inventory\") pod \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\" (UID: \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\") " Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.500735 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-ssh-key\") pod \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\" (UID: \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\") " Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.500883 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-bootstrap-combined-ca-bundle\") pod \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\" (UID: \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\") " Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.500941 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftdlc\" (UniqueName: \"kubernetes.io/projected/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-kube-api-access-ftdlc\") pod \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\" (UID: \"b6ef0d02-4ed6-4021-b60d-a06dc6566c48\") " Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.506106 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-kube-api-access-ftdlc" (OuterVolumeSpecName: "kube-api-access-ftdlc") pod "b6ef0d02-4ed6-4021-b60d-a06dc6566c48" (UID: "b6ef0d02-4ed6-4021-b60d-a06dc6566c48"). InnerVolumeSpecName "kube-api-access-ftdlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.506674 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "b6ef0d02-4ed6-4021-b60d-a06dc6566c48" (UID: "b6ef0d02-4ed6-4021-b60d-a06dc6566c48"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.532047 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-inventory" (OuterVolumeSpecName: "inventory") pod "b6ef0d02-4ed6-4021-b60d-a06dc6566c48" (UID: "b6ef0d02-4ed6-4021-b60d-a06dc6566c48"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.532990 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b6ef0d02-4ed6-4021-b60d-a06dc6566c48" (UID: "b6ef0d02-4ed6-4021-b60d-a06dc6566c48"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.602830 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftdlc\" (UniqueName: \"kubernetes.io/projected/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-kube-api-access-ftdlc\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.603078 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.603091 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.603104 4954 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ef0d02-4ed6-4021-b60d-a06dc6566c48-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.994633 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-bjn9q" event={"ID":"7fda341b-7000-4aab-ba41-ba427a3c33bd","Type":"ContainerStarted","Data":"0a6b5df4d5b99c81ca9c8a073f27a02f6539f92fa92f8fcf935fb60d968c273a"} Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.994700 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-bjn9q" event={"ID":"7fda341b-7000-4aab-ba41-ba427a3c33bd","Type":"ContainerStarted","Data":"32ace15edf35c3b6352507672ff19f00f6829abe0628cf628e27cedc85558a15"} Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.996757 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-6c94j" event={"ID":"b6ef0d02-4ed6-4021-b60d-a06dc6566c48","Type":"ContainerDied","Data":"697b861f01c7cfe8998bafef5d6a3f701277e618e852429238f1a1f8f8f10604"} Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.996789 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-6c94j" Dec 06 09:22:16 crc kubenswrapper[4954]: I1206 09:22:16.996799 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="697b861f01c7cfe8998bafef5d6a3f701277e618e852429238f1a1f8f8f10604" Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.016209 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-bjn9q" podStartSLOduration=1.516025778 podStartE2EDuration="2.016192222s" podCreationTimestamp="2025-12-06 09:22:15 +0000 UTC" firstStartedPulling="2025-12-06 09:22:16.029379334 +0000 UTC m=+8710.842738723" lastFinishedPulling="2025-12-06 09:22:16.529545778 +0000 UTC m=+8711.342905167" observedRunningTime="2025-12-06 09:22:17.011217129 +0000 UTC m=+8711.824576518" watchObservedRunningTime="2025-12-06 09:22:17.016192222 +0000 UTC m=+8711.829551611" Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.085759 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-networker-8wnbc"] Dec 06 09:22:17 crc kubenswrapper[4954]: E1206 09:22:17.086161 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6ef0d02-4ed6-4021-b60d-a06dc6566c48" containerName="bootstrap-openstack-openstack-networker" Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.086175 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ef0d02-4ed6-4021-b60d-a06dc6566c48" containerName="bootstrap-openstack-openstack-networker" Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.086410 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6ef0d02-4ed6-4021-b60d-a06dc6566c48" containerName="bootstrap-openstack-openstack-networker" Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.087206 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-8wnbc" Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.089976 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.090227 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-qwx8g" Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.117548 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-networker-8wnbc"] Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.214844 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c44x\" (UniqueName: \"kubernetes.io/projected/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-kube-api-access-8c44x\") pod \"download-cache-openstack-openstack-networker-8wnbc\" (UID: \"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7\") " pod="openstack/download-cache-openstack-openstack-networker-8wnbc" Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.214906 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-inventory\") pod \"download-cache-openstack-openstack-networker-8wnbc\" (UID: \"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7\") " pod="openstack/download-cache-openstack-openstack-networker-8wnbc" Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.215070 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-ssh-key\") pod \"download-cache-openstack-openstack-networker-8wnbc\" (UID: \"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7\") " pod="openstack/download-cache-openstack-openstack-networker-8wnbc" Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.319249 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-ssh-key\") pod \"download-cache-openstack-openstack-networker-8wnbc\" (UID: \"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7\") " pod="openstack/download-cache-openstack-openstack-networker-8wnbc" Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.320245 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c44x\" (UniqueName: \"kubernetes.io/projected/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-kube-api-access-8c44x\") pod \"download-cache-openstack-openstack-networker-8wnbc\" (UID: \"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7\") " pod="openstack/download-cache-openstack-openstack-networker-8wnbc" Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.320360 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-inventory\") pod \"download-cache-openstack-openstack-networker-8wnbc\" (UID: \"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7\") " pod="openstack/download-cache-openstack-openstack-networker-8wnbc" Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.324724 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-inventory\") pod \"download-cache-openstack-openstack-networker-8wnbc\" (UID: \"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7\") " pod="openstack/download-cache-openstack-openstack-networker-8wnbc" Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.327409 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-ssh-key\") pod \"download-cache-openstack-openstack-networker-8wnbc\" (UID: \"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7\") " pod="openstack/download-cache-openstack-openstack-networker-8wnbc" Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.337033 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c44x\" (UniqueName: \"kubernetes.io/projected/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-kube-api-access-8c44x\") pod \"download-cache-openstack-openstack-networker-8wnbc\" (UID: \"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7\") " pod="openstack/download-cache-openstack-openstack-networker-8wnbc" Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.403384 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-8wnbc" Dec 06 09:22:17 crc kubenswrapper[4954]: I1206 09:22:17.941860 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-networker-8wnbc"] Dec 06 09:22:17 crc kubenswrapper[4954]: W1206 09:22:17.946143 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d75c3e1_cfed_4a30_867d_e9aeabd7cee7.slice/crio-c1c42d06aff0133f9970037b5f4469422b6c65b29c319434b8d2796d16c1d342 WatchSource:0}: Error finding container c1c42d06aff0133f9970037b5f4469422b6c65b29c319434b8d2796d16c1d342: Status 404 returned error can't find the container with id c1c42d06aff0133f9970037b5f4469422b6c65b29c319434b8d2796d16c1d342 Dec 06 09:22:18 crc kubenswrapper[4954]: I1206 09:22:18.016999 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-8wnbc" event={"ID":"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7","Type":"ContainerStarted","Data":"c1c42d06aff0133f9970037b5f4469422b6c65b29c319434b8d2796d16c1d342"} Dec 06 09:22:21 crc kubenswrapper[4954]: I1206 09:22:21.042256 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-8wnbc" event={"ID":"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7","Type":"ContainerStarted","Data":"05a3beb433ad9fdc20bd7ee0a0e52f6231631f8e579ee84c66c861299b7f4e92"} Dec 06 09:22:21 crc kubenswrapper[4954]: I1206 09:22:21.063204 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-networker-8wnbc" podStartSLOduration=2.472721181 podStartE2EDuration="4.063182811s" podCreationTimestamp="2025-12-06 09:22:17 +0000 UTC" firstStartedPulling="2025-12-06 09:22:17.949606536 +0000 UTC m=+8712.762965925" lastFinishedPulling="2025-12-06 09:22:19.540068146 +0000 UTC m=+8714.353427555" observedRunningTime="2025-12-06 09:22:21.05450047 +0000 UTC m=+8715.867859849" watchObservedRunningTime="2025-12-06 09:22:21.063182811 +0000 UTC m=+8715.876542200" Dec 06 09:23:29 crc kubenswrapper[4954]: I1206 09:23:29.652395 4954 generic.go:334] "Generic (PLEG): container finished" podID="8d75c3e1-cfed-4a30-867d-e9aeabd7cee7" containerID="05a3beb433ad9fdc20bd7ee0a0e52f6231631f8e579ee84c66c861299b7f4e92" exitCode=0 Dec 06 09:23:29 crc kubenswrapper[4954]: I1206 09:23:29.652505 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-8wnbc" event={"ID":"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7","Type":"ContainerDied","Data":"05a3beb433ad9fdc20bd7ee0a0e52f6231631f8e579ee84c66c861299b7f4e92"} Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.042921 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-8wnbc" Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.197682 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-ssh-key\") pod \"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7\" (UID: \"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7\") " Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.197720 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-inventory\") pod \"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7\" (UID: \"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7\") " Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.197761 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c44x\" (UniqueName: \"kubernetes.io/projected/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-kube-api-access-8c44x\") pod \"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7\" (UID: \"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7\") " Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.205290 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-kube-api-access-8c44x" (OuterVolumeSpecName: "kube-api-access-8c44x") pod "8d75c3e1-cfed-4a30-867d-e9aeabd7cee7" (UID: "8d75c3e1-cfed-4a30-867d-e9aeabd7cee7"). InnerVolumeSpecName "kube-api-access-8c44x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:23:31 crc kubenswrapper[4954]: E1206 09:23:31.227200 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-ssh-key podName:8d75c3e1-cfed-4a30-867d-e9aeabd7cee7 nodeName:}" failed. No retries permitted until 2025-12-06 09:23:31.727140653 +0000 UTC m=+8786.540500042 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key" (UniqueName: "kubernetes.io/secret/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-ssh-key") pod "8d75c3e1-cfed-4a30-867d-e9aeabd7cee7" (UID: "8d75c3e1-cfed-4a30-867d-e9aeabd7cee7") : error deleting /var/lib/kubelet/pods/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7/volume-subpaths: remove /var/lib/kubelet/pods/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7/volume-subpaths: no such file or directory Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.229387 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-inventory" (OuterVolumeSpecName: "inventory") pod "8d75c3e1-cfed-4a30-867d-e9aeabd7cee7" (UID: "8d75c3e1-cfed-4a30-867d-e9aeabd7cee7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.301444 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.301478 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c44x\" (UniqueName: \"kubernetes.io/projected/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-kube-api-access-8c44x\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.671672 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-8wnbc" event={"ID":"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7","Type":"ContainerDied","Data":"c1c42d06aff0133f9970037b5f4469422b6c65b29c319434b8d2796d16c1d342"} Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.671709 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-8wnbc" Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.671713 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1c42d06aff0133f9970037b5f4469422b6c65b29c319434b8d2796d16c1d342" Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.770316 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-networker-dkvqf"] Dec 06 09:23:31 crc kubenswrapper[4954]: E1206 09:23:31.771119 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d75c3e1-cfed-4a30-867d-e9aeabd7cee7" containerName="download-cache-openstack-openstack-networker" Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.771241 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d75c3e1-cfed-4a30-867d-e9aeabd7cee7" containerName="download-cache-openstack-openstack-networker" Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.771639 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d75c3e1-cfed-4a30-867d-e9aeabd7cee7" containerName="download-cache-openstack-openstack-networker" Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.772529 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-dkvqf" Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.779700 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-networker-dkvqf"] Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.812002 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-ssh-key\") pod \"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7\" (UID: \"8d75c3e1-cfed-4a30-867d-e9aeabd7cee7\") " Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.825863 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8d75c3e1-cfed-4a30-867d-e9aeabd7cee7" (UID: "8d75c3e1-cfed-4a30-867d-e9aeabd7cee7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.914083 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz924\" (UniqueName: \"kubernetes.io/projected/68cf26b2-12c2-40a9-9dcc-42048e97c0f4-kube-api-access-pz924\") pod \"configure-network-openstack-openstack-networker-dkvqf\" (UID: \"68cf26b2-12c2-40a9-9dcc-42048e97c0f4\") " pod="openstack/configure-network-openstack-openstack-networker-dkvqf" Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.914216 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68cf26b2-12c2-40a9-9dcc-42048e97c0f4-ssh-key\") pod \"configure-network-openstack-openstack-networker-dkvqf\" (UID: \"68cf26b2-12c2-40a9-9dcc-42048e97c0f4\") " pod="openstack/configure-network-openstack-openstack-networker-dkvqf" Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.914295 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68cf26b2-12c2-40a9-9dcc-42048e97c0f4-inventory\") pod \"configure-network-openstack-openstack-networker-dkvqf\" (UID: \"68cf26b2-12c2-40a9-9dcc-42048e97c0f4\") " pod="openstack/configure-network-openstack-openstack-networker-dkvqf" Dec 06 09:23:31 crc kubenswrapper[4954]: I1206 09:23:31.914466 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d75c3e1-cfed-4a30-867d-e9aeabd7cee7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:32 crc kubenswrapper[4954]: I1206 09:23:32.015773 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz924\" (UniqueName: \"kubernetes.io/projected/68cf26b2-12c2-40a9-9dcc-42048e97c0f4-kube-api-access-pz924\") pod \"configure-network-openstack-openstack-networker-dkvqf\" (UID: \"68cf26b2-12c2-40a9-9dcc-42048e97c0f4\") " pod="openstack/configure-network-openstack-openstack-networker-dkvqf" Dec 06 09:23:32 crc kubenswrapper[4954]: I1206 09:23:32.015843 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68cf26b2-12c2-40a9-9dcc-42048e97c0f4-ssh-key\") pod \"configure-network-openstack-openstack-networker-dkvqf\" (UID: \"68cf26b2-12c2-40a9-9dcc-42048e97c0f4\") " pod="openstack/configure-network-openstack-openstack-networker-dkvqf" Dec 06 09:23:32 crc kubenswrapper[4954]: I1206 09:23:32.015876 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68cf26b2-12c2-40a9-9dcc-42048e97c0f4-inventory\") pod \"configure-network-openstack-openstack-networker-dkvqf\" (UID: \"68cf26b2-12c2-40a9-9dcc-42048e97c0f4\") " pod="openstack/configure-network-openstack-openstack-networker-dkvqf" Dec 06 09:23:32 crc kubenswrapper[4954]: I1206 09:23:32.020723 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68cf26b2-12c2-40a9-9dcc-42048e97c0f4-inventory\") pod \"configure-network-openstack-openstack-networker-dkvqf\" (UID: \"68cf26b2-12c2-40a9-9dcc-42048e97c0f4\") " pod="openstack/configure-network-openstack-openstack-networker-dkvqf" Dec 06 09:23:32 crc kubenswrapper[4954]: I1206 09:23:32.024255 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68cf26b2-12c2-40a9-9dcc-42048e97c0f4-ssh-key\") pod \"configure-network-openstack-openstack-networker-dkvqf\" (UID: \"68cf26b2-12c2-40a9-9dcc-42048e97c0f4\") " pod="openstack/configure-network-openstack-openstack-networker-dkvqf" Dec 06 09:23:32 crc kubenswrapper[4954]: I1206 09:23:32.038265 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz924\" (UniqueName: \"kubernetes.io/projected/68cf26b2-12c2-40a9-9dcc-42048e97c0f4-kube-api-access-pz924\") pod \"configure-network-openstack-openstack-networker-dkvqf\" (UID: \"68cf26b2-12c2-40a9-9dcc-42048e97c0f4\") " pod="openstack/configure-network-openstack-openstack-networker-dkvqf" Dec 06 09:23:32 crc kubenswrapper[4954]: I1206 09:23:32.099644 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-dkvqf" Dec 06 09:23:32 crc kubenswrapper[4954]: I1206 09:23:32.625274 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-networker-dkvqf"] Dec 06 09:23:32 crc kubenswrapper[4954]: I1206 09:23:32.685127 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-dkvqf" event={"ID":"68cf26b2-12c2-40a9-9dcc-42048e97c0f4","Type":"ContainerStarted","Data":"18ea3ec07bc6769904a1abbd558d2aa6dad3f6cb8152227b0fd4ed48c3c57462"} Dec 06 09:23:33 crc kubenswrapper[4954]: I1206 09:23:33.694938 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-dkvqf" event={"ID":"68cf26b2-12c2-40a9-9dcc-42048e97c0f4","Type":"ContainerStarted","Data":"d82d8a05621bc4651253e69bdce6db086538243d1d676059fd8ecd2d28620d87"} Dec 06 09:23:33 crc kubenswrapper[4954]: I1206 09:23:33.716154 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-networker-dkvqf" podStartSLOduration=2.239783188 podStartE2EDuration="2.716114227s" podCreationTimestamp="2025-12-06 09:23:31 +0000 UTC" firstStartedPulling="2025-12-06 09:23:32.624326681 +0000 UTC m=+8787.437686070" lastFinishedPulling="2025-12-06 09:23:33.10065772 +0000 UTC m=+8787.914017109" observedRunningTime="2025-12-06 09:23:33.708784792 +0000 UTC m=+8788.522144191" watchObservedRunningTime="2025-12-06 09:23:33.716114227 +0000 UTC m=+8788.529473616" Dec 06 09:23:50 crc kubenswrapper[4954]: I1206 09:23:50.857227 4954 generic.go:334] "Generic (PLEG): container finished" podID="7fda341b-7000-4aab-ba41-ba427a3c33bd" containerID="0a6b5df4d5b99c81ca9c8a073f27a02f6539f92fa92f8fcf935fb60d968c273a" exitCode=0 Dec 06 09:23:50 crc kubenswrapper[4954]: I1206 09:23:50.857762 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-bjn9q" event={"ID":"7fda341b-7000-4aab-ba41-ba427a3c33bd","Type":"ContainerDied","Data":"0a6b5df4d5b99c81ca9c8a073f27a02f6539f92fa92f8fcf935fb60d968c273a"} Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.576037 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-bjn9q" Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.679491 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fda341b-7000-4aab-ba41-ba427a3c33bd-inventory\") pod \"7fda341b-7000-4aab-ba41-ba427a3c33bd\" (UID: \"7fda341b-7000-4aab-ba41-ba427a3c33bd\") " Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.680049 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vpp9\" (UniqueName: \"kubernetes.io/projected/7fda341b-7000-4aab-ba41-ba427a3c33bd-kube-api-access-9vpp9\") pod \"7fda341b-7000-4aab-ba41-ba427a3c33bd\" (UID: \"7fda341b-7000-4aab-ba41-ba427a3c33bd\") " Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.680097 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fda341b-7000-4aab-ba41-ba427a3c33bd-ssh-key\") pod \"7fda341b-7000-4aab-ba41-ba427a3c33bd\" (UID: \"7fda341b-7000-4aab-ba41-ba427a3c33bd\") " Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.684926 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fda341b-7000-4aab-ba41-ba427a3c33bd-kube-api-access-9vpp9" (OuterVolumeSpecName: "kube-api-access-9vpp9") pod "7fda341b-7000-4aab-ba41-ba427a3c33bd" (UID: "7fda341b-7000-4aab-ba41-ba427a3c33bd"). InnerVolumeSpecName "kube-api-access-9vpp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.714818 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fda341b-7000-4aab-ba41-ba427a3c33bd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7fda341b-7000-4aab-ba41-ba427a3c33bd" (UID: "7fda341b-7000-4aab-ba41-ba427a3c33bd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.715493 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fda341b-7000-4aab-ba41-ba427a3c33bd-inventory" (OuterVolumeSpecName: "inventory") pod "7fda341b-7000-4aab-ba41-ba427a3c33bd" (UID: "7fda341b-7000-4aab-ba41-ba427a3c33bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.782515 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vpp9\" (UniqueName: \"kubernetes.io/projected/7fda341b-7000-4aab-ba41-ba427a3c33bd-kube-api-access-9vpp9\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.782843 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fda341b-7000-4aab-ba41-ba427a3c33bd-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.782854 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fda341b-7000-4aab-ba41-ba427a3c33bd-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.884185 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-bjn9q" event={"ID":"7fda341b-7000-4aab-ba41-ba427a3c33bd","Type":"ContainerDied","Data":"32ace15edf35c3b6352507672ff19f00f6829abe0628cf628e27cedc85558a15"} Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.884232 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32ace15edf35c3b6352507672ff19f00f6829abe0628cf628e27cedc85558a15" Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.884303 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-bjn9q" Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.961453 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-lxt7m"] Dec 06 09:23:52 crc kubenswrapper[4954]: E1206 09:23:52.961880 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fda341b-7000-4aab-ba41-ba427a3c33bd" containerName="download-cache-openstack-openstack-cell1" Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.961898 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fda341b-7000-4aab-ba41-ba427a3c33bd" containerName="download-cache-openstack-openstack-cell1" Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.962140 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fda341b-7000-4aab-ba41-ba427a3c33bd" containerName="download-cache-openstack-openstack-cell1" Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.962838 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-lxt7m" Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.965845 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.966066 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:23:52 crc kubenswrapper[4954]: I1206 09:23:52.983365 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-lxt7m"] Dec 06 09:23:53 crc kubenswrapper[4954]: I1206 09:23:53.088401 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/723e1f46-4585-4b35-ab0f-cdd381997052-inventory\") pod \"configure-network-openstack-openstack-cell1-lxt7m\" (UID: \"723e1f46-4585-4b35-ab0f-cdd381997052\") " pod="openstack/configure-network-openstack-openstack-cell1-lxt7m" Dec 06 09:23:53 crc kubenswrapper[4954]: I1206 09:23:53.088448 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/723e1f46-4585-4b35-ab0f-cdd381997052-ssh-key\") pod \"configure-network-openstack-openstack-cell1-lxt7m\" (UID: \"723e1f46-4585-4b35-ab0f-cdd381997052\") " pod="openstack/configure-network-openstack-openstack-cell1-lxt7m" Dec 06 09:23:53 crc kubenswrapper[4954]: I1206 09:23:53.088498 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58qpr\" (UniqueName: \"kubernetes.io/projected/723e1f46-4585-4b35-ab0f-cdd381997052-kube-api-access-58qpr\") pod \"configure-network-openstack-openstack-cell1-lxt7m\" (UID: \"723e1f46-4585-4b35-ab0f-cdd381997052\") " pod="openstack/configure-network-openstack-openstack-cell1-lxt7m" Dec 06 09:23:53 crc kubenswrapper[4954]: I1206 09:23:53.190909 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/723e1f46-4585-4b35-ab0f-cdd381997052-inventory\") pod \"configure-network-openstack-openstack-cell1-lxt7m\" (UID: \"723e1f46-4585-4b35-ab0f-cdd381997052\") " pod="openstack/configure-network-openstack-openstack-cell1-lxt7m" Dec 06 09:23:53 crc kubenswrapper[4954]: I1206 09:23:53.190956 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/723e1f46-4585-4b35-ab0f-cdd381997052-ssh-key\") pod \"configure-network-openstack-openstack-cell1-lxt7m\" (UID: \"723e1f46-4585-4b35-ab0f-cdd381997052\") " pod="openstack/configure-network-openstack-openstack-cell1-lxt7m" Dec 06 09:23:53 crc kubenswrapper[4954]: I1206 09:23:53.190996 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58qpr\" (UniqueName: \"kubernetes.io/projected/723e1f46-4585-4b35-ab0f-cdd381997052-kube-api-access-58qpr\") pod \"configure-network-openstack-openstack-cell1-lxt7m\" (UID: \"723e1f46-4585-4b35-ab0f-cdd381997052\") " pod="openstack/configure-network-openstack-openstack-cell1-lxt7m" Dec 06 09:23:53 crc kubenswrapper[4954]: I1206 09:23:53.197253 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/723e1f46-4585-4b35-ab0f-cdd381997052-ssh-key\") pod \"configure-network-openstack-openstack-cell1-lxt7m\" (UID: \"723e1f46-4585-4b35-ab0f-cdd381997052\") " pod="openstack/configure-network-openstack-openstack-cell1-lxt7m" Dec 06 09:23:53 crc kubenswrapper[4954]: I1206 09:23:53.198998 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/723e1f46-4585-4b35-ab0f-cdd381997052-inventory\") pod \"configure-network-openstack-openstack-cell1-lxt7m\" (UID: \"723e1f46-4585-4b35-ab0f-cdd381997052\") " pod="openstack/configure-network-openstack-openstack-cell1-lxt7m" Dec 06 09:23:53 crc kubenswrapper[4954]: I1206 09:23:53.209300 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58qpr\" (UniqueName: \"kubernetes.io/projected/723e1f46-4585-4b35-ab0f-cdd381997052-kube-api-access-58qpr\") pod \"configure-network-openstack-openstack-cell1-lxt7m\" (UID: \"723e1f46-4585-4b35-ab0f-cdd381997052\") " pod="openstack/configure-network-openstack-openstack-cell1-lxt7m" Dec 06 09:23:53 crc kubenswrapper[4954]: I1206 09:23:53.289043 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-lxt7m" Dec 06 09:23:53 crc kubenswrapper[4954]: W1206 09:23:53.855054 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod723e1f46_4585_4b35_ab0f_cdd381997052.slice/crio-01564103ba0e57fa14beef61f2180192ec630b412b23ae06c5e9dbfe2f7ee5e3 WatchSource:0}: Error finding container 01564103ba0e57fa14beef61f2180192ec630b412b23ae06c5e9dbfe2f7ee5e3: Status 404 returned error can't find the container with id 01564103ba0e57fa14beef61f2180192ec630b412b23ae06c5e9dbfe2f7ee5e3 Dec 06 09:23:53 crc kubenswrapper[4954]: I1206 09:23:53.877753 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-lxt7m"] Dec 06 09:23:53 crc kubenswrapper[4954]: I1206 09:23:53.896304 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-lxt7m" event={"ID":"723e1f46-4585-4b35-ab0f-cdd381997052","Type":"ContainerStarted","Data":"01564103ba0e57fa14beef61f2180192ec630b412b23ae06c5e9dbfe2f7ee5e3"} Dec 06 09:23:55 crc kubenswrapper[4954]: I1206 09:23:55.915011 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-lxt7m" event={"ID":"723e1f46-4585-4b35-ab0f-cdd381997052","Type":"ContainerStarted","Data":"d21cdf70fd10ab7dac8e5082eb4b51f750d2e7cd0dbbf730a848fb337440615c"} Dec 06 09:23:55 crc kubenswrapper[4954]: I1206 09:23:55.935753 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-lxt7m" podStartSLOduration=2.972421234 podStartE2EDuration="3.935732835s" podCreationTimestamp="2025-12-06 09:23:52 +0000 UTC" firstStartedPulling="2025-12-06 09:23:53.862050782 +0000 UTC m=+8808.675410171" lastFinishedPulling="2025-12-06 09:23:54.825362393 +0000 UTC m=+8809.638721772" observedRunningTime="2025-12-06 09:23:55.935536229 +0000 UTC m=+8810.748895628" watchObservedRunningTime="2025-12-06 09:23:55.935732835 +0000 UTC m=+8810.749092244" Dec 06 09:24:10 crc kubenswrapper[4954]: I1206 09:24:10.101363 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:24:10 crc kubenswrapper[4954]: I1206 09:24:10.101923 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:24:34 crc kubenswrapper[4954]: I1206 09:24:34.245204 4954 generic.go:334] "Generic (PLEG): container finished" podID="68cf26b2-12c2-40a9-9dcc-42048e97c0f4" containerID="d82d8a05621bc4651253e69bdce6db086538243d1d676059fd8ecd2d28620d87" exitCode=0 Dec 06 09:24:34 crc kubenswrapper[4954]: I1206 09:24:34.245286 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-dkvqf" event={"ID":"68cf26b2-12c2-40a9-9dcc-42048e97c0f4","Type":"ContainerDied","Data":"d82d8a05621bc4651253e69bdce6db086538243d1d676059fd8ecd2d28620d87"} Dec 06 09:24:35 crc kubenswrapper[4954]: I1206 09:24:35.723089 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-dkvqf" Dec 06 09:24:35 crc kubenswrapper[4954]: I1206 09:24:35.777379 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz924\" (UniqueName: \"kubernetes.io/projected/68cf26b2-12c2-40a9-9dcc-42048e97c0f4-kube-api-access-pz924\") pod \"68cf26b2-12c2-40a9-9dcc-42048e97c0f4\" (UID: \"68cf26b2-12c2-40a9-9dcc-42048e97c0f4\") " Dec 06 09:24:35 crc kubenswrapper[4954]: I1206 09:24:35.778541 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68cf26b2-12c2-40a9-9dcc-42048e97c0f4-ssh-key\") pod \"68cf26b2-12c2-40a9-9dcc-42048e97c0f4\" (UID: \"68cf26b2-12c2-40a9-9dcc-42048e97c0f4\") " Dec 06 09:24:35 crc kubenswrapper[4954]: I1206 09:24:35.778704 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68cf26b2-12c2-40a9-9dcc-42048e97c0f4-inventory\") pod \"68cf26b2-12c2-40a9-9dcc-42048e97c0f4\" (UID: \"68cf26b2-12c2-40a9-9dcc-42048e97c0f4\") " Dec 06 09:24:35 crc kubenswrapper[4954]: I1206 09:24:35.783513 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68cf26b2-12c2-40a9-9dcc-42048e97c0f4-kube-api-access-pz924" (OuterVolumeSpecName: "kube-api-access-pz924") pod "68cf26b2-12c2-40a9-9dcc-42048e97c0f4" (UID: "68cf26b2-12c2-40a9-9dcc-42048e97c0f4"). InnerVolumeSpecName "kube-api-access-pz924". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:35 crc kubenswrapper[4954]: I1206 09:24:35.813342 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68cf26b2-12c2-40a9-9dcc-42048e97c0f4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "68cf26b2-12c2-40a9-9dcc-42048e97c0f4" (UID: "68cf26b2-12c2-40a9-9dcc-42048e97c0f4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:35 crc kubenswrapper[4954]: I1206 09:24:35.831921 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68cf26b2-12c2-40a9-9dcc-42048e97c0f4-inventory" (OuterVolumeSpecName: "inventory") pod "68cf26b2-12c2-40a9-9dcc-42048e97c0f4" (UID: "68cf26b2-12c2-40a9-9dcc-42048e97c0f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:35 crc kubenswrapper[4954]: I1206 09:24:35.882009 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68cf26b2-12c2-40a9-9dcc-42048e97c0f4-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:35 crc kubenswrapper[4954]: I1206 09:24:35.882047 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz924\" (UniqueName: \"kubernetes.io/projected/68cf26b2-12c2-40a9-9dcc-42048e97c0f4-kube-api-access-pz924\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:35 crc kubenswrapper[4954]: I1206 09:24:35.882063 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68cf26b2-12c2-40a9-9dcc-42048e97c0f4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.263931 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-dkvqf" event={"ID":"68cf26b2-12c2-40a9-9dcc-42048e97c0f4","Type":"ContainerDied","Data":"18ea3ec07bc6769904a1abbd558d2aa6dad3f6cb8152227b0fd4ed48c3c57462"} Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.264177 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18ea3ec07bc6769904a1abbd558d2aa6dad3f6cb8152227b0fd4ed48c3c57462" Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.264019 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-dkvqf" Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.359579 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-networker-vk44j"] Dec 06 09:24:36 crc kubenswrapper[4954]: E1206 09:24:36.360114 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68cf26b2-12c2-40a9-9dcc-42048e97c0f4" containerName="configure-network-openstack-openstack-networker" Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.360136 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="68cf26b2-12c2-40a9-9dcc-42048e97c0f4" containerName="configure-network-openstack-openstack-networker" Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.360323 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="68cf26b2-12c2-40a9-9dcc-42048e97c0f4" containerName="configure-network-openstack-openstack-networker" Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.361014 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-vk44j" Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.363528 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.363888 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-qwx8g" Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.373242 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-networker-vk44j"] Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.492471 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl7zq\" (UniqueName: \"kubernetes.io/projected/b3a3b17a-af50-4e16-8ee2-011bfe71370b-kube-api-access-wl7zq\") pod \"validate-network-openstack-openstack-networker-vk44j\" (UID: \"b3a3b17a-af50-4e16-8ee2-011bfe71370b\") " pod="openstack/validate-network-openstack-openstack-networker-vk44j" Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.492543 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3a3b17a-af50-4e16-8ee2-011bfe71370b-inventory\") pod \"validate-network-openstack-openstack-networker-vk44j\" (UID: \"b3a3b17a-af50-4e16-8ee2-011bfe71370b\") " pod="openstack/validate-network-openstack-openstack-networker-vk44j" Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.493169 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b3a3b17a-af50-4e16-8ee2-011bfe71370b-ssh-key\") pod \"validate-network-openstack-openstack-networker-vk44j\" (UID: \"b3a3b17a-af50-4e16-8ee2-011bfe71370b\") " pod="openstack/validate-network-openstack-openstack-networker-vk44j" Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.596005 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl7zq\" (UniqueName: \"kubernetes.io/projected/b3a3b17a-af50-4e16-8ee2-011bfe71370b-kube-api-access-wl7zq\") pod \"validate-network-openstack-openstack-networker-vk44j\" (UID: \"b3a3b17a-af50-4e16-8ee2-011bfe71370b\") " pod="openstack/validate-network-openstack-openstack-networker-vk44j" Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.596070 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3a3b17a-af50-4e16-8ee2-011bfe71370b-inventory\") pod \"validate-network-openstack-openstack-networker-vk44j\" (UID: \"b3a3b17a-af50-4e16-8ee2-011bfe71370b\") " pod="openstack/validate-network-openstack-openstack-networker-vk44j" Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.596178 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b3a3b17a-af50-4e16-8ee2-011bfe71370b-ssh-key\") pod \"validate-network-openstack-openstack-networker-vk44j\" (UID: \"b3a3b17a-af50-4e16-8ee2-011bfe71370b\") " pod="openstack/validate-network-openstack-openstack-networker-vk44j" Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.607321 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b3a3b17a-af50-4e16-8ee2-011bfe71370b-ssh-key\") pod \"validate-network-openstack-openstack-networker-vk44j\" (UID: \"b3a3b17a-af50-4e16-8ee2-011bfe71370b\") " pod="openstack/validate-network-openstack-openstack-networker-vk44j" Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.607405 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3a3b17a-af50-4e16-8ee2-011bfe71370b-inventory\") pod \"validate-network-openstack-openstack-networker-vk44j\" (UID: \"b3a3b17a-af50-4e16-8ee2-011bfe71370b\") " pod="openstack/validate-network-openstack-openstack-networker-vk44j" Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.614607 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl7zq\" (UniqueName: \"kubernetes.io/projected/b3a3b17a-af50-4e16-8ee2-011bfe71370b-kube-api-access-wl7zq\") pod \"validate-network-openstack-openstack-networker-vk44j\" (UID: \"b3a3b17a-af50-4e16-8ee2-011bfe71370b\") " pod="openstack/validate-network-openstack-openstack-networker-vk44j" Dec 06 09:24:36 crc kubenswrapper[4954]: I1206 09:24:36.684714 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-vk44j" Dec 06 09:24:37 crc kubenswrapper[4954]: I1206 09:24:37.267852 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-networker-vk44j"] Dec 06 09:24:38 crc kubenswrapper[4954]: I1206 09:24:38.313093 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-vk44j" event={"ID":"b3a3b17a-af50-4e16-8ee2-011bfe71370b","Type":"ContainerStarted","Data":"711fe4d85f8a181a761d3d9ebb8c6fe9d125b58d2a1ecf2a774e32ef0b1c9fd9"} Dec 06 09:24:39 crc kubenswrapper[4954]: I1206 09:24:39.322971 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-vk44j" event={"ID":"b3a3b17a-af50-4e16-8ee2-011bfe71370b","Type":"ContainerStarted","Data":"ea9e18384c0b5114ca7b128df00d11618168e1e5aa8a20c562bd4ef930548a15"} Dec 06 09:24:39 crc kubenswrapper[4954]: I1206 09:24:39.341798 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-networker-vk44j" podStartSLOduration=2.119979993 podStartE2EDuration="3.341777034s" podCreationTimestamp="2025-12-06 09:24:36 +0000 UTC" firstStartedPulling="2025-12-06 09:24:37.27285803 +0000 UTC m=+8852.086217409" lastFinishedPulling="2025-12-06 09:24:38.494655061 +0000 UTC m=+8853.308014450" observedRunningTime="2025-12-06 09:24:39.339240017 +0000 UTC m=+8854.152599406" watchObservedRunningTime="2025-12-06 09:24:39.341777034 +0000 UTC m=+8854.155136423" Dec 06 09:24:40 crc kubenswrapper[4954]: I1206 09:24:40.101178 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:24:40 crc kubenswrapper[4954]: I1206 09:24:40.101243 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:24:45 crc kubenswrapper[4954]: I1206 09:24:45.373669 4954 generic.go:334] "Generic (PLEG): container finished" podID="b3a3b17a-af50-4e16-8ee2-011bfe71370b" containerID="ea9e18384c0b5114ca7b128df00d11618168e1e5aa8a20c562bd4ef930548a15" exitCode=0 Dec 06 09:24:45 crc kubenswrapper[4954]: I1206 09:24:45.373751 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-vk44j" event={"ID":"b3a3b17a-af50-4e16-8ee2-011bfe71370b","Type":"ContainerDied","Data":"ea9e18384c0b5114ca7b128df00d11618168e1e5aa8a20c562bd4ef930548a15"} Dec 06 09:24:46 crc kubenswrapper[4954]: I1206 09:24:46.830582 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-vk44j" Dec 06 09:24:46 crc kubenswrapper[4954]: I1206 09:24:46.927532 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b3a3b17a-af50-4e16-8ee2-011bfe71370b-ssh-key\") pod \"b3a3b17a-af50-4e16-8ee2-011bfe71370b\" (UID: \"b3a3b17a-af50-4e16-8ee2-011bfe71370b\") " Dec 06 09:24:46 crc kubenswrapper[4954]: I1206 09:24:46.928016 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl7zq\" (UniqueName: \"kubernetes.io/projected/b3a3b17a-af50-4e16-8ee2-011bfe71370b-kube-api-access-wl7zq\") pod \"b3a3b17a-af50-4e16-8ee2-011bfe71370b\" (UID: \"b3a3b17a-af50-4e16-8ee2-011bfe71370b\") " Dec 06 09:24:46 crc kubenswrapper[4954]: I1206 09:24:46.928055 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3a3b17a-af50-4e16-8ee2-011bfe71370b-inventory\") pod \"b3a3b17a-af50-4e16-8ee2-011bfe71370b\" (UID: \"b3a3b17a-af50-4e16-8ee2-011bfe71370b\") " Dec 06 09:24:46 crc kubenswrapper[4954]: I1206 09:24:46.939166 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3a3b17a-af50-4e16-8ee2-011bfe71370b-kube-api-access-wl7zq" (OuterVolumeSpecName: "kube-api-access-wl7zq") pod "b3a3b17a-af50-4e16-8ee2-011bfe71370b" (UID: "b3a3b17a-af50-4e16-8ee2-011bfe71370b"). InnerVolumeSpecName "kube-api-access-wl7zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:46 crc kubenswrapper[4954]: I1206 09:24:46.979729 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3a3b17a-af50-4e16-8ee2-011bfe71370b-inventory" (OuterVolumeSpecName: "inventory") pod "b3a3b17a-af50-4e16-8ee2-011bfe71370b" (UID: "b3a3b17a-af50-4e16-8ee2-011bfe71370b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:46 crc kubenswrapper[4954]: I1206 09:24:46.982204 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3a3b17a-af50-4e16-8ee2-011bfe71370b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b3a3b17a-af50-4e16-8ee2-011bfe71370b" (UID: "b3a3b17a-af50-4e16-8ee2-011bfe71370b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.031079 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b3a3b17a-af50-4e16-8ee2-011bfe71370b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.031137 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl7zq\" (UniqueName: \"kubernetes.io/projected/b3a3b17a-af50-4e16-8ee2-011bfe71370b-kube-api-access-wl7zq\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.031149 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3a3b17a-af50-4e16-8ee2-011bfe71370b-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.392073 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-vk44j" event={"ID":"b3a3b17a-af50-4e16-8ee2-011bfe71370b","Type":"ContainerDied","Data":"711fe4d85f8a181a761d3d9ebb8c6fe9d125b58d2a1ecf2a774e32ef0b1c9fd9"} Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.392115 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="711fe4d85f8a181a761d3d9ebb8c6fe9d125b58d2a1ecf2a774e32ef0b1c9fd9" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.392255 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-vk44j" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.481377 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-networker-r9k8t"] Dec 06 09:24:47 crc kubenswrapper[4954]: E1206 09:24:47.481921 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a3b17a-af50-4e16-8ee2-011bfe71370b" containerName="validate-network-openstack-openstack-networker" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.481946 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a3b17a-af50-4e16-8ee2-011bfe71370b" containerName="validate-network-openstack-openstack-networker" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.482135 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3a3b17a-af50-4e16-8ee2-011bfe71370b" containerName="validate-network-openstack-openstack-networker" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.483071 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-r9k8t" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.485070 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-qwx8g" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.485135 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.493945 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-networker-r9k8t"] Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.644746 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/191da969-b76a-4ca2-95fd-101f479af42e-ssh-key\") pod \"install-os-openstack-openstack-networker-r9k8t\" (UID: \"191da969-b76a-4ca2-95fd-101f479af42e\") " pod="openstack/install-os-openstack-openstack-networker-r9k8t" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.644805 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm6sn\" (UniqueName: \"kubernetes.io/projected/191da969-b76a-4ca2-95fd-101f479af42e-kube-api-access-qm6sn\") pod \"install-os-openstack-openstack-networker-r9k8t\" (UID: \"191da969-b76a-4ca2-95fd-101f479af42e\") " pod="openstack/install-os-openstack-openstack-networker-r9k8t" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.644931 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/191da969-b76a-4ca2-95fd-101f479af42e-inventory\") pod \"install-os-openstack-openstack-networker-r9k8t\" (UID: \"191da969-b76a-4ca2-95fd-101f479af42e\") " pod="openstack/install-os-openstack-openstack-networker-r9k8t" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.747116 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/191da969-b76a-4ca2-95fd-101f479af42e-ssh-key\") pod \"install-os-openstack-openstack-networker-r9k8t\" (UID: \"191da969-b76a-4ca2-95fd-101f479af42e\") " pod="openstack/install-os-openstack-openstack-networker-r9k8t" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.747195 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm6sn\" (UniqueName: \"kubernetes.io/projected/191da969-b76a-4ca2-95fd-101f479af42e-kube-api-access-qm6sn\") pod \"install-os-openstack-openstack-networker-r9k8t\" (UID: \"191da969-b76a-4ca2-95fd-101f479af42e\") " pod="openstack/install-os-openstack-openstack-networker-r9k8t" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.747313 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/191da969-b76a-4ca2-95fd-101f479af42e-inventory\") pod \"install-os-openstack-openstack-networker-r9k8t\" (UID: \"191da969-b76a-4ca2-95fd-101f479af42e\") " pod="openstack/install-os-openstack-openstack-networker-r9k8t" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.750616 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/191da969-b76a-4ca2-95fd-101f479af42e-ssh-key\") pod \"install-os-openstack-openstack-networker-r9k8t\" (UID: \"191da969-b76a-4ca2-95fd-101f479af42e\") " pod="openstack/install-os-openstack-openstack-networker-r9k8t" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.751490 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/191da969-b76a-4ca2-95fd-101f479af42e-inventory\") pod \"install-os-openstack-openstack-networker-r9k8t\" (UID: \"191da969-b76a-4ca2-95fd-101f479af42e\") " pod="openstack/install-os-openstack-openstack-networker-r9k8t" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.767768 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm6sn\" (UniqueName: \"kubernetes.io/projected/191da969-b76a-4ca2-95fd-101f479af42e-kube-api-access-qm6sn\") pod \"install-os-openstack-openstack-networker-r9k8t\" (UID: \"191da969-b76a-4ca2-95fd-101f479af42e\") " pod="openstack/install-os-openstack-openstack-networker-r9k8t" Dec 06 09:24:47 crc kubenswrapper[4954]: I1206 09:24:47.806654 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-r9k8t" Dec 06 09:24:48 crc kubenswrapper[4954]: I1206 09:24:48.144682 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-networker-r9k8t"] Dec 06 09:24:48 crc kubenswrapper[4954]: I1206 09:24:48.402275 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-r9k8t" event={"ID":"191da969-b76a-4ca2-95fd-101f479af42e","Type":"ContainerStarted","Data":"b5c39af72d00a58fa4610477d4312242d43b506bb68fe51b1e7264340edbafda"} Dec 06 09:24:49 crc kubenswrapper[4954]: I1206 09:24:49.412553 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-r9k8t" event={"ID":"191da969-b76a-4ca2-95fd-101f479af42e","Type":"ContainerStarted","Data":"639816c41211dc0a06d6f2fa1d6e9d5194381cb197da33084964358c7fa8364e"} Dec 06 09:24:49 crc kubenswrapper[4954]: I1206 09:24:49.432469 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-networker-r9k8t" podStartSLOduration=1.691221263 podStartE2EDuration="2.432447893s" podCreationTimestamp="2025-12-06 09:24:47 +0000 UTC" firstStartedPulling="2025-12-06 09:24:48.154221397 +0000 UTC m=+8862.967580786" lastFinishedPulling="2025-12-06 09:24:48.895448027 +0000 UTC m=+8863.708807416" observedRunningTime="2025-12-06 09:24:49.427537852 +0000 UTC m=+8864.240897241" watchObservedRunningTime="2025-12-06 09:24:49.432447893 +0000 UTC m=+8864.245807282" Dec 06 09:24:56 crc kubenswrapper[4954]: I1206 09:24:56.481729 4954 generic.go:334] "Generic (PLEG): container finished" podID="723e1f46-4585-4b35-ab0f-cdd381997052" containerID="d21cdf70fd10ab7dac8e5082eb4b51f750d2e7cd0dbbf730a848fb337440615c" exitCode=0 Dec 06 09:24:56 crc kubenswrapper[4954]: I1206 09:24:56.481791 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-lxt7m" event={"ID":"723e1f46-4585-4b35-ab0f-cdd381997052","Type":"ContainerDied","Data":"d21cdf70fd10ab7dac8e5082eb4b51f750d2e7cd0dbbf730a848fb337440615c"} Dec 06 09:24:57 crc kubenswrapper[4954]: I1206 09:24:57.965396 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-lxt7m" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.085443 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/723e1f46-4585-4b35-ab0f-cdd381997052-ssh-key\") pod \"723e1f46-4585-4b35-ab0f-cdd381997052\" (UID: \"723e1f46-4585-4b35-ab0f-cdd381997052\") " Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.085643 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/723e1f46-4585-4b35-ab0f-cdd381997052-inventory\") pod \"723e1f46-4585-4b35-ab0f-cdd381997052\" (UID: \"723e1f46-4585-4b35-ab0f-cdd381997052\") " Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.085769 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58qpr\" (UniqueName: \"kubernetes.io/projected/723e1f46-4585-4b35-ab0f-cdd381997052-kube-api-access-58qpr\") pod \"723e1f46-4585-4b35-ab0f-cdd381997052\" (UID: \"723e1f46-4585-4b35-ab0f-cdd381997052\") " Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.092756 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/723e1f46-4585-4b35-ab0f-cdd381997052-kube-api-access-58qpr" (OuterVolumeSpecName: "kube-api-access-58qpr") pod "723e1f46-4585-4b35-ab0f-cdd381997052" (UID: "723e1f46-4585-4b35-ab0f-cdd381997052"). InnerVolumeSpecName "kube-api-access-58qpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.119629 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/723e1f46-4585-4b35-ab0f-cdd381997052-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "723e1f46-4585-4b35-ab0f-cdd381997052" (UID: "723e1f46-4585-4b35-ab0f-cdd381997052"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.119661 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/723e1f46-4585-4b35-ab0f-cdd381997052-inventory" (OuterVolumeSpecName: "inventory") pod "723e1f46-4585-4b35-ab0f-cdd381997052" (UID: "723e1f46-4585-4b35-ab0f-cdd381997052"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.197184 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/723e1f46-4585-4b35-ab0f-cdd381997052-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.197245 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58qpr\" (UniqueName: \"kubernetes.io/projected/723e1f46-4585-4b35-ab0f-cdd381997052-kube-api-access-58qpr\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.197263 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/723e1f46-4585-4b35-ab0f-cdd381997052-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.501053 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-lxt7m" event={"ID":"723e1f46-4585-4b35-ab0f-cdd381997052","Type":"ContainerDied","Data":"01564103ba0e57fa14beef61f2180192ec630b412b23ae06c5e9dbfe2f7ee5e3"} Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.501322 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01564103ba0e57fa14beef61f2180192ec630b412b23ae06c5e9dbfe2f7ee5e3" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.501127 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-lxt7m" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.580885 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-lnnrb"] Dec 06 09:24:58 crc kubenswrapper[4954]: E1206 09:24:58.581283 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723e1f46-4585-4b35-ab0f-cdd381997052" containerName="configure-network-openstack-openstack-cell1" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.581303 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="723e1f46-4585-4b35-ab0f-cdd381997052" containerName="configure-network-openstack-openstack-cell1" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.581517 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="723e1f46-4585-4b35-ab0f-cdd381997052" containerName="configure-network-openstack-openstack-cell1" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.582203 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-lnnrb" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.587881 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.590156 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.592038 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-lnnrb"] Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.706181 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8772370c-5e61-442a-a0c4-ca20e1e43d07-inventory\") pod \"validate-network-openstack-openstack-cell1-lnnrb\" (UID: \"8772370c-5e61-442a-a0c4-ca20e1e43d07\") " pod="openstack/validate-network-openstack-openstack-cell1-lnnrb" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.706246 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjntm\" (UniqueName: \"kubernetes.io/projected/8772370c-5e61-442a-a0c4-ca20e1e43d07-kube-api-access-gjntm\") pod \"validate-network-openstack-openstack-cell1-lnnrb\" (UID: \"8772370c-5e61-442a-a0c4-ca20e1e43d07\") " pod="openstack/validate-network-openstack-openstack-cell1-lnnrb" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.706481 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8772370c-5e61-442a-a0c4-ca20e1e43d07-ssh-key\") pod \"validate-network-openstack-openstack-cell1-lnnrb\" (UID: \"8772370c-5e61-442a-a0c4-ca20e1e43d07\") " pod="openstack/validate-network-openstack-openstack-cell1-lnnrb" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.809185 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8772370c-5e61-442a-a0c4-ca20e1e43d07-inventory\") pod \"validate-network-openstack-openstack-cell1-lnnrb\" (UID: \"8772370c-5e61-442a-a0c4-ca20e1e43d07\") " pod="openstack/validate-network-openstack-openstack-cell1-lnnrb" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.809232 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjntm\" (UniqueName: \"kubernetes.io/projected/8772370c-5e61-442a-a0c4-ca20e1e43d07-kube-api-access-gjntm\") pod \"validate-network-openstack-openstack-cell1-lnnrb\" (UID: \"8772370c-5e61-442a-a0c4-ca20e1e43d07\") " pod="openstack/validate-network-openstack-openstack-cell1-lnnrb" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.809288 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8772370c-5e61-442a-a0c4-ca20e1e43d07-ssh-key\") pod \"validate-network-openstack-openstack-cell1-lnnrb\" (UID: \"8772370c-5e61-442a-a0c4-ca20e1e43d07\") " pod="openstack/validate-network-openstack-openstack-cell1-lnnrb" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.814313 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8772370c-5e61-442a-a0c4-ca20e1e43d07-inventory\") pod \"validate-network-openstack-openstack-cell1-lnnrb\" (UID: \"8772370c-5e61-442a-a0c4-ca20e1e43d07\") " pod="openstack/validate-network-openstack-openstack-cell1-lnnrb" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.821248 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8772370c-5e61-442a-a0c4-ca20e1e43d07-ssh-key\") pod \"validate-network-openstack-openstack-cell1-lnnrb\" (UID: \"8772370c-5e61-442a-a0c4-ca20e1e43d07\") " pod="openstack/validate-network-openstack-openstack-cell1-lnnrb" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.825661 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjntm\" (UniqueName: \"kubernetes.io/projected/8772370c-5e61-442a-a0c4-ca20e1e43d07-kube-api-access-gjntm\") pod \"validate-network-openstack-openstack-cell1-lnnrb\" (UID: \"8772370c-5e61-442a-a0c4-ca20e1e43d07\") " pod="openstack/validate-network-openstack-openstack-cell1-lnnrb" Dec 06 09:24:58 crc kubenswrapper[4954]: I1206 09:24:58.903378 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-lnnrb" Dec 06 09:24:59 crc kubenswrapper[4954]: I1206 09:24:59.401082 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-lnnrb"] Dec 06 09:24:59 crc kubenswrapper[4954]: I1206 09:24:59.510243 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-lnnrb" event={"ID":"8772370c-5e61-442a-a0c4-ca20e1e43d07","Type":"ContainerStarted","Data":"d1d233019d979e7d2ccc25b9fc7bf8dc3ce5837b440c6fd32b842a267964d1de"} Dec 06 09:25:00 crc kubenswrapper[4954]: I1206 09:25:00.527935 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-lnnrb" event={"ID":"8772370c-5e61-442a-a0c4-ca20e1e43d07","Type":"ContainerStarted","Data":"acfc1a36bee8cca3fb4356ef80c253b793d356af9342d53725fae8d773c3c617"} Dec 06 09:25:00 crc kubenswrapper[4954]: I1206 09:25:00.547988 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-lnnrb" podStartSLOduration=2.112400453 podStartE2EDuration="2.547964535s" podCreationTimestamp="2025-12-06 09:24:58 +0000 UTC" firstStartedPulling="2025-12-06 09:24:59.403871904 +0000 UTC m=+8874.217231293" lastFinishedPulling="2025-12-06 09:24:59.839435986 +0000 UTC m=+8874.652795375" observedRunningTime="2025-12-06 09:25:00.543343882 +0000 UTC m=+8875.356703291" watchObservedRunningTime="2025-12-06 09:25:00.547964535 +0000 UTC m=+8875.361323934" Dec 06 09:25:07 crc kubenswrapper[4954]: I1206 09:25:07.622823 4954 generic.go:334] "Generic (PLEG): container finished" podID="8772370c-5e61-442a-a0c4-ca20e1e43d07" containerID="acfc1a36bee8cca3fb4356ef80c253b793d356af9342d53725fae8d773c3c617" exitCode=0 Dec 06 09:25:07 crc kubenswrapper[4954]: I1206 09:25:07.622908 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-lnnrb" event={"ID":"8772370c-5e61-442a-a0c4-ca20e1e43d07","Type":"ContainerDied","Data":"acfc1a36bee8cca3fb4356ef80c253b793d356af9342d53725fae8d773c3c617"} Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.079138 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-lnnrb" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.232191 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8772370c-5e61-442a-a0c4-ca20e1e43d07-inventory\") pod \"8772370c-5e61-442a-a0c4-ca20e1e43d07\" (UID: \"8772370c-5e61-442a-a0c4-ca20e1e43d07\") " Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.232314 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjntm\" (UniqueName: \"kubernetes.io/projected/8772370c-5e61-442a-a0c4-ca20e1e43d07-kube-api-access-gjntm\") pod \"8772370c-5e61-442a-a0c4-ca20e1e43d07\" (UID: \"8772370c-5e61-442a-a0c4-ca20e1e43d07\") " Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.232416 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8772370c-5e61-442a-a0c4-ca20e1e43d07-ssh-key\") pod \"8772370c-5e61-442a-a0c4-ca20e1e43d07\" (UID: \"8772370c-5e61-442a-a0c4-ca20e1e43d07\") " Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.239771 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8772370c-5e61-442a-a0c4-ca20e1e43d07-kube-api-access-gjntm" (OuterVolumeSpecName: "kube-api-access-gjntm") pod "8772370c-5e61-442a-a0c4-ca20e1e43d07" (UID: "8772370c-5e61-442a-a0c4-ca20e1e43d07"). InnerVolumeSpecName "kube-api-access-gjntm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.260077 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8772370c-5e61-442a-a0c4-ca20e1e43d07-inventory" (OuterVolumeSpecName: "inventory") pod "8772370c-5e61-442a-a0c4-ca20e1e43d07" (UID: "8772370c-5e61-442a-a0c4-ca20e1e43d07"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.261789 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8772370c-5e61-442a-a0c4-ca20e1e43d07-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8772370c-5e61-442a-a0c4-ca20e1e43d07" (UID: "8772370c-5e61-442a-a0c4-ca20e1e43d07"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.335577 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjntm\" (UniqueName: \"kubernetes.io/projected/8772370c-5e61-442a-a0c4-ca20e1e43d07-kube-api-access-gjntm\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.335608 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8772370c-5e61-442a-a0c4-ca20e1e43d07-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.335617 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8772370c-5e61-442a-a0c4-ca20e1e43d07-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.648422 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-lnnrb" event={"ID":"8772370c-5e61-442a-a0c4-ca20e1e43d07","Type":"ContainerDied","Data":"d1d233019d979e7d2ccc25b9fc7bf8dc3ce5837b440c6fd32b842a267964d1de"} Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.648466 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1d233019d979e7d2ccc25b9fc7bf8dc3ce5837b440c6fd32b842a267964d1de" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.648582 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-lnnrb" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.742422 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-pxq8m"] Dec 06 09:25:09 crc kubenswrapper[4954]: E1206 09:25:09.742949 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8772370c-5e61-442a-a0c4-ca20e1e43d07" containerName="validate-network-openstack-openstack-cell1" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.743043 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8772370c-5e61-442a-a0c4-ca20e1e43d07" containerName="validate-network-openstack-openstack-cell1" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.743305 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8772370c-5e61-442a-a0c4-ca20e1e43d07" containerName="validate-network-openstack-openstack-cell1" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.744096 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-pxq8m" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.748973 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.749081 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.762978 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-pxq8m"] Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.847301 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vq2q\" (UniqueName: \"kubernetes.io/projected/ceb965f1-31a2-439a-aee5-4619e914a9f0-kube-api-access-9vq2q\") pod \"install-os-openstack-openstack-cell1-pxq8m\" (UID: \"ceb965f1-31a2-439a-aee5-4619e914a9f0\") " pod="openstack/install-os-openstack-openstack-cell1-pxq8m" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.847373 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceb965f1-31a2-439a-aee5-4619e914a9f0-inventory\") pod \"install-os-openstack-openstack-cell1-pxq8m\" (UID: \"ceb965f1-31a2-439a-aee5-4619e914a9f0\") " pod="openstack/install-os-openstack-openstack-cell1-pxq8m" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.847623 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ceb965f1-31a2-439a-aee5-4619e914a9f0-ssh-key\") pod \"install-os-openstack-openstack-cell1-pxq8m\" (UID: \"ceb965f1-31a2-439a-aee5-4619e914a9f0\") " pod="openstack/install-os-openstack-openstack-cell1-pxq8m" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.949942 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ceb965f1-31a2-439a-aee5-4619e914a9f0-ssh-key\") pod \"install-os-openstack-openstack-cell1-pxq8m\" (UID: \"ceb965f1-31a2-439a-aee5-4619e914a9f0\") " pod="openstack/install-os-openstack-openstack-cell1-pxq8m" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.950128 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vq2q\" (UniqueName: \"kubernetes.io/projected/ceb965f1-31a2-439a-aee5-4619e914a9f0-kube-api-access-9vq2q\") pod \"install-os-openstack-openstack-cell1-pxq8m\" (UID: \"ceb965f1-31a2-439a-aee5-4619e914a9f0\") " pod="openstack/install-os-openstack-openstack-cell1-pxq8m" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.950665 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceb965f1-31a2-439a-aee5-4619e914a9f0-inventory\") pod \"install-os-openstack-openstack-cell1-pxq8m\" (UID: \"ceb965f1-31a2-439a-aee5-4619e914a9f0\") " pod="openstack/install-os-openstack-openstack-cell1-pxq8m" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.954534 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ceb965f1-31a2-439a-aee5-4619e914a9f0-ssh-key\") pod \"install-os-openstack-openstack-cell1-pxq8m\" (UID: \"ceb965f1-31a2-439a-aee5-4619e914a9f0\") " pod="openstack/install-os-openstack-openstack-cell1-pxq8m" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.954781 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceb965f1-31a2-439a-aee5-4619e914a9f0-inventory\") pod \"install-os-openstack-openstack-cell1-pxq8m\" (UID: \"ceb965f1-31a2-439a-aee5-4619e914a9f0\") " pod="openstack/install-os-openstack-openstack-cell1-pxq8m" Dec 06 09:25:09 crc kubenswrapper[4954]: I1206 09:25:09.971175 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vq2q\" (UniqueName: \"kubernetes.io/projected/ceb965f1-31a2-439a-aee5-4619e914a9f0-kube-api-access-9vq2q\") pod \"install-os-openstack-openstack-cell1-pxq8m\" (UID: \"ceb965f1-31a2-439a-aee5-4619e914a9f0\") " pod="openstack/install-os-openstack-openstack-cell1-pxq8m" Dec 06 09:25:10 crc kubenswrapper[4954]: I1206 09:25:10.061512 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-pxq8m" Dec 06 09:25:10 crc kubenswrapper[4954]: I1206 09:25:10.101225 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:25:10 crc kubenswrapper[4954]: I1206 09:25:10.101297 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:25:10 crc kubenswrapper[4954]: I1206 09:25:10.101357 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 09:25:10 crc kubenswrapper[4954]: I1206 09:25:10.102285 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aecf62838f817cb682fa72b028865467dea354261d177fde14cf882c68fe87c3"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:25:10 crc kubenswrapper[4954]: I1206 09:25:10.102369 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://aecf62838f817cb682fa72b028865467dea354261d177fde14cf882c68fe87c3" gracePeriod=600 Dec 06 09:25:10 crc kubenswrapper[4954]: E1206 09:25:10.359965 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e0babbe_21ce_42f4_90cf_c3eb21991413.slice/crio-aecf62838f817cb682fa72b028865467dea354261d177fde14cf882c68fe87c3.scope\": RecentStats: unable to find data in memory cache]" Dec 06 09:25:10 crc kubenswrapper[4954]: I1206 09:25:10.631686 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-pxq8m"] Dec 06 09:25:10 crc kubenswrapper[4954]: I1206 09:25:10.659142 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-pxq8m" event={"ID":"ceb965f1-31a2-439a-aee5-4619e914a9f0","Type":"ContainerStarted","Data":"2f86fa254af936d96110d6c561277a14fb5d4c05043177516d17b3a95a265079"} Dec 06 09:25:10 crc kubenswrapper[4954]: I1206 09:25:10.662071 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="aecf62838f817cb682fa72b028865467dea354261d177fde14cf882c68fe87c3" exitCode=0 Dec 06 09:25:10 crc kubenswrapper[4954]: I1206 09:25:10.662115 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"aecf62838f817cb682fa72b028865467dea354261d177fde14cf882c68fe87c3"} Dec 06 09:25:10 crc kubenswrapper[4954]: I1206 09:25:10.662172 4954 scope.go:117] "RemoveContainer" containerID="4cdb3382d305e1600b9c7b23bbbc74fb3765ec4f1a1ee1efe33657432284a173" Dec 06 09:25:11 crc kubenswrapper[4954]: I1206 09:25:11.675401 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e"} Dec 06 09:25:11 crc kubenswrapper[4954]: I1206 09:25:11.679553 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-pxq8m" event={"ID":"ceb965f1-31a2-439a-aee5-4619e914a9f0","Type":"ContainerStarted","Data":"2897acd450a8427c7f373309374d523d8b85895d359123837f2b61e86bdeb9d4"} Dec 06 09:25:11 crc kubenswrapper[4954]: I1206 09:25:11.742113 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-pxq8m" podStartSLOduration=2.297116949 podStartE2EDuration="2.742065371s" podCreationTimestamp="2025-12-06 09:25:09 +0000 UTC" firstStartedPulling="2025-12-06 09:25:10.641911252 +0000 UTC m=+8885.455270641" lastFinishedPulling="2025-12-06 09:25:11.086859684 +0000 UTC m=+8885.900219063" observedRunningTime="2025-12-06 09:25:11.731768567 +0000 UTC m=+8886.545127966" watchObservedRunningTime="2025-12-06 09:25:11.742065371 +0000 UTC m=+8886.555424770" Dec 06 09:25:39 crc kubenswrapper[4954]: I1206 09:25:39.953424 4954 generic.go:334] "Generic (PLEG): container finished" podID="191da969-b76a-4ca2-95fd-101f479af42e" containerID="639816c41211dc0a06d6f2fa1d6e9d5194381cb197da33084964358c7fa8364e" exitCode=0 Dec 06 09:25:39 crc kubenswrapper[4954]: I1206 09:25:39.953996 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-r9k8t" event={"ID":"191da969-b76a-4ca2-95fd-101f479af42e","Type":"ContainerDied","Data":"639816c41211dc0a06d6f2fa1d6e9d5194381cb197da33084964358c7fa8364e"} Dec 06 09:25:41 crc kubenswrapper[4954]: I1206 09:25:41.407311 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-r9k8t" Dec 06 09:25:41 crc kubenswrapper[4954]: I1206 09:25:41.502773 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm6sn\" (UniqueName: \"kubernetes.io/projected/191da969-b76a-4ca2-95fd-101f479af42e-kube-api-access-qm6sn\") pod \"191da969-b76a-4ca2-95fd-101f479af42e\" (UID: \"191da969-b76a-4ca2-95fd-101f479af42e\") " Dec 06 09:25:41 crc kubenswrapper[4954]: I1206 09:25:41.502827 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/191da969-b76a-4ca2-95fd-101f479af42e-ssh-key\") pod \"191da969-b76a-4ca2-95fd-101f479af42e\" (UID: \"191da969-b76a-4ca2-95fd-101f479af42e\") " Dec 06 09:25:41 crc kubenswrapper[4954]: I1206 09:25:41.502876 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/191da969-b76a-4ca2-95fd-101f479af42e-inventory\") pod \"191da969-b76a-4ca2-95fd-101f479af42e\" (UID: \"191da969-b76a-4ca2-95fd-101f479af42e\") " Dec 06 09:25:41 crc kubenswrapper[4954]: I1206 09:25:41.508093 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/191da969-b76a-4ca2-95fd-101f479af42e-kube-api-access-qm6sn" (OuterVolumeSpecName: "kube-api-access-qm6sn") pod "191da969-b76a-4ca2-95fd-101f479af42e" (UID: "191da969-b76a-4ca2-95fd-101f479af42e"). InnerVolumeSpecName "kube-api-access-qm6sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:25:41 crc kubenswrapper[4954]: E1206 09:25:41.535982 4954 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/191da969-b76a-4ca2-95fd-101f479af42e-ssh-key podName:191da969-b76a-4ca2-95fd-101f479af42e nodeName:}" failed. No retries permitted until 2025-12-06 09:25:42.035945112 +0000 UTC m=+8916.849304501 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key" (UniqueName: "kubernetes.io/secret/191da969-b76a-4ca2-95fd-101f479af42e-ssh-key") pod "191da969-b76a-4ca2-95fd-101f479af42e" (UID: "191da969-b76a-4ca2-95fd-101f479af42e") : error deleting /var/lib/kubelet/pods/191da969-b76a-4ca2-95fd-101f479af42e/volume-subpaths: remove /var/lib/kubelet/pods/191da969-b76a-4ca2-95fd-101f479af42e/volume-subpaths: no such file or directory Dec 06 09:25:41 crc kubenswrapper[4954]: I1206 09:25:41.539324 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/191da969-b76a-4ca2-95fd-101f479af42e-inventory" (OuterVolumeSpecName: "inventory") pod "191da969-b76a-4ca2-95fd-101f479af42e" (UID: "191da969-b76a-4ca2-95fd-101f479af42e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:41 crc kubenswrapper[4954]: I1206 09:25:41.605636 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm6sn\" (UniqueName: \"kubernetes.io/projected/191da969-b76a-4ca2-95fd-101f479af42e-kube-api-access-qm6sn\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:41 crc kubenswrapper[4954]: I1206 09:25:41.605671 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/191da969-b76a-4ca2-95fd-101f479af42e-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:41 crc kubenswrapper[4954]: I1206 09:25:41.975069 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-r9k8t" event={"ID":"191da969-b76a-4ca2-95fd-101f479af42e","Type":"ContainerDied","Data":"b5c39af72d00a58fa4610477d4312242d43b506bb68fe51b1e7264340edbafda"} Dec 06 09:25:41 crc kubenswrapper[4954]: I1206 09:25:41.975109 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5c39af72d00a58fa4610477d4312242d43b506bb68fe51b1e7264340edbafda" Dec 06 09:25:41 crc kubenswrapper[4954]: I1206 09:25:41.975156 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-r9k8t" Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.061388 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-networker-tkq4q"] Dec 06 09:25:42 crc kubenswrapper[4954]: E1206 09:25:42.061857 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="191da969-b76a-4ca2-95fd-101f479af42e" containerName="install-os-openstack-openstack-networker" Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.061891 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="191da969-b76a-4ca2-95fd-101f479af42e" containerName="install-os-openstack-openstack-networker" Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.062109 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="191da969-b76a-4ca2-95fd-101f479af42e" containerName="install-os-openstack-openstack-networker" Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.062878 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-tkq4q" Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.069895 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-networker-tkq4q"] Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.112956 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/191da969-b76a-4ca2-95fd-101f479af42e-ssh-key\") pod \"191da969-b76a-4ca2-95fd-101f479af42e\" (UID: \"191da969-b76a-4ca2-95fd-101f479af42e\") " Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.118384 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/191da969-b76a-4ca2-95fd-101f479af42e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "191da969-b76a-4ca2-95fd-101f479af42e" (UID: "191da969-b76a-4ca2-95fd-101f479af42e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.216241 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1-inventory\") pod \"configure-os-openstack-openstack-networker-tkq4q\" (UID: \"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1\") " pod="openstack/configure-os-openstack-openstack-networker-tkq4q" Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.216404 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjbcl\" (UniqueName: \"kubernetes.io/projected/e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1-kube-api-access-hjbcl\") pod \"configure-os-openstack-openstack-networker-tkq4q\" (UID: \"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1\") " pod="openstack/configure-os-openstack-openstack-networker-tkq4q" Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.216474 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1-ssh-key\") pod \"configure-os-openstack-openstack-networker-tkq4q\" (UID: \"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1\") " pod="openstack/configure-os-openstack-openstack-networker-tkq4q" Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.216761 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/191da969-b76a-4ca2-95fd-101f479af42e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.318315 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjbcl\" (UniqueName: \"kubernetes.io/projected/e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1-kube-api-access-hjbcl\") pod \"configure-os-openstack-openstack-networker-tkq4q\" (UID: \"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1\") " pod="openstack/configure-os-openstack-openstack-networker-tkq4q" Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.318466 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1-ssh-key\") pod \"configure-os-openstack-openstack-networker-tkq4q\" (UID: \"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1\") " pod="openstack/configure-os-openstack-openstack-networker-tkq4q" Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.318651 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1-inventory\") pod \"configure-os-openstack-openstack-networker-tkq4q\" (UID: \"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1\") " pod="openstack/configure-os-openstack-openstack-networker-tkq4q" Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.323044 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1-ssh-key\") pod \"configure-os-openstack-openstack-networker-tkq4q\" (UID: \"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1\") " pod="openstack/configure-os-openstack-openstack-networker-tkq4q" Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.325025 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1-inventory\") pod \"configure-os-openstack-openstack-networker-tkq4q\" (UID: \"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1\") " pod="openstack/configure-os-openstack-openstack-networker-tkq4q" Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.342511 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjbcl\" (UniqueName: \"kubernetes.io/projected/e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1-kube-api-access-hjbcl\") pod \"configure-os-openstack-openstack-networker-tkq4q\" (UID: \"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1\") " pod="openstack/configure-os-openstack-openstack-networker-tkq4q" Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.378422 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-tkq4q" Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.896373 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-networker-tkq4q"] Dec 06 09:25:42 crc kubenswrapper[4954]: I1206 09:25:42.998129 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-tkq4q" event={"ID":"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1","Type":"ContainerStarted","Data":"81d4ce9d6a9f9cebce36efdf5b377ffcf58d2850be08b18f159a326f32034ce3"} Dec 06 09:25:44 crc kubenswrapper[4954]: I1206 09:25:44.007706 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-tkq4q" event={"ID":"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1","Type":"ContainerStarted","Data":"9ec19dc330f75d2a8e5c543b81364e57611aa253f3c98d53ead7a076a5dbf6be"} Dec 06 09:25:44 crc kubenswrapper[4954]: I1206 09:25:44.027004 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-networker-tkq4q" podStartSLOduration=1.480096441 podStartE2EDuration="2.0269758s" podCreationTimestamp="2025-12-06 09:25:42 +0000 UTC" firstStartedPulling="2025-12-06 09:25:42.915865219 +0000 UTC m=+8917.729224608" lastFinishedPulling="2025-12-06 09:25:43.462744578 +0000 UTC m=+8918.276103967" observedRunningTime="2025-12-06 09:25:44.020617681 +0000 UTC m=+8918.833977090" watchObservedRunningTime="2025-12-06 09:25:44.0269758 +0000 UTC m=+8918.840335189" Dec 06 09:26:04 crc kubenswrapper[4954]: I1206 09:26:04.234257 4954 generic.go:334] "Generic (PLEG): container finished" podID="ceb965f1-31a2-439a-aee5-4619e914a9f0" containerID="2897acd450a8427c7f373309374d523d8b85895d359123837f2b61e86bdeb9d4" exitCode=0 Dec 06 09:26:04 crc kubenswrapper[4954]: I1206 09:26:04.234386 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-pxq8m" event={"ID":"ceb965f1-31a2-439a-aee5-4619e914a9f0","Type":"ContainerDied","Data":"2897acd450a8427c7f373309374d523d8b85895d359123837f2b61e86bdeb9d4"} Dec 06 09:26:05 crc kubenswrapper[4954]: I1206 09:26:05.750935 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-pxq8m" Dec 06 09:26:05 crc kubenswrapper[4954]: I1206 09:26:05.913484 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vq2q\" (UniqueName: \"kubernetes.io/projected/ceb965f1-31a2-439a-aee5-4619e914a9f0-kube-api-access-9vq2q\") pod \"ceb965f1-31a2-439a-aee5-4619e914a9f0\" (UID: \"ceb965f1-31a2-439a-aee5-4619e914a9f0\") " Dec 06 09:26:05 crc kubenswrapper[4954]: I1206 09:26:05.913698 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ceb965f1-31a2-439a-aee5-4619e914a9f0-ssh-key\") pod \"ceb965f1-31a2-439a-aee5-4619e914a9f0\" (UID: \"ceb965f1-31a2-439a-aee5-4619e914a9f0\") " Dec 06 09:26:05 crc kubenswrapper[4954]: I1206 09:26:05.913808 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceb965f1-31a2-439a-aee5-4619e914a9f0-inventory\") pod \"ceb965f1-31a2-439a-aee5-4619e914a9f0\" (UID: \"ceb965f1-31a2-439a-aee5-4619e914a9f0\") " Dec 06 09:26:05 crc kubenswrapper[4954]: I1206 09:26:05.920947 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceb965f1-31a2-439a-aee5-4619e914a9f0-kube-api-access-9vq2q" (OuterVolumeSpecName: "kube-api-access-9vq2q") pod "ceb965f1-31a2-439a-aee5-4619e914a9f0" (UID: "ceb965f1-31a2-439a-aee5-4619e914a9f0"). InnerVolumeSpecName "kube-api-access-9vq2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:26:05 crc kubenswrapper[4954]: I1206 09:26:05.944205 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceb965f1-31a2-439a-aee5-4619e914a9f0-inventory" (OuterVolumeSpecName: "inventory") pod "ceb965f1-31a2-439a-aee5-4619e914a9f0" (UID: "ceb965f1-31a2-439a-aee5-4619e914a9f0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:05 crc kubenswrapper[4954]: I1206 09:26:05.950774 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceb965f1-31a2-439a-aee5-4619e914a9f0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ceb965f1-31a2-439a-aee5-4619e914a9f0" (UID: "ceb965f1-31a2-439a-aee5-4619e914a9f0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.016937 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceb965f1-31a2-439a-aee5-4619e914a9f0-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.016972 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vq2q\" (UniqueName: \"kubernetes.io/projected/ceb965f1-31a2-439a-aee5-4619e914a9f0-kube-api-access-9vq2q\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.016983 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ceb965f1-31a2-439a-aee5-4619e914a9f0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.257774 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-pxq8m" event={"ID":"ceb965f1-31a2-439a-aee5-4619e914a9f0","Type":"ContainerDied","Data":"2f86fa254af936d96110d6c561277a14fb5d4c05043177516d17b3a95a265079"} Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.257820 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f86fa254af936d96110d6c561277a14fb5d4c05043177516d17b3a95a265079" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.257884 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-pxq8m" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.363087 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-67w67"] Dec 06 09:26:06 crc kubenswrapper[4954]: E1206 09:26:06.363827 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb965f1-31a2-439a-aee5-4619e914a9f0" containerName="install-os-openstack-openstack-cell1" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.363855 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb965f1-31a2-439a-aee5-4619e914a9f0" containerName="install-os-openstack-openstack-cell1" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.364097 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceb965f1-31a2-439a-aee5-4619e914a9f0" containerName="install-os-openstack-openstack-cell1" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.365271 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-67w67" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.367847 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.368046 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.384461 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-67w67"] Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.527477 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txtd6\" (UniqueName: \"kubernetes.io/projected/bc171c73-4a40-48cc-9a75-9dd222a9da01-kube-api-access-txtd6\") pod \"configure-os-openstack-openstack-cell1-67w67\" (UID: \"bc171c73-4a40-48cc-9a75-9dd222a9da01\") " pod="openstack/configure-os-openstack-openstack-cell1-67w67" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.528166 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc171c73-4a40-48cc-9a75-9dd222a9da01-inventory\") pod \"configure-os-openstack-openstack-cell1-67w67\" (UID: \"bc171c73-4a40-48cc-9a75-9dd222a9da01\") " pod="openstack/configure-os-openstack-openstack-cell1-67w67" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.528275 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc171c73-4a40-48cc-9a75-9dd222a9da01-ssh-key\") pod \"configure-os-openstack-openstack-cell1-67w67\" (UID: \"bc171c73-4a40-48cc-9a75-9dd222a9da01\") " pod="openstack/configure-os-openstack-openstack-cell1-67w67" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.632798 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txtd6\" (UniqueName: \"kubernetes.io/projected/bc171c73-4a40-48cc-9a75-9dd222a9da01-kube-api-access-txtd6\") pod \"configure-os-openstack-openstack-cell1-67w67\" (UID: \"bc171c73-4a40-48cc-9a75-9dd222a9da01\") " pod="openstack/configure-os-openstack-openstack-cell1-67w67" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.632950 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc171c73-4a40-48cc-9a75-9dd222a9da01-inventory\") pod \"configure-os-openstack-openstack-cell1-67w67\" (UID: \"bc171c73-4a40-48cc-9a75-9dd222a9da01\") " pod="openstack/configure-os-openstack-openstack-cell1-67w67" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.633007 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc171c73-4a40-48cc-9a75-9dd222a9da01-ssh-key\") pod \"configure-os-openstack-openstack-cell1-67w67\" (UID: \"bc171c73-4a40-48cc-9a75-9dd222a9da01\") " pod="openstack/configure-os-openstack-openstack-cell1-67w67" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.638313 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc171c73-4a40-48cc-9a75-9dd222a9da01-inventory\") pod \"configure-os-openstack-openstack-cell1-67w67\" (UID: \"bc171c73-4a40-48cc-9a75-9dd222a9da01\") " pod="openstack/configure-os-openstack-openstack-cell1-67w67" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.646205 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc171c73-4a40-48cc-9a75-9dd222a9da01-ssh-key\") pod \"configure-os-openstack-openstack-cell1-67w67\" (UID: \"bc171c73-4a40-48cc-9a75-9dd222a9da01\") " pod="openstack/configure-os-openstack-openstack-cell1-67w67" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.650881 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txtd6\" (UniqueName: \"kubernetes.io/projected/bc171c73-4a40-48cc-9a75-9dd222a9da01-kube-api-access-txtd6\") pod \"configure-os-openstack-openstack-cell1-67w67\" (UID: \"bc171c73-4a40-48cc-9a75-9dd222a9da01\") " pod="openstack/configure-os-openstack-openstack-cell1-67w67" Dec 06 09:26:06 crc kubenswrapper[4954]: I1206 09:26:06.702363 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-67w67" Dec 06 09:26:07 crc kubenswrapper[4954]: I1206 09:26:07.917007 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-67w67"] Dec 06 09:26:08 crc kubenswrapper[4954]: I1206 09:26:08.277002 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-67w67" event={"ID":"bc171c73-4a40-48cc-9a75-9dd222a9da01","Type":"ContainerStarted","Data":"a9a5c7a252e228edca831a08e0482ccb6a3111947f63de57d3e90cfa100b74e6"} Dec 06 09:26:09 crc kubenswrapper[4954]: I1206 09:26:09.286713 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-67w67" event={"ID":"bc171c73-4a40-48cc-9a75-9dd222a9da01","Type":"ContainerStarted","Data":"3f5439cda76fd9c1ea4a76550cbfc99fa96e5b8447d11eb343bb26bf95687509"} Dec 06 09:26:09 crc kubenswrapper[4954]: I1206 09:26:09.304382 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-67w67" podStartSLOduration=2.79855314 podStartE2EDuration="3.304360535s" podCreationTimestamp="2025-12-06 09:26:06 +0000 UTC" firstStartedPulling="2025-12-06 09:26:07.925555347 +0000 UTC m=+8942.738914736" lastFinishedPulling="2025-12-06 09:26:08.431362742 +0000 UTC m=+8943.244722131" observedRunningTime="2025-12-06 09:26:09.299552827 +0000 UTC m=+8944.112912236" watchObservedRunningTime="2025-12-06 09:26:09.304360535 +0000 UTC m=+8944.117719924" Dec 06 09:26:36 crc kubenswrapper[4954]: I1206 09:26:36.553763 4954 generic.go:334] "Generic (PLEG): container finished" podID="e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1" containerID="9ec19dc330f75d2a8e5c543b81364e57611aa253f3c98d53ead7a076a5dbf6be" exitCode=0 Dec 06 09:26:36 crc kubenswrapper[4954]: I1206 09:26:36.553848 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-tkq4q" event={"ID":"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1","Type":"ContainerDied","Data":"9ec19dc330f75d2a8e5c543b81364e57611aa253f3c98d53ead7a076a5dbf6be"} Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.173094 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-tkq4q" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.287601 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1-inventory\") pod \"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1\" (UID: \"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1\") " Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.287709 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjbcl\" (UniqueName: \"kubernetes.io/projected/e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1-kube-api-access-hjbcl\") pod \"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1\" (UID: \"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1\") " Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.287800 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1-ssh-key\") pod \"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1\" (UID: \"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1\") " Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.293057 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1-kube-api-access-hjbcl" (OuterVolumeSpecName: "kube-api-access-hjbcl") pod "e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1" (UID: "e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1"). InnerVolumeSpecName "kube-api-access-hjbcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.320412 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1" (UID: "e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.330322 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1-inventory" (OuterVolumeSpecName: "inventory") pod "e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1" (UID: "e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.390396 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.390432 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.390444 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjbcl\" (UniqueName: \"kubernetes.io/projected/e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1-kube-api-access-hjbcl\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.588925 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-tkq4q" event={"ID":"e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1","Type":"ContainerDied","Data":"81d4ce9d6a9f9cebce36efdf5b377ffcf58d2850be08b18f159a326f32034ce3"} Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.589191 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81d4ce9d6a9f9cebce36efdf5b377ffcf58d2850be08b18f159a326f32034ce3" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.589232 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-tkq4q" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.672636 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-networker-z82wc"] Dec 06 09:26:38 crc kubenswrapper[4954]: E1206 09:26:38.673524 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1" containerName="configure-os-openstack-openstack-networker" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.673551 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1" containerName="configure-os-openstack-openstack-networker" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.673837 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1" containerName="configure-os-openstack-openstack-networker" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.674763 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-z82wc" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.677080 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.677348 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-qwx8g" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.683116 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-networker-z82wc"] Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.798632 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00ae1c89-e237-492e-b815-d385026df7e6-ssh-key\") pod \"run-os-openstack-openstack-networker-z82wc\" (UID: \"00ae1c89-e237-492e-b815-d385026df7e6\") " pod="openstack/run-os-openstack-openstack-networker-z82wc" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.798916 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp2fw\" (UniqueName: \"kubernetes.io/projected/00ae1c89-e237-492e-b815-d385026df7e6-kube-api-access-lp2fw\") pod \"run-os-openstack-openstack-networker-z82wc\" (UID: \"00ae1c89-e237-492e-b815-d385026df7e6\") " pod="openstack/run-os-openstack-openstack-networker-z82wc" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.799094 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00ae1c89-e237-492e-b815-d385026df7e6-inventory\") pod \"run-os-openstack-openstack-networker-z82wc\" (UID: \"00ae1c89-e237-492e-b815-d385026df7e6\") " pod="openstack/run-os-openstack-openstack-networker-z82wc" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.901286 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00ae1c89-e237-492e-b815-d385026df7e6-ssh-key\") pod \"run-os-openstack-openstack-networker-z82wc\" (UID: \"00ae1c89-e237-492e-b815-d385026df7e6\") " pod="openstack/run-os-openstack-openstack-networker-z82wc" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.901416 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp2fw\" (UniqueName: \"kubernetes.io/projected/00ae1c89-e237-492e-b815-d385026df7e6-kube-api-access-lp2fw\") pod \"run-os-openstack-openstack-networker-z82wc\" (UID: \"00ae1c89-e237-492e-b815-d385026df7e6\") " pod="openstack/run-os-openstack-openstack-networker-z82wc" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.901811 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00ae1c89-e237-492e-b815-d385026df7e6-inventory\") pod \"run-os-openstack-openstack-networker-z82wc\" (UID: \"00ae1c89-e237-492e-b815-d385026df7e6\") " pod="openstack/run-os-openstack-openstack-networker-z82wc" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.910248 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00ae1c89-e237-492e-b815-d385026df7e6-inventory\") pod \"run-os-openstack-openstack-networker-z82wc\" (UID: \"00ae1c89-e237-492e-b815-d385026df7e6\") " pod="openstack/run-os-openstack-openstack-networker-z82wc" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.910253 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00ae1c89-e237-492e-b815-d385026df7e6-ssh-key\") pod \"run-os-openstack-openstack-networker-z82wc\" (UID: \"00ae1c89-e237-492e-b815-d385026df7e6\") " pod="openstack/run-os-openstack-openstack-networker-z82wc" Dec 06 09:26:38 crc kubenswrapper[4954]: I1206 09:26:38.928224 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp2fw\" (UniqueName: \"kubernetes.io/projected/00ae1c89-e237-492e-b815-d385026df7e6-kube-api-access-lp2fw\") pod \"run-os-openstack-openstack-networker-z82wc\" (UID: \"00ae1c89-e237-492e-b815-d385026df7e6\") " pod="openstack/run-os-openstack-openstack-networker-z82wc" Dec 06 09:26:39 crc kubenswrapper[4954]: I1206 09:26:39.005615 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-z82wc" Dec 06 09:26:39 crc kubenswrapper[4954]: I1206 09:26:39.513978 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-networker-z82wc"] Dec 06 09:26:40 crc kubenswrapper[4954]: I1206 09:26:40.609043 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-z82wc" event={"ID":"00ae1c89-e237-492e-b815-d385026df7e6","Type":"ContainerStarted","Data":"7d704437f6f5f8a3220fb7ed49df86502c5d27b3cedebbad283ac45a47847d96"} Dec 06 09:26:40 crc kubenswrapper[4954]: I1206 09:26:40.610946 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-z82wc" event={"ID":"00ae1c89-e237-492e-b815-d385026df7e6","Type":"ContainerStarted","Data":"93858e7037193b3450311d13daeb9d8a8547ea9fe1123ad35e66085ba9802944"} Dec 06 09:26:40 crc kubenswrapper[4954]: I1206 09:26:40.636467 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-networker-z82wc" podStartSLOduration=2.201939222 podStartE2EDuration="2.636445675s" podCreationTimestamp="2025-12-06 09:26:38 +0000 UTC" firstStartedPulling="2025-12-06 09:26:39.677296295 +0000 UTC m=+8974.490655684" lastFinishedPulling="2025-12-06 09:26:40.111802748 +0000 UTC m=+8974.925162137" observedRunningTime="2025-12-06 09:26:40.628671158 +0000 UTC m=+8975.442030567" watchObservedRunningTime="2025-12-06 09:26:40.636445675 +0000 UTC m=+8975.449805064" Dec 06 09:26:49 crc kubenswrapper[4954]: I1206 09:26:49.931643 4954 generic.go:334] "Generic (PLEG): container finished" podID="00ae1c89-e237-492e-b815-d385026df7e6" containerID="7d704437f6f5f8a3220fb7ed49df86502c5d27b3cedebbad283ac45a47847d96" exitCode=0 Dec 06 09:26:49 crc kubenswrapper[4954]: I1206 09:26:49.932359 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-z82wc" event={"ID":"00ae1c89-e237-492e-b815-d385026df7e6","Type":"ContainerDied","Data":"7d704437f6f5f8a3220fb7ed49df86502c5d27b3cedebbad283ac45a47847d96"} Dec 06 09:26:51 crc kubenswrapper[4954]: I1206 09:26:51.435548 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-z82wc" Dec 06 09:26:51 crc kubenswrapper[4954]: I1206 09:26:51.497464 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp2fw\" (UniqueName: \"kubernetes.io/projected/00ae1c89-e237-492e-b815-d385026df7e6-kube-api-access-lp2fw\") pod \"00ae1c89-e237-492e-b815-d385026df7e6\" (UID: \"00ae1c89-e237-492e-b815-d385026df7e6\") " Dec 06 09:26:51 crc kubenswrapper[4954]: I1206 09:26:51.497578 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00ae1c89-e237-492e-b815-d385026df7e6-inventory\") pod \"00ae1c89-e237-492e-b815-d385026df7e6\" (UID: \"00ae1c89-e237-492e-b815-d385026df7e6\") " Dec 06 09:26:51 crc kubenswrapper[4954]: I1206 09:26:51.497610 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00ae1c89-e237-492e-b815-d385026df7e6-ssh-key\") pod \"00ae1c89-e237-492e-b815-d385026df7e6\" (UID: \"00ae1c89-e237-492e-b815-d385026df7e6\") " Dec 06 09:26:51 crc kubenswrapper[4954]: I1206 09:26:51.515955 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ae1c89-e237-492e-b815-d385026df7e6-kube-api-access-lp2fw" (OuterVolumeSpecName: "kube-api-access-lp2fw") pod "00ae1c89-e237-492e-b815-d385026df7e6" (UID: "00ae1c89-e237-492e-b815-d385026df7e6"). InnerVolumeSpecName "kube-api-access-lp2fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:26:51 crc kubenswrapper[4954]: I1206 09:26:51.540265 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ae1c89-e237-492e-b815-d385026df7e6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "00ae1c89-e237-492e-b815-d385026df7e6" (UID: "00ae1c89-e237-492e-b815-d385026df7e6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:51 crc kubenswrapper[4954]: I1206 09:26:51.546523 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ae1c89-e237-492e-b815-d385026df7e6-inventory" (OuterVolumeSpecName: "inventory") pod "00ae1c89-e237-492e-b815-d385026df7e6" (UID: "00ae1c89-e237-492e-b815-d385026df7e6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:26:51 crc kubenswrapper[4954]: I1206 09:26:51.610123 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp2fw\" (UniqueName: \"kubernetes.io/projected/00ae1c89-e237-492e-b815-d385026df7e6-kube-api-access-lp2fw\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:51 crc kubenswrapper[4954]: I1206 09:26:51.610164 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00ae1c89-e237-492e-b815-d385026df7e6-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:51 crc kubenswrapper[4954]: I1206 09:26:51.610175 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00ae1c89-e237-492e-b815-d385026df7e6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:26:51 crc kubenswrapper[4954]: I1206 09:26:51.958638 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-z82wc" Dec 06 09:26:51 crc kubenswrapper[4954]: I1206 09:26:51.958548 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-z82wc" event={"ID":"00ae1c89-e237-492e-b815-d385026df7e6","Type":"ContainerDied","Data":"93858e7037193b3450311d13daeb9d8a8547ea9fe1123ad35e66085ba9802944"} Dec 06 09:26:51 crc kubenswrapper[4954]: I1206 09:26:51.958813 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93858e7037193b3450311d13daeb9d8a8547ea9fe1123ad35e66085ba9802944" Dec 06 09:26:52 crc kubenswrapper[4954]: I1206 09:26:52.041887 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-qwhjm"] Dec 06 09:26:52 crc kubenswrapper[4954]: E1206 09:26:52.042672 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ae1c89-e237-492e-b815-d385026df7e6" containerName="run-os-openstack-openstack-networker" Dec 06 09:26:52 crc kubenswrapper[4954]: I1206 09:26:52.042714 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ae1c89-e237-492e-b815-d385026df7e6" containerName="run-os-openstack-openstack-networker" Dec 06 09:26:52 crc kubenswrapper[4954]: I1206 09:26:52.042958 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ae1c89-e237-492e-b815-d385026df7e6" containerName="run-os-openstack-openstack-networker" Dec 06 09:26:52 crc kubenswrapper[4954]: I1206 09:26:52.043878 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-qwhjm" Dec 06 09:26:52 crc kubenswrapper[4954]: I1206 09:26:52.046328 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Dec 06 09:26:52 crc kubenswrapper[4954]: I1206 09:26:52.046654 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-qwx8g" Dec 06 09:26:52 crc kubenswrapper[4954]: I1206 09:26:52.058701 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-qwhjm"] Dec 06 09:26:52 crc kubenswrapper[4954]: I1206 09:26:52.119952 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfb803e6-8e44-4318-a660-4b90c4b19150-ssh-key\") pod \"reboot-os-openstack-openstack-networker-qwhjm\" (UID: \"cfb803e6-8e44-4318-a660-4b90c4b19150\") " pod="openstack/reboot-os-openstack-openstack-networker-qwhjm" Dec 06 09:26:52 crc kubenswrapper[4954]: I1206 09:26:52.120031 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj5cq\" (UniqueName: \"kubernetes.io/projected/cfb803e6-8e44-4318-a660-4b90c4b19150-kube-api-access-hj5cq\") pod \"reboot-os-openstack-openstack-networker-qwhjm\" (UID: \"cfb803e6-8e44-4318-a660-4b90c4b19150\") " pod="openstack/reboot-os-openstack-openstack-networker-qwhjm" Dec 06 09:26:52 crc kubenswrapper[4954]: I1206 09:26:52.120308 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfb803e6-8e44-4318-a660-4b90c4b19150-inventory\") pod \"reboot-os-openstack-openstack-networker-qwhjm\" (UID: \"cfb803e6-8e44-4318-a660-4b90c4b19150\") " pod="openstack/reboot-os-openstack-openstack-networker-qwhjm" Dec 06 09:26:52 crc kubenswrapper[4954]: I1206 09:26:52.221849 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfb803e6-8e44-4318-a660-4b90c4b19150-inventory\") pod \"reboot-os-openstack-openstack-networker-qwhjm\" (UID: \"cfb803e6-8e44-4318-a660-4b90c4b19150\") " pod="openstack/reboot-os-openstack-openstack-networker-qwhjm" Dec 06 09:26:52 crc kubenswrapper[4954]: I1206 09:26:52.221928 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfb803e6-8e44-4318-a660-4b90c4b19150-ssh-key\") pod \"reboot-os-openstack-openstack-networker-qwhjm\" (UID: \"cfb803e6-8e44-4318-a660-4b90c4b19150\") " pod="openstack/reboot-os-openstack-openstack-networker-qwhjm" Dec 06 09:26:52 crc kubenswrapper[4954]: I1206 09:26:52.221976 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj5cq\" (UniqueName: \"kubernetes.io/projected/cfb803e6-8e44-4318-a660-4b90c4b19150-kube-api-access-hj5cq\") pod \"reboot-os-openstack-openstack-networker-qwhjm\" (UID: \"cfb803e6-8e44-4318-a660-4b90c4b19150\") " pod="openstack/reboot-os-openstack-openstack-networker-qwhjm" Dec 06 09:26:52 crc kubenswrapper[4954]: I1206 09:26:52.228769 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfb803e6-8e44-4318-a660-4b90c4b19150-ssh-key\") pod \"reboot-os-openstack-openstack-networker-qwhjm\" (UID: \"cfb803e6-8e44-4318-a660-4b90c4b19150\") " pod="openstack/reboot-os-openstack-openstack-networker-qwhjm" Dec 06 09:26:52 crc kubenswrapper[4954]: I1206 09:26:52.235780 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfb803e6-8e44-4318-a660-4b90c4b19150-inventory\") pod \"reboot-os-openstack-openstack-networker-qwhjm\" (UID: \"cfb803e6-8e44-4318-a660-4b90c4b19150\") " pod="openstack/reboot-os-openstack-openstack-networker-qwhjm" Dec 06 09:26:52 crc kubenswrapper[4954]: I1206 09:26:52.244846 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj5cq\" (UniqueName: \"kubernetes.io/projected/cfb803e6-8e44-4318-a660-4b90c4b19150-kube-api-access-hj5cq\") pod \"reboot-os-openstack-openstack-networker-qwhjm\" (UID: \"cfb803e6-8e44-4318-a660-4b90c4b19150\") " pod="openstack/reboot-os-openstack-openstack-networker-qwhjm" Dec 06 09:26:52 crc kubenswrapper[4954]: I1206 09:26:52.368889 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-qwhjm" Dec 06 09:26:52 crc kubenswrapper[4954]: I1206 09:26:52.955218 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-qwhjm"] Dec 06 09:26:53 crc kubenswrapper[4954]: I1206 09:26:53.993472 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-qwhjm" event={"ID":"cfb803e6-8e44-4318-a660-4b90c4b19150","Type":"ContainerStarted","Data":"83c6269b6a6f1980a6636ed1d469f271fdbccfd174fe30f213ea93da61caef91"} Dec 06 09:26:53 crc kubenswrapper[4954]: I1206 09:26:53.993987 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-qwhjm" event={"ID":"cfb803e6-8e44-4318-a660-4b90c4b19150","Type":"ContainerStarted","Data":"48daf8e396a1597e67274a0710780db0c89422e99c2b572a3ada0d6bf5244375"} Dec 06 09:26:54 crc kubenswrapper[4954]: I1206 09:26:54.012589 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-networker-qwhjm" podStartSLOduration=1.510596219 podStartE2EDuration="2.012555181s" podCreationTimestamp="2025-12-06 09:26:52 +0000 UTC" firstStartedPulling="2025-12-06 09:26:52.963072144 +0000 UTC m=+8987.776431533" lastFinishedPulling="2025-12-06 09:26:53.465031106 +0000 UTC m=+8988.278390495" observedRunningTime="2025-12-06 09:26:54.011330859 +0000 UTC m=+8988.824690268" watchObservedRunningTime="2025-12-06 09:26:54.012555181 +0000 UTC m=+8988.825914570" Dec 06 09:27:03 crc kubenswrapper[4954]: I1206 09:27:03.086597 4954 generic.go:334] "Generic (PLEG): container finished" podID="bc171c73-4a40-48cc-9a75-9dd222a9da01" containerID="3f5439cda76fd9c1ea4a76550cbfc99fa96e5b8447d11eb343bb26bf95687509" exitCode=0 Dec 06 09:27:03 crc kubenswrapper[4954]: I1206 09:27:03.086608 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-67w67" event={"ID":"bc171c73-4a40-48cc-9a75-9dd222a9da01","Type":"ContainerDied","Data":"3f5439cda76fd9c1ea4a76550cbfc99fa96e5b8447d11eb343bb26bf95687509"} Dec 06 09:27:04 crc kubenswrapper[4954]: I1206 09:27:04.532519 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-67w67" Dec 06 09:27:04 crc kubenswrapper[4954]: I1206 09:27:04.694308 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc171c73-4a40-48cc-9a75-9dd222a9da01-ssh-key\") pod \"bc171c73-4a40-48cc-9a75-9dd222a9da01\" (UID: \"bc171c73-4a40-48cc-9a75-9dd222a9da01\") " Dec 06 09:27:04 crc kubenswrapper[4954]: I1206 09:27:04.694401 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc171c73-4a40-48cc-9a75-9dd222a9da01-inventory\") pod \"bc171c73-4a40-48cc-9a75-9dd222a9da01\" (UID: \"bc171c73-4a40-48cc-9a75-9dd222a9da01\") " Dec 06 09:27:04 crc kubenswrapper[4954]: I1206 09:27:04.694669 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txtd6\" (UniqueName: \"kubernetes.io/projected/bc171c73-4a40-48cc-9a75-9dd222a9da01-kube-api-access-txtd6\") pod \"bc171c73-4a40-48cc-9a75-9dd222a9da01\" (UID: \"bc171c73-4a40-48cc-9a75-9dd222a9da01\") " Dec 06 09:27:04 crc kubenswrapper[4954]: I1206 09:27:04.700922 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc171c73-4a40-48cc-9a75-9dd222a9da01-kube-api-access-txtd6" (OuterVolumeSpecName: "kube-api-access-txtd6") pod "bc171c73-4a40-48cc-9a75-9dd222a9da01" (UID: "bc171c73-4a40-48cc-9a75-9dd222a9da01"). InnerVolumeSpecName "kube-api-access-txtd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:27:04 crc kubenswrapper[4954]: I1206 09:27:04.728917 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc171c73-4a40-48cc-9a75-9dd222a9da01-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bc171c73-4a40-48cc-9a75-9dd222a9da01" (UID: "bc171c73-4a40-48cc-9a75-9dd222a9da01"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:04 crc kubenswrapper[4954]: I1206 09:27:04.729443 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc171c73-4a40-48cc-9a75-9dd222a9da01-inventory" (OuterVolumeSpecName: "inventory") pod "bc171c73-4a40-48cc-9a75-9dd222a9da01" (UID: "bc171c73-4a40-48cc-9a75-9dd222a9da01"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:04 crc kubenswrapper[4954]: I1206 09:27:04.803820 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txtd6\" (UniqueName: \"kubernetes.io/projected/bc171c73-4a40-48cc-9a75-9dd222a9da01-kube-api-access-txtd6\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:04 crc kubenswrapper[4954]: I1206 09:27:04.803870 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc171c73-4a40-48cc-9a75-9dd222a9da01-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:04 crc kubenswrapper[4954]: I1206 09:27:04.803881 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc171c73-4a40-48cc-9a75-9dd222a9da01-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.121130 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-67w67" event={"ID":"bc171c73-4a40-48cc-9a75-9dd222a9da01","Type":"ContainerDied","Data":"a9a5c7a252e228edca831a08e0482ccb6a3111947f63de57d3e90cfa100b74e6"} Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.121172 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9a5c7a252e228edca831a08e0482ccb6a3111947f63de57d3e90cfa100b74e6" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.121258 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-67w67" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.203240 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-jgtdv"] Dec 06 09:27:05 crc kubenswrapper[4954]: E1206 09:27:05.203914 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc171c73-4a40-48cc-9a75-9dd222a9da01" containerName="configure-os-openstack-openstack-cell1" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.203939 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc171c73-4a40-48cc-9a75-9dd222a9da01" containerName="configure-os-openstack-openstack-cell1" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.204267 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc171c73-4a40-48cc-9a75-9dd222a9da01" containerName="configure-os-openstack-openstack-cell1" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.205336 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.207533 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.208959 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.221869 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-jgtdv"] Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.427504 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpnf4\" (UniqueName: \"kubernetes.io/projected/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-kube-api-access-vpnf4\") pod \"ssh-known-hosts-openstack-jgtdv\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.427881 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-jgtdv\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.427976 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-inventory-0\") pod \"ssh-known-hosts-openstack-jgtdv\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.428059 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-jgtdv\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.428181 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-inventory-1\") pod \"ssh-known-hosts-openstack-jgtdv\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.531002 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-inventory-0\") pod \"ssh-known-hosts-openstack-jgtdv\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.531131 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-jgtdv\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.531265 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-inventory-1\") pod \"ssh-known-hosts-openstack-jgtdv\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.531404 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpnf4\" (UniqueName: \"kubernetes.io/projected/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-kube-api-access-vpnf4\") pod \"ssh-known-hosts-openstack-jgtdv\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.531748 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-jgtdv\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.533743 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.536103 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-jgtdv\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.536737 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-jgtdv\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.553596 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-inventory-1\") pod \"ssh-known-hosts-openstack-jgtdv\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.555545 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-inventory-0\") pod \"ssh-known-hosts-openstack-jgtdv\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.556463 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpnf4\" (UniqueName: \"kubernetes.io/projected/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-kube-api-access-vpnf4\") pod \"ssh-known-hosts-openstack-jgtdv\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.832053 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:27:05 crc kubenswrapper[4954]: I1206 09:27:05.840515 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:06 crc kubenswrapper[4954]: I1206 09:27:06.403894 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-jgtdv"] Dec 06 09:27:06 crc kubenswrapper[4954]: W1206 09:27:06.408724 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e80411b_5c3f_4ea5_9963_6fb8b639ae94.slice/crio-6192bfc315e9687b95fc4d256da9de49e870878723a36b90dfbe3df2c6ad2550 WatchSource:0}: Error finding container 6192bfc315e9687b95fc4d256da9de49e870878723a36b90dfbe3df2c6ad2550: Status 404 returned error can't find the container with id 6192bfc315e9687b95fc4d256da9de49e870878723a36b90dfbe3df2c6ad2550 Dec 06 09:27:07 crc kubenswrapper[4954]: I1206 09:27:07.143034 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-jgtdv" event={"ID":"1e80411b-5c3f-4ea5-9963-6fb8b639ae94","Type":"ContainerStarted","Data":"e65fdd9a14d1604ad08c1c37f639f4849608764b212b0015ccabf34db0a2a24d"} Dec 06 09:27:07 crc kubenswrapper[4954]: I1206 09:27:07.143338 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-jgtdv" event={"ID":"1e80411b-5c3f-4ea5-9963-6fb8b639ae94","Type":"ContainerStarted","Data":"6192bfc315e9687b95fc4d256da9de49e870878723a36b90dfbe3df2c6ad2550"} Dec 06 09:27:07 crc kubenswrapper[4954]: I1206 09:27:07.165982 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-jgtdv" podStartSLOduration=1.751365078 podStartE2EDuration="2.165953791s" podCreationTimestamp="2025-12-06 09:27:05 +0000 UTC" firstStartedPulling="2025-12-06 09:27:06.411760505 +0000 UTC m=+9001.225119894" lastFinishedPulling="2025-12-06 09:27:06.826349218 +0000 UTC m=+9001.639708607" observedRunningTime="2025-12-06 09:27:07.161791161 +0000 UTC m=+9001.975150550" watchObservedRunningTime="2025-12-06 09:27:07.165953791 +0000 UTC m=+9001.979313180" Dec 06 09:27:08 crc kubenswrapper[4954]: I1206 09:27:08.152142 4954 generic.go:334] "Generic (PLEG): container finished" podID="cfb803e6-8e44-4318-a660-4b90c4b19150" containerID="83c6269b6a6f1980a6636ed1d469f271fdbccfd174fe30f213ea93da61caef91" exitCode=0 Dec 06 09:27:08 crc kubenswrapper[4954]: I1206 09:27:08.152214 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-qwhjm" event={"ID":"cfb803e6-8e44-4318-a660-4b90c4b19150","Type":"ContainerDied","Data":"83c6269b6a6f1980a6636ed1d469f271fdbccfd174fe30f213ea93da61caef91"} Dec 06 09:27:09 crc kubenswrapper[4954]: I1206 09:27:09.595498 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-qwhjm" Dec 06 09:27:09 crc kubenswrapper[4954]: I1206 09:27:09.722736 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfb803e6-8e44-4318-a660-4b90c4b19150-inventory\") pod \"cfb803e6-8e44-4318-a660-4b90c4b19150\" (UID: \"cfb803e6-8e44-4318-a660-4b90c4b19150\") " Dec 06 09:27:09 crc kubenswrapper[4954]: I1206 09:27:09.723114 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj5cq\" (UniqueName: \"kubernetes.io/projected/cfb803e6-8e44-4318-a660-4b90c4b19150-kube-api-access-hj5cq\") pod \"cfb803e6-8e44-4318-a660-4b90c4b19150\" (UID: \"cfb803e6-8e44-4318-a660-4b90c4b19150\") " Dec 06 09:27:09 crc kubenswrapper[4954]: I1206 09:27:09.723239 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfb803e6-8e44-4318-a660-4b90c4b19150-ssh-key\") pod \"cfb803e6-8e44-4318-a660-4b90c4b19150\" (UID: \"cfb803e6-8e44-4318-a660-4b90c4b19150\") " Dec 06 09:27:09 crc kubenswrapper[4954]: I1206 09:27:09.728484 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb803e6-8e44-4318-a660-4b90c4b19150-kube-api-access-hj5cq" (OuterVolumeSpecName: "kube-api-access-hj5cq") pod "cfb803e6-8e44-4318-a660-4b90c4b19150" (UID: "cfb803e6-8e44-4318-a660-4b90c4b19150"). InnerVolumeSpecName "kube-api-access-hj5cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:27:09 crc kubenswrapper[4954]: I1206 09:27:09.760417 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb803e6-8e44-4318-a660-4b90c4b19150-inventory" (OuterVolumeSpecName: "inventory") pod "cfb803e6-8e44-4318-a660-4b90c4b19150" (UID: "cfb803e6-8e44-4318-a660-4b90c4b19150"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:09 crc kubenswrapper[4954]: I1206 09:27:09.761614 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb803e6-8e44-4318-a660-4b90c4b19150-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cfb803e6-8e44-4318-a660-4b90c4b19150" (UID: "cfb803e6-8e44-4318-a660-4b90c4b19150"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:09 crc kubenswrapper[4954]: I1206 09:27:09.825454 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfb803e6-8e44-4318-a660-4b90c4b19150-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:09 crc kubenswrapper[4954]: I1206 09:27:09.825492 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj5cq\" (UniqueName: \"kubernetes.io/projected/cfb803e6-8e44-4318-a660-4b90c4b19150-kube-api-access-hj5cq\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:09 crc kubenswrapper[4954]: I1206 09:27:09.825502 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfb803e6-8e44-4318-a660-4b90c4b19150-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.173445 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-qwhjm" event={"ID":"cfb803e6-8e44-4318-a660-4b90c4b19150","Type":"ContainerDied","Data":"48daf8e396a1597e67274a0710780db0c89422e99c2b572a3ada0d6bf5244375"} Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.173485 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48daf8e396a1597e67274a0710780db0c89422e99c2b572a3ada0d6bf5244375" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.173583 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-qwhjm" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.322466 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-networker-bvkwz"] Dec 06 09:27:10 crc kubenswrapper[4954]: E1206 09:27:10.322905 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb803e6-8e44-4318-a660-4b90c4b19150" containerName="reboot-os-openstack-openstack-networker" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.322924 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb803e6-8e44-4318-a660-4b90c4b19150" containerName="reboot-os-openstack-openstack-networker" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.323111 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb803e6-8e44-4318-a660-4b90c4b19150" containerName="reboot-os-openstack-openstack-networker" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.323814 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.326887 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-qwx8g" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.327113 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-neutron-metadata-default-certs-0" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.327596 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-ovn-default-certs-0" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.335028 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-openstack-networker-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.335232 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-inventory\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.335306 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b78b9\" (UniqueName: \"kubernetes.io/projected/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-kube-api-access-b78b9\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.335400 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-ssh-key\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.335472 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.335537 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-openstack-networker-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.335664 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.335758 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.337042 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-networker-bvkwz"] Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.438331 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-openstack-networker-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.438485 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-inventory\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.438514 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b78b9\" (UniqueName: \"kubernetes.io/projected/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-kube-api-access-b78b9\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.438551 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-ssh-key\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.438679 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.438704 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-openstack-networker-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.438734 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.438800 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.443060 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.443462 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.443636 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-inventory\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.444191 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.445394 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-openstack-networker-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.452372 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-ssh-key\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.453182 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-openstack-networker-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.456422 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b78b9\" (UniqueName: \"kubernetes.io/projected/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-kube-api-access-b78b9\") pod \"install-certs-openstack-openstack-networker-bvkwz\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:10 crc kubenswrapper[4954]: I1206 09:27:10.682806 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:11 crc kubenswrapper[4954]: I1206 09:27:11.214896 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-networker-bvkwz"] Dec 06 09:27:11 crc kubenswrapper[4954]: W1206 09:27:11.218724 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a15c7ab_e735_41c0_ba7b_d2cb7eb8912b.slice/crio-6cf27ac0150a5baf511ae0c66e332e46f2debd6385f529f316073dd255cb4898 WatchSource:0}: Error finding container 6cf27ac0150a5baf511ae0c66e332e46f2debd6385f529f316073dd255cb4898: Status 404 returned error can't find the container with id 6cf27ac0150a5baf511ae0c66e332e46f2debd6385f529f316073dd255cb4898 Dec 06 09:27:12 crc kubenswrapper[4954]: I1206 09:27:12.193655 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-bvkwz" event={"ID":"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b","Type":"ContainerStarted","Data":"a52bc17b6715dfebae1af1655ac3e279793c6482faeedca120f2885f7a0387fc"} Dec 06 09:27:12 crc kubenswrapper[4954]: I1206 09:27:12.193961 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-bvkwz" event={"ID":"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b","Type":"ContainerStarted","Data":"6cf27ac0150a5baf511ae0c66e332e46f2debd6385f529f316073dd255cb4898"} Dec 06 09:27:12 crc kubenswrapper[4954]: I1206 09:27:12.213346 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-networker-bvkwz" podStartSLOduration=1.7773617769999999 podStartE2EDuration="2.213325109s" podCreationTimestamp="2025-12-06 09:27:10 +0000 UTC" firstStartedPulling="2025-12-06 09:27:11.221355805 +0000 UTC m=+9006.034715194" lastFinishedPulling="2025-12-06 09:27:11.657319137 +0000 UTC m=+9006.470678526" observedRunningTime="2025-12-06 09:27:12.210386711 +0000 UTC m=+9007.023746110" watchObservedRunningTime="2025-12-06 09:27:12.213325109 +0000 UTC m=+9007.026684498" Dec 06 09:27:23 crc kubenswrapper[4954]: I1206 09:27:23.308462 4954 generic.go:334] "Generic (PLEG): container finished" podID="1e80411b-5c3f-4ea5-9963-6fb8b639ae94" containerID="e65fdd9a14d1604ad08c1c37f639f4849608764b212b0015ccabf34db0a2a24d" exitCode=0 Dec 06 09:27:23 crc kubenswrapper[4954]: I1206 09:27:23.308532 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-jgtdv" event={"ID":"1e80411b-5c3f-4ea5-9963-6fb8b639ae94","Type":"ContainerDied","Data":"e65fdd9a14d1604ad08c1c37f639f4849608764b212b0015ccabf34db0a2a24d"} Dec 06 09:27:24 crc kubenswrapper[4954]: I1206 09:27:24.783416 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:24 crc kubenswrapper[4954]: I1206 09:27:24.874164 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-ssh-key-openstack-cell1\") pod \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " Dec 06 09:27:24 crc kubenswrapper[4954]: I1206 09:27:24.874294 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpnf4\" (UniqueName: \"kubernetes.io/projected/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-kube-api-access-vpnf4\") pod \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " Dec 06 09:27:24 crc kubenswrapper[4954]: I1206 09:27:24.874391 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-inventory-1\") pod \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " Dec 06 09:27:24 crc kubenswrapper[4954]: I1206 09:27:24.874450 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-inventory-0\") pod \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " Dec 06 09:27:24 crc kubenswrapper[4954]: I1206 09:27:24.874490 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-ssh-key-openstack-networker\") pod \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\" (UID: \"1e80411b-5c3f-4ea5-9963-6fb8b639ae94\") " Dec 06 09:27:24 crc kubenswrapper[4954]: I1206 09:27:24.880285 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-kube-api-access-vpnf4" (OuterVolumeSpecName: "kube-api-access-vpnf4") pod "1e80411b-5c3f-4ea5-9963-6fb8b639ae94" (UID: "1e80411b-5c3f-4ea5-9963-6fb8b639ae94"). InnerVolumeSpecName "kube-api-access-vpnf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:27:24 crc kubenswrapper[4954]: I1206 09:27:24.905747 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "1e80411b-5c3f-4ea5-9963-6fb8b639ae94" (UID: "1e80411b-5c3f-4ea5-9963-6fb8b639ae94"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:24 crc kubenswrapper[4954]: I1206 09:27:24.906408 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "1e80411b-5c3f-4ea5-9963-6fb8b639ae94" (UID: "1e80411b-5c3f-4ea5-9963-6fb8b639ae94"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:24 crc kubenswrapper[4954]: I1206 09:27:24.913512 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-inventory-1" (OuterVolumeSpecName: "inventory-1") pod "1e80411b-5c3f-4ea5-9963-6fb8b639ae94" (UID: "1e80411b-5c3f-4ea5-9963-6fb8b639ae94"). InnerVolumeSpecName "inventory-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:24 crc kubenswrapper[4954]: I1206 09:27:24.915995 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "1e80411b-5c3f-4ea5-9963-6fb8b639ae94" (UID: "1e80411b-5c3f-4ea5-9963-6fb8b639ae94"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:24 crc kubenswrapper[4954]: I1206 09:27:24.977199 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:24 crc kubenswrapper[4954]: I1206 09:27:24.977417 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:24 crc kubenswrapper[4954]: I1206 09:27:24.977500 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpnf4\" (UniqueName: \"kubernetes.io/projected/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-kube-api-access-vpnf4\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:24 crc kubenswrapper[4954]: I1206 09:27:24.977618 4954 reconciler_common.go:293] "Volume detached for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-inventory-1\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:24 crc kubenswrapper[4954]: I1206 09:27:24.977693 4954 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1e80411b-5c3f-4ea5-9963-6fb8b639ae94-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.329846 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-jgtdv" event={"ID":"1e80411b-5c3f-4ea5-9963-6fb8b639ae94","Type":"ContainerDied","Data":"6192bfc315e9687b95fc4d256da9de49e870878723a36b90dfbe3df2c6ad2550"} Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.329895 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6192bfc315e9687b95fc4d256da9de49e870878723a36b90dfbe3df2c6ad2550" Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.329944 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-jgtdv" Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.414340 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-skz67"] Dec 06 09:27:25 crc kubenswrapper[4954]: E1206 09:27:25.415141 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e80411b-5c3f-4ea5-9963-6fb8b639ae94" containerName="ssh-known-hosts-openstack" Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.415162 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e80411b-5c3f-4ea5-9963-6fb8b639ae94" containerName="ssh-known-hosts-openstack" Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.415638 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e80411b-5c3f-4ea5-9963-6fb8b639ae94" containerName="ssh-known-hosts-openstack" Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.416577 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-skz67" Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.420163 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.420418 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.426624 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-skz67"] Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.487928 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nthzm\" (UniqueName: \"kubernetes.io/projected/10123b35-1d1c-41fa-b752-896706a99792-kube-api-access-nthzm\") pod \"run-os-openstack-openstack-cell1-skz67\" (UID: \"10123b35-1d1c-41fa-b752-896706a99792\") " pod="openstack/run-os-openstack-openstack-cell1-skz67" Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.488176 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10123b35-1d1c-41fa-b752-896706a99792-ssh-key\") pod \"run-os-openstack-openstack-cell1-skz67\" (UID: \"10123b35-1d1c-41fa-b752-896706a99792\") " pod="openstack/run-os-openstack-openstack-cell1-skz67" Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.488388 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10123b35-1d1c-41fa-b752-896706a99792-inventory\") pod \"run-os-openstack-openstack-cell1-skz67\" (UID: \"10123b35-1d1c-41fa-b752-896706a99792\") " pod="openstack/run-os-openstack-openstack-cell1-skz67" Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.589763 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10123b35-1d1c-41fa-b752-896706a99792-ssh-key\") pod \"run-os-openstack-openstack-cell1-skz67\" (UID: \"10123b35-1d1c-41fa-b752-896706a99792\") " pod="openstack/run-os-openstack-openstack-cell1-skz67" Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.589863 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10123b35-1d1c-41fa-b752-896706a99792-inventory\") pod \"run-os-openstack-openstack-cell1-skz67\" (UID: \"10123b35-1d1c-41fa-b752-896706a99792\") " pod="openstack/run-os-openstack-openstack-cell1-skz67" Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.589965 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nthzm\" (UniqueName: \"kubernetes.io/projected/10123b35-1d1c-41fa-b752-896706a99792-kube-api-access-nthzm\") pod \"run-os-openstack-openstack-cell1-skz67\" (UID: \"10123b35-1d1c-41fa-b752-896706a99792\") " pod="openstack/run-os-openstack-openstack-cell1-skz67" Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.595168 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10123b35-1d1c-41fa-b752-896706a99792-inventory\") pod \"run-os-openstack-openstack-cell1-skz67\" (UID: \"10123b35-1d1c-41fa-b752-896706a99792\") " pod="openstack/run-os-openstack-openstack-cell1-skz67" Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.597551 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10123b35-1d1c-41fa-b752-896706a99792-ssh-key\") pod \"run-os-openstack-openstack-cell1-skz67\" (UID: \"10123b35-1d1c-41fa-b752-896706a99792\") " pod="openstack/run-os-openstack-openstack-cell1-skz67" Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.606539 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nthzm\" (UniqueName: \"kubernetes.io/projected/10123b35-1d1c-41fa-b752-896706a99792-kube-api-access-nthzm\") pod \"run-os-openstack-openstack-cell1-skz67\" (UID: \"10123b35-1d1c-41fa-b752-896706a99792\") " pod="openstack/run-os-openstack-openstack-cell1-skz67" Dec 06 09:27:25 crc kubenswrapper[4954]: I1206 09:27:25.746725 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-skz67" Dec 06 09:27:26 crc kubenswrapper[4954]: I1206 09:27:26.287837 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-skz67"] Dec 06 09:27:26 crc kubenswrapper[4954]: I1206 09:27:26.300774 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 09:27:26 crc kubenswrapper[4954]: I1206 09:27:26.340283 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-skz67" event={"ID":"10123b35-1d1c-41fa-b752-896706a99792","Type":"ContainerStarted","Data":"495bba9d875b97bd51a6618f8a1f199a6b173e1423f639337d33741ca90f9b6c"} Dec 06 09:27:27 crc kubenswrapper[4954]: I1206 09:27:27.351901 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-skz67" event={"ID":"10123b35-1d1c-41fa-b752-896706a99792","Type":"ContainerStarted","Data":"80314ac65b13f4e236b99eb2a5f14a3203644454a444863de9554dff6bfd9825"} Dec 06 09:27:27 crc kubenswrapper[4954]: I1206 09:27:27.370424 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-skz67" podStartSLOduration=1.738643314 podStartE2EDuration="2.370407446s" podCreationTimestamp="2025-12-06 09:27:25 +0000 UTC" firstStartedPulling="2025-12-06 09:27:26.300360759 +0000 UTC m=+9021.113720148" lastFinishedPulling="2025-12-06 09:27:26.932124891 +0000 UTC m=+9021.745484280" observedRunningTime="2025-12-06 09:27:27.36792895 +0000 UTC m=+9022.181288339" watchObservedRunningTime="2025-12-06 09:27:27.370407446 +0000 UTC m=+9022.183766835" Dec 06 09:27:34 crc kubenswrapper[4954]: I1206 09:27:34.425092 4954 generic.go:334] "Generic (PLEG): container finished" podID="0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b" containerID="a52bc17b6715dfebae1af1655ac3e279793c6482faeedca120f2885f7a0387fc" exitCode=0 Dec 06 09:27:34 crc kubenswrapper[4954]: I1206 09:27:34.425178 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-bvkwz" event={"ID":"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b","Type":"ContainerDied","Data":"a52bc17b6715dfebae1af1655ac3e279793c6482faeedca120f2885f7a0387fc"} Dec 06 09:27:35 crc kubenswrapper[4954]: I1206 09:27:35.923171 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.023158 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-networker-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-openstack-networker-neutron-metadata-default-certs-0\") pod \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.023314 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-ssh-key\") pod \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.023401 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b78b9\" (UniqueName: \"kubernetes.io/projected/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-kube-api-access-b78b9\") pod \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.023422 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-inventory\") pod \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.023497 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-bootstrap-combined-ca-bundle\") pod \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.023535 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-ovn-combined-ca-bundle\") pod \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.023574 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-networker-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-openstack-networker-ovn-default-certs-0\") pod \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.023615 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-neutron-metadata-combined-ca-bundle\") pod \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\" (UID: \"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b\") " Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.029130 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b" (UID: "0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.029404 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-kube-api-access-b78b9" (OuterVolumeSpecName: "kube-api-access-b78b9") pod "0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b" (UID: "0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b"). InnerVolumeSpecName "kube-api-access-b78b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.029949 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-openstack-networker-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-networker-neutron-metadata-default-certs-0") pod "0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b" (UID: "0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b"). InnerVolumeSpecName "openstack-networker-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.030719 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-openstack-networker-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-networker-ovn-default-certs-0") pod "0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b" (UID: "0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b"). InnerVolumeSpecName "openstack-networker-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.031423 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b" (UID: "0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.032403 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b" (UID: "0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.058259 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b" (UID: "0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.063985 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-inventory" (OuterVolumeSpecName: "inventory") pod "0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b" (UID: "0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.125688 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b78b9\" (UniqueName: \"kubernetes.io/projected/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-kube-api-access-b78b9\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.125726 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.125741 4954 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.125750 4954 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.125760 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-networker-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-openstack-networker-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.125771 4954 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.125783 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-networker-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-openstack-networker-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.125793 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.464886 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-bvkwz" event={"ID":"0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b","Type":"ContainerDied","Data":"6cf27ac0150a5baf511ae0c66e332e46f2debd6385f529f316073dd255cb4898"} Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.465274 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cf27ac0150a5baf511ae0c66e332e46f2debd6385f529f316073dd255cb4898" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.465209 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-bvkwz" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.567955 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-networker-ftv6g"] Dec 06 09:27:36 crc kubenswrapper[4954]: E1206 09:27:36.568575 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b" containerName="install-certs-openstack-openstack-networker" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.568600 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b" containerName="install-certs-openstack-openstack-networker" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.568869 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b" containerName="install-certs-openstack-openstack-networker" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.569789 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.575157 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-qwx8g" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.575533 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.575761 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.598692 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-networker-ftv6g"] Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.636284 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-ftv6g\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.636543 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-ssh-key\") pod \"ovn-openstack-openstack-networker-ftv6g\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.636749 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-inventory\") pod \"ovn-openstack-openstack-networker-ftv6g\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.636814 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jp2w\" (UniqueName: \"kubernetes.io/projected/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-kube-api-access-4jp2w\") pod \"ovn-openstack-openstack-networker-ftv6g\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.636883 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-ftv6g\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.738402 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-inventory\") pod \"ovn-openstack-openstack-networker-ftv6g\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.738721 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jp2w\" (UniqueName: \"kubernetes.io/projected/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-kube-api-access-4jp2w\") pod \"ovn-openstack-openstack-networker-ftv6g\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.738816 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-ftv6g\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.738922 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-ftv6g\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.739018 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-ssh-key\") pod \"ovn-openstack-openstack-networker-ftv6g\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.740196 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-ftv6g\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.742886 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-ssh-key\") pod \"ovn-openstack-openstack-networker-ftv6g\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.743300 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-inventory\") pod \"ovn-openstack-openstack-networker-ftv6g\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.743386 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-ftv6g\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.757260 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jp2w\" (UniqueName: \"kubernetes.io/projected/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-kube-api-access-4jp2w\") pod \"ovn-openstack-openstack-networker-ftv6g\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:27:36 crc kubenswrapper[4954]: I1206 09:27:36.902880 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:27:37 crc kubenswrapper[4954]: I1206 09:27:37.455229 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-networker-ftv6g"] Dec 06 09:27:37 crc kubenswrapper[4954]: I1206 09:27:37.475835 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-ftv6g" event={"ID":"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77","Type":"ContainerStarted","Data":"cff1d2ec3979eb39ad7521acc5e1edc11af8a1eb4136f66dd859d966b272446a"} Dec 06 09:27:38 crc kubenswrapper[4954]: I1206 09:27:38.487784 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-ftv6g" event={"ID":"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77","Type":"ContainerStarted","Data":"ab330fd35c319128d1121cc1a7cd7c334e2f3a0ae5a29b63234ab1ff6320b90f"} Dec 06 09:27:38 crc kubenswrapper[4954]: I1206 09:27:38.490822 4954 generic.go:334] "Generic (PLEG): container finished" podID="10123b35-1d1c-41fa-b752-896706a99792" containerID="80314ac65b13f4e236b99eb2a5f14a3203644454a444863de9554dff6bfd9825" exitCode=0 Dec 06 09:27:38 crc kubenswrapper[4954]: I1206 09:27:38.490884 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-skz67" event={"ID":"10123b35-1d1c-41fa-b752-896706a99792","Type":"ContainerDied","Data":"80314ac65b13f4e236b99eb2a5f14a3203644454a444863de9554dff6bfd9825"} Dec 06 09:27:38 crc kubenswrapper[4954]: I1206 09:27:38.512209 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-networker-ftv6g" podStartSLOduration=2.115711148 podStartE2EDuration="2.512189478s" podCreationTimestamp="2025-12-06 09:27:36 +0000 UTC" firstStartedPulling="2025-12-06 09:27:37.45526652 +0000 UTC m=+9032.268625909" lastFinishedPulling="2025-12-06 09:27:37.85174484 +0000 UTC m=+9032.665104239" observedRunningTime="2025-12-06 09:27:38.50176667 +0000 UTC m=+9033.315126059" watchObservedRunningTime="2025-12-06 09:27:38.512189478 +0000 UTC m=+9033.325548867" Dec 06 09:27:39 crc kubenswrapper[4954]: I1206 09:27:39.949507 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-skz67" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.016488 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nthzm\" (UniqueName: \"kubernetes.io/projected/10123b35-1d1c-41fa-b752-896706a99792-kube-api-access-nthzm\") pod \"10123b35-1d1c-41fa-b752-896706a99792\" (UID: \"10123b35-1d1c-41fa-b752-896706a99792\") " Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.016797 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10123b35-1d1c-41fa-b752-896706a99792-ssh-key\") pod \"10123b35-1d1c-41fa-b752-896706a99792\" (UID: \"10123b35-1d1c-41fa-b752-896706a99792\") " Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.016850 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10123b35-1d1c-41fa-b752-896706a99792-inventory\") pod \"10123b35-1d1c-41fa-b752-896706a99792\" (UID: \"10123b35-1d1c-41fa-b752-896706a99792\") " Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.023455 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10123b35-1d1c-41fa-b752-896706a99792-kube-api-access-nthzm" (OuterVolumeSpecName: "kube-api-access-nthzm") pod "10123b35-1d1c-41fa-b752-896706a99792" (UID: "10123b35-1d1c-41fa-b752-896706a99792"). InnerVolumeSpecName "kube-api-access-nthzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.045271 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10123b35-1d1c-41fa-b752-896706a99792-inventory" (OuterVolumeSpecName: "inventory") pod "10123b35-1d1c-41fa-b752-896706a99792" (UID: "10123b35-1d1c-41fa-b752-896706a99792"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.046606 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10123b35-1d1c-41fa-b752-896706a99792-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "10123b35-1d1c-41fa-b752-896706a99792" (UID: "10123b35-1d1c-41fa-b752-896706a99792"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.102174 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.102237 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.119647 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nthzm\" (UniqueName: \"kubernetes.io/projected/10123b35-1d1c-41fa-b752-896706a99792-kube-api-access-nthzm\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.119676 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10123b35-1d1c-41fa-b752-896706a99792-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.119699 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10123b35-1d1c-41fa-b752-896706a99792-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.524291 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-skz67" event={"ID":"10123b35-1d1c-41fa-b752-896706a99792","Type":"ContainerDied","Data":"495bba9d875b97bd51a6618f8a1f199a6b173e1423f639337d33741ca90f9b6c"} Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.524698 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="495bba9d875b97bd51a6618f8a1f199a6b173e1423f639337d33741ca90f9b6c" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.524377 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-skz67" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.600891 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-k6tcb"] Dec 06 09:27:40 crc kubenswrapper[4954]: E1206 09:27:40.601364 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10123b35-1d1c-41fa-b752-896706a99792" containerName="run-os-openstack-openstack-cell1" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.601383 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="10123b35-1d1c-41fa-b752-896706a99792" containerName="run-os-openstack-openstack-cell1" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.601614 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="10123b35-1d1c-41fa-b752-896706a99792" containerName="run-os-openstack-openstack-cell1" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.602419 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-k6tcb" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.604609 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.605783 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.612459 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-k6tcb"] Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.763740 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b2b5006-9910-4d4f-86f1-99ccb1c6f942-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-k6tcb\" (UID: \"4b2b5006-9910-4d4f-86f1-99ccb1c6f942\") " pod="openstack/reboot-os-openstack-openstack-cell1-k6tcb" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.763811 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b2b5006-9910-4d4f-86f1-99ccb1c6f942-inventory\") pod \"reboot-os-openstack-openstack-cell1-k6tcb\" (UID: \"4b2b5006-9910-4d4f-86f1-99ccb1c6f942\") " pod="openstack/reboot-os-openstack-openstack-cell1-k6tcb" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.764031 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dngn7\" (UniqueName: \"kubernetes.io/projected/4b2b5006-9910-4d4f-86f1-99ccb1c6f942-kube-api-access-dngn7\") pod \"reboot-os-openstack-openstack-cell1-k6tcb\" (UID: \"4b2b5006-9910-4d4f-86f1-99ccb1c6f942\") " pod="openstack/reboot-os-openstack-openstack-cell1-k6tcb" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.865637 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b2b5006-9910-4d4f-86f1-99ccb1c6f942-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-k6tcb\" (UID: \"4b2b5006-9910-4d4f-86f1-99ccb1c6f942\") " pod="openstack/reboot-os-openstack-openstack-cell1-k6tcb" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.865750 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b2b5006-9910-4d4f-86f1-99ccb1c6f942-inventory\") pod \"reboot-os-openstack-openstack-cell1-k6tcb\" (UID: \"4b2b5006-9910-4d4f-86f1-99ccb1c6f942\") " pod="openstack/reboot-os-openstack-openstack-cell1-k6tcb" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.865822 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dngn7\" (UniqueName: \"kubernetes.io/projected/4b2b5006-9910-4d4f-86f1-99ccb1c6f942-kube-api-access-dngn7\") pod \"reboot-os-openstack-openstack-cell1-k6tcb\" (UID: \"4b2b5006-9910-4d4f-86f1-99ccb1c6f942\") " pod="openstack/reboot-os-openstack-openstack-cell1-k6tcb" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.871027 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b2b5006-9910-4d4f-86f1-99ccb1c6f942-inventory\") pod \"reboot-os-openstack-openstack-cell1-k6tcb\" (UID: \"4b2b5006-9910-4d4f-86f1-99ccb1c6f942\") " pod="openstack/reboot-os-openstack-openstack-cell1-k6tcb" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.871399 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b2b5006-9910-4d4f-86f1-99ccb1c6f942-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-k6tcb\" (UID: \"4b2b5006-9910-4d4f-86f1-99ccb1c6f942\") " pod="openstack/reboot-os-openstack-openstack-cell1-k6tcb" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.882425 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dngn7\" (UniqueName: \"kubernetes.io/projected/4b2b5006-9910-4d4f-86f1-99ccb1c6f942-kube-api-access-dngn7\") pod \"reboot-os-openstack-openstack-cell1-k6tcb\" (UID: \"4b2b5006-9910-4d4f-86f1-99ccb1c6f942\") " pod="openstack/reboot-os-openstack-openstack-cell1-k6tcb" Dec 06 09:27:40 crc kubenswrapper[4954]: I1206 09:27:40.928978 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-k6tcb" Dec 06 09:27:41 crc kubenswrapper[4954]: I1206 09:27:41.495187 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-k6tcb"] Dec 06 09:27:41 crc kubenswrapper[4954]: I1206 09:27:41.538591 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-k6tcb" event={"ID":"4b2b5006-9910-4d4f-86f1-99ccb1c6f942","Type":"ContainerStarted","Data":"974f11997bfd6923220271dd18b3005b2cb962318d065c5af96dd5471932848b"} Dec 06 09:27:43 crc kubenswrapper[4954]: I1206 09:27:43.557337 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-k6tcb" event={"ID":"4b2b5006-9910-4d4f-86f1-99ccb1c6f942","Type":"ContainerStarted","Data":"fbff1de66ca016ef514ec6a945a34e8595bb2bdf49ac287e5d56297f040d0e0f"} Dec 06 09:27:43 crc kubenswrapper[4954]: I1206 09:27:43.592000 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-k6tcb" podStartSLOduration=2.643026542 podStartE2EDuration="3.591972081s" podCreationTimestamp="2025-12-06 09:27:40 +0000 UTC" firstStartedPulling="2025-12-06 09:27:41.492046908 +0000 UTC m=+9036.305406297" lastFinishedPulling="2025-12-06 09:27:42.440992447 +0000 UTC m=+9037.254351836" observedRunningTime="2025-12-06 09:27:43.575746219 +0000 UTC m=+9038.389105618" watchObservedRunningTime="2025-12-06 09:27:43.591972081 +0000 UTC m=+9038.405331490" Dec 06 09:27:56 crc kubenswrapper[4954]: I1206 09:27:56.751293 4954 generic.go:334] "Generic (PLEG): container finished" podID="4b2b5006-9910-4d4f-86f1-99ccb1c6f942" containerID="fbff1de66ca016ef514ec6a945a34e8595bb2bdf49ac287e5d56297f040d0e0f" exitCode=0 Dec 06 09:27:56 crc kubenswrapper[4954]: I1206 09:27:56.751478 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-k6tcb" event={"ID":"4b2b5006-9910-4d4f-86f1-99ccb1c6f942","Type":"ContainerDied","Data":"fbff1de66ca016ef514ec6a945a34e8595bb2bdf49ac287e5d56297f040d0e0f"} Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.201717 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-k6tcb" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.353722 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b2b5006-9910-4d4f-86f1-99ccb1c6f942-ssh-key\") pod \"4b2b5006-9910-4d4f-86f1-99ccb1c6f942\" (UID: \"4b2b5006-9910-4d4f-86f1-99ccb1c6f942\") " Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.354280 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b2b5006-9910-4d4f-86f1-99ccb1c6f942-inventory\") pod \"4b2b5006-9910-4d4f-86f1-99ccb1c6f942\" (UID: \"4b2b5006-9910-4d4f-86f1-99ccb1c6f942\") " Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.354326 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dngn7\" (UniqueName: \"kubernetes.io/projected/4b2b5006-9910-4d4f-86f1-99ccb1c6f942-kube-api-access-dngn7\") pod \"4b2b5006-9910-4d4f-86f1-99ccb1c6f942\" (UID: \"4b2b5006-9910-4d4f-86f1-99ccb1c6f942\") " Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.359481 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b2b5006-9910-4d4f-86f1-99ccb1c6f942-kube-api-access-dngn7" (OuterVolumeSpecName: "kube-api-access-dngn7") pod "4b2b5006-9910-4d4f-86f1-99ccb1c6f942" (UID: "4b2b5006-9910-4d4f-86f1-99ccb1c6f942"). InnerVolumeSpecName "kube-api-access-dngn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.384234 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b2b5006-9910-4d4f-86f1-99ccb1c6f942-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4b2b5006-9910-4d4f-86f1-99ccb1c6f942" (UID: "4b2b5006-9910-4d4f-86f1-99ccb1c6f942"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.394001 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b2b5006-9910-4d4f-86f1-99ccb1c6f942-inventory" (OuterVolumeSpecName: "inventory") pod "4b2b5006-9910-4d4f-86f1-99ccb1c6f942" (UID: "4b2b5006-9910-4d4f-86f1-99ccb1c6f942"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.456767 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b2b5006-9910-4d4f-86f1-99ccb1c6f942-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.456802 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b2b5006-9910-4d4f-86f1-99ccb1c6f942-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.456813 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dngn7\" (UniqueName: \"kubernetes.io/projected/4b2b5006-9910-4d4f-86f1-99ccb1c6f942-kube-api-access-dngn7\") on node \"crc\" DevicePath \"\"" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.770995 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-k6tcb" event={"ID":"4b2b5006-9910-4d4f-86f1-99ccb1c6f942","Type":"ContainerDied","Data":"974f11997bfd6923220271dd18b3005b2cb962318d065c5af96dd5471932848b"} Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.771032 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="974f11997bfd6923220271dd18b3005b2cb962318d065c5af96dd5471932848b" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.771063 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-k6tcb" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.885616 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-5hwtq"] Dec 06 09:27:58 crc kubenswrapper[4954]: E1206 09:27:58.886215 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2b5006-9910-4d4f-86f1-99ccb1c6f942" containerName="reboot-os-openstack-openstack-cell1" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.886243 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2b5006-9910-4d4f-86f1-99ccb1c6f942" containerName="reboot-os-openstack-openstack-cell1" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.886551 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b2b5006-9910-4d4f-86f1-99ccb1c6f942" containerName="reboot-os-openstack-openstack-cell1" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.887618 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.891670 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.892034 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-neutron-metadata-default-certs-0" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.892168 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-telemetry-default-certs-0" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.897650 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-ovn-default-certs-0" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.897821 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-libvirt-default-certs-0" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.898061 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.902485 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-5hwtq"] Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.971184 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.971232 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.971295 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.971325 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.971350 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.971389 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.971450 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.971483 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.971502 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwr2z\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-kube-api-access-qwr2z\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.971526 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.971551 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.971604 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.971625 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.971652 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-ssh-key\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:58 crc kubenswrapper[4954]: I1206 09:27:58.971677 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-inventory\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.074869 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.075052 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.075123 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.075160 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwr2z\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-kube-api-access-qwr2z\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.075203 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.075240 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.075320 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.075351 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.075388 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-ssh-key\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.075457 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-inventory\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.075534 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.075596 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.075696 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.075740 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.075780 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.081326 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.083038 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.084277 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.085173 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.085341 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.085602 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.087325 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-ssh-key\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.096894 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.096970 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.098309 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-inventory\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.098491 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.098705 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.098898 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.101986 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.102364 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwr2z\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-kube-api-access-qwr2z\") pod \"install-certs-openstack-openstack-cell1-5hwtq\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.218114 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:27:59 crc kubenswrapper[4954]: I1206 09:27:59.824374 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-5hwtq"] Dec 06 09:27:59 crc kubenswrapper[4954]: W1206 09:27:59.973715 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod772337e6_04b4_40a1_9cc9_2a482d88b845.slice/crio-1297274652b958e54fed8de91c7b75a466f8c6f16f3265df34713b1c095e4128 WatchSource:0}: Error finding container 1297274652b958e54fed8de91c7b75a466f8c6f16f3265df34713b1c095e4128: Status 404 returned error can't find the container with id 1297274652b958e54fed8de91c7b75a466f8c6f16f3265df34713b1c095e4128 Dec 06 09:28:00 crc kubenswrapper[4954]: I1206 09:28:00.791452 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" event={"ID":"772337e6-04b4-40a1-9cc9-2a482d88b845","Type":"ContainerStarted","Data":"b05354548d2aa3cbc769a649f9c60c90f19b82878b625ac6654f076f32745d79"} Dec 06 09:28:00 crc kubenswrapper[4954]: I1206 09:28:00.791817 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" event={"ID":"772337e6-04b4-40a1-9cc9-2a482d88b845","Type":"ContainerStarted","Data":"1297274652b958e54fed8de91c7b75a466f8c6f16f3265df34713b1c095e4128"} Dec 06 09:28:01 crc kubenswrapper[4954]: I1206 09:28:01.828202 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" podStartSLOduration=3.423987676 podStartE2EDuration="3.828149011s" podCreationTimestamp="2025-12-06 09:27:58 +0000 UTC" firstStartedPulling="2025-12-06 09:27:59.975978294 +0000 UTC m=+9054.789337683" lastFinishedPulling="2025-12-06 09:28:00.380139629 +0000 UTC m=+9055.193499018" observedRunningTime="2025-12-06 09:28:01.825025208 +0000 UTC m=+9056.638384597" watchObservedRunningTime="2025-12-06 09:28:01.828149011 +0000 UTC m=+9056.641508400" Dec 06 09:28:10 crc kubenswrapper[4954]: I1206 09:28:10.100861 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:28:10 crc kubenswrapper[4954]: I1206 09:28:10.101377 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:28:17 crc kubenswrapper[4954]: I1206 09:28:17.771796 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="ef4d1830-bfa7-4aba-8718-a7e540e52222" containerName="galera" probeResult="failure" output="command timed out" Dec 06 09:28:17 crc kubenswrapper[4954]: I1206 09:28:17.771844 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="ef4d1830-bfa7-4aba-8718-a7e540e52222" containerName="galera" probeResult="failure" output="command timed out" Dec 06 09:28:40 crc kubenswrapper[4954]: I1206 09:28:40.101263 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:28:40 crc kubenswrapper[4954]: I1206 09:28:40.101959 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:28:40 crc kubenswrapper[4954]: I1206 09:28:40.102013 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 09:28:40 crc kubenswrapper[4954]: I1206 09:28:40.102933 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:28:40 crc kubenswrapper[4954]: I1206 09:28:40.102994 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" gracePeriod=600 Dec 06 09:28:42 crc kubenswrapper[4954]: E1206 09:28:42.064274 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:28:42 crc kubenswrapper[4954]: I1206 09:28:42.162948 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" exitCode=0 Dec 06 09:28:42 crc kubenswrapper[4954]: I1206 09:28:42.162997 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e"} Dec 06 09:28:42 crc kubenswrapper[4954]: I1206 09:28:42.163056 4954 scope.go:117] "RemoveContainer" containerID="aecf62838f817cb682fa72b028865467dea354261d177fde14cf882c68fe87c3" Dec 06 09:28:42 crc kubenswrapper[4954]: I1206 09:28:42.163859 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:28:42 crc kubenswrapper[4954]: E1206 09:28:42.164139 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:28:44 crc kubenswrapper[4954]: I1206 09:28:44.202824 4954 generic.go:334] "Generic (PLEG): container finished" podID="772337e6-04b4-40a1-9cc9-2a482d88b845" containerID="b05354548d2aa3cbc769a649f9c60c90f19b82878b625ac6654f076f32745d79" exitCode=0 Dec 06 09:28:44 crc kubenswrapper[4954]: I1206 09:28:44.203097 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" event={"ID":"772337e6-04b4-40a1-9cc9-2a482d88b845","Type":"ContainerDied","Data":"b05354548d2aa3cbc769a649f9c60c90f19b82878b625ac6654f076f32745d79"} Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.630217 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.736047 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-libvirt-default-certs-0\") pod \"772337e6-04b4-40a1-9cc9-2a482d88b845\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.736106 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwr2z\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-kube-api-access-qwr2z\") pod \"772337e6-04b4-40a1-9cc9-2a482d88b845\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.736129 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-neutron-metadata-combined-ca-bundle\") pod \"772337e6-04b4-40a1-9cc9-2a482d88b845\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.736169 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-neutron-metadata-default-certs-0\") pod \"772337e6-04b4-40a1-9cc9-2a482d88b845\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.736214 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-inventory\") pod \"772337e6-04b4-40a1-9cc9-2a482d88b845\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.736231 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-ovn-default-certs-0\") pod \"772337e6-04b4-40a1-9cc9-2a482d88b845\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.736259 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-libvirt-combined-ca-bundle\") pod \"772337e6-04b4-40a1-9cc9-2a482d88b845\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.736293 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-nova-combined-ca-bundle\") pod \"772337e6-04b4-40a1-9cc9-2a482d88b845\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.736322 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-ovn-combined-ca-bundle\") pod \"772337e6-04b4-40a1-9cc9-2a482d88b845\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.736374 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-telemetry-combined-ca-bundle\") pod \"772337e6-04b4-40a1-9cc9-2a482d88b845\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.736408 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-neutron-sriov-combined-ca-bundle\") pod \"772337e6-04b4-40a1-9cc9-2a482d88b845\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.736462 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-neutron-dhcp-combined-ca-bundle\") pod \"772337e6-04b4-40a1-9cc9-2a482d88b845\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.736482 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-ssh-key\") pod \"772337e6-04b4-40a1-9cc9-2a482d88b845\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.736512 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-bootstrap-combined-ca-bundle\") pod \"772337e6-04b4-40a1-9cc9-2a482d88b845\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.736537 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-telemetry-default-certs-0\") pod \"772337e6-04b4-40a1-9cc9-2a482d88b845\" (UID: \"772337e6-04b4-40a1-9cc9-2a482d88b845\") " Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.742993 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-telemetry-default-certs-0") pod "772337e6-04b4-40a1-9cc9-2a482d88b845" (UID: "772337e6-04b4-40a1-9cc9-2a482d88b845"). InnerVolumeSpecName "openstack-cell1-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.743118 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "772337e6-04b4-40a1-9cc9-2a482d88b845" (UID: "772337e6-04b4-40a1-9cc9-2a482d88b845"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.743188 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-libvirt-default-certs-0") pod "772337e6-04b4-40a1-9cc9-2a482d88b845" (UID: "772337e6-04b4-40a1-9cc9-2a482d88b845"). InnerVolumeSpecName "openstack-cell1-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.743577 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "772337e6-04b4-40a1-9cc9-2a482d88b845" (UID: "772337e6-04b4-40a1-9cc9-2a482d88b845"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.744215 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "772337e6-04b4-40a1-9cc9-2a482d88b845" (UID: "772337e6-04b4-40a1-9cc9-2a482d88b845"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.744295 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-neutron-metadata-default-certs-0") pod "772337e6-04b4-40a1-9cc9-2a482d88b845" (UID: "772337e6-04b4-40a1-9cc9-2a482d88b845"). InnerVolumeSpecName "openstack-cell1-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.749661 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "772337e6-04b4-40a1-9cc9-2a482d88b845" (UID: "772337e6-04b4-40a1-9cc9-2a482d88b845"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.757942 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "772337e6-04b4-40a1-9cc9-2a482d88b845" (UID: "772337e6-04b4-40a1-9cc9-2a482d88b845"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.758329 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-kube-api-access-qwr2z" (OuterVolumeSpecName: "kube-api-access-qwr2z") pod "772337e6-04b4-40a1-9cc9-2a482d88b845" (UID: "772337e6-04b4-40a1-9cc9-2a482d88b845"). InnerVolumeSpecName "kube-api-access-qwr2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.758805 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "772337e6-04b4-40a1-9cc9-2a482d88b845" (UID: "772337e6-04b4-40a1-9cc9-2a482d88b845"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.759769 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "772337e6-04b4-40a1-9cc9-2a482d88b845" (UID: "772337e6-04b4-40a1-9cc9-2a482d88b845"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.759815 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-ovn-default-certs-0") pod "772337e6-04b4-40a1-9cc9-2a482d88b845" (UID: "772337e6-04b4-40a1-9cc9-2a482d88b845"). InnerVolumeSpecName "openstack-cell1-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.760525 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "772337e6-04b4-40a1-9cc9-2a482d88b845" (UID: "772337e6-04b4-40a1-9cc9-2a482d88b845"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.776831 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-inventory" (OuterVolumeSpecName: "inventory") pod "772337e6-04b4-40a1-9cc9-2a482d88b845" (UID: "772337e6-04b4-40a1-9cc9-2a482d88b845"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.784423 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "772337e6-04b4-40a1-9cc9-2a482d88b845" (UID: "772337e6-04b4-40a1-9cc9-2a482d88b845"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.838698 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.838947 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwr2z\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-kube-api-access-qwr2z\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.839007 4954 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.839066 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.839195 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.839262 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.839318 4954 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.839372 4954 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.839430 4954 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.839483 4954 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.839533 4954 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.839609 4954 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.839665 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.839718 4954 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772337e6-04b4-40a1-9cc9-2a482d88b845-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:45 crc kubenswrapper[4954]: I1206 09:28:45.839771 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/772337e6-04b4-40a1-9cc9-2a482d88b845-openstack-cell1-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.221704 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" event={"ID":"772337e6-04b4-40a1-9cc9-2a482d88b845","Type":"ContainerDied","Data":"1297274652b958e54fed8de91c7b75a466f8c6f16f3265df34713b1c095e4128"} Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.222019 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1297274652b958e54fed8de91c7b75a466f8c6f16f3265df34713b1c095e4128" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.221774 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-5hwtq" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.447374 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-g2lrh"] Dec 06 09:28:46 crc kubenswrapper[4954]: E1206 09:28:46.448113 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="772337e6-04b4-40a1-9cc9-2a482d88b845" containerName="install-certs-openstack-openstack-cell1" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.448198 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="772337e6-04b4-40a1-9cc9-2a482d88b845" containerName="install-certs-openstack-openstack-cell1" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.448465 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="772337e6-04b4-40a1-9cc9-2a482d88b845" containerName="install-certs-openstack-openstack-cell1" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.449477 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.451808 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.452420 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.464047 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-g2lrh"] Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.557164 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025be865-7bd0-4184-b542-76879dcb05c4-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-g2lrh\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.557259 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/025be865-7bd0-4184-b542-76879dcb05c4-ssh-key\") pod \"ovn-openstack-openstack-cell1-g2lrh\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.557285 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/025be865-7bd0-4184-b542-76879dcb05c4-inventory\") pod \"ovn-openstack-openstack-cell1-g2lrh\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.557331 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjsv2\" (UniqueName: \"kubernetes.io/projected/025be865-7bd0-4184-b542-76879dcb05c4-kube-api-access-tjsv2\") pod \"ovn-openstack-openstack-cell1-g2lrh\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.557680 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/025be865-7bd0-4184-b542-76879dcb05c4-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-g2lrh\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.658983 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/025be865-7bd0-4184-b542-76879dcb05c4-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-g2lrh\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.659143 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025be865-7bd0-4184-b542-76879dcb05c4-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-g2lrh\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.659197 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/025be865-7bd0-4184-b542-76879dcb05c4-ssh-key\") pod \"ovn-openstack-openstack-cell1-g2lrh\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.659223 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/025be865-7bd0-4184-b542-76879dcb05c4-inventory\") pod \"ovn-openstack-openstack-cell1-g2lrh\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.659242 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjsv2\" (UniqueName: \"kubernetes.io/projected/025be865-7bd0-4184-b542-76879dcb05c4-kube-api-access-tjsv2\") pod \"ovn-openstack-openstack-cell1-g2lrh\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.660095 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/025be865-7bd0-4184-b542-76879dcb05c4-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-g2lrh\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.664119 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/025be865-7bd0-4184-b542-76879dcb05c4-inventory\") pod \"ovn-openstack-openstack-cell1-g2lrh\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.664912 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025be865-7bd0-4184-b542-76879dcb05c4-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-g2lrh\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.667548 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/025be865-7bd0-4184-b542-76879dcb05c4-ssh-key\") pod \"ovn-openstack-openstack-cell1-g2lrh\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.680536 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjsv2\" (UniqueName: \"kubernetes.io/projected/025be865-7bd0-4184-b542-76879dcb05c4-kube-api-access-tjsv2\") pod \"ovn-openstack-openstack-cell1-g2lrh\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:28:46 crc kubenswrapper[4954]: I1206 09:28:46.778588 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:28:47 crc kubenswrapper[4954]: I1206 09:28:47.352003 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-g2lrh"] Dec 06 09:28:48 crc kubenswrapper[4954]: I1206 09:28:48.243515 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-g2lrh" event={"ID":"025be865-7bd0-4184-b542-76879dcb05c4","Type":"ContainerStarted","Data":"d9751a122e3f70c13b5569a860f52623f32b5e73ccaaadb69866b03cc097f51d"} Dec 06 09:28:48 crc kubenswrapper[4954]: I1206 09:28:48.243874 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-g2lrh" event={"ID":"025be865-7bd0-4184-b542-76879dcb05c4","Type":"ContainerStarted","Data":"912f54b4001744e8332ada0f3f6c898d07244c5491b48395833c7df046d00049"} Dec 06 09:28:48 crc kubenswrapper[4954]: I1206 09:28:48.267986 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-g2lrh" podStartSLOduration=1.660991181 podStartE2EDuration="2.267963412s" podCreationTimestamp="2025-12-06 09:28:46 +0000 UTC" firstStartedPulling="2025-12-06 09:28:47.381922842 +0000 UTC m=+9102.195282231" lastFinishedPulling="2025-12-06 09:28:47.988895073 +0000 UTC m=+9102.802254462" observedRunningTime="2025-12-06 09:28:48.259923708 +0000 UTC m=+9103.073283107" watchObservedRunningTime="2025-12-06 09:28:48.267963412 +0000 UTC m=+9103.081322801" Dec 06 09:28:52 crc kubenswrapper[4954]: I1206 09:28:52.443749 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:28:52 crc kubenswrapper[4954]: E1206 09:28:52.444462 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:29:00 crc kubenswrapper[4954]: I1206 09:29:00.390750 4954 generic.go:334] "Generic (PLEG): container finished" podID="1668e041-54f7-4e5c-8b9e-0c5bb97cdf77" containerID="ab330fd35c319128d1121cc1a7cd7c334e2f3a0ae5a29b63234ab1ff6320b90f" exitCode=0 Dec 06 09:29:00 crc kubenswrapper[4954]: I1206 09:29:00.390888 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-ftv6g" event={"ID":"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77","Type":"ContainerDied","Data":"ab330fd35c319128d1121cc1a7cd7c334e2f3a0ae5a29b63234ab1ff6320b90f"} Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.341666 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.357848 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-ovn-combined-ca-bundle\") pod \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.357890 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-ovncontroller-config-0\") pod \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.357950 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jp2w\" (UniqueName: \"kubernetes.io/projected/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-kube-api-access-4jp2w\") pod \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.358050 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-ssh-key\") pod \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.358127 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-inventory\") pod \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\" (UID: \"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77\") " Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.413290 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-kube-api-access-4jp2w" (OuterVolumeSpecName: "kube-api-access-4jp2w") pod "1668e041-54f7-4e5c-8b9e-0c5bb97cdf77" (UID: "1668e041-54f7-4e5c-8b9e-0c5bb97cdf77"). InnerVolumeSpecName "kube-api-access-4jp2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.414748 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "1668e041-54f7-4e5c-8b9e-0c5bb97cdf77" (UID: "1668e041-54f7-4e5c-8b9e-0c5bb97cdf77"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.419846 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1668e041-54f7-4e5c-8b9e-0c5bb97cdf77" (UID: "1668e041-54f7-4e5c-8b9e-0c5bb97cdf77"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.424437 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-inventory" (OuterVolumeSpecName: "inventory") pod "1668e041-54f7-4e5c-8b9e-0c5bb97cdf77" (UID: "1668e041-54f7-4e5c-8b9e-0c5bb97cdf77"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.434541 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-ftv6g" event={"ID":"1668e041-54f7-4e5c-8b9e-0c5bb97cdf77","Type":"ContainerDied","Data":"cff1d2ec3979eb39ad7521acc5e1edc11af8a1eb4136f66dd859d966b272446a"} Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.434615 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cff1d2ec3979eb39ad7521acc5e1edc11af8a1eb4136f66dd859d966b272446a" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.434684 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-ftv6g" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.447909 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "1668e041-54f7-4e5c-8b9e-0c5bb97cdf77" (UID: "1668e041-54f7-4e5c-8b9e-0c5bb97cdf77"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.459740 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.459774 4954 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.459784 4954 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.459792 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jp2w\" (UniqueName: \"kubernetes.io/projected/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-kube-api-access-4jp2w\") on node \"crc\" DevicePath \"\"" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.459800 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1668e041-54f7-4e5c-8b9e-0c5bb97cdf77-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.526742 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-hrbwm"] Dec 06 09:29:02 crc kubenswrapper[4954]: E1206 09:29:02.527206 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1668e041-54f7-4e5c-8b9e-0c5bb97cdf77" containerName="ovn-openstack-openstack-networker" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.527250 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1668e041-54f7-4e5c-8b9e-0c5bb97cdf77" containerName="ovn-openstack-openstack-networker" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.527483 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1668e041-54f7-4e5c-8b9e-0c5bb97cdf77" containerName="ovn-openstack-openstack-networker" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.528542 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.532236 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.536206 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.547318 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-hrbwm"] Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.562101 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-hrbwm\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.562163 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-ssh-key\") pod \"neutron-metadata-openstack-openstack-networker-hrbwm\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.562218 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rktt5\" (UniqueName: \"kubernetes.io/projected/50fb29f7-2be5-45d6-b204-692aec122a55-kube-api-access-rktt5\") pod \"neutron-metadata-openstack-openstack-networker-hrbwm\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.562399 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-hrbwm\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.562472 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-hrbwm\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.562514 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-inventory\") pod \"neutron-metadata-openstack-openstack-networker-hrbwm\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.663491 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-hrbwm\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.663551 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-inventory\") pod \"neutron-metadata-openstack-openstack-networker-hrbwm\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.663646 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-hrbwm\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.663664 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-ssh-key\") pod \"neutron-metadata-openstack-openstack-networker-hrbwm\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.663693 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rktt5\" (UniqueName: \"kubernetes.io/projected/50fb29f7-2be5-45d6-b204-692aec122a55-kube-api-access-rktt5\") pod \"neutron-metadata-openstack-openstack-networker-hrbwm\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.663768 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-hrbwm\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.667185 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-hrbwm\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.667245 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-hrbwm\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.677121 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-hrbwm\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.677344 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-inventory\") pod \"neutron-metadata-openstack-openstack-networker-hrbwm\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.677778 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-ssh-key\") pod \"neutron-metadata-openstack-openstack-networker-hrbwm\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.690624 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rktt5\" (UniqueName: \"kubernetes.io/projected/50fb29f7-2be5-45d6-b204-692aec122a55-kube-api-access-rktt5\") pod \"neutron-metadata-openstack-openstack-networker-hrbwm\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:02 crc kubenswrapper[4954]: I1206 09:29:02.853496 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:29:03 crc kubenswrapper[4954]: I1206 09:29:03.418918 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-hrbwm"] Dec 06 09:29:03 crc kubenswrapper[4954]: I1206 09:29:03.458260 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" event={"ID":"50fb29f7-2be5-45d6-b204-692aec122a55","Type":"ContainerStarted","Data":"aaf3d855bb2a74d843266e10262e5cb7b03725fa36b9caf6a7d89b0fa46e07ab"} Dec 06 09:29:05 crc kubenswrapper[4954]: I1206 09:29:05.469740 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" event={"ID":"50fb29f7-2be5-45d6-b204-692aec122a55","Type":"ContainerStarted","Data":"36dd196835abfdbfc93b251a23a696a5b14c8e96f67449be323fde940e802fbb"} Dec 06 09:29:06 crc kubenswrapper[4954]: I1206 09:29:06.445285 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:29:06 crc kubenswrapper[4954]: E1206 09:29:06.445865 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:29:08 crc kubenswrapper[4954]: I1206 09:29:08.162127 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" podStartSLOduration=5.762310698 podStartE2EDuration="6.162104077s" podCreationTimestamp="2025-12-06 09:29:02 +0000 UTC" firstStartedPulling="2025-12-06 09:29:03.427246328 +0000 UTC m=+9118.240605707" lastFinishedPulling="2025-12-06 09:29:03.827039687 +0000 UTC m=+9118.640399086" observedRunningTime="2025-12-06 09:29:05.500181592 +0000 UTC m=+9120.313540991" watchObservedRunningTime="2025-12-06 09:29:08.162104077 +0000 UTC m=+9122.975463456" Dec 06 09:29:08 crc kubenswrapper[4954]: I1206 09:29:08.165874 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-64xkw"] Dec 06 09:29:08 crc kubenswrapper[4954]: I1206 09:29:08.168926 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64xkw" Dec 06 09:29:08 crc kubenswrapper[4954]: I1206 09:29:08.181746 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64xkw"] Dec 06 09:29:08 crc kubenswrapper[4954]: I1206 09:29:08.221488 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c61bf7-ce4f-4b3f-8656-a8654febd209-utilities\") pod \"redhat-marketplace-64xkw\" (UID: \"f6c61bf7-ce4f-4b3f-8656-a8654febd209\") " pod="openshift-marketplace/redhat-marketplace-64xkw" Dec 06 09:29:08 crc kubenswrapper[4954]: I1206 09:29:08.222394 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c61bf7-ce4f-4b3f-8656-a8654febd209-catalog-content\") pod \"redhat-marketplace-64xkw\" (UID: \"f6c61bf7-ce4f-4b3f-8656-a8654febd209\") " pod="openshift-marketplace/redhat-marketplace-64xkw" Dec 06 09:29:08 crc kubenswrapper[4954]: I1206 09:29:08.222550 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jj4g\" (UniqueName: \"kubernetes.io/projected/f6c61bf7-ce4f-4b3f-8656-a8654febd209-kube-api-access-5jj4g\") pod \"redhat-marketplace-64xkw\" (UID: \"f6c61bf7-ce4f-4b3f-8656-a8654febd209\") " pod="openshift-marketplace/redhat-marketplace-64xkw" Dec 06 09:29:08 crc kubenswrapper[4954]: I1206 09:29:08.324619 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c61bf7-ce4f-4b3f-8656-a8654febd209-catalog-content\") pod \"redhat-marketplace-64xkw\" (UID: \"f6c61bf7-ce4f-4b3f-8656-a8654febd209\") " pod="openshift-marketplace/redhat-marketplace-64xkw" Dec 06 09:29:08 crc kubenswrapper[4954]: I1206 09:29:08.324723 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jj4g\" (UniqueName: \"kubernetes.io/projected/f6c61bf7-ce4f-4b3f-8656-a8654febd209-kube-api-access-5jj4g\") pod \"redhat-marketplace-64xkw\" (UID: \"f6c61bf7-ce4f-4b3f-8656-a8654febd209\") " pod="openshift-marketplace/redhat-marketplace-64xkw" Dec 06 09:29:08 crc kubenswrapper[4954]: I1206 09:29:08.324875 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c61bf7-ce4f-4b3f-8656-a8654febd209-utilities\") pod \"redhat-marketplace-64xkw\" (UID: \"f6c61bf7-ce4f-4b3f-8656-a8654febd209\") " pod="openshift-marketplace/redhat-marketplace-64xkw" Dec 06 09:29:08 crc kubenswrapper[4954]: I1206 09:29:08.325311 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c61bf7-ce4f-4b3f-8656-a8654febd209-catalog-content\") pod \"redhat-marketplace-64xkw\" (UID: \"f6c61bf7-ce4f-4b3f-8656-a8654febd209\") " pod="openshift-marketplace/redhat-marketplace-64xkw" Dec 06 09:29:08 crc kubenswrapper[4954]: I1206 09:29:08.325370 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c61bf7-ce4f-4b3f-8656-a8654febd209-utilities\") pod \"redhat-marketplace-64xkw\" (UID: \"f6c61bf7-ce4f-4b3f-8656-a8654febd209\") " pod="openshift-marketplace/redhat-marketplace-64xkw" Dec 06 09:29:08 crc kubenswrapper[4954]: I1206 09:29:08.353894 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jj4g\" (UniqueName: \"kubernetes.io/projected/f6c61bf7-ce4f-4b3f-8656-a8654febd209-kube-api-access-5jj4g\") pod \"redhat-marketplace-64xkw\" (UID: \"f6c61bf7-ce4f-4b3f-8656-a8654febd209\") " pod="openshift-marketplace/redhat-marketplace-64xkw" Dec 06 09:29:08 crc kubenswrapper[4954]: I1206 09:29:08.499477 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64xkw" Dec 06 09:29:09 crc kubenswrapper[4954]: I1206 09:29:09.074304 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64xkw"] Dec 06 09:29:09 crc kubenswrapper[4954]: W1206 09:29:09.077940 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6c61bf7_ce4f_4b3f_8656_a8654febd209.slice/crio-7d75f51d4944a355c843511eaa9f3053af1e22b9e0f69f33e65d0b54dfc5c307 WatchSource:0}: Error finding container 7d75f51d4944a355c843511eaa9f3053af1e22b9e0f69f33e65d0b54dfc5c307: Status 404 returned error can't find the container with id 7d75f51d4944a355c843511eaa9f3053af1e22b9e0f69f33e65d0b54dfc5c307 Dec 06 09:29:09 crc kubenswrapper[4954]: I1206 09:29:09.511990 4954 generic.go:334] "Generic (PLEG): container finished" podID="f6c61bf7-ce4f-4b3f-8656-a8654febd209" containerID="3b006e97592906f3ca8ea6a438d8790acd13c1825f9a89049b28ed1384c59463" exitCode=0 Dec 06 09:29:09 crc kubenswrapper[4954]: I1206 09:29:09.512400 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64xkw" event={"ID":"f6c61bf7-ce4f-4b3f-8656-a8654febd209","Type":"ContainerDied","Data":"3b006e97592906f3ca8ea6a438d8790acd13c1825f9a89049b28ed1384c59463"} Dec 06 09:29:09 crc kubenswrapper[4954]: I1206 09:29:09.513740 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64xkw" event={"ID":"f6c61bf7-ce4f-4b3f-8656-a8654febd209","Type":"ContainerStarted","Data":"7d75f51d4944a355c843511eaa9f3053af1e22b9e0f69f33e65d0b54dfc5c307"} Dec 06 09:29:11 crc kubenswrapper[4954]: I1206 09:29:11.546078 4954 generic.go:334] "Generic (PLEG): container finished" podID="f6c61bf7-ce4f-4b3f-8656-a8654febd209" containerID="30f34d3216028a466663128ae5d4407de6f937f118790cffd5a63d0a1a7d1133" exitCode=0 Dec 06 09:29:11 crc kubenswrapper[4954]: I1206 09:29:11.546638 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64xkw" event={"ID":"f6c61bf7-ce4f-4b3f-8656-a8654febd209","Type":"ContainerDied","Data":"30f34d3216028a466663128ae5d4407de6f937f118790cffd5a63d0a1a7d1133"} Dec 06 09:29:13 crc kubenswrapper[4954]: I1206 09:29:13.569069 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64xkw" event={"ID":"f6c61bf7-ce4f-4b3f-8656-a8654febd209","Type":"ContainerStarted","Data":"466c56536002e0e67006fc8be8f38f4645deeb999e22069a4c2e56a1fdf3e195"} Dec 06 09:29:13 crc kubenswrapper[4954]: I1206 09:29:13.604716 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-64xkw" podStartSLOduration=3.150008462 podStartE2EDuration="5.604695681s" podCreationTimestamp="2025-12-06 09:29:08 +0000 UTC" firstStartedPulling="2025-12-06 09:29:09.514310266 +0000 UTC m=+9124.327669675" lastFinishedPulling="2025-12-06 09:29:11.968997505 +0000 UTC m=+9126.782356894" observedRunningTime="2025-12-06 09:29:13.595824105 +0000 UTC m=+9128.409183514" watchObservedRunningTime="2025-12-06 09:29:13.604695681 +0000 UTC m=+9128.418055060" Dec 06 09:29:18 crc kubenswrapper[4954]: I1206 09:29:18.500595 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-64xkw" Dec 06 09:29:18 crc kubenswrapper[4954]: I1206 09:29:18.501127 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-64xkw" Dec 06 09:29:18 crc kubenswrapper[4954]: I1206 09:29:18.544362 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-64xkw" Dec 06 09:29:18 crc kubenswrapper[4954]: I1206 09:29:18.664404 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-64xkw" Dec 06 09:29:18 crc kubenswrapper[4954]: I1206 09:29:18.782084 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64xkw"] Dec 06 09:29:19 crc kubenswrapper[4954]: I1206 09:29:19.443715 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:29:19 crc kubenswrapper[4954]: E1206 09:29:19.444184 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:29:20 crc kubenswrapper[4954]: I1206 09:29:20.643530 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-64xkw" podUID="f6c61bf7-ce4f-4b3f-8656-a8654febd209" containerName="registry-server" containerID="cri-o://466c56536002e0e67006fc8be8f38f4645deeb999e22069a4c2e56a1fdf3e195" gracePeriod=2 Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.161400 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64xkw" Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.200638 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c61bf7-ce4f-4b3f-8656-a8654febd209-utilities\") pod \"f6c61bf7-ce4f-4b3f-8656-a8654febd209\" (UID: \"f6c61bf7-ce4f-4b3f-8656-a8654febd209\") " Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.200775 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c61bf7-ce4f-4b3f-8656-a8654febd209-catalog-content\") pod \"f6c61bf7-ce4f-4b3f-8656-a8654febd209\" (UID: \"f6c61bf7-ce4f-4b3f-8656-a8654febd209\") " Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.200847 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jj4g\" (UniqueName: \"kubernetes.io/projected/f6c61bf7-ce4f-4b3f-8656-a8654febd209-kube-api-access-5jj4g\") pod \"f6c61bf7-ce4f-4b3f-8656-a8654febd209\" (UID: \"f6c61bf7-ce4f-4b3f-8656-a8654febd209\") " Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.202806 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6c61bf7-ce4f-4b3f-8656-a8654febd209-utilities" (OuterVolumeSpecName: "utilities") pod "f6c61bf7-ce4f-4b3f-8656-a8654febd209" (UID: "f6c61bf7-ce4f-4b3f-8656-a8654febd209"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.216026 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6c61bf7-ce4f-4b3f-8656-a8654febd209-kube-api-access-5jj4g" (OuterVolumeSpecName: "kube-api-access-5jj4g") pod "f6c61bf7-ce4f-4b3f-8656-a8654febd209" (UID: "f6c61bf7-ce4f-4b3f-8656-a8654febd209"). InnerVolumeSpecName "kube-api-access-5jj4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.224156 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6c61bf7-ce4f-4b3f-8656-a8654febd209-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6c61bf7-ce4f-4b3f-8656-a8654febd209" (UID: "f6c61bf7-ce4f-4b3f-8656-a8654febd209"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.304525 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c61bf7-ce4f-4b3f-8656-a8654febd209-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.304602 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c61bf7-ce4f-4b3f-8656-a8654febd209-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.304620 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jj4g\" (UniqueName: \"kubernetes.io/projected/f6c61bf7-ce4f-4b3f-8656-a8654febd209-kube-api-access-5jj4g\") on node \"crc\" DevicePath \"\"" Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.660197 4954 generic.go:334] "Generic (PLEG): container finished" podID="f6c61bf7-ce4f-4b3f-8656-a8654febd209" containerID="466c56536002e0e67006fc8be8f38f4645deeb999e22069a4c2e56a1fdf3e195" exitCode=0 Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.660237 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64xkw" event={"ID":"f6c61bf7-ce4f-4b3f-8656-a8654febd209","Type":"ContainerDied","Data":"466c56536002e0e67006fc8be8f38f4645deeb999e22069a4c2e56a1fdf3e195"} Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.660274 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64xkw" event={"ID":"f6c61bf7-ce4f-4b3f-8656-a8654febd209","Type":"ContainerDied","Data":"7d75f51d4944a355c843511eaa9f3053af1e22b9e0f69f33e65d0b54dfc5c307"} Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.660285 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64xkw" Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.660296 4954 scope.go:117] "RemoveContainer" containerID="466c56536002e0e67006fc8be8f38f4645deeb999e22069a4c2e56a1fdf3e195" Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.686976 4954 scope.go:117] "RemoveContainer" containerID="30f34d3216028a466663128ae5d4407de6f937f118790cffd5a63d0a1a7d1133" Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.710608 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64xkw"] Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.720194 4954 scope.go:117] "RemoveContainer" containerID="3b006e97592906f3ca8ea6a438d8790acd13c1825f9a89049b28ed1384c59463" Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.728137 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-64xkw"] Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.789252 4954 scope.go:117] "RemoveContainer" containerID="466c56536002e0e67006fc8be8f38f4645deeb999e22069a4c2e56a1fdf3e195" Dec 06 09:29:21 crc kubenswrapper[4954]: E1206 09:29:21.792957 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"466c56536002e0e67006fc8be8f38f4645deeb999e22069a4c2e56a1fdf3e195\": container with ID starting with 466c56536002e0e67006fc8be8f38f4645deeb999e22069a4c2e56a1fdf3e195 not found: ID does not exist" containerID="466c56536002e0e67006fc8be8f38f4645deeb999e22069a4c2e56a1fdf3e195" Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.793019 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466c56536002e0e67006fc8be8f38f4645deeb999e22069a4c2e56a1fdf3e195"} err="failed to get container status \"466c56536002e0e67006fc8be8f38f4645deeb999e22069a4c2e56a1fdf3e195\": rpc error: code = NotFound desc = could not find container \"466c56536002e0e67006fc8be8f38f4645deeb999e22069a4c2e56a1fdf3e195\": container with ID starting with 466c56536002e0e67006fc8be8f38f4645deeb999e22069a4c2e56a1fdf3e195 not found: ID does not exist" Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.793045 4954 scope.go:117] "RemoveContainer" containerID="30f34d3216028a466663128ae5d4407de6f937f118790cffd5a63d0a1a7d1133" Dec 06 09:29:21 crc kubenswrapper[4954]: E1206 09:29:21.793769 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30f34d3216028a466663128ae5d4407de6f937f118790cffd5a63d0a1a7d1133\": container with ID starting with 30f34d3216028a466663128ae5d4407de6f937f118790cffd5a63d0a1a7d1133 not found: ID does not exist" containerID="30f34d3216028a466663128ae5d4407de6f937f118790cffd5a63d0a1a7d1133" Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.793853 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30f34d3216028a466663128ae5d4407de6f937f118790cffd5a63d0a1a7d1133"} err="failed to get container status \"30f34d3216028a466663128ae5d4407de6f937f118790cffd5a63d0a1a7d1133\": rpc error: code = NotFound desc = could not find container \"30f34d3216028a466663128ae5d4407de6f937f118790cffd5a63d0a1a7d1133\": container with ID starting with 30f34d3216028a466663128ae5d4407de6f937f118790cffd5a63d0a1a7d1133 not found: ID does not exist" Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.793899 4954 scope.go:117] "RemoveContainer" containerID="3b006e97592906f3ca8ea6a438d8790acd13c1825f9a89049b28ed1384c59463" Dec 06 09:29:21 crc kubenswrapper[4954]: E1206 09:29:21.794365 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b006e97592906f3ca8ea6a438d8790acd13c1825f9a89049b28ed1384c59463\": container with ID starting with 3b006e97592906f3ca8ea6a438d8790acd13c1825f9a89049b28ed1384c59463 not found: ID does not exist" containerID="3b006e97592906f3ca8ea6a438d8790acd13c1825f9a89049b28ed1384c59463" Dec 06 09:29:21 crc kubenswrapper[4954]: I1206 09:29:21.794447 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b006e97592906f3ca8ea6a438d8790acd13c1825f9a89049b28ed1384c59463"} err="failed to get container status \"3b006e97592906f3ca8ea6a438d8790acd13c1825f9a89049b28ed1384c59463\": rpc error: code = NotFound desc = could not find container \"3b006e97592906f3ca8ea6a438d8790acd13c1825f9a89049b28ed1384c59463\": container with ID starting with 3b006e97592906f3ca8ea6a438d8790acd13c1825f9a89049b28ed1384c59463 not found: ID does not exist" Dec 06 09:29:23 crc kubenswrapper[4954]: I1206 09:29:23.459112 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6c61bf7-ce4f-4b3f-8656-a8654febd209" path="/var/lib/kubelet/pods/f6c61bf7-ce4f-4b3f-8656-a8654febd209/volumes" Dec 06 09:29:33 crc kubenswrapper[4954]: I1206 09:29:33.443002 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:29:33 crc kubenswrapper[4954]: E1206 09:29:33.443835 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:29:46 crc kubenswrapper[4954]: I1206 09:29:46.443797 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:29:46 crc kubenswrapper[4954]: E1206 09:29:46.444831 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.149140 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb"] Dec 06 09:30:00 crc kubenswrapper[4954]: E1206 09:30:00.180191 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6c61bf7-ce4f-4b3f-8656-a8654febd209" containerName="extract-content" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.180615 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c61bf7-ce4f-4b3f-8656-a8654febd209" containerName="extract-content" Dec 06 09:30:00 crc kubenswrapper[4954]: E1206 09:30:00.180733 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6c61bf7-ce4f-4b3f-8656-a8654febd209" containerName="registry-server" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.180801 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c61bf7-ce4f-4b3f-8656-a8654febd209" containerName="registry-server" Dec 06 09:30:00 crc kubenswrapper[4954]: E1206 09:30:00.180863 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6c61bf7-ce4f-4b3f-8656-a8654febd209" containerName="extract-utilities" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.180912 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c61bf7-ce4f-4b3f-8656-a8654febd209" containerName="extract-utilities" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.181548 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6c61bf7-ce4f-4b3f-8656-a8654febd209" containerName="registry-server" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.182662 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.185763 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.186892 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.191829 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb"] Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.374697 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e-secret-volume\") pod \"collect-profiles-29416890-5w8nb\" (UID: \"1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.374980 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e-config-volume\") pod \"collect-profiles-29416890-5w8nb\" (UID: \"1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.375027 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7829\" (UniqueName: \"kubernetes.io/projected/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e-kube-api-access-r7829\") pod \"collect-profiles-29416890-5w8nb\" (UID: \"1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.443845 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:30:00 crc kubenswrapper[4954]: E1206 09:30:00.444121 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.477382 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e-secret-volume\") pod \"collect-profiles-29416890-5w8nb\" (UID: \"1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.477648 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e-config-volume\") pod \"collect-profiles-29416890-5w8nb\" (UID: \"1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.477690 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7829\" (UniqueName: \"kubernetes.io/projected/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e-kube-api-access-r7829\") pod \"collect-profiles-29416890-5w8nb\" (UID: \"1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.478435 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e-config-volume\") pod \"collect-profiles-29416890-5w8nb\" (UID: \"1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.484170 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e-secret-volume\") pod \"collect-profiles-29416890-5w8nb\" (UID: \"1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.494664 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7829\" (UniqueName: \"kubernetes.io/projected/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e-kube-api-access-r7829\") pod \"collect-profiles-29416890-5w8nb\" (UID: \"1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.505709 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb" Dec 06 09:30:00 crc kubenswrapper[4954]: I1206 09:30:00.990259 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb"] Dec 06 09:30:00 crc kubenswrapper[4954]: W1206 09:30:00.994250 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cb1fbbb_35ab_4c0b_ba48_b0b2b9a6d54e.slice/crio-a6f9b22e0e01c60c614b39f86435a8128a77a27d291a675602cc74846cee63f8 WatchSource:0}: Error finding container a6f9b22e0e01c60c614b39f86435a8128a77a27d291a675602cc74846cee63f8: Status 404 returned error can't find the container with id a6f9b22e0e01c60c614b39f86435a8128a77a27d291a675602cc74846cee63f8 Dec 06 09:30:01 crc kubenswrapper[4954]: I1206 09:30:01.033147 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb" event={"ID":"1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e","Type":"ContainerStarted","Data":"a6f9b22e0e01c60c614b39f86435a8128a77a27d291a675602cc74846cee63f8"} Dec 06 09:30:02 crc kubenswrapper[4954]: I1206 09:30:02.043107 4954 generic.go:334] "Generic (PLEG): container finished" podID="1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e" containerID="8ba1edb0760f827f659527418cacefb800eeff40f5dbb8ed52da1de212a97b0d" exitCode=0 Dec 06 09:30:02 crc kubenswrapper[4954]: I1206 09:30:02.043388 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb" event={"ID":"1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e","Type":"ContainerDied","Data":"8ba1edb0760f827f659527418cacefb800eeff40f5dbb8ed52da1de212a97b0d"} Dec 06 09:30:03 crc kubenswrapper[4954]: I1206 09:30:03.486478 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb" Dec 06 09:30:03 crc kubenswrapper[4954]: I1206 09:30:03.639081 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e-config-volume\") pod \"1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e\" (UID: \"1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e\") " Dec 06 09:30:03 crc kubenswrapper[4954]: I1206 09:30:03.639779 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e-config-volume" (OuterVolumeSpecName: "config-volume") pod "1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e" (UID: "1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:30:03 crc kubenswrapper[4954]: I1206 09:30:03.639865 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e-secret-volume\") pod \"1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e\" (UID: \"1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e\") " Dec 06 09:30:03 crc kubenswrapper[4954]: I1206 09:30:03.640177 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7829\" (UniqueName: \"kubernetes.io/projected/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e-kube-api-access-r7829\") pod \"1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e\" (UID: \"1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e\") " Dec 06 09:30:03 crc kubenswrapper[4954]: I1206 09:30:03.642374 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 09:30:03 crc kubenswrapper[4954]: I1206 09:30:03.647127 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e-kube-api-access-r7829" (OuterVolumeSpecName: "kube-api-access-r7829") pod "1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e" (UID: "1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e"). InnerVolumeSpecName "kube-api-access-r7829". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:30:03 crc kubenswrapper[4954]: I1206 09:30:03.649931 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e" (UID: "1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:30:03 crc kubenswrapper[4954]: I1206 09:30:03.744837 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7829\" (UniqueName: \"kubernetes.io/projected/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e-kube-api-access-r7829\") on node \"crc\" DevicePath \"\"" Dec 06 09:30:03 crc kubenswrapper[4954]: I1206 09:30:03.744897 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 09:30:04 crc kubenswrapper[4954]: I1206 09:30:04.069600 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb" Dec 06 09:30:04 crc kubenswrapper[4954]: I1206 09:30:04.069626 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb" event={"ID":"1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e","Type":"ContainerDied","Data":"a6f9b22e0e01c60c614b39f86435a8128a77a27d291a675602cc74846cee63f8"} Dec 06 09:30:04 crc kubenswrapper[4954]: I1206 09:30:04.069665 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6f9b22e0e01c60c614b39f86435a8128a77a27d291a675602cc74846cee63f8" Dec 06 09:30:04 crc kubenswrapper[4954]: I1206 09:30:04.072237 4954 generic.go:334] "Generic (PLEG): container finished" podID="025be865-7bd0-4184-b542-76879dcb05c4" containerID="d9751a122e3f70c13b5569a860f52623f32b5e73ccaaadb69866b03cc097f51d" exitCode=0 Dec 06 09:30:04 crc kubenswrapper[4954]: I1206 09:30:04.072294 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-g2lrh" event={"ID":"025be865-7bd0-4184-b542-76879dcb05c4","Type":"ContainerDied","Data":"d9751a122e3f70c13b5569a860f52623f32b5e73ccaaadb69866b03cc097f51d"} Dec 06 09:30:04 crc kubenswrapper[4954]: I1206 09:30:04.590317 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56"] Dec 06 09:30:04 crc kubenswrapper[4954]: I1206 09:30:04.603883 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416845-w6s56"] Dec 06 09:30:05 crc kubenswrapper[4954]: I1206 09:30:05.456931 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff6704a9-ac73-4944-b286-047db35d742f" path="/var/lib/kubelet/pods/ff6704a9-ac73-4944-b286-047db35d742f/volumes" Dec 06 09:30:05 crc kubenswrapper[4954]: I1206 09:30:05.776932 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:30:05 crc kubenswrapper[4954]: I1206 09:30:05.901993 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/025be865-7bd0-4184-b542-76879dcb05c4-ssh-key\") pod \"025be865-7bd0-4184-b542-76879dcb05c4\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " Dec 06 09:30:05 crc kubenswrapper[4954]: I1206 09:30:05.902068 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjsv2\" (UniqueName: \"kubernetes.io/projected/025be865-7bd0-4184-b542-76879dcb05c4-kube-api-access-tjsv2\") pod \"025be865-7bd0-4184-b542-76879dcb05c4\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " Dec 06 09:30:05 crc kubenswrapper[4954]: I1206 09:30:05.902211 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/025be865-7bd0-4184-b542-76879dcb05c4-ovncontroller-config-0\") pod \"025be865-7bd0-4184-b542-76879dcb05c4\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " Dec 06 09:30:05 crc kubenswrapper[4954]: I1206 09:30:05.902341 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/025be865-7bd0-4184-b542-76879dcb05c4-inventory\") pod \"025be865-7bd0-4184-b542-76879dcb05c4\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " Dec 06 09:30:05 crc kubenswrapper[4954]: I1206 09:30:05.902388 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025be865-7bd0-4184-b542-76879dcb05c4-ovn-combined-ca-bundle\") pod \"025be865-7bd0-4184-b542-76879dcb05c4\" (UID: \"025be865-7bd0-4184-b542-76879dcb05c4\") " Dec 06 09:30:05 crc kubenswrapper[4954]: I1206 09:30:05.907722 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025be865-7bd0-4184-b542-76879dcb05c4-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "025be865-7bd0-4184-b542-76879dcb05c4" (UID: "025be865-7bd0-4184-b542-76879dcb05c4"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:30:05 crc kubenswrapper[4954]: I1206 09:30:05.907827 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/025be865-7bd0-4184-b542-76879dcb05c4-kube-api-access-tjsv2" (OuterVolumeSpecName: "kube-api-access-tjsv2") pod "025be865-7bd0-4184-b542-76879dcb05c4" (UID: "025be865-7bd0-4184-b542-76879dcb05c4"). InnerVolumeSpecName "kube-api-access-tjsv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:30:05 crc kubenswrapper[4954]: I1206 09:30:05.934126 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025be865-7bd0-4184-b542-76879dcb05c4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "025be865-7bd0-4184-b542-76879dcb05c4" (UID: "025be865-7bd0-4184-b542-76879dcb05c4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:30:05 crc kubenswrapper[4954]: I1206 09:30:05.933774 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/025be865-7bd0-4184-b542-76879dcb05c4-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "025be865-7bd0-4184-b542-76879dcb05c4" (UID: "025be865-7bd0-4184-b542-76879dcb05c4"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:30:05 crc kubenswrapper[4954]: I1206 09:30:05.936743 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025be865-7bd0-4184-b542-76879dcb05c4-inventory" (OuterVolumeSpecName: "inventory") pod "025be865-7bd0-4184-b542-76879dcb05c4" (UID: "025be865-7bd0-4184-b542-76879dcb05c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.004719 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjsv2\" (UniqueName: \"kubernetes.io/projected/025be865-7bd0-4184-b542-76879dcb05c4-kube-api-access-tjsv2\") on node \"crc\" DevicePath \"\"" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.004757 4954 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/025be865-7bd0-4184-b542-76879dcb05c4-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.004766 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/025be865-7bd0-4184-b542-76879dcb05c4-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.004776 4954 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025be865-7bd0-4184-b542-76879dcb05c4-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.004784 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/025be865-7bd0-4184-b542-76879dcb05c4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.092999 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-g2lrh" event={"ID":"025be865-7bd0-4184-b542-76879dcb05c4","Type":"ContainerDied","Data":"912f54b4001744e8332ada0f3f6c898d07244c5491b48395833c7df046d00049"} Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.093058 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="912f54b4001744e8332ada0f3f6c898d07244c5491b48395833c7df046d00049" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.093064 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-g2lrh" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.277023 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-5mxhl"] Dec 06 09:30:06 crc kubenswrapper[4954]: E1206 09:30:06.277465 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025be865-7bd0-4184-b542-76879dcb05c4" containerName="ovn-openstack-openstack-cell1" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.277480 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="025be865-7bd0-4184-b542-76879dcb05c4" containerName="ovn-openstack-openstack-cell1" Dec 06 09:30:06 crc kubenswrapper[4954]: E1206 09:30:06.277500 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e" containerName="collect-profiles" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.277506 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e" containerName="collect-profiles" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.277712 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e" containerName="collect-profiles" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.277750 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="025be865-7bd0-4184-b542-76879dcb05c4" containerName="ovn-openstack-openstack-cell1" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.278441 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.282225 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.282777 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.300765 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-5mxhl"] Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.415191 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5mxhl\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.415755 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-5mxhl\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.415863 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-5mxhl\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.415979 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-5mxhl\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.416064 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r9hb\" (UniqueName: \"kubernetes.io/projected/9135ca19-eb24-4e05-a3b7-a8511ef9368e-kube-api-access-8r9hb\") pod \"neutron-metadata-openstack-openstack-cell1-5mxhl\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.416177 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5mxhl\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.517845 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5mxhl\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.517955 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-5mxhl\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.517978 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-5mxhl\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.518008 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r9hb\" (UniqueName: \"kubernetes.io/projected/9135ca19-eb24-4e05-a3b7-a8511ef9368e-kube-api-access-8r9hb\") pod \"neutron-metadata-openstack-openstack-cell1-5mxhl\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.518030 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-5mxhl\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.518072 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5mxhl\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.522090 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-5mxhl\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.522273 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5mxhl\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.522373 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5mxhl\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.522419 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-5mxhl\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.522863 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-5mxhl\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.536195 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r9hb\" (UniqueName: \"kubernetes.io/projected/9135ca19-eb24-4e05-a3b7-a8511ef9368e-kube-api-access-8r9hb\") pod \"neutron-metadata-openstack-openstack-cell1-5mxhl\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:06 crc kubenswrapper[4954]: I1206 09:30:06.595197 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:30:07 crc kubenswrapper[4954]: I1206 09:30:07.126334 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-5mxhl"] Dec 06 09:30:08 crc kubenswrapper[4954]: I1206 09:30:08.130237 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" event={"ID":"9135ca19-eb24-4e05-a3b7-a8511ef9368e","Type":"ContainerStarted","Data":"018aac7a0d64b7798e6481e106cabf830120656a7e94b212a8e8210577791dbb"} Dec 06 09:30:09 crc kubenswrapper[4954]: I1206 09:30:09.141316 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" event={"ID":"9135ca19-eb24-4e05-a3b7-a8511ef9368e","Type":"ContainerStarted","Data":"6b3ef503c2f420b195b440bd528878d0a5a95c612c0154bbb80d153eff20adc0"} Dec 06 09:30:09 crc kubenswrapper[4954]: I1206 09:30:09.176197 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" podStartSLOduration=2.710005935 podStartE2EDuration="3.176177653s" podCreationTimestamp="2025-12-06 09:30:06 +0000 UTC" firstStartedPulling="2025-12-06 09:30:07.680511679 +0000 UTC m=+9182.493871058" lastFinishedPulling="2025-12-06 09:30:08.146683387 +0000 UTC m=+9182.960042776" observedRunningTime="2025-12-06 09:30:09.16668974 +0000 UTC m=+9183.980049129" watchObservedRunningTime="2025-12-06 09:30:09.176177653 +0000 UTC m=+9183.989537042" Dec 06 09:30:10 crc kubenswrapper[4954]: I1206 09:30:10.152309 4954 generic.go:334] "Generic (PLEG): container finished" podID="50fb29f7-2be5-45d6-b204-692aec122a55" containerID="36dd196835abfdbfc93b251a23a696a5b14c8e96f67449be323fde940e802fbb" exitCode=0 Dec 06 09:30:10 crc kubenswrapper[4954]: I1206 09:30:10.152390 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" event={"ID":"50fb29f7-2be5-45d6-b204-692aec122a55","Type":"ContainerDied","Data":"36dd196835abfdbfc93b251a23a696a5b14c8e96f67449be323fde940e802fbb"} Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.639459 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.750232 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-inventory\") pod \"50fb29f7-2be5-45d6-b204-692aec122a55\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.750302 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-ssh-key\") pod \"50fb29f7-2be5-45d6-b204-692aec122a55\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.750332 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-nova-metadata-neutron-config-0\") pod \"50fb29f7-2be5-45d6-b204-692aec122a55\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.750396 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rktt5\" (UniqueName: \"kubernetes.io/projected/50fb29f7-2be5-45d6-b204-692aec122a55-kube-api-access-rktt5\") pod \"50fb29f7-2be5-45d6-b204-692aec122a55\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.750445 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-neutron-ovn-metadata-agent-neutron-config-0\") pod \"50fb29f7-2be5-45d6-b204-692aec122a55\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.750489 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-neutron-metadata-combined-ca-bundle\") pod \"50fb29f7-2be5-45d6-b204-692aec122a55\" (UID: \"50fb29f7-2be5-45d6-b204-692aec122a55\") " Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.796191 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50fb29f7-2be5-45d6-b204-692aec122a55-kube-api-access-rktt5" (OuterVolumeSpecName: "kube-api-access-rktt5") pod "50fb29f7-2be5-45d6-b204-692aec122a55" (UID: "50fb29f7-2be5-45d6-b204-692aec122a55"). InnerVolumeSpecName "kube-api-access-rktt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.796726 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "50fb29f7-2be5-45d6-b204-692aec122a55" (UID: "50fb29f7-2be5-45d6-b204-692aec122a55"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.852897 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rktt5\" (UniqueName: \"kubernetes.io/projected/50fb29f7-2be5-45d6-b204-692aec122a55-kube-api-access-rktt5\") on node \"crc\" DevicePath \"\"" Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.852920 4954 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.859825 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-inventory" (OuterVolumeSpecName: "inventory") pod "50fb29f7-2be5-45d6-b204-692aec122a55" (UID: "50fb29f7-2be5-45d6-b204-692aec122a55"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.864740 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "50fb29f7-2be5-45d6-b204-692aec122a55" (UID: "50fb29f7-2be5-45d6-b204-692aec122a55"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.871960 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "50fb29f7-2be5-45d6-b204-692aec122a55" (UID: "50fb29f7-2be5-45d6-b204-692aec122a55"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.884482 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "50fb29f7-2be5-45d6-b204-692aec122a55" (UID: "50fb29f7-2be5-45d6-b204-692aec122a55"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.954399 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.954440 4954 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.954456 4954 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:30:11 crc kubenswrapper[4954]: I1206 09:30:11.954470 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50fb29f7-2be5-45d6-b204-692aec122a55-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:30:12 crc kubenswrapper[4954]: I1206 09:30:12.172959 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" event={"ID":"50fb29f7-2be5-45d6-b204-692aec122a55","Type":"ContainerDied","Data":"aaf3d855bb2a74d843266e10262e5cb7b03725fa36b9caf6a7d89b0fa46e07ab"} Dec 06 09:30:12 crc kubenswrapper[4954]: I1206 09:30:12.172998 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaf3d855bb2a74d843266e10262e5cb7b03725fa36b9caf6a7d89b0fa46e07ab" Dec 06 09:30:12 crc kubenswrapper[4954]: I1206 09:30:12.173018 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-hrbwm" Dec 06 09:30:14 crc kubenswrapper[4954]: I1206 09:30:14.444209 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:30:14 crc kubenswrapper[4954]: E1206 09:30:14.444962 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:30:26 crc kubenswrapper[4954]: I1206 09:30:26.443534 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:30:26 crc kubenswrapper[4954]: E1206 09:30:26.444355 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:30:29 crc kubenswrapper[4954]: I1206 09:30:29.472246 4954 scope.go:117] "RemoveContainer" containerID="072d6be198759a163bbf3f43e5b5bb10cc516b1aba69f4ef435e9d993b123004" Dec 06 09:30:37 crc kubenswrapper[4954]: I1206 09:30:37.444360 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:30:37 crc kubenswrapper[4954]: E1206 09:30:37.445438 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:30:48 crc kubenswrapper[4954]: I1206 09:30:48.444750 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:30:48 crc kubenswrapper[4954]: E1206 09:30:48.445854 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:31:03 crc kubenswrapper[4954]: I1206 09:31:03.443478 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:31:03 crc kubenswrapper[4954]: E1206 09:31:03.444228 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:31:06 crc kubenswrapper[4954]: I1206 09:31:06.738967 4954 generic.go:334] "Generic (PLEG): container finished" podID="9135ca19-eb24-4e05-a3b7-a8511ef9368e" containerID="6b3ef503c2f420b195b440bd528878d0a5a95c612c0154bbb80d153eff20adc0" exitCode=0 Dec 06 09:31:06 crc kubenswrapper[4954]: I1206 09:31:06.739528 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" event={"ID":"9135ca19-eb24-4e05-a3b7-a8511ef9368e","Type":"ContainerDied","Data":"6b3ef503c2f420b195b440bd528878d0a5a95c612c0154bbb80d153eff20adc0"} Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.192623 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.367219 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-neutron-metadata-combined-ca-bundle\") pod \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.367410 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-nova-metadata-neutron-config-0\") pod \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.367442 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-ssh-key\") pod \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.367526 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.367588 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r9hb\" (UniqueName: \"kubernetes.io/projected/9135ca19-eb24-4e05-a3b7-a8511ef9368e-kube-api-access-8r9hb\") pod \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.367708 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-inventory\") pod \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\" (UID: \"9135ca19-eb24-4e05-a3b7-a8511ef9368e\") " Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.373916 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9135ca19-eb24-4e05-a3b7-a8511ef9368e" (UID: "9135ca19-eb24-4e05-a3b7-a8511ef9368e"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.373968 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9135ca19-eb24-4e05-a3b7-a8511ef9368e-kube-api-access-8r9hb" (OuterVolumeSpecName: "kube-api-access-8r9hb") pod "9135ca19-eb24-4e05-a3b7-a8511ef9368e" (UID: "9135ca19-eb24-4e05-a3b7-a8511ef9368e"). InnerVolumeSpecName "kube-api-access-8r9hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.400241 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9135ca19-eb24-4e05-a3b7-a8511ef9368e" (UID: "9135ca19-eb24-4e05-a3b7-a8511ef9368e"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.403843 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9135ca19-eb24-4e05-a3b7-a8511ef9368e" (UID: "9135ca19-eb24-4e05-a3b7-a8511ef9368e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.404457 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9135ca19-eb24-4e05-a3b7-a8511ef9368e" (UID: "9135ca19-eb24-4e05-a3b7-a8511ef9368e"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.405208 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-inventory" (OuterVolumeSpecName: "inventory") pod "9135ca19-eb24-4e05-a3b7-a8511ef9368e" (UID: "9135ca19-eb24-4e05-a3b7-a8511ef9368e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.471082 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.471120 4954 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.471131 4954 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.471142 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.471157 4954 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9135ca19-eb24-4e05-a3b7-a8511ef9368e-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.471168 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r9hb\" (UniqueName: \"kubernetes.io/projected/9135ca19-eb24-4e05-a3b7-a8511ef9368e-kube-api-access-8r9hb\") on node \"crc\" DevicePath \"\"" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.763255 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" event={"ID":"9135ca19-eb24-4e05-a3b7-a8511ef9368e","Type":"ContainerDied","Data":"018aac7a0d64b7798e6481e106cabf830120656a7e94b212a8e8210577791dbb"} Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.763303 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="018aac7a0d64b7798e6481e106cabf830120656a7e94b212a8e8210577791dbb" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.763338 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-5mxhl" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.886610 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-rjsg8"] Dec 06 09:31:08 crc kubenswrapper[4954]: E1206 09:31:08.887177 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50fb29f7-2be5-45d6-b204-692aec122a55" containerName="neutron-metadata-openstack-openstack-networker" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.887199 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="50fb29f7-2be5-45d6-b204-692aec122a55" containerName="neutron-metadata-openstack-openstack-networker" Dec 06 09:31:08 crc kubenswrapper[4954]: E1206 09:31:08.887221 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9135ca19-eb24-4e05-a3b7-a8511ef9368e" containerName="neutron-metadata-openstack-openstack-cell1" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.887229 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9135ca19-eb24-4e05-a3b7-a8511ef9368e" containerName="neutron-metadata-openstack-openstack-cell1" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.887551 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9135ca19-eb24-4e05-a3b7-a8511ef9368e" containerName="neutron-metadata-openstack-openstack-cell1" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.887592 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="50fb29f7-2be5-45d6-b204-692aec122a55" containerName="neutron-metadata-openstack-openstack-networker" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.888692 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.891837 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.892027 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.892345 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.892693 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.893054 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.906287 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-rjsg8"] Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.986469 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-rjsg8\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.986604 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-inventory\") pod \"libvirt-openstack-openstack-cell1-rjsg8\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.986692 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-rjsg8\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.986733 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbwj4\" (UniqueName: \"kubernetes.io/projected/5f829cef-d637-404f-b735-02768488c5f7-kube-api-access-nbwj4\") pod \"libvirt-openstack-openstack-cell1-rjsg8\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:31:08 crc kubenswrapper[4954]: I1206 09:31:08.986830 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-ssh-key\") pod \"libvirt-openstack-openstack-cell1-rjsg8\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:31:09 crc kubenswrapper[4954]: I1206 09:31:09.088614 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-rjsg8\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:31:09 crc kubenswrapper[4954]: I1206 09:31:09.088682 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbwj4\" (UniqueName: \"kubernetes.io/projected/5f829cef-d637-404f-b735-02768488c5f7-kube-api-access-nbwj4\") pod \"libvirt-openstack-openstack-cell1-rjsg8\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:31:09 crc kubenswrapper[4954]: I1206 09:31:09.088706 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-ssh-key\") pod \"libvirt-openstack-openstack-cell1-rjsg8\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:31:09 crc kubenswrapper[4954]: I1206 09:31:09.088801 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-rjsg8\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:31:09 crc kubenswrapper[4954]: I1206 09:31:09.088855 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-inventory\") pod \"libvirt-openstack-openstack-cell1-rjsg8\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:31:09 crc kubenswrapper[4954]: I1206 09:31:09.093330 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-rjsg8\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:31:09 crc kubenswrapper[4954]: I1206 09:31:09.093698 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-rjsg8\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:31:09 crc kubenswrapper[4954]: I1206 09:31:09.100027 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-inventory\") pod \"libvirt-openstack-openstack-cell1-rjsg8\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:31:09 crc kubenswrapper[4954]: I1206 09:31:09.101175 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-ssh-key\") pod \"libvirt-openstack-openstack-cell1-rjsg8\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:31:09 crc kubenswrapper[4954]: I1206 09:31:09.111915 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbwj4\" (UniqueName: \"kubernetes.io/projected/5f829cef-d637-404f-b735-02768488c5f7-kube-api-access-nbwj4\") pod \"libvirt-openstack-openstack-cell1-rjsg8\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:31:09 crc kubenswrapper[4954]: I1206 09:31:09.227754 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:31:09 crc kubenswrapper[4954]: I1206 09:31:09.778244 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-rjsg8"] Dec 06 09:31:10 crc kubenswrapper[4954]: I1206 09:31:10.783236 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" event={"ID":"5f829cef-d637-404f-b735-02768488c5f7","Type":"ContainerStarted","Data":"08684d0feb3652159eea23a7a56bcc261ab03efe30feaa572967ffe0382f69ee"} Dec 06 09:31:10 crc kubenswrapper[4954]: I1206 09:31:10.783602 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" event={"ID":"5f829cef-d637-404f-b735-02768488c5f7","Type":"ContainerStarted","Data":"3bebc40a58278a93af3dfd0b29108945d0e4e3f6bc11e2b940fc8b7495f4596b"} Dec 06 09:31:10 crc kubenswrapper[4954]: I1206 09:31:10.805979 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" podStartSLOduration=2.288332542 podStartE2EDuration="2.805960462s" podCreationTimestamp="2025-12-06 09:31:08 +0000 UTC" firstStartedPulling="2025-12-06 09:31:09.785652161 +0000 UTC m=+9244.599011550" lastFinishedPulling="2025-12-06 09:31:10.303280081 +0000 UTC m=+9245.116639470" observedRunningTime="2025-12-06 09:31:10.80065748 +0000 UTC m=+9245.614016889" watchObservedRunningTime="2025-12-06 09:31:10.805960462 +0000 UTC m=+9245.619319851" Dec 06 09:31:18 crc kubenswrapper[4954]: I1206 09:31:18.443600 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:31:18 crc kubenswrapper[4954]: E1206 09:31:18.444352 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:31:29 crc kubenswrapper[4954]: I1206 09:31:29.461850 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n7zn6"] Dec 06 09:31:29 crc kubenswrapper[4954]: I1206 09:31:29.468731 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7zn6" Dec 06 09:31:29 crc kubenswrapper[4954]: I1206 09:31:29.469853 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n7zn6"] Dec 06 09:31:29 crc kubenswrapper[4954]: I1206 09:31:29.623951 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0112a86b-91d1-4d68-b4b2-ed01777f1470-catalog-content\") pod \"community-operators-n7zn6\" (UID: \"0112a86b-91d1-4d68-b4b2-ed01777f1470\") " pod="openshift-marketplace/community-operators-n7zn6" Dec 06 09:31:29 crc kubenswrapper[4954]: I1206 09:31:29.624351 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0112a86b-91d1-4d68-b4b2-ed01777f1470-utilities\") pod \"community-operators-n7zn6\" (UID: \"0112a86b-91d1-4d68-b4b2-ed01777f1470\") " pod="openshift-marketplace/community-operators-n7zn6" Dec 06 09:31:29 crc kubenswrapper[4954]: I1206 09:31:29.624435 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7spk6\" (UniqueName: \"kubernetes.io/projected/0112a86b-91d1-4d68-b4b2-ed01777f1470-kube-api-access-7spk6\") pod \"community-operators-n7zn6\" (UID: \"0112a86b-91d1-4d68-b4b2-ed01777f1470\") " pod="openshift-marketplace/community-operators-n7zn6" Dec 06 09:31:29 crc kubenswrapper[4954]: I1206 09:31:29.726785 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0112a86b-91d1-4d68-b4b2-ed01777f1470-catalog-content\") pod \"community-operators-n7zn6\" (UID: \"0112a86b-91d1-4d68-b4b2-ed01777f1470\") " pod="openshift-marketplace/community-operators-n7zn6" Dec 06 09:31:29 crc kubenswrapper[4954]: I1206 09:31:29.726859 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0112a86b-91d1-4d68-b4b2-ed01777f1470-utilities\") pod \"community-operators-n7zn6\" (UID: \"0112a86b-91d1-4d68-b4b2-ed01777f1470\") " pod="openshift-marketplace/community-operators-n7zn6" Dec 06 09:31:29 crc kubenswrapper[4954]: I1206 09:31:29.726905 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7spk6\" (UniqueName: \"kubernetes.io/projected/0112a86b-91d1-4d68-b4b2-ed01777f1470-kube-api-access-7spk6\") pod \"community-operators-n7zn6\" (UID: \"0112a86b-91d1-4d68-b4b2-ed01777f1470\") " pod="openshift-marketplace/community-operators-n7zn6" Dec 06 09:31:29 crc kubenswrapper[4954]: I1206 09:31:29.727299 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0112a86b-91d1-4d68-b4b2-ed01777f1470-catalog-content\") pod \"community-operators-n7zn6\" (UID: \"0112a86b-91d1-4d68-b4b2-ed01777f1470\") " pod="openshift-marketplace/community-operators-n7zn6" Dec 06 09:31:29 crc kubenswrapper[4954]: I1206 09:31:29.727482 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0112a86b-91d1-4d68-b4b2-ed01777f1470-utilities\") pod \"community-operators-n7zn6\" (UID: \"0112a86b-91d1-4d68-b4b2-ed01777f1470\") " pod="openshift-marketplace/community-operators-n7zn6" Dec 06 09:31:29 crc kubenswrapper[4954]: I1206 09:31:29.777384 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7spk6\" (UniqueName: \"kubernetes.io/projected/0112a86b-91d1-4d68-b4b2-ed01777f1470-kube-api-access-7spk6\") pod \"community-operators-n7zn6\" (UID: \"0112a86b-91d1-4d68-b4b2-ed01777f1470\") " pod="openshift-marketplace/community-operators-n7zn6" Dec 06 09:31:29 crc kubenswrapper[4954]: I1206 09:31:29.802845 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7zn6" Dec 06 09:31:30 crc kubenswrapper[4954]: I1206 09:31:30.361877 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n7zn6"] Dec 06 09:31:31 crc kubenswrapper[4954]: I1206 09:31:31.012654 4954 generic.go:334] "Generic (PLEG): container finished" podID="0112a86b-91d1-4d68-b4b2-ed01777f1470" containerID="59ff94367250d114088abcacedab1bb3bd4f219a662f25c5505caa02348cf4da" exitCode=0 Dec 06 09:31:31 crc kubenswrapper[4954]: I1206 09:31:31.013007 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7zn6" event={"ID":"0112a86b-91d1-4d68-b4b2-ed01777f1470","Type":"ContainerDied","Data":"59ff94367250d114088abcacedab1bb3bd4f219a662f25c5505caa02348cf4da"} Dec 06 09:31:31 crc kubenswrapper[4954]: I1206 09:31:31.013043 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7zn6" event={"ID":"0112a86b-91d1-4d68-b4b2-ed01777f1470","Type":"ContainerStarted","Data":"a6cf6276ee3c531fcebfe840955c7263c4a1f74ce77da2f0b76dd1c6698c8844"} Dec 06 09:31:31 crc kubenswrapper[4954]: I1206 09:31:31.445773 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:31:31 crc kubenswrapper[4954]: E1206 09:31:31.446346 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:31:32 crc kubenswrapper[4954]: I1206 09:31:32.025437 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7zn6" event={"ID":"0112a86b-91d1-4d68-b4b2-ed01777f1470","Type":"ContainerStarted","Data":"5f2d5b27c0707c745c0764ddfe7a207848fd972313a245d047bfab997d996f2d"} Dec 06 09:31:33 crc kubenswrapper[4954]: I1206 09:31:33.036399 4954 generic.go:334] "Generic (PLEG): container finished" podID="0112a86b-91d1-4d68-b4b2-ed01777f1470" containerID="5f2d5b27c0707c745c0764ddfe7a207848fd972313a245d047bfab997d996f2d" exitCode=0 Dec 06 09:31:33 crc kubenswrapper[4954]: I1206 09:31:33.036443 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7zn6" event={"ID":"0112a86b-91d1-4d68-b4b2-ed01777f1470","Type":"ContainerDied","Data":"5f2d5b27c0707c745c0764ddfe7a207848fd972313a245d047bfab997d996f2d"} Dec 06 09:31:34 crc kubenswrapper[4954]: I1206 09:31:34.048053 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7zn6" event={"ID":"0112a86b-91d1-4d68-b4b2-ed01777f1470","Type":"ContainerStarted","Data":"755bdfd06194c0312a9310105396f280e88204174a0eec9b9fa7cc952490bbd2"} Dec 06 09:31:34 crc kubenswrapper[4954]: I1206 09:31:34.088786 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n7zn6" podStartSLOduration=2.643787721 podStartE2EDuration="5.088757471s" podCreationTimestamp="2025-12-06 09:31:29 +0000 UTC" firstStartedPulling="2025-12-06 09:31:31.015029239 +0000 UTC m=+9265.828388628" lastFinishedPulling="2025-12-06 09:31:33.459998989 +0000 UTC m=+9268.273358378" observedRunningTime="2025-12-06 09:31:34.073923386 +0000 UTC m=+9268.887282795" watchObservedRunningTime="2025-12-06 09:31:34.088757471 +0000 UTC m=+9268.902116900" Dec 06 09:31:39 crc kubenswrapper[4954]: I1206 09:31:39.803193 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n7zn6" Dec 06 09:31:39 crc kubenswrapper[4954]: I1206 09:31:39.803908 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n7zn6" Dec 06 09:31:39 crc kubenswrapper[4954]: I1206 09:31:39.858318 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n7zn6" Dec 06 09:31:40 crc kubenswrapper[4954]: I1206 09:31:40.149731 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n7zn6" Dec 06 09:31:40 crc kubenswrapper[4954]: I1206 09:31:40.206702 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n7zn6"] Dec 06 09:31:42 crc kubenswrapper[4954]: I1206 09:31:42.126518 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n7zn6" podUID="0112a86b-91d1-4d68-b4b2-ed01777f1470" containerName="registry-server" containerID="cri-o://755bdfd06194c0312a9310105396f280e88204174a0eec9b9fa7cc952490bbd2" gracePeriod=2 Dec 06 09:31:42 crc kubenswrapper[4954]: I1206 09:31:42.621125 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7zn6" Dec 06 09:31:42 crc kubenswrapper[4954]: I1206 09:31:42.750797 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0112a86b-91d1-4d68-b4b2-ed01777f1470-utilities\") pod \"0112a86b-91d1-4d68-b4b2-ed01777f1470\" (UID: \"0112a86b-91d1-4d68-b4b2-ed01777f1470\") " Dec 06 09:31:42 crc kubenswrapper[4954]: I1206 09:31:42.751168 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0112a86b-91d1-4d68-b4b2-ed01777f1470-catalog-content\") pod \"0112a86b-91d1-4d68-b4b2-ed01777f1470\" (UID: \"0112a86b-91d1-4d68-b4b2-ed01777f1470\") " Dec 06 09:31:42 crc kubenswrapper[4954]: I1206 09:31:42.751202 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7spk6\" (UniqueName: \"kubernetes.io/projected/0112a86b-91d1-4d68-b4b2-ed01777f1470-kube-api-access-7spk6\") pod \"0112a86b-91d1-4d68-b4b2-ed01777f1470\" (UID: \"0112a86b-91d1-4d68-b4b2-ed01777f1470\") " Dec 06 09:31:42 crc kubenswrapper[4954]: I1206 09:31:42.751820 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0112a86b-91d1-4d68-b4b2-ed01777f1470-utilities" (OuterVolumeSpecName: "utilities") pod "0112a86b-91d1-4d68-b4b2-ed01777f1470" (UID: "0112a86b-91d1-4d68-b4b2-ed01777f1470"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:31:42 crc kubenswrapper[4954]: I1206 09:31:42.756769 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0112a86b-91d1-4d68-b4b2-ed01777f1470-kube-api-access-7spk6" (OuterVolumeSpecName: "kube-api-access-7spk6") pod "0112a86b-91d1-4d68-b4b2-ed01777f1470" (UID: "0112a86b-91d1-4d68-b4b2-ed01777f1470"). InnerVolumeSpecName "kube-api-access-7spk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:31:42 crc kubenswrapper[4954]: I1206 09:31:42.795156 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0112a86b-91d1-4d68-b4b2-ed01777f1470-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0112a86b-91d1-4d68-b4b2-ed01777f1470" (UID: "0112a86b-91d1-4d68-b4b2-ed01777f1470"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:31:42 crc kubenswrapper[4954]: I1206 09:31:42.854104 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0112a86b-91d1-4d68-b4b2-ed01777f1470-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:31:42 crc kubenswrapper[4954]: I1206 09:31:42.854134 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0112a86b-91d1-4d68-b4b2-ed01777f1470-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:31:42 crc kubenswrapper[4954]: I1206 09:31:42.854147 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7spk6\" (UniqueName: \"kubernetes.io/projected/0112a86b-91d1-4d68-b4b2-ed01777f1470-kube-api-access-7spk6\") on node \"crc\" DevicePath \"\"" Dec 06 09:31:43 crc kubenswrapper[4954]: I1206 09:31:43.140858 4954 generic.go:334] "Generic (PLEG): container finished" podID="0112a86b-91d1-4d68-b4b2-ed01777f1470" containerID="755bdfd06194c0312a9310105396f280e88204174a0eec9b9fa7cc952490bbd2" exitCode=0 Dec 06 09:31:43 crc kubenswrapper[4954]: I1206 09:31:43.140908 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7zn6" event={"ID":"0112a86b-91d1-4d68-b4b2-ed01777f1470","Type":"ContainerDied","Data":"755bdfd06194c0312a9310105396f280e88204174a0eec9b9fa7cc952490bbd2"} Dec 06 09:31:43 crc kubenswrapper[4954]: I1206 09:31:43.140940 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7zn6" Dec 06 09:31:43 crc kubenswrapper[4954]: I1206 09:31:43.140955 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7zn6" event={"ID":"0112a86b-91d1-4d68-b4b2-ed01777f1470","Type":"ContainerDied","Data":"a6cf6276ee3c531fcebfe840955c7263c4a1f74ce77da2f0b76dd1c6698c8844"} Dec 06 09:31:43 crc kubenswrapper[4954]: I1206 09:31:43.140980 4954 scope.go:117] "RemoveContainer" containerID="755bdfd06194c0312a9310105396f280e88204174a0eec9b9fa7cc952490bbd2" Dec 06 09:31:43 crc kubenswrapper[4954]: I1206 09:31:43.181740 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n7zn6"] Dec 06 09:31:43 crc kubenswrapper[4954]: I1206 09:31:43.187013 4954 scope.go:117] "RemoveContainer" containerID="5f2d5b27c0707c745c0764ddfe7a207848fd972313a245d047bfab997d996f2d" Dec 06 09:31:43 crc kubenswrapper[4954]: I1206 09:31:43.200018 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n7zn6"] Dec 06 09:31:43 crc kubenswrapper[4954]: I1206 09:31:43.211667 4954 scope.go:117] "RemoveContainer" containerID="59ff94367250d114088abcacedab1bb3bd4f219a662f25c5505caa02348cf4da" Dec 06 09:31:43 crc kubenswrapper[4954]: I1206 09:31:43.258806 4954 scope.go:117] "RemoveContainer" containerID="755bdfd06194c0312a9310105396f280e88204174a0eec9b9fa7cc952490bbd2" Dec 06 09:31:43 crc kubenswrapper[4954]: E1206 09:31:43.259329 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"755bdfd06194c0312a9310105396f280e88204174a0eec9b9fa7cc952490bbd2\": container with ID starting with 755bdfd06194c0312a9310105396f280e88204174a0eec9b9fa7cc952490bbd2 not found: ID does not exist" containerID="755bdfd06194c0312a9310105396f280e88204174a0eec9b9fa7cc952490bbd2" Dec 06 09:31:43 crc kubenswrapper[4954]: I1206 09:31:43.259402 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"755bdfd06194c0312a9310105396f280e88204174a0eec9b9fa7cc952490bbd2"} err="failed to get container status \"755bdfd06194c0312a9310105396f280e88204174a0eec9b9fa7cc952490bbd2\": rpc error: code = NotFound desc = could not find container \"755bdfd06194c0312a9310105396f280e88204174a0eec9b9fa7cc952490bbd2\": container with ID starting with 755bdfd06194c0312a9310105396f280e88204174a0eec9b9fa7cc952490bbd2 not found: ID does not exist" Dec 06 09:31:43 crc kubenswrapper[4954]: I1206 09:31:43.259442 4954 scope.go:117] "RemoveContainer" containerID="5f2d5b27c0707c745c0764ddfe7a207848fd972313a245d047bfab997d996f2d" Dec 06 09:31:43 crc kubenswrapper[4954]: E1206 09:31:43.259886 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f2d5b27c0707c745c0764ddfe7a207848fd972313a245d047bfab997d996f2d\": container with ID starting with 5f2d5b27c0707c745c0764ddfe7a207848fd972313a245d047bfab997d996f2d not found: ID does not exist" containerID="5f2d5b27c0707c745c0764ddfe7a207848fd972313a245d047bfab997d996f2d" Dec 06 09:31:43 crc kubenswrapper[4954]: I1206 09:31:43.259931 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f2d5b27c0707c745c0764ddfe7a207848fd972313a245d047bfab997d996f2d"} err="failed to get container status \"5f2d5b27c0707c745c0764ddfe7a207848fd972313a245d047bfab997d996f2d\": rpc error: code = NotFound desc = could not find container \"5f2d5b27c0707c745c0764ddfe7a207848fd972313a245d047bfab997d996f2d\": container with ID starting with 5f2d5b27c0707c745c0764ddfe7a207848fd972313a245d047bfab997d996f2d not found: ID does not exist" Dec 06 09:31:43 crc kubenswrapper[4954]: I1206 09:31:43.259964 4954 scope.go:117] "RemoveContainer" containerID="59ff94367250d114088abcacedab1bb3bd4f219a662f25c5505caa02348cf4da" Dec 06 09:31:43 crc kubenswrapper[4954]: E1206 09:31:43.260241 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59ff94367250d114088abcacedab1bb3bd4f219a662f25c5505caa02348cf4da\": container with ID starting with 59ff94367250d114088abcacedab1bb3bd4f219a662f25c5505caa02348cf4da not found: ID does not exist" containerID="59ff94367250d114088abcacedab1bb3bd4f219a662f25c5505caa02348cf4da" Dec 06 09:31:43 crc kubenswrapper[4954]: I1206 09:31:43.260284 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ff94367250d114088abcacedab1bb3bd4f219a662f25c5505caa02348cf4da"} err="failed to get container status \"59ff94367250d114088abcacedab1bb3bd4f219a662f25c5505caa02348cf4da\": rpc error: code = NotFound desc = could not find container \"59ff94367250d114088abcacedab1bb3bd4f219a662f25c5505caa02348cf4da\": container with ID starting with 59ff94367250d114088abcacedab1bb3bd4f219a662f25c5505caa02348cf4da not found: ID does not exist" Dec 06 09:31:43 crc kubenswrapper[4954]: I1206 09:31:43.444136 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:31:43 crc kubenswrapper[4954]: E1206 09:31:43.444880 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:31:43 crc kubenswrapper[4954]: I1206 09:31:43.459351 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0112a86b-91d1-4d68-b4b2-ed01777f1470" path="/var/lib/kubelet/pods/0112a86b-91d1-4d68-b4b2-ed01777f1470/volumes" Dec 06 09:31:51 crc kubenswrapper[4954]: I1206 09:31:51.834108 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gdrq8"] Dec 06 09:31:51 crc kubenswrapper[4954]: E1206 09:31:51.835388 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0112a86b-91d1-4d68-b4b2-ed01777f1470" containerName="extract-utilities" Dec 06 09:31:51 crc kubenswrapper[4954]: I1206 09:31:51.835412 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0112a86b-91d1-4d68-b4b2-ed01777f1470" containerName="extract-utilities" Dec 06 09:31:51 crc kubenswrapper[4954]: E1206 09:31:51.835437 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0112a86b-91d1-4d68-b4b2-ed01777f1470" containerName="extract-content" Dec 06 09:31:51 crc kubenswrapper[4954]: I1206 09:31:51.835449 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0112a86b-91d1-4d68-b4b2-ed01777f1470" containerName="extract-content" Dec 06 09:31:51 crc kubenswrapper[4954]: E1206 09:31:51.835474 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0112a86b-91d1-4d68-b4b2-ed01777f1470" containerName="registry-server" Dec 06 09:31:51 crc kubenswrapper[4954]: I1206 09:31:51.835485 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0112a86b-91d1-4d68-b4b2-ed01777f1470" containerName="registry-server" Dec 06 09:31:51 crc kubenswrapper[4954]: I1206 09:31:51.835853 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="0112a86b-91d1-4d68-b4b2-ed01777f1470" containerName="registry-server" Dec 06 09:31:51 crc kubenswrapper[4954]: I1206 09:31:51.839042 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdrq8" Dec 06 09:31:51 crc kubenswrapper[4954]: I1206 09:31:51.873677 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gdrq8"] Dec 06 09:31:51 crc kubenswrapper[4954]: I1206 09:31:51.920703 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b57622-447b-4285-8acc-c005b994cab0-catalog-content\") pod \"certified-operators-gdrq8\" (UID: \"09b57622-447b-4285-8acc-c005b994cab0\") " pod="openshift-marketplace/certified-operators-gdrq8" Dec 06 09:31:51 crc kubenswrapper[4954]: I1206 09:31:51.921093 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b57622-447b-4285-8acc-c005b994cab0-utilities\") pod \"certified-operators-gdrq8\" (UID: \"09b57622-447b-4285-8acc-c005b994cab0\") " pod="openshift-marketplace/certified-operators-gdrq8" Dec 06 09:31:51 crc kubenswrapper[4954]: I1206 09:31:51.921125 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdnw2\" (UniqueName: \"kubernetes.io/projected/09b57622-447b-4285-8acc-c005b994cab0-kube-api-access-pdnw2\") pod \"certified-operators-gdrq8\" (UID: \"09b57622-447b-4285-8acc-c005b994cab0\") " pod="openshift-marketplace/certified-operators-gdrq8" Dec 06 09:31:52 crc kubenswrapper[4954]: I1206 09:31:52.022473 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b57622-447b-4285-8acc-c005b994cab0-utilities\") pod \"certified-operators-gdrq8\" (UID: \"09b57622-447b-4285-8acc-c005b994cab0\") " pod="openshift-marketplace/certified-operators-gdrq8" Dec 06 09:31:52 crc kubenswrapper[4954]: I1206 09:31:52.022512 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdnw2\" (UniqueName: \"kubernetes.io/projected/09b57622-447b-4285-8acc-c005b994cab0-kube-api-access-pdnw2\") pod \"certified-operators-gdrq8\" (UID: \"09b57622-447b-4285-8acc-c005b994cab0\") " pod="openshift-marketplace/certified-operators-gdrq8" Dec 06 09:31:52 crc kubenswrapper[4954]: I1206 09:31:52.022646 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b57622-447b-4285-8acc-c005b994cab0-catalog-content\") pod \"certified-operators-gdrq8\" (UID: \"09b57622-447b-4285-8acc-c005b994cab0\") " pod="openshift-marketplace/certified-operators-gdrq8" Dec 06 09:31:52 crc kubenswrapper[4954]: I1206 09:31:52.023001 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b57622-447b-4285-8acc-c005b994cab0-utilities\") pod \"certified-operators-gdrq8\" (UID: \"09b57622-447b-4285-8acc-c005b994cab0\") " pod="openshift-marketplace/certified-operators-gdrq8" Dec 06 09:31:52 crc kubenswrapper[4954]: I1206 09:31:52.023032 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b57622-447b-4285-8acc-c005b994cab0-catalog-content\") pod \"certified-operators-gdrq8\" (UID: \"09b57622-447b-4285-8acc-c005b994cab0\") " pod="openshift-marketplace/certified-operators-gdrq8" Dec 06 09:31:52 crc kubenswrapper[4954]: I1206 09:31:52.044714 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdnw2\" (UniqueName: \"kubernetes.io/projected/09b57622-447b-4285-8acc-c005b994cab0-kube-api-access-pdnw2\") pod \"certified-operators-gdrq8\" (UID: \"09b57622-447b-4285-8acc-c005b994cab0\") " pod="openshift-marketplace/certified-operators-gdrq8" Dec 06 09:31:52 crc kubenswrapper[4954]: I1206 09:31:52.167160 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdrq8" Dec 06 09:31:52 crc kubenswrapper[4954]: I1206 09:31:52.801017 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gdrq8"] Dec 06 09:31:53 crc kubenswrapper[4954]: I1206 09:31:53.320645 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdrq8" event={"ID":"09b57622-447b-4285-8acc-c005b994cab0","Type":"ContainerStarted","Data":"af9850a34366c178867e5773e5109de70323e1f66a8f43acad2aa08d37ab1859"} Dec 06 09:31:54 crc kubenswrapper[4954]: I1206 09:31:54.333369 4954 generic.go:334] "Generic (PLEG): container finished" podID="09b57622-447b-4285-8acc-c005b994cab0" containerID="70f80c44e2d8cad5e49f17961f3bff7541629aed8f858880daa61545a57bc4be" exitCode=0 Dec 06 09:31:54 crc kubenswrapper[4954]: I1206 09:31:54.333499 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdrq8" event={"ID":"09b57622-447b-4285-8acc-c005b994cab0","Type":"ContainerDied","Data":"70f80c44e2d8cad5e49f17961f3bff7541629aed8f858880daa61545a57bc4be"} Dec 06 09:31:56 crc kubenswrapper[4954]: I1206 09:31:56.353187 4954 generic.go:334] "Generic (PLEG): container finished" podID="09b57622-447b-4285-8acc-c005b994cab0" containerID="1f808ead3579682297a3ad29695ec3054726390f49791829a83a32de6f5bb427" exitCode=0 Dec 06 09:31:56 crc kubenswrapper[4954]: I1206 09:31:56.353429 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdrq8" event={"ID":"09b57622-447b-4285-8acc-c005b994cab0","Type":"ContainerDied","Data":"1f808ead3579682297a3ad29695ec3054726390f49791829a83a32de6f5bb427"} Dec 06 09:31:56 crc kubenswrapper[4954]: I1206 09:31:56.443946 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:31:56 crc kubenswrapper[4954]: E1206 09:31:56.444206 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:31:57 crc kubenswrapper[4954]: I1206 09:31:57.367232 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdrq8" event={"ID":"09b57622-447b-4285-8acc-c005b994cab0","Type":"ContainerStarted","Data":"c4417bd3d9d8c1a91b7b1e10266bb305cdd80e95c983db19419688ab076d99b4"} Dec 06 09:31:57 crc kubenswrapper[4954]: I1206 09:31:57.400927 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gdrq8" podStartSLOduration=3.964313466 podStartE2EDuration="6.400901385s" podCreationTimestamp="2025-12-06 09:31:51 +0000 UTC" firstStartedPulling="2025-12-06 09:31:54.335125863 +0000 UTC m=+9289.148485252" lastFinishedPulling="2025-12-06 09:31:56.771713782 +0000 UTC m=+9291.585073171" observedRunningTime="2025-12-06 09:31:57.387058636 +0000 UTC m=+9292.200418025" watchObservedRunningTime="2025-12-06 09:31:57.400901385 +0000 UTC m=+9292.214260784" Dec 06 09:32:02 crc kubenswrapper[4954]: I1206 09:32:02.168213 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gdrq8" Dec 06 09:32:02 crc kubenswrapper[4954]: I1206 09:32:02.170550 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gdrq8" Dec 06 09:32:02 crc kubenswrapper[4954]: I1206 09:32:02.220329 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gdrq8" Dec 06 09:32:02 crc kubenswrapper[4954]: I1206 09:32:02.460652 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gdrq8" Dec 06 09:32:05 crc kubenswrapper[4954]: I1206 09:32:05.845176 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gdrq8"] Dec 06 09:32:05 crc kubenswrapper[4954]: I1206 09:32:05.845900 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gdrq8" podUID="09b57622-447b-4285-8acc-c005b994cab0" containerName="registry-server" containerID="cri-o://c4417bd3d9d8c1a91b7b1e10266bb305cdd80e95c983db19419688ab076d99b4" gracePeriod=2 Dec 06 09:32:06 crc kubenswrapper[4954]: I1206 09:32:06.458198 4954 generic.go:334] "Generic (PLEG): container finished" podID="09b57622-447b-4285-8acc-c005b994cab0" containerID="c4417bd3d9d8c1a91b7b1e10266bb305cdd80e95c983db19419688ab076d99b4" exitCode=0 Dec 06 09:32:06 crc kubenswrapper[4954]: I1206 09:32:06.458273 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdrq8" event={"ID":"09b57622-447b-4285-8acc-c005b994cab0","Type":"ContainerDied","Data":"c4417bd3d9d8c1a91b7b1e10266bb305cdd80e95c983db19419688ab076d99b4"} Dec 06 09:32:07 crc kubenswrapper[4954]: I1206 09:32:07.084601 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdrq8" Dec 06 09:32:07 crc kubenswrapper[4954]: I1206 09:32:07.232682 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b57622-447b-4285-8acc-c005b994cab0-catalog-content\") pod \"09b57622-447b-4285-8acc-c005b994cab0\" (UID: \"09b57622-447b-4285-8acc-c005b994cab0\") " Dec 06 09:32:07 crc kubenswrapper[4954]: I1206 09:32:07.233259 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b57622-447b-4285-8acc-c005b994cab0-utilities\") pod \"09b57622-447b-4285-8acc-c005b994cab0\" (UID: \"09b57622-447b-4285-8acc-c005b994cab0\") " Dec 06 09:32:07 crc kubenswrapper[4954]: I1206 09:32:07.233317 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdnw2\" (UniqueName: \"kubernetes.io/projected/09b57622-447b-4285-8acc-c005b994cab0-kube-api-access-pdnw2\") pod \"09b57622-447b-4285-8acc-c005b994cab0\" (UID: \"09b57622-447b-4285-8acc-c005b994cab0\") " Dec 06 09:32:07 crc kubenswrapper[4954]: I1206 09:32:07.273101 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09b57622-447b-4285-8acc-c005b994cab0-utilities" (OuterVolumeSpecName: "utilities") pod "09b57622-447b-4285-8acc-c005b994cab0" (UID: "09b57622-447b-4285-8acc-c005b994cab0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:32:07 crc kubenswrapper[4954]: I1206 09:32:07.282965 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b57622-447b-4285-8acc-c005b994cab0-kube-api-access-pdnw2" (OuterVolumeSpecName: "kube-api-access-pdnw2") pod "09b57622-447b-4285-8acc-c005b994cab0" (UID: "09b57622-447b-4285-8acc-c005b994cab0"). InnerVolumeSpecName "kube-api-access-pdnw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:32:07 crc kubenswrapper[4954]: I1206 09:32:07.314762 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09b57622-447b-4285-8acc-c005b994cab0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09b57622-447b-4285-8acc-c005b994cab0" (UID: "09b57622-447b-4285-8acc-c005b994cab0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:32:07 crc kubenswrapper[4954]: I1206 09:32:07.337022 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b57622-447b-4285-8acc-c005b994cab0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:32:07 crc kubenswrapper[4954]: I1206 09:32:07.337075 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b57622-447b-4285-8acc-c005b994cab0-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:32:07 crc kubenswrapper[4954]: I1206 09:32:07.337089 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdnw2\" (UniqueName: \"kubernetes.io/projected/09b57622-447b-4285-8acc-c005b994cab0-kube-api-access-pdnw2\") on node \"crc\" DevicePath \"\"" Dec 06 09:32:07 crc kubenswrapper[4954]: I1206 09:32:07.473088 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdrq8" event={"ID":"09b57622-447b-4285-8acc-c005b994cab0","Type":"ContainerDied","Data":"af9850a34366c178867e5773e5109de70323e1f66a8f43acad2aa08d37ab1859"} Dec 06 09:32:07 crc kubenswrapper[4954]: I1206 09:32:07.473115 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdrq8" Dec 06 09:32:07 crc kubenswrapper[4954]: I1206 09:32:07.473324 4954 scope.go:117] "RemoveContainer" containerID="c4417bd3d9d8c1a91b7b1e10266bb305cdd80e95c983db19419688ab076d99b4" Dec 06 09:32:07 crc kubenswrapper[4954]: I1206 09:32:07.502082 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gdrq8"] Dec 06 09:32:07 crc kubenswrapper[4954]: I1206 09:32:07.510682 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gdrq8"] Dec 06 09:32:07 crc kubenswrapper[4954]: I1206 09:32:07.511388 4954 scope.go:117] "RemoveContainer" containerID="1f808ead3579682297a3ad29695ec3054726390f49791829a83a32de6f5bb427" Dec 06 09:32:07 crc kubenswrapper[4954]: I1206 09:32:07.533680 4954 scope.go:117] "RemoveContainer" containerID="70f80c44e2d8cad5e49f17961f3bff7541629aed8f858880daa61545a57bc4be" Dec 06 09:32:09 crc kubenswrapper[4954]: I1206 09:32:09.443967 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:32:09 crc kubenswrapper[4954]: E1206 09:32:09.444801 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:32:09 crc kubenswrapper[4954]: I1206 09:32:09.471360 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09b57622-447b-4285-8acc-c005b994cab0" path="/var/lib/kubelet/pods/09b57622-447b-4285-8acc-c005b994cab0/volumes" Dec 06 09:32:24 crc kubenswrapper[4954]: I1206 09:32:24.443106 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:32:24 crc kubenswrapper[4954]: E1206 09:32:24.443851 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:32:35 crc kubenswrapper[4954]: I1206 09:32:35.452014 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:32:35 crc kubenswrapper[4954]: E1206 09:32:35.453016 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:32:49 crc kubenswrapper[4954]: I1206 09:32:49.444617 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:32:49 crc kubenswrapper[4954]: E1206 09:32:49.445452 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:32:58 crc kubenswrapper[4954]: I1206 09:32:58.592731 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8qsn6"] Dec 06 09:32:58 crc kubenswrapper[4954]: E1206 09:32:58.593722 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b57622-447b-4285-8acc-c005b994cab0" containerName="extract-content" Dec 06 09:32:58 crc kubenswrapper[4954]: I1206 09:32:58.593739 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b57622-447b-4285-8acc-c005b994cab0" containerName="extract-content" Dec 06 09:32:58 crc kubenswrapper[4954]: E1206 09:32:58.593763 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b57622-447b-4285-8acc-c005b994cab0" containerName="registry-server" Dec 06 09:32:58 crc kubenswrapper[4954]: I1206 09:32:58.593772 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b57622-447b-4285-8acc-c005b994cab0" containerName="registry-server" Dec 06 09:32:58 crc kubenswrapper[4954]: E1206 09:32:58.593813 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b57622-447b-4285-8acc-c005b994cab0" containerName="extract-utilities" Dec 06 09:32:58 crc kubenswrapper[4954]: I1206 09:32:58.593823 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b57622-447b-4285-8acc-c005b994cab0" containerName="extract-utilities" Dec 06 09:32:58 crc kubenswrapper[4954]: I1206 09:32:58.594055 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="09b57622-447b-4285-8acc-c005b994cab0" containerName="registry-server" Dec 06 09:32:58 crc kubenswrapper[4954]: I1206 09:32:58.595811 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8qsn6" Dec 06 09:32:58 crc kubenswrapper[4954]: I1206 09:32:58.607001 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8qsn6"] Dec 06 09:32:58 crc kubenswrapper[4954]: I1206 09:32:58.732243 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b141560-70f2-48f1-8e95-8092255880ce-catalog-content\") pod \"redhat-operators-8qsn6\" (UID: \"8b141560-70f2-48f1-8e95-8092255880ce\") " pod="openshift-marketplace/redhat-operators-8qsn6" Dec 06 09:32:58 crc kubenswrapper[4954]: I1206 09:32:58.732660 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b141560-70f2-48f1-8e95-8092255880ce-utilities\") pod \"redhat-operators-8qsn6\" (UID: \"8b141560-70f2-48f1-8e95-8092255880ce\") " pod="openshift-marketplace/redhat-operators-8qsn6" Dec 06 09:32:58 crc kubenswrapper[4954]: I1206 09:32:58.732732 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4bhz\" (UniqueName: \"kubernetes.io/projected/8b141560-70f2-48f1-8e95-8092255880ce-kube-api-access-r4bhz\") pod \"redhat-operators-8qsn6\" (UID: \"8b141560-70f2-48f1-8e95-8092255880ce\") " pod="openshift-marketplace/redhat-operators-8qsn6" Dec 06 09:32:58 crc kubenswrapper[4954]: I1206 09:32:58.834173 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4bhz\" (UniqueName: \"kubernetes.io/projected/8b141560-70f2-48f1-8e95-8092255880ce-kube-api-access-r4bhz\") pod \"redhat-operators-8qsn6\" (UID: \"8b141560-70f2-48f1-8e95-8092255880ce\") " pod="openshift-marketplace/redhat-operators-8qsn6" Dec 06 09:32:58 crc kubenswrapper[4954]: I1206 09:32:58.834367 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b141560-70f2-48f1-8e95-8092255880ce-catalog-content\") pod \"redhat-operators-8qsn6\" (UID: \"8b141560-70f2-48f1-8e95-8092255880ce\") " pod="openshift-marketplace/redhat-operators-8qsn6" Dec 06 09:32:58 crc kubenswrapper[4954]: I1206 09:32:58.834436 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b141560-70f2-48f1-8e95-8092255880ce-utilities\") pod \"redhat-operators-8qsn6\" (UID: \"8b141560-70f2-48f1-8e95-8092255880ce\") " pod="openshift-marketplace/redhat-operators-8qsn6" Dec 06 09:32:58 crc kubenswrapper[4954]: I1206 09:32:58.834987 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b141560-70f2-48f1-8e95-8092255880ce-utilities\") pod \"redhat-operators-8qsn6\" (UID: \"8b141560-70f2-48f1-8e95-8092255880ce\") " pod="openshift-marketplace/redhat-operators-8qsn6" Dec 06 09:32:58 crc kubenswrapper[4954]: I1206 09:32:58.835020 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b141560-70f2-48f1-8e95-8092255880ce-catalog-content\") pod \"redhat-operators-8qsn6\" (UID: \"8b141560-70f2-48f1-8e95-8092255880ce\") " pod="openshift-marketplace/redhat-operators-8qsn6" Dec 06 09:32:58 crc kubenswrapper[4954]: I1206 09:32:58.864035 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4bhz\" (UniqueName: \"kubernetes.io/projected/8b141560-70f2-48f1-8e95-8092255880ce-kube-api-access-r4bhz\") pod \"redhat-operators-8qsn6\" (UID: \"8b141560-70f2-48f1-8e95-8092255880ce\") " pod="openshift-marketplace/redhat-operators-8qsn6" Dec 06 09:32:58 crc kubenswrapper[4954]: I1206 09:32:58.930406 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8qsn6" Dec 06 09:32:59 crc kubenswrapper[4954]: I1206 09:32:59.596194 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8qsn6"] Dec 06 09:33:00 crc kubenswrapper[4954]: I1206 09:33:00.087354 4954 generic.go:334] "Generic (PLEG): container finished" podID="8b141560-70f2-48f1-8e95-8092255880ce" containerID="df1e95c382e1b866ef4100bcbf1d1aa21ebd809bd58e2248dd4068d67203bbd4" exitCode=0 Dec 06 09:33:00 crc kubenswrapper[4954]: I1206 09:33:00.087455 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qsn6" event={"ID":"8b141560-70f2-48f1-8e95-8092255880ce","Type":"ContainerDied","Data":"df1e95c382e1b866ef4100bcbf1d1aa21ebd809bd58e2248dd4068d67203bbd4"} Dec 06 09:33:00 crc kubenswrapper[4954]: I1206 09:33:00.087671 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qsn6" event={"ID":"8b141560-70f2-48f1-8e95-8092255880ce","Type":"ContainerStarted","Data":"1107a3ee9716f6c13f9e597cf7c5b4c25706f3236b5011c9893d8aa284f1b032"} Dec 06 09:33:00 crc kubenswrapper[4954]: I1206 09:33:00.089655 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 09:33:00 crc kubenswrapper[4954]: I1206 09:33:00.443725 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:33:00 crc kubenswrapper[4954]: E1206 09:33:00.444134 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:33:01 crc kubenswrapper[4954]: I1206 09:33:01.098420 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qsn6" event={"ID":"8b141560-70f2-48f1-8e95-8092255880ce","Type":"ContainerStarted","Data":"4bf36521a55d29e0372c999fdda49f072c0eec5fcbe3bdd0bbd95455cb159c1d"} Dec 06 09:33:05 crc kubenswrapper[4954]: I1206 09:33:05.137944 4954 generic.go:334] "Generic (PLEG): container finished" podID="8b141560-70f2-48f1-8e95-8092255880ce" containerID="4bf36521a55d29e0372c999fdda49f072c0eec5fcbe3bdd0bbd95455cb159c1d" exitCode=0 Dec 06 09:33:05 crc kubenswrapper[4954]: I1206 09:33:05.138014 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qsn6" event={"ID":"8b141560-70f2-48f1-8e95-8092255880ce","Type":"ContainerDied","Data":"4bf36521a55d29e0372c999fdda49f072c0eec5fcbe3bdd0bbd95455cb159c1d"} Dec 06 09:33:06 crc kubenswrapper[4954]: I1206 09:33:06.161848 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qsn6" event={"ID":"8b141560-70f2-48f1-8e95-8092255880ce","Type":"ContainerStarted","Data":"cfcdef044c9b2568f4483153ae6e58ee574f36da3afa6d965fd105fcaf93f37b"} Dec 06 09:33:06 crc kubenswrapper[4954]: I1206 09:33:06.195905 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8qsn6" podStartSLOduration=2.61454516 podStartE2EDuration="8.195862884s" podCreationTimestamp="2025-12-06 09:32:58 +0000 UTC" firstStartedPulling="2025-12-06 09:33:00.089316018 +0000 UTC m=+9354.902675407" lastFinishedPulling="2025-12-06 09:33:05.670633732 +0000 UTC m=+9360.483993131" observedRunningTime="2025-12-06 09:33:06.188832606 +0000 UTC m=+9361.002192005" watchObservedRunningTime="2025-12-06 09:33:06.195862884 +0000 UTC m=+9361.009222273" Dec 06 09:33:08 crc kubenswrapper[4954]: I1206 09:33:08.930878 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8qsn6" Dec 06 09:33:08 crc kubenswrapper[4954]: I1206 09:33:08.931205 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8qsn6" Dec 06 09:33:09 crc kubenswrapper[4954]: I1206 09:33:09.984703 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8qsn6" podUID="8b141560-70f2-48f1-8e95-8092255880ce" containerName="registry-server" probeResult="failure" output=< Dec 06 09:33:09 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 09:33:09 crc kubenswrapper[4954]: > Dec 06 09:33:11 crc kubenswrapper[4954]: I1206 09:33:11.444763 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:33:11 crc kubenswrapper[4954]: E1206 09:33:11.445067 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:33:19 crc kubenswrapper[4954]: I1206 09:33:19.070440 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8qsn6" Dec 06 09:33:19 crc kubenswrapper[4954]: I1206 09:33:19.119890 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8qsn6" Dec 06 09:33:19 crc kubenswrapper[4954]: I1206 09:33:19.316470 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8qsn6"] Dec 06 09:33:20 crc kubenswrapper[4954]: I1206 09:33:20.315134 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8qsn6" podUID="8b141560-70f2-48f1-8e95-8092255880ce" containerName="registry-server" containerID="cri-o://cfcdef044c9b2568f4483153ae6e58ee574f36da3afa6d965fd105fcaf93f37b" gracePeriod=2 Dec 06 09:33:20 crc kubenswrapper[4954]: I1206 09:33:20.795008 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8qsn6" Dec 06 09:33:20 crc kubenswrapper[4954]: I1206 09:33:20.892272 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b141560-70f2-48f1-8e95-8092255880ce-catalog-content\") pod \"8b141560-70f2-48f1-8e95-8092255880ce\" (UID: \"8b141560-70f2-48f1-8e95-8092255880ce\") " Dec 06 09:33:20 crc kubenswrapper[4954]: I1206 09:33:20.892391 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4bhz\" (UniqueName: \"kubernetes.io/projected/8b141560-70f2-48f1-8e95-8092255880ce-kube-api-access-r4bhz\") pod \"8b141560-70f2-48f1-8e95-8092255880ce\" (UID: \"8b141560-70f2-48f1-8e95-8092255880ce\") " Dec 06 09:33:20 crc kubenswrapper[4954]: I1206 09:33:20.892454 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b141560-70f2-48f1-8e95-8092255880ce-utilities\") pod \"8b141560-70f2-48f1-8e95-8092255880ce\" (UID: \"8b141560-70f2-48f1-8e95-8092255880ce\") " Dec 06 09:33:20 crc kubenswrapper[4954]: I1206 09:33:20.893330 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b141560-70f2-48f1-8e95-8092255880ce-utilities" (OuterVolumeSpecName: "utilities") pod "8b141560-70f2-48f1-8e95-8092255880ce" (UID: "8b141560-70f2-48f1-8e95-8092255880ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:33:20 crc kubenswrapper[4954]: I1206 09:33:20.900744 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b141560-70f2-48f1-8e95-8092255880ce-kube-api-access-r4bhz" (OuterVolumeSpecName: "kube-api-access-r4bhz") pod "8b141560-70f2-48f1-8e95-8092255880ce" (UID: "8b141560-70f2-48f1-8e95-8092255880ce"). InnerVolumeSpecName "kube-api-access-r4bhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:33:20 crc kubenswrapper[4954]: I1206 09:33:20.996157 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4bhz\" (UniqueName: \"kubernetes.io/projected/8b141560-70f2-48f1-8e95-8092255880ce-kube-api-access-r4bhz\") on node \"crc\" DevicePath \"\"" Dec 06 09:33:20 crc kubenswrapper[4954]: I1206 09:33:20.996449 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b141560-70f2-48f1-8e95-8092255880ce-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:33:20 crc kubenswrapper[4954]: I1206 09:33:20.998155 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b141560-70f2-48f1-8e95-8092255880ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b141560-70f2-48f1-8e95-8092255880ce" (UID: "8b141560-70f2-48f1-8e95-8092255880ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:33:21 crc kubenswrapper[4954]: I1206 09:33:21.097131 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b141560-70f2-48f1-8e95-8092255880ce-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:33:21 crc kubenswrapper[4954]: I1206 09:33:21.344648 4954 generic.go:334] "Generic (PLEG): container finished" podID="8b141560-70f2-48f1-8e95-8092255880ce" containerID="cfcdef044c9b2568f4483153ae6e58ee574f36da3afa6d965fd105fcaf93f37b" exitCode=0 Dec 06 09:33:21 crc kubenswrapper[4954]: I1206 09:33:21.344918 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qsn6" event={"ID":"8b141560-70f2-48f1-8e95-8092255880ce","Type":"ContainerDied","Data":"cfcdef044c9b2568f4483153ae6e58ee574f36da3afa6d965fd105fcaf93f37b"} Dec 06 09:33:21 crc kubenswrapper[4954]: I1206 09:33:21.344957 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qsn6" event={"ID":"8b141560-70f2-48f1-8e95-8092255880ce","Type":"ContainerDied","Data":"1107a3ee9716f6c13f9e597cf7c5b4c25706f3236b5011c9893d8aa284f1b032"} Dec 06 09:33:21 crc kubenswrapper[4954]: I1206 09:33:21.345034 4954 scope.go:117] "RemoveContainer" containerID="cfcdef044c9b2568f4483153ae6e58ee574f36da3afa6d965fd105fcaf93f37b" Dec 06 09:33:21 crc kubenswrapper[4954]: I1206 09:33:21.345642 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8qsn6" Dec 06 09:33:21 crc kubenswrapper[4954]: I1206 09:33:21.371576 4954 scope.go:117] "RemoveContainer" containerID="4bf36521a55d29e0372c999fdda49f072c0eec5fcbe3bdd0bbd95455cb159c1d" Dec 06 09:33:21 crc kubenswrapper[4954]: I1206 09:33:21.386590 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8qsn6"] Dec 06 09:33:21 crc kubenswrapper[4954]: I1206 09:33:21.396823 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8qsn6"] Dec 06 09:33:21 crc kubenswrapper[4954]: I1206 09:33:21.456004 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b141560-70f2-48f1-8e95-8092255880ce" path="/var/lib/kubelet/pods/8b141560-70f2-48f1-8e95-8092255880ce/volumes" Dec 06 09:33:21 crc kubenswrapper[4954]: I1206 09:33:21.593302 4954 scope.go:117] "RemoveContainer" containerID="df1e95c382e1b866ef4100bcbf1d1aa21ebd809bd58e2248dd4068d67203bbd4" Dec 06 09:33:21 crc kubenswrapper[4954]: I1206 09:33:21.731474 4954 scope.go:117] "RemoveContainer" containerID="cfcdef044c9b2568f4483153ae6e58ee574f36da3afa6d965fd105fcaf93f37b" Dec 06 09:33:21 crc kubenswrapper[4954]: E1206 09:33:21.732521 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfcdef044c9b2568f4483153ae6e58ee574f36da3afa6d965fd105fcaf93f37b\": container with ID starting with cfcdef044c9b2568f4483153ae6e58ee574f36da3afa6d965fd105fcaf93f37b not found: ID does not exist" containerID="cfcdef044c9b2568f4483153ae6e58ee574f36da3afa6d965fd105fcaf93f37b" Dec 06 09:33:21 crc kubenswrapper[4954]: I1206 09:33:21.732614 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfcdef044c9b2568f4483153ae6e58ee574f36da3afa6d965fd105fcaf93f37b"} err="failed to get container status \"cfcdef044c9b2568f4483153ae6e58ee574f36da3afa6d965fd105fcaf93f37b\": rpc error: code = NotFound desc = could not find container \"cfcdef044c9b2568f4483153ae6e58ee574f36da3afa6d965fd105fcaf93f37b\": container with ID starting with cfcdef044c9b2568f4483153ae6e58ee574f36da3afa6d965fd105fcaf93f37b not found: ID does not exist" Dec 06 09:33:21 crc kubenswrapper[4954]: I1206 09:33:21.732658 4954 scope.go:117] "RemoveContainer" containerID="4bf36521a55d29e0372c999fdda49f072c0eec5fcbe3bdd0bbd95455cb159c1d" Dec 06 09:33:21 crc kubenswrapper[4954]: E1206 09:33:21.733381 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bf36521a55d29e0372c999fdda49f072c0eec5fcbe3bdd0bbd95455cb159c1d\": container with ID starting with 4bf36521a55d29e0372c999fdda49f072c0eec5fcbe3bdd0bbd95455cb159c1d not found: ID does not exist" containerID="4bf36521a55d29e0372c999fdda49f072c0eec5fcbe3bdd0bbd95455cb159c1d" Dec 06 09:33:21 crc kubenswrapper[4954]: I1206 09:33:21.733430 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bf36521a55d29e0372c999fdda49f072c0eec5fcbe3bdd0bbd95455cb159c1d"} err="failed to get container status \"4bf36521a55d29e0372c999fdda49f072c0eec5fcbe3bdd0bbd95455cb159c1d\": rpc error: code = NotFound desc = could not find container \"4bf36521a55d29e0372c999fdda49f072c0eec5fcbe3bdd0bbd95455cb159c1d\": container with ID starting with 4bf36521a55d29e0372c999fdda49f072c0eec5fcbe3bdd0bbd95455cb159c1d not found: ID does not exist" Dec 06 09:33:21 crc kubenswrapper[4954]: I1206 09:33:21.733459 4954 scope.go:117] "RemoveContainer" containerID="df1e95c382e1b866ef4100bcbf1d1aa21ebd809bd58e2248dd4068d67203bbd4" Dec 06 09:33:21 crc kubenswrapper[4954]: E1206 09:33:21.733713 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df1e95c382e1b866ef4100bcbf1d1aa21ebd809bd58e2248dd4068d67203bbd4\": container with ID starting with df1e95c382e1b866ef4100bcbf1d1aa21ebd809bd58e2248dd4068d67203bbd4 not found: ID does not exist" containerID="df1e95c382e1b866ef4100bcbf1d1aa21ebd809bd58e2248dd4068d67203bbd4" Dec 06 09:33:21 crc kubenswrapper[4954]: I1206 09:33:21.733744 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df1e95c382e1b866ef4100bcbf1d1aa21ebd809bd58e2248dd4068d67203bbd4"} err="failed to get container status \"df1e95c382e1b866ef4100bcbf1d1aa21ebd809bd58e2248dd4068d67203bbd4\": rpc error: code = NotFound desc = could not find container \"df1e95c382e1b866ef4100bcbf1d1aa21ebd809bd58e2248dd4068d67203bbd4\": container with ID starting with df1e95c382e1b866ef4100bcbf1d1aa21ebd809bd58e2248dd4068d67203bbd4 not found: ID does not exist" Dec 06 09:33:24 crc kubenswrapper[4954]: I1206 09:33:24.443231 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:33:24 crc kubenswrapper[4954]: E1206 09:33:24.444111 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:33:38 crc kubenswrapper[4954]: I1206 09:33:38.442909 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:33:38 crc kubenswrapper[4954]: E1206 09:33:38.443668 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:33:52 crc kubenswrapper[4954]: I1206 09:33:52.443996 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:33:53 crc kubenswrapper[4954]: I1206 09:33:53.691415 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"2efff635b849e4eb533aeea40baff5ba7c258b3f88206045b79eb865856ad034"} Dec 06 09:35:33 crc kubenswrapper[4954]: I1206 09:35:33.646214 4954 generic.go:334] "Generic (PLEG): container finished" podID="5f829cef-d637-404f-b735-02768488c5f7" containerID="08684d0feb3652159eea23a7a56bcc261ab03efe30feaa572967ffe0382f69ee" exitCode=0 Dec 06 09:35:33 crc kubenswrapper[4954]: I1206 09:35:33.646716 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" event={"ID":"5f829cef-d637-404f-b735-02768488c5f7","Type":"ContainerDied","Data":"08684d0feb3652159eea23a7a56bcc261ab03efe30feaa572967ffe0382f69ee"} Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.093235 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.199722 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-inventory\") pod \"5f829cef-d637-404f-b735-02768488c5f7\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.199792 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-ssh-key\") pod \"5f829cef-d637-404f-b735-02768488c5f7\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.199840 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbwj4\" (UniqueName: \"kubernetes.io/projected/5f829cef-d637-404f-b735-02768488c5f7-kube-api-access-nbwj4\") pod \"5f829cef-d637-404f-b735-02768488c5f7\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.199911 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-libvirt-combined-ca-bundle\") pod \"5f829cef-d637-404f-b735-02768488c5f7\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.200027 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-libvirt-secret-0\") pod \"5f829cef-d637-404f-b735-02768488c5f7\" (UID: \"5f829cef-d637-404f-b735-02768488c5f7\") " Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.205510 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5f829cef-d637-404f-b735-02768488c5f7" (UID: "5f829cef-d637-404f-b735-02768488c5f7"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.206140 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f829cef-d637-404f-b735-02768488c5f7-kube-api-access-nbwj4" (OuterVolumeSpecName: "kube-api-access-nbwj4") pod "5f829cef-d637-404f-b735-02768488c5f7" (UID: "5f829cef-d637-404f-b735-02768488c5f7"). InnerVolumeSpecName "kube-api-access-nbwj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.229985 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-inventory" (OuterVolumeSpecName: "inventory") pod "5f829cef-d637-404f-b735-02768488c5f7" (UID: "5f829cef-d637-404f-b735-02768488c5f7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.230016 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5f829cef-d637-404f-b735-02768488c5f7" (UID: "5f829cef-d637-404f-b735-02768488c5f7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.231339 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "5f829cef-d637-404f-b735-02768488c5f7" (UID: "5f829cef-d637-404f-b735-02768488c5f7"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.309971 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.310008 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.310021 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbwj4\" (UniqueName: \"kubernetes.io/projected/5f829cef-d637-404f-b735-02768488c5f7-kube-api-access-nbwj4\") on node \"crc\" DevicePath \"\"" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.310039 4954 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.310053 4954 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5f829cef-d637-404f-b735-02768488c5f7-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.668775 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" event={"ID":"5f829cef-d637-404f-b735-02768488c5f7","Type":"ContainerDied","Data":"3bebc40a58278a93af3dfd0b29108945d0e4e3f6bc11e2b940fc8b7495f4596b"} Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.668815 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bebc40a58278a93af3dfd0b29108945d0e4e3f6bc11e2b940fc8b7495f4596b" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.669164 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-rjsg8" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.762057 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-vg4vw"] Dec 06 09:35:35 crc kubenswrapper[4954]: E1206 09:35:35.762533 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b141560-70f2-48f1-8e95-8092255880ce" containerName="registry-server" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.762558 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b141560-70f2-48f1-8e95-8092255880ce" containerName="registry-server" Dec 06 09:35:35 crc kubenswrapper[4954]: E1206 09:35:35.762598 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f829cef-d637-404f-b735-02768488c5f7" containerName="libvirt-openstack-openstack-cell1" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.762606 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f829cef-d637-404f-b735-02768488c5f7" containerName="libvirt-openstack-openstack-cell1" Dec 06 09:35:35 crc kubenswrapper[4954]: E1206 09:35:35.762628 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b141560-70f2-48f1-8e95-8092255880ce" containerName="extract-content" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.762638 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b141560-70f2-48f1-8e95-8092255880ce" containerName="extract-content" Dec 06 09:35:35 crc kubenswrapper[4954]: E1206 09:35:35.762668 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b141560-70f2-48f1-8e95-8092255880ce" containerName="extract-utilities" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.762676 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b141560-70f2-48f1-8e95-8092255880ce" containerName="extract-utilities" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.762920 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b141560-70f2-48f1-8e95-8092255880ce" containerName="registry-server" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.762951 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f829cef-d637-404f-b735-02768488c5f7" containerName="libvirt-openstack-openstack-cell1" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.763860 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.768375 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.768402 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.768758 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.768920 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.769282 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.769368 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.770859 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.774727 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-vg4vw"] Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.920805 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.920849 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcngc\" (UniqueName: \"kubernetes.io/projected/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-kube-api-access-jcngc\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.920874 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-inventory\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.921051 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.921073 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.921104 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.921128 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.921151 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:35 crc kubenswrapper[4954]: I1206 09:35:35.921238 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.022630 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.022667 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcngc\" (UniqueName: \"kubernetes.io/projected/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-kube-api-access-jcngc\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.022696 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-inventory\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.022774 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.022791 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.022811 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.022834 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.022851 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.022889 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.023980 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.028624 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.028908 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.029241 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-inventory\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.029817 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.030149 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.030489 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.039908 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.040171 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcngc\" (UniqueName: \"kubernetes.io/projected/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-kube-api-access-jcngc\") pod \"nova-cell1-openstack-openstack-cell1-vg4vw\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.088686 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.602860 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-vg4vw"] Dec 06 09:35:36 crc kubenswrapper[4954]: I1206 09:35:36.679006 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" event={"ID":"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c","Type":"ContainerStarted","Data":"febed735623256cd642c12c697ec89b8bc8dbe50c61ff058afd1815908960e22"} Dec 06 09:35:37 crc kubenswrapper[4954]: I1206 09:35:37.688752 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" event={"ID":"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c","Type":"ContainerStarted","Data":"6f3019ef880029986ff056b3b424c183d1dd91e773950946d50755d65de6d137"} Dec 06 09:35:37 crc kubenswrapper[4954]: I1206 09:35:37.714076 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" podStartSLOduration=2.291627417 podStartE2EDuration="2.714053418s" podCreationTimestamp="2025-12-06 09:35:35 +0000 UTC" firstStartedPulling="2025-12-06 09:35:36.608475414 +0000 UTC m=+9511.421834803" lastFinishedPulling="2025-12-06 09:35:37.030901415 +0000 UTC m=+9511.844260804" observedRunningTime="2025-12-06 09:35:37.703790264 +0000 UTC m=+9512.517149653" watchObservedRunningTime="2025-12-06 09:35:37.714053418 +0000 UTC m=+9512.527412807" Dec 06 09:36:10 crc kubenswrapper[4954]: I1206 09:36:10.101527 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:36:10 crc kubenswrapper[4954]: I1206 09:36:10.102048 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:36:40 crc kubenswrapper[4954]: I1206 09:36:40.101578 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:36:40 crc kubenswrapper[4954]: I1206 09:36:40.103210 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:37:10 crc kubenswrapper[4954]: I1206 09:37:10.101450 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:37:10 crc kubenswrapper[4954]: I1206 09:37:10.102077 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:37:10 crc kubenswrapper[4954]: I1206 09:37:10.102145 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 09:37:10 crc kubenswrapper[4954]: I1206 09:37:10.103079 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2efff635b849e4eb533aeea40baff5ba7c258b3f88206045b79eb865856ad034"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:37:10 crc kubenswrapper[4954]: I1206 09:37:10.103134 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://2efff635b849e4eb533aeea40baff5ba7c258b3f88206045b79eb865856ad034" gracePeriod=600 Dec 06 09:37:10 crc kubenswrapper[4954]: I1206 09:37:10.683600 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"2efff635b849e4eb533aeea40baff5ba7c258b3f88206045b79eb865856ad034"} Dec 06 09:37:10 crc kubenswrapper[4954]: I1206 09:37:10.684289 4954 scope.go:117] "RemoveContainer" containerID="7c27b6954904322aab8071ca81e71bd0c4d9a5a08c93b84981fcf82f74a3681e" Dec 06 09:37:10 crc kubenswrapper[4954]: I1206 09:37:10.683548 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="2efff635b849e4eb533aeea40baff5ba7c258b3f88206045b79eb865856ad034" exitCode=0 Dec 06 09:37:10 crc kubenswrapper[4954]: I1206 09:37:10.684537 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf"} Dec 06 09:39:10 crc kubenswrapper[4954]: I1206 09:39:10.101787 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:39:10 crc kubenswrapper[4954]: I1206 09:39:10.102893 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:39:10 crc kubenswrapper[4954]: I1206 09:39:10.925977 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2kc8d"] Dec 06 09:39:10 crc kubenswrapper[4954]: I1206 09:39:10.929150 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2kc8d" Dec 06 09:39:10 crc kubenswrapper[4954]: I1206 09:39:10.943079 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2kc8d"] Dec 06 09:39:11 crc kubenswrapper[4954]: I1206 09:39:11.074946 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58a0057-7b80-40e3-a76f-5d71b277f930-utilities\") pod \"redhat-marketplace-2kc8d\" (UID: \"a58a0057-7b80-40e3-a76f-5d71b277f930\") " pod="openshift-marketplace/redhat-marketplace-2kc8d" Dec 06 09:39:11 crc kubenswrapper[4954]: I1206 09:39:11.075003 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58a0057-7b80-40e3-a76f-5d71b277f930-catalog-content\") pod \"redhat-marketplace-2kc8d\" (UID: \"a58a0057-7b80-40e3-a76f-5d71b277f930\") " pod="openshift-marketplace/redhat-marketplace-2kc8d" Dec 06 09:39:11 crc kubenswrapper[4954]: I1206 09:39:11.075159 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcr27\" (UniqueName: \"kubernetes.io/projected/a58a0057-7b80-40e3-a76f-5d71b277f930-kube-api-access-vcr27\") pod \"redhat-marketplace-2kc8d\" (UID: \"a58a0057-7b80-40e3-a76f-5d71b277f930\") " pod="openshift-marketplace/redhat-marketplace-2kc8d" Dec 06 09:39:11 crc kubenswrapper[4954]: I1206 09:39:11.177339 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcr27\" (UniqueName: \"kubernetes.io/projected/a58a0057-7b80-40e3-a76f-5d71b277f930-kube-api-access-vcr27\") pod \"redhat-marketplace-2kc8d\" (UID: \"a58a0057-7b80-40e3-a76f-5d71b277f930\") " pod="openshift-marketplace/redhat-marketplace-2kc8d" Dec 06 09:39:11 crc kubenswrapper[4954]: I1206 09:39:11.177601 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58a0057-7b80-40e3-a76f-5d71b277f930-utilities\") pod \"redhat-marketplace-2kc8d\" (UID: \"a58a0057-7b80-40e3-a76f-5d71b277f930\") " pod="openshift-marketplace/redhat-marketplace-2kc8d" Dec 06 09:39:11 crc kubenswrapper[4954]: I1206 09:39:11.177681 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58a0057-7b80-40e3-a76f-5d71b277f930-catalog-content\") pod \"redhat-marketplace-2kc8d\" (UID: \"a58a0057-7b80-40e3-a76f-5d71b277f930\") " pod="openshift-marketplace/redhat-marketplace-2kc8d" Dec 06 09:39:11 crc kubenswrapper[4954]: I1206 09:39:11.178187 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58a0057-7b80-40e3-a76f-5d71b277f930-utilities\") pod \"redhat-marketplace-2kc8d\" (UID: \"a58a0057-7b80-40e3-a76f-5d71b277f930\") " pod="openshift-marketplace/redhat-marketplace-2kc8d" Dec 06 09:39:11 crc kubenswrapper[4954]: I1206 09:39:11.178244 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58a0057-7b80-40e3-a76f-5d71b277f930-catalog-content\") pod \"redhat-marketplace-2kc8d\" (UID: \"a58a0057-7b80-40e3-a76f-5d71b277f930\") " pod="openshift-marketplace/redhat-marketplace-2kc8d" Dec 06 09:39:11 crc kubenswrapper[4954]: I1206 09:39:11.198246 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcr27\" (UniqueName: \"kubernetes.io/projected/a58a0057-7b80-40e3-a76f-5d71b277f930-kube-api-access-vcr27\") pod \"redhat-marketplace-2kc8d\" (UID: \"a58a0057-7b80-40e3-a76f-5d71b277f930\") " pod="openshift-marketplace/redhat-marketplace-2kc8d" Dec 06 09:39:11 crc kubenswrapper[4954]: I1206 09:39:11.254063 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2kc8d" Dec 06 09:39:11 crc kubenswrapper[4954]: I1206 09:39:11.775631 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2kc8d"] Dec 06 09:39:11 crc kubenswrapper[4954]: I1206 09:39:11.921731 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kc8d" event={"ID":"a58a0057-7b80-40e3-a76f-5d71b277f930","Type":"ContainerStarted","Data":"1c1479fbd529c307dc613f3664558614923676f1d947cc85b75e701301d30535"} Dec 06 09:39:12 crc kubenswrapper[4954]: I1206 09:39:12.942325 4954 generic.go:334] "Generic (PLEG): container finished" podID="a58a0057-7b80-40e3-a76f-5d71b277f930" containerID="e9a7be5d5b340e010618dcd43d087150ee59a819a7d4ffdec6ee3d3e1b7d2a49" exitCode=0 Dec 06 09:39:12 crc kubenswrapper[4954]: I1206 09:39:12.942379 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kc8d" event={"ID":"a58a0057-7b80-40e3-a76f-5d71b277f930","Type":"ContainerDied","Data":"e9a7be5d5b340e010618dcd43d087150ee59a819a7d4ffdec6ee3d3e1b7d2a49"} Dec 06 09:39:12 crc kubenswrapper[4954]: I1206 09:39:12.944836 4954 generic.go:334] "Generic (PLEG): container finished" podID="88a2ce35-e4bc-4a06-9f75-9dfc51828c7c" containerID="6f3019ef880029986ff056b3b424c183d1dd91e773950946d50755d65de6d137" exitCode=0 Dec 06 09:39:12 crc kubenswrapper[4954]: I1206 09:39:12.944901 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" event={"ID":"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c","Type":"ContainerDied","Data":"6f3019ef880029986ff056b3b424c183d1dd91e773950946d50755d65de6d137"} Dec 06 09:39:12 crc kubenswrapper[4954]: I1206 09:39:12.945780 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 09:39:13 crc kubenswrapper[4954]: I1206 09:39:13.956337 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kc8d" event={"ID":"a58a0057-7b80-40e3-a76f-5d71b277f930","Type":"ContainerStarted","Data":"6a34fb803a020aa22e2ed6f3590199e985797d1e407b3f8c963eba5bd8ff48d4"} Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.378947 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.453782 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-migration-ssh-key-0\") pod \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.453825 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cell1-combined-ca-bundle\") pod \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.453844 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cell1-compute-config-1\") pod \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.453885 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-migration-ssh-key-1\") pod \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.453962 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-ssh-key\") pod \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.454029 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cell1-compute-config-0\") pod \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.454062 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcngc\" (UniqueName: \"kubernetes.io/projected/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-kube-api-access-jcngc\") pod \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.455358 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cells-global-config-0\") pod \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.456042 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-inventory\") pod \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\" (UID: \"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c\") " Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.461085 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-kube-api-access-jcngc" (OuterVolumeSpecName: "kube-api-access-jcngc") pod "88a2ce35-e4bc-4a06-9f75-9dfc51828c7c" (UID: "88a2ce35-e4bc-4a06-9f75-9dfc51828c7c"). InnerVolumeSpecName "kube-api-access-jcngc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.461170 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "88a2ce35-e4bc-4a06-9f75-9dfc51828c7c" (UID: "88a2ce35-e4bc-4a06-9f75-9dfc51828c7c"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.485238 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "88a2ce35-e4bc-4a06-9f75-9dfc51828c7c" (UID: "88a2ce35-e4bc-4a06-9f75-9dfc51828c7c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.486782 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "88a2ce35-e4bc-4a06-9f75-9dfc51828c7c" (UID: "88a2ce35-e4bc-4a06-9f75-9dfc51828c7c"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.486975 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "88a2ce35-e4bc-4a06-9f75-9dfc51828c7c" (UID: "88a2ce35-e4bc-4a06-9f75-9dfc51828c7c"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.488734 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "88a2ce35-e4bc-4a06-9f75-9dfc51828c7c" (UID: "88a2ce35-e4bc-4a06-9f75-9dfc51828c7c"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.488971 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-inventory" (OuterVolumeSpecName: "inventory") pod "88a2ce35-e4bc-4a06-9f75-9dfc51828c7c" (UID: "88a2ce35-e4bc-4a06-9f75-9dfc51828c7c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.494405 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "88a2ce35-e4bc-4a06-9f75-9dfc51828c7c" (UID: "88a2ce35-e4bc-4a06-9f75-9dfc51828c7c"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.515961 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "88a2ce35-e4bc-4a06-9f75-9dfc51828c7c" (UID: "88a2ce35-e4bc-4a06-9f75-9dfc51828c7c"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.560273 4954 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.560461 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.560474 4954 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.560484 4954 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.560494 4954 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.560504 4954 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.560515 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.560524 4954 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.560533 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcngc\" (UniqueName: \"kubernetes.io/projected/88a2ce35-e4bc-4a06-9f75-9dfc51828c7c-kube-api-access-jcngc\") on node \"crc\" DevicePath \"\"" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.966288 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.966348 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-vg4vw" event={"ID":"88a2ce35-e4bc-4a06-9f75-9dfc51828c7c","Type":"ContainerDied","Data":"febed735623256cd642c12c697ec89b8bc8dbe50c61ff058afd1815908960e22"} Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.966403 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="febed735623256cd642c12c697ec89b8bc8dbe50c61ff058afd1815908960e22" Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.968277 4954 generic.go:334] "Generic (PLEG): container finished" podID="a58a0057-7b80-40e3-a76f-5d71b277f930" containerID="6a34fb803a020aa22e2ed6f3590199e985797d1e407b3f8c963eba5bd8ff48d4" exitCode=0 Dec 06 09:39:14 crc kubenswrapper[4954]: I1206 09:39:14.968309 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kc8d" event={"ID":"a58a0057-7b80-40e3-a76f-5d71b277f930","Type":"ContainerDied","Data":"6a34fb803a020aa22e2ed6f3590199e985797d1e407b3f8c963eba5bd8ff48d4"} Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.114266 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-2k4qx"] Dec 06 09:39:15 crc kubenswrapper[4954]: E1206 09:39:15.114932 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a2ce35-e4bc-4a06-9f75-9dfc51828c7c" containerName="nova-cell1-openstack-openstack-cell1" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.114958 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a2ce35-e4bc-4a06-9f75-9dfc51828c7c" containerName="nova-cell1-openstack-openstack-cell1" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.115158 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a2ce35-e4bc-4a06-9f75-9dfc51828c7c" containerName="nova-cell1-openstack-openstack-cell1" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.115954 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.117961 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.118199 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.118821 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.118956 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.119174 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.125969 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-2k4qx"] Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.273433 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.273548 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.273593 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-inventory\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.273614 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.273642 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g88qf\" (UniqueName: \"kubernetes.io/projected/509d808a-804b-405a-9a0a-545a8d15a90e-kube-api-access-g88qf\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.273672 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.273698 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ssh-key\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.375168 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.375324 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.375351 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-inventory\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.375371 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.375401 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g88qf\" (UniqueName: \"kubernetes.io/projected/509d808a-804b-405a-9a0a-545a8d15a90e-kube-api-access-g88qf\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.375581 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.375628 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ssh-key\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.769878 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.770312 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ssh-key\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.770449 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-inventory\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.770493 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.770767 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.770816 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:15 crc kubenswrapper[4954]: I1206 09:39:15.771362 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g88qf\" (UniqueName: \"kubernetes.io/projected/509d808a-804b-405a-9a0a-545a8d15a90e-kube-api-access-g88qf\") pod \"telemetry-openstack-openstack-cell1-2k4qx\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:16 crc kubenswrapper[4954]: I1206 09:39:16.034189 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:39:16 crc kubenswrapper[4954]: I1206 09:39:16.585108 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-2k4qx"] Dec 06 09:39:16 crc kubenswrapper[4954]: W1206 09:39:16.593975 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod509d808a_804b_405a_9a0a_545a8d15a90e.slice/crio-db9875d5d7b867e5b2769f071f9c07cc409695bd28e1c5b0b952db09c8a5da29 WatchSource:0}: Error finding container db9875d5d7b867e5b2769f071f9c07cc409695bd28e1c5b0b952db09c8a5da29: Status 404 returned error can't find the container with id db9875d5d7b867e5b2769f071f9c07cc409695bd28e1c5b0b952db09c8a5da29 Dec 06 09:39:16 crc kubenswrapper[4954]: I1206 09:39:16.996386 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" event={"ID":"509d808a-804b-405a-9a0a-545a8d15a90e","Type":"ContainerStarted","Data":"db9875d5d7b867e5b2769f071f9c07cc409695bd28e1c5b0b952db09c8a5da29"} Dec 06 09:39:16 crc kubenswrapper[4954]: I1206 09:39:16.998907 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kc8d" event={"ID":"a58a0057-7b80-40e3-a76f-5d71b277f930","Type":"ContainerStarted","Data":"41adff8cd13577f6367f4230f8d42eb91eaaf693f72f1f53cb509cf30c485a57"} Dec 06 09:39:17 crc kubenswrapper[4954]: I1206 09:39:17.019532 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2kc8d" podStartSLOduration=4.622385876 podStartE2EDuration="7.019479714s" podCreationTimestamp="2025-12-06 09:39:10 +0000 UTC" firstStartedPulling="2025-12-06 09:39:12.945426119 +0000 UTC m=+9727.758785508" lastFinishedPulling="2025-12-06 09:39:15.342519957 +0000 UTC m=+9730.155879346" observedRunningTime="2025-12-06 09:39:17.013492424 +0000 UTC m=+9731.826851813" watchObservedRunningTime="2025-12-06 09:39:17.019479714 +0000 UTC m=+9731.832839103" Dec 06 09:39:18 crc kubenswrapper[4954]: I1206 09:39:18.010216 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" event={"ID":"509d808a-804b-405a-9a0a-545a8d15a90e","Type":"ContainerStarted","Data":"7bfc01d6e69d36ee379ce7e63df7bf01e26c96d146a980ebce93042d57739357"} Dec 06 09:39:18 crc kubenswrapper[4954]: I1206 09:39:18.036448 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" podStartSLOduration=2.401814678 podStartE2EDuration="3.036430361s" podCreationTimestamp="2025-12-06 09:39:15 +0000 UTC" firstStartedPulling="2025-12-06 09:39:16.598889421 +0000 UTC m=+9731.412248810" lastFinishedPulling="2025-12-06 09:39:17.233505104 +0000 UTC m=+9732.046864493" observedRunningTime="2025-12-06 09:39:18.030192535 +0000 UTC m=+9732.843551924" watchObservedRunningTime="2025-12-06 09:39:18.036430361 +0000 UTC m=+9732.849789750" Dec 06 09:39:21 crc kubenswrapper[4954]: I1206 09:39:21.255106 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2kc8d" Dec 06 09:39:21 crc kubenswrapper[4954]: I1206 09:39:21.255723 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2kc8d" Dec 06 09:39:21 crc kubenswrapper[4954]: I1206 09:39:21.304081 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2kc8d" Dec 06 09:39:22 crc kubenswrapper[4954]: I1206 09:39:22.095377 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2kc8d" Dec 06 09:39:22 crc kubenswrapper[4954]: I1206 09:39:22.150179 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2kc8d"] Dec 06 09:39:24 crc kubenswrapper[4954]: I1206 09:39:24.063804 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2kc8d" podUID="a58a0057-7b80-40e3-a76f-5d71b277f930" containerName="registry-server" containerID="cri-o://41adff8cd13577f6367f4230f8d42eb91eaaf693f72f1f53cb509cf30c485a57" gracePeriod=2 Dec 06 09:39:24 crc kubenswrapper[4954]: I1206 09:39:24.619656 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2kc8d" Dec 06 09:39:24 crc kubenswrapper[4954]: I1206 09:39:24.769613 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcr27\" (UniqueName: \"kubernetes.io/projected/a58a0057-7b80-40e3-a76f-5d71b277f930-kube-api-access-vcr27\") pod \"a58a0057-7b80-40e3-a76f-5d71b277f930\" (UID: \"a58a0057-7b80-40e3-a76f-5d71b277f930\") " Dec 06 09:39:24 crc kubenswrapper[4954]: I1206 09:39:24.769829 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58a0057-7b80-40e3-a76f-5d71b277f930-utilities\") pod \"a58a0057-7b80-40e3-a76f-5d71b277f930\" (UID: \"a58a0057-7b80-40e3-a76f-5d71b277f930\") " Dec 06 09:39:24 crc kubenswrapper[4954]: I1206 09:39:24.769932 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58a0057-7b80-40e3-a76f-5d71b277f930-catalog-content\") pod \"a58a0057-7b80-40e3-a76f-5d71b277f930\" (UID: \"a58a0057-7b80-40e3-a76f-5d71b277f930\") " Dec 06 09:39:24 crc kubenswrapper[4954]: I1206 09:39:24.770813 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a58a0057-7b80-40e3-a76f-5d71b277f930-utilities" (OuterVolumeSpecName: "utilities") pod "a58a0057-7b80-40e3-a76f-5d71b277f930" (UID: "a58a0057-7b80-40e3-a76f-5d71b277f930"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:39:24 crc kubenswrapper[4954]: I1206 09:39:24.775741 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a58a0057-7b80-40e3-a76f-5d71b277f930-kube-api-access-vcr27" (OuterVolumeSpecName: "kube-api-access-vcr27") pod "a58a0057-7b80-40e3-a76f-5d71b277f930" (UID: "a58a0057-7b80-40e3-a76f-5d71b277f930"). InnerVolumeSpecName "kube-api-access-vcr27". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:39:24 crc kubenswrapper[4954]: I1206 09:39:24.789652 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a58a0057-7b80-40e3-a76f-5d71b277f930-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a58a0057-7b80-40e3-a76f-5d71b277f930" (UID: "a58a0057-7b80-40e3-a76f-5d71b277f930"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:39:24 crc kubenswrapper[4954]: I1206 09:39:24.872845 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcr27\" (UniqueName: \"kubernetes.io/projected/a58a0057-7b80-40e3-a76f-5d71b277f930-kube-api-access-vcr27\") on node \"crc\" DevicePath \"\"" Dec 06 09:39:24 crc kubenswrapper[4954]: I1206 09:39:24.872893 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58a0057-7b80-40e3-a76f-5d71b277f930-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:39:24 crc kubenswrapper[4954]: I1206 09:39:24.872907 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58a0057-7b80-40e3-a76f-5d71b277f930-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:39:25 crc kubenswrapper[4954]: I1206 09:39:25.077836 4954 generic.go:334] "Generic (PLEG): container finished" podID="a58a0057-7b80-40e3-a76f-5d71b277f930" containerID="41adff8cd13577f6367f4230f8d42eb91eaaf693f72f1f53cb509cf30c485a57" exitCode=0 Dec 06 09:39:25 crc kubenswrapper[4954]: I1206 09:39:25.077905 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kc8d" event={"ID":"a58a0057-7b80-40e3-a76f-5d71b277f930","Type":"ContainerDied","Data":"41adff8cd13577f6367f4230f8d42eb91eaaf693f72f1f53cb509cf30c485a57"} Dec 06 09:39:25 crc kubenswrapper[4954]: I1206 09:39:25.078036 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2kc8d" Dec 06 09:39:25 crc kubenswrapper[4954]: I1206 09:39:25.078073 4954 scope.go:117] "RemoveContainer" containerID="41adff8cd13577f6367f4230f8d42eb91eaaf693f72f1f53cb509cf30c485a57" Dec 06 09:39:25 crc kubenswrapper[4954]: I1206 09:39:25.077945 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kc8d" event={"ID":"a58a0057-7b80-40e3-a76f-5d71b277f930","Type":"ContainerDied","Data":"1c1479fbd529c307dc613f3664558614923676f1d947cc85b75e701301d30535"} Dec 06 09:39:25 crc kubenswrapper[4954]: I1206 09:39:25.100882 4954 scope.go:117] "RemoveContainer" containerID="6a34fb803a020aa22e2ed6f3590199e985797d1e407b3f8c963eba5bd8ff48d4" Dec 06 09:39:25 crc kubenswrapper[4954]: I1206 09:39:25.122967 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2kc8d"] Dec 06 09:39:25 crc kubenswrapper[4954]: I1206 09:39:25.133825 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2kc8d"] Dec 06 09:39:25 crc kubenswrapper[4954]: I1206 09:39:25.161598 4954 scope.go:117] "RemoveContainer" containerID="e9a7be5d5b340e010618dcd43d087150ee59a819a7d4ffdec6ee3d3e1b7d2a49" Dec 06 09:39:25 crc kubenswrapper[4954]: I1206 09:39:25.181669 4954 scope.go:117] "RemoveContainer" containerID="41adff8cd13577f6367f4230f8d42eb91eaaf693f72f1f53cb509cf30c485a57" Dec 06 09:39:25 crc kubenswrapper[4954]: E1206 09:39:25.182074 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41adff8cd13577f6367f4230f8d42eb91eaaf693f72f1f53cb509cf30c485a57\": container with ID starting with 41adff8cd13577f6367f4230f8d42eb91eaaf693f72f1f53cb509cf30c485a57 not found: ID does not exist" containerID="41adff8cd13577f6367f4230f8d42eb91eaaf693f72f1f53cb509cf30c485a57" Dec 06 09:39:25 crc kubenswrapper[4954]: I1206 09:39:25.182147 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41adff8cd13577f6367f4230f8d42eb91eaaf693f72f1f53cb509cf30c485a57"} err="failed to get container status \"41adff8cd13577f6367f4230f8d42eb91eaaf693f72f1f53cb509cf30c485a57\": rpc error: code = NotFound desc = could not find container \"41adff8cd13577f6367f4230f8d42eb91eaaf693f72f1f53cb509cf30c485a57\": container with ID starting with 41adff8cd13577f6367f4230f8d42eb91eaaf693f72f1f53cb509cf30c485a57 not found: ID does not exist" Dec 06 09:39:25 crc kubenswrapper[4954]: I1206 09:39:25.182172 4954 scope.go:117] "RemoveContainer" containerID="6a34fb803a020aa22e2ed6f3590199e985797d1e407b3f8c963eba5bd8ff48d4" Dec 06 09:39:25 crc kubenswrapper[4954]: E1206 09:39:25.182373 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a34fb803a020aa22e2ed6f3590199e985797d1e407b3f8c963eba5bd8ff48d4\": container with ID starting with 6a34fb803a020aa22e2ed6f3590199e985797d1e407b3f8c963eba5bd8ff48d4 not found: ID does not exist" containerID="6a34fb803a020aa22e2ed6f3590199e985797d1e407b3f8c963eba5bd8ff48d4" Dec 06 09:39:25 crc kubenswrapper[4954]: I1206 09:39:25.182399 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a34fb803a020aa22e2ed6f3590199e985797d1e407b3f8c963eba5bd8ff48d4"} err="failed to get container status \"6a34fb803a020aa22e2ed6f3590199e985797d1e407b3f8c963eba5bd8ff48d4\": rpc error: code = NotFound desc = could not find container \"6a34fb803a020aa22e2ed6f3590199e985797d1e407b3f8c963eba5bd8ff48d4\": container with ID starting with 6a34fb803a020aa22e2ed6f3590199e985797d1e407b3f8c963eba5bd8ff48d4 not found: ID does not exist" Dec 06 09:39:25 crc kubenswrapper[4954]: I1206 09:39:25.182413 4954 scope.go:117] "RemoveContainer" containerID="e9a7be5d5b340e010618dcd43d087150ee59a819a7d4ffdec6ee3d3e1b7d2a49" Dec 06 09:39:25 crc kubenswrapper[4954]: E1206 09:39:25.182704 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9a7be5d5b340e010618dcd43d087150ee59a819a7d4ffdec6ee3d3e1b7d2a49\": container with ID starting with e9a7be5d5b340e010618dcd43d087150ee59a819a7d4ffdec6ee3d3e1b7d2a49 not found: ID does not exist" containerID="e9a7be5d5b340e010618dcd43d087150ee59a819a7d4ffdec6ee3d3e1b7d2a49" Dec 06 09:39:25 crc kubenswrapper[4954]: I1206 09:39:25.182747 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a7be5d5b340e010618dcd43d087150ee59a819a7d4ffdec6ee3d3e1b7d2a49"} err="failed to get container status \"e9a7be5d5b340e010618dcd43d087150ee59a819a7d4ffdec6ee3d3e1b7d2a49\": rpc error: code = NotFound desc = could not find container \"e9a7be5d5b340e010618dcd43d087150ee59a819a7d4ffdec6ee3d3e1b7d2a49\": container with ID starting with e9a7be5d5b340e010618dcd43d087150ee59a819a7d4ffdec6ee3d3e1b7d2a49 not found: ID does not exist" Dec 06 09:39:25 crc kubenswrapper[4954]: I1206 09:39:25.454977 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a58a0057-7b80-40e3-a76f-5d71b277f930" path="/var/lib/kubelet/pods/a58a0057-7b80-40e3-a76f-5d71b277f930/volumes" Dec 06 09:39:40 crc kubenswrapper[4954]: I1206 09:39:40.101711 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:39:40 crc kubenswrapper[4954]: I1206 09:39:40.102590 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:40:10 crc kubenswrapper[4954]: I1206 09:40:10.100763 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:40:10 crc kubenswrapper[4954]: I1206 09:40:10.101210 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:40:10 crc kubenswrapper[4954]: I1206 09:40:10.101253 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 09:40:10 crc kubenswrapper[4954]: I1206 09:40:10.102028 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:40:10 crc kubenswrapper[4954]: I1206 09:40:10.102080 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" gracePeriod=600 Dec 06 09:40:10 crc kubenswrapper[4954]: E1206 09:40:10.236139 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:40:10 crc kubenswrapper[4954]: I1206 09:40:10.678829 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" exitCode=0 Dec 06 09:40:10 crc kubenswrapper[4954]: I1206 09:40:10.678939 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf"} Dec 06 09:40:10 crc kubenswrapper[4954]: I1206 09:40:10.679119 4954 scope.go:117] "RemoveContainer" containerID="2efff635b849e4eb533aeea40baff5ba7c258b3f88206045b79eb865856ad034" Dec 06 09:40:10 crc kubenswrapper[4954]: I1206 09:40:10.680110 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:40:10 crc kubenswrapper[4954]: E1206 09:40:10.680525 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:40:25 crc kubenswrapper[4954]: I1206 09:40:25.451998 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:40:25 crc kubenswrapper[4954]: E1206 09:40:25.452919 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:40:37 crc kubenswrapper[4954]: I1206 09:40:37.443707 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:40:37 crc kubenswrapper[4954]: E1206 09:40:37.444475 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:40:50 crc kubenswrapper[4954]: I1206 09:40:50.444637 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:40:50 crc kubenswrapper[4954]: E1206 09:40:50.445758 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:41:02 crc kubenswrapper[4954]: I1206 09:41:02.443148 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:41:02 crc kubenswrapper[4954]: E1206 09:41:02.443863 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:41:17 crc kubenswrapper[4954]: I1206 09:41:17.444886 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:41:17 crc kubenswrapper[4954]: E1206 09:41:17.445785 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:41:29 crc kubenswrapper[4954]: I1206 09:41:29.444219 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:41:29 crc kubenswrapper[4954]: E1206 09:41:29.445341 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:41:40 crc kubenswrapper[4954]: I1206 09:41:40.116247 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x85rf"] Dec 06 09:41:40 crc kubenswrapper[4954]: E1206 09:41:40.117323 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58a0057-7b80-40e3-a76f-5d71b277f930" containerName="extract-content" Dec 06 09:41:40 crc kubenswrapper[4954]: I1206 09:41:40.117338 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58a0057-7b80-40e3-a76f-5d71b277f930" containerName="extract-content" Dec 06 09:41:40 crc kubenswrapper[4954]: E1206 09:41:40.117377 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58a0057-7b80-40e3-a76f-5d71b277f930" containerName="extract-utilities" Dec 06 09:41:40 crc kubenswrapper[4954]: I1206 09:41:40.117384 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58a0057-7b80-40e3-a76f-5d71b277f930" containerName="extract-utilities" Dec 06 09:41:40 crc kubenswrapper[4954]: E1206 09:41:40.117393 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58a0057-7b80-40e3-a76f-5d71b277f930" containerName="registry-server" Dec 06 09:41:40 crc kubenswrapper[4954]: I1206 09:41:40.117400 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58a0057-7b80-40e3-a76f-5d71b277f930" containerName="registry-server" Dec 06 09:41:40 crc kubenswrapper[4954]: I1206 09:41:40.117630 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="a58a0057-7b80-40e3-a76f-5d71b277f930" containerName="registry-server" Dec 06 09:41:40 crc kubenswrapper[4954]: I1206 09:41:40.119285 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x85rf" Dec 06 09:41:40 crc kubenswrapper[4954]: I1206 09:41:40.137786 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x85rf"] Dec 06 09:41:40 crc kubenswrapper[4954]: I1206 09:41:40.240321 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l49fj\" (UniqueName: \"kubernetes.io/projected/42877e5d-7a92-4b16-880c-07638ded2778-kube-api-access-l49fj\") pod \"community-operators-x85rf\" (UID: \"42877e5d-7a92-4b16-880c-07638ded2778\") " pod="openshift-marketplace/community-operators-x85rf" Dec 06 09:41:40 crc kubenswrapper[4954]: I1206 09:41:40.240674 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42877e5d-7a92-4b16-880c-07638ded2778-catalog-content\") pod \"community-operators-x85rf\" (UID: \"42877e5d-7a92-4b16-880c-07638ded2778\") " pod="openshift-marketplace/community-operators-x85rf" Dec 06 09:41:40 crc kubenswrapper[4954]: I1206 09:41:40.240813 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42877e5d-7a92-4b16-880c-07638ded2778-utilities\") pod \"community-operators-x85rf\" (UID: \"42877e5d-7a92-4b16-880c-07638ded2778\") " pod="openshift-marketplace/community-operators-x85rf" Dec 06 09:41:40 crc kubenswrapper[4954]: I1206 09:41:40.343406 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l49fj\" (UniqueName: \"kubernetes.io/projected/42877e5d-7a92-4b16-880c-07638ded2778-kube-api-access-l49fj\") pod \"community-operators-x85rf\" (UID: \"42877e5d-7a92-4b16-880c-07638ded2778\") " pod="openshift-marketplace/community-operators-x85rf" Dec 06 09:41:40 crc kubenswrapper[4954]: I1206 09:41:40.343461 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42877e5d-7a92-4b16-880c-07638ded2778-catalog-content\") pod \"community-operators-x85rf\" (UID: \"42877e5d-7a92-4b16-880c-07638ded2778\") " pod="openshift-marketplace/community-operators-x85rf" Dec 06 09:41:40 crc kubenswrapper[4954]: I1206 09:41:40.343533 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42877e5d-7a92-4b16-880c-07638ded2778-utilities\") pod \"community-operators-x85rf\" (UID: \"42877e5d-7a92-4b16-880c-07638ded2778\") " pod="openshift-marketplace/community-operators-x85rf" Dec 06 09:41:40 crc kubenswrapper[4954]: I1206 09:41:40.344227 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42877e5d-7a92-4b16-880c-07638ded2778-catalog-content\") pod \"community-operators-x85rf\" (UID: \"42877e5d-7a92-4b16-880c-07638ded2778\") " pod="openshift-marketplace/community-operators-x85rf" Dec 06 09:41:40 crc kubenswrapper[4954]: I1206 09:41:40.344291 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42877e5d-7a92-4b16-880c-07638ded2778-utilities\") pod \"community-operators-x85rf\" (UID: \"42877e5d-7a92-4b16-880c-07638ded2778\") " pod="openshift-marketplace/community-operators-x85rf" Dec 06 09:41:40 crc kubenswrapper[4954]: I1206 09:41:40.364628 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l49fj\" (UniqueName: \"kubernetes.io/projected/42877e5d-7a92-4b16-880c-07638ded2778-kube-api-access-l49fj\") pod \"community-operators-x85rf\" (UID: \"42877e5d-7a92-4b16-880c-07638ded2778\") " pod="openshift-marketplace/community-operators-x85rf" Dec 06 09:41:40 crc kubenswrapper[4954]: I1206 09:41:40.456332 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x85rf" Dec 06 09:41:41 crc kubenswrapper[4954]: I1206 09:41:41.006959 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x85rf"] Dec 06 09:41:41 crc kubenswrapper[4954]: I1206 09:41:41.556053 4954 generic.go:334] "Generic (PLEG): container finished" podID="42877e5d-7a92-4b16-880c-07638ded2778" containerID="3cf7199f141cf3a8bfc733a02fca44daf712e2afae25fa149be377bb2a80e67c" exitCode=0 Dec 06 09:41:41 crc kubenswrapper[4954]: I1206 09:41:41.556104 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x85rf" event={"ID":"42877e5d-7a92-4b16-880c-07638ded2778","Type":"ContainerDied","Data":"3cf7199f141cf3a8bfc733a02fca44daf712e2afae25fa149be377bb2a80e67c"} Dec 06 09:41:41 crc kubenswrapper[4954]: I1206 09:41:41.556133 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x85rf" event={"ID":"42877e5d-7a92-4b16-880c-07638ded2778","Type":"ContainerStarted","Data":"1f0076ff4df101a20888f0543263db01652715320a6461b4c56343369b2342b6"} Dec 06 09:41:42 crc kubenswrapper[4954]: I1206 09:41:42.445302 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:41:42 crc kubenswrapper[4954]: E1206 09:41:42.445882 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:41:42 crc kubenswrapper[4954]: I1206 09:41:42.566744 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x85rf" event={"ID":"42877e5d-7a92-4b16-880c-07638ded2778","Type":"ContainerStarted","Data":"591e630964e1b404a9d53867e0a8861a77331c172a529d8137a98ae2d68f3dcf"} Dec 06 09:41:43 crc kubenswrapper[4954]: I1206 09:41:43.581441 4954 generic.go:334] "Generic (PLEG): container finished" podID="42877e5d-7a92-4b16-880c-07638ded2778" containerID="591e630964e1b404a9d53867e0a8861a77331c172a529d8137a98ae2d68f3dcf" exitCode=0 Dec 06 09:41:43 crc kubenswrapper[4954]: I1206 09:41:43.581690 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x85rf" event={"ID":"42877e5d-7a92-4b16-880c-07638ded2778","Type":"ContainerDied","Data":"591e630964e1b404a9d53867e0a8861a77331c172a529d8137a98ae2d68f3dcf"} Dec 06 09:41:44 crc kubenswrapper[4954]: I1206 09:41:44.607514 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x85rf" event={"ID":"42877e5d-7a92-4b16-880c-07638ded2778","Type":"ContainerStarted","Data":"11e388b7d9a3878d863e4e06b32ea6a46016220797814b445745338b711015fc"} Dec 06 09:41:44 crc kubenswrapper[4954]: I1206 09:41:44.635721 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x85rf" podStartSLOduration=2.01443253 podStartE2EDuration="4.635702399s" podCreationTimestamp="2025-12-06 09:41:40 +0000 UTC" firstStartedPulling="2025-12-06 09:41:41.55789686 +0000 UTC m=+9876.371256249" lastFinishedPulling="2025-12-06 09:41:44.179166729 +0000 UTC m=+9878.992526118" observedRunningTime="2025-12-06 09:41:44.62484013 +0000 UTC m=+9879.438199519" watchObservedRunningTime="2025-12-06 09:41:44.635702399 +0000 UTC m=+9879.449061788" Dec 06 09:41:50 crc kubenswrapper[4954]: I1206 09:41:50.456858 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x85rf" Dec 06 09:41:50 crc kubenswrapper[4954]: I1206 09:41:50.457436 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x85rf" Dec 06 09:41:51 crc kubenswrapper[4954]: I1206 09:41:51.133410 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x85rf" Dec 06 09:41:51 crc kubenswrapper[4954]: I1206 09:41:51.179790 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x85rf" Dec 06 09:41:51 crc kubenswrapper[4954]: I1206 09:41:51.369265 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x85rf"] Dec 06 09:41:52 crc kubenswrapper[4954]: I1206 09:41:52.722869 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x85rf" podUID="42877e5d-7a92-4b16-880c-07638ded2778" containerName="registry-server" containerID="cri-o://11e388b7d9a3878d863e4e06b32ea6a46016220797814b445745338b711015fc" gracePeriod=2 Dec 06 09:41:53 crc kubenswrapper[4954]: I1206 09:41:53.443762 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:41:53 crc kubenswrapper[4954]: E1206 09:41:53.444305 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:41:53 crc kubenswrapper[4954]: I1206 09:41:53.736801 4954 generic.go:334] "Generic (PLEG): container finished" podID="42877e5d-7a92-4b16-880c-07638ded2778" containerID="11e388b7d9a3878d863e4e06b32ea6a46016220797814b445745338b711015fc" exitCode=0 Dec 06 09:41:53 crc kubenswrapper[4954]: I1206 09:41:53.736877 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x85rf" event={"ID":"42877e5d-7a92-4b16-880c-07638ded2778","Type":"ContainerDied","Data":"11e388b7d9a3878d863e4e06b32ea6a46016220797814b445745338b711015fc"} Dec 06 09:41:53 crc kubenswrapper[4954]: I1206 09:41:53.737767 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x85rf" event={"ID":"42877e5d-7a92-4b16-880c-07638ded2778","Type":"ContainerDied","Data":"1f0076ff4df101a20888f0543263db01652715320a6461b4c56343369b2342b6"} Dec 06 09:41:53 crc kubenswrapper[4954]: I1206 09:41:53.737841 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f0076ff4df101a20888f0543263db01652715320a6461b4c56343369b2342b6" Dec 06 09:41:53 crc kubenswrapper[4954]: I1206 09:41:53.762219 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x85rf" Dec 06 09:41:53 crc kubenswrapper[4954]: I1206 09:41:53.790759 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42877e5d-7a92-4b16-880c-07638ded2778-catalog-content\") pod \"42877e5d-7a92-4b16-880c-07638ded2778\" (UID: \"42877e5d-7a92-4b16-880c-07638ded2778\") " Dec 06 09:41:53 crc kubenswrapper[4954]: I1206 09:41:53.791203 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l49fj\" (UniqueName: \"kubernetes.io/projected/42877e5d-7a92-4b16-880c-07638ded2778-kube-api-access-l49fj\") pod \"42877e5d-7a92-4b16-880c-07638ded2778\" (UID: \"42877e5d-7a92-4b16-880c-07638ded2778\") " Dec 06 09:41:53 crc kubenswrapper[4954]: I1206 09:41:53.791392 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42877e5d-7a92-4b16-880c-07638ded2778-utilities\") pod \"42877e5d-7a92-4b16-880c-07638ded2778\" (UID: \"42877e5d-7a92-4b16-880c-07638ded2778\") " Dec 06 09:41:53 crc kubenswrapper[4954]: I1206 09:41:53.792279 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42877e5d-7a92-4b16-880c-07638ded2778-utilities" (OuterVolumeSpecName: "utilities") pod "42877e5d-7a92-4b16-880c-07638ded2778" (UID: "42877e5d-7a92-4b16-880c-07638ded2778"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:41:53 crc kubenswrapper[4954]: I1206 09:41:53.799536 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42877e5d-7a92-4b16-880c-07638ded2778-kube-api-access-l49fj" (OuterVolumeSpecName: "kube-api-access-l49fj") pod "42877e5d-7a92-4b16-880c-07638ded2778" (UID: "42877e5d-7a92-4b16-880c-07638ded2778"). InnerVolumeSpecName "kube-api-access-l49fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:41:53 crc kubenswrapper[4954]: I1206 09:41:53.857849 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42877e5d-7a92-4b16-880c-07638ded2778-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42877e5d-7a92-4b16-880c-07638ded2778" (UID: "42877e5d-7a92-4b16-880c-07638ded2778"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:41:53 crc kubenswrapper[4954]: I1206 09:41:53.893785 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42877e5d-7a92-4b16-880c-07638ded2778-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:41:53 crc kubenswrapper[4954]: I1206 09:41:53.893825 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l49fj\" (UniqueName: \"kubernetes.io/projected/42877e5d-7a92-4b16-880c-07638ded2778-kube-api-access-l49fj\") on node \"crc\" DevicePath \"\"" Dec 06 09:41:53 crc kubenswrapper[4954]: I1206 09:41:53.893837 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42877e5d-7a92-4b16-880c-07638ded2778-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:41:54 crc kubenswrapper[4954]: I1206 09:41:54.745830 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x85rf" Dec 06 09:41:54 crc kubenswrapper[4954]: I1206 09:41:54.780330 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x85rf"] Dec 06 09:41:54 crc kubenswrapper[4954]: I1206 09:41:54.795108 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x85rf"] Dec 06 09:41:55 crc kubenswrapper[4954]: I1206 09:41:55.459345 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42877e5d-7a92-4b16-880c-07638ded2778" path="/var/lib/kubelet/pods/42877e5d-7a92-4b16-880c-07638ded2778/volumes" Dec 06 09:42:08 crc kubenswrapper[4954]: I1206 09:42:08.443669 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:42:08 crc kubenswrapper[4954]: E1206 09:42:08.444440 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:42:22 crc kubenswrapper[4954]: I1206 09:42:22.443595 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:42:22 crc kubenswrapper[4954]: E1206 09:42:22.444323 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:42:34 crc kubenswrapper[4954]: I1206 09:42:34.444171 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:42:34 crc kubenswrapper[4954]: E1206 09:42:34.445021 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:42:46 crc kubenswrapper[4954]: I1206 09:42:46.443634 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:42:46 crc kubenswrapper[4954]: E1206 09:42:46.444262 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:43:00 crc kubenswrapper[4954]: I1206 09:43:00.467760 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:43:00 crc kubenswrapper[4954]: E1206 09:43:00.468883 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:43:11 crc kubenswrapper[4954]: I1206 09:43:11.443711 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:43:11 crc kubenswrapper[4954]: E1206 09:43:11.444657 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:43:22 crc kubenswrapper[4954]: I1206 09:43:22.444297 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:43:22 crc kubenswrapper[4954]: E1206 09:43:22.445365 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:43:34 crc kubenswrapper[4954]: I1206 09:43:34.443951 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:43:34 crc kubenswrapper[4954]: E1206 09:43:34.444952 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:43:44 crc kubenswrapper[4954]: I1206 09:43:44.689473 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-87jnb"] Dec 06 09:43:44 crc kubenswrapper[4954]: E1206 09:43:44.705904 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42877e5d-7a92-4b16-880c-07638ded2778" containerName="registry-server" Dec 06 09:43:44 crc kubenswrapper[4954]: I1206 09:43:44.705989 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="42877e5d-7a92-4b16-880c-07638ded2778" containerName="registry-server" Dec 06 09:43:44 crc kubenswrapper[4954]: E1206 09:43:44.706162 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42877e5d-7a92-4b16-880c-07638ded2778" containerName="extract-content" Dec 06 09:43:44 crc kubenswrapper[4954]: I1206 09:43:44.706181 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="42877e5d-7a92-4b16-880c-07638ded2778" containerName="extract-content" Dec 06 09:43:44 crc kubenswrapper[4954]: E1206 09:43:44.706263 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42877e5d-7a92-4b16-880c-07638ded2778" containerName="extract-utilities" Dec 06 09:43:44 crc kubenswrapper[4954]: I1206 09:43:44.706287 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="42877e5d-7a92-4b16-880c-07638ded2778" containerName="extract-utilities" Dec 06 09:43:44 crc kubenswrapper[4954]: I1206 09:43:44.707233 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="42877e5d-7a92-4b16-880c-07638ded2778" containerName="registry-server" Dec 06 09:43:44 crc kubenswrapper[4954]: I1206 09:43:44.713006 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-87jnb"] Dec 06 09:43:44 crc kubenswrapper[4954]: I1206 09:43:44.713200 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-87jnb" Dec 06 09:43:44 crc kubenswrapper[4954]: I1206 09:43:44.770212 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1-utilities\") pod \"redhat-operators-87jnb\" (UID: \"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1\") " pod="openshift-marketplace/redhat-operators-87jnb" Dec 06 09:43:44 crc kubenswrapper[4954]: I1206 09:43:44.770305 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1-catalog-content\") pod \"redhat-operators-87jnb\" (UID: \"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1\") " pod="openshift-marketplace/redhat-operators-87jnb" Dec 06 09:43:44 crc kubenswrapper[4954]: I1206 09:43:44.770344 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzpvv\" (UniqueName: \"kubernetes.io/projected/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1-kube-api-access-pzpvv\") pod \"redhat-operators-87jnb\" (UID: \"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1\") " pod="openshift-marketplace/redhat-operators-87jnb" Dec 06 09:43:44 crc kubenswrapper[4954]: I1206 09:43:44.872334 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1-utilities\") pod \"redhat-operators-87jnb\" (UID: \"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1\") " pod="openshift-marketplace/redhat-operators-87jnb" Dec 06 09:43:44 crc kubenswrapper[4954]: I1206 09:43:44.872446 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1-catalog-content\") pod \"redhat-operators-87jnb\" (UID: \"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1\") " pod="openshift-marketplace/redhat-operators-87jnb" Dec 06 09:43:44 crc kubenswrapper[4954]: I1206 09:43:44.872480 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzpvv\" (UniqueName: \"kubernetes.io/projected/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1-kube-api-access-pzpvv\") pod \"redhat-operators-87jnb\" (UID: \"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1\") " pod="openshift-marketplace/redhat-operators-87jnb" Dec 06 09:43:44 crc kubenswrapper[4954]: I1206 09:43:44.872982 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1-utilities\") pod \"redhat-operators-87jnb\" (UID: \"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1\") " pod="openshift-marketplace/redhat-operators-87jnb" Dec 06 09:43:44 crc kubenswrapper[4954]: I1206 09:43:44.873093 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1-catalog-content\") pod \"redhat-operators-87jnb\" (UID: \"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1\") " pod="openshift-marketplace/redhat-operators-87jnb" Dec 06 09:43:44 crc kubenswrapper[4954]: I1206 09:43:44.895384 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzpvv\" (UniqueName: \"kubernetes.io/projected/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1-kube-api-access-pzpvv\") pod \"redhat-operators-87jnb\" (UID: \"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1\") " pod="openshift-marketplace/redhat-operators-87jnb" Dec 06 09:43:45 crc kubenswrapper[4954]: I1206 09:43:45.043744 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-87jnb" Dec 06 09:43:45 crc kubenswrapper[4954]: I1206 09:43:45.565770 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-87jnb"] Dec 06 09:43:45 crc kubenswrapper[4954]: I1206 09:43:45.952123 4954 generic.go:334] "Generic (PLEG): container finished" podID="0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1" containerID="9311480085f932bf0101ab076b2b5b238c8f2e9c0e4b4b5d1a2ab4b4cae71a33" exitCode=0 Dec 06 09:43:45 crc kubenswrapper[4954]: I1206 09:43:45.952307 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87jnb" event={"ID":"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1","Type":"ContainerDied","Data":"9311480085f932bf0101ab076b2b5b238c8f2e9c0e4b4b5d1a2ab4b4cae71a33"} Dec 06 09:43:45 crc kubenswrapper[4954]: I1206 09:43:45.952424 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87jnb" event={"ID":"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1","Type":"ContainerStarted","Data":"d90bfea11e54ac2eb27907b5306d0e931a2c4ceeff27300f4a8992fb56da96ff"} Dec 06 09:43:46 crc kubenswrapper[4954]: I1206 09:43:46.443824 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:43:46 crc kubenswrapper[4954]: E1206 09:43:46.444430 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:43:47 crc kubenswrapper[4954]: I1206 09:43:47.971998 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87jnb" event={"ID":"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1","Type":"ContainerStarted","Data":"7e059892a1f86207b6e32ad536ab178e33d2d2b87a6ee153b88ee988aa6a0104"} Dec 06 09:43:49 crc kubenswrapper[4954]: I1206 09:43:49.994436 4954 generic.go:334] "Generic (PLEG): container finished" podID="0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1" containerID="7e059892a1f86207b6e32ad536ab178e33d2d2b87a6ee153b88ee988aa6a0104" exitCode=0 Dec 06 09:43:49 crc kubenswrapper[4954]: I1206 09:43:49.994492 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87jnb" event={"ID":"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1","Type":"ContainerDied","Data":"7e059892a1f86207b6e32ad536ab178e33d2d2b87a6ee153b88ee988aa6a0104"} Dec 06 09:43:52 crc kubenswrapper[4954]: I1206 09:43:52.017306 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87jnb" event={"ID":"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1","Type":"ContainerStarted","Data":"1809797a746080a263652503ac78d572d07e5eaab2b04af896975262d45174bb"} Dec 06 09:43:52 crc kubenswrapper[4954]: I1206 09:43:52.041100 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-87jnb" podStartSLOduration=2.955941852 podStartE2EDuration="8.041057765s" podCreationTimestamp="2025-12-06 09:43:44 +0000 UTC" firstStartedPulling="2025-12-06 09:43:45.955500955 +0000 UTC m=+10000.768860344" lastFinishedPulling="2025-12-06 09:43:51.040616858 +0000 UTC m=+10005.853976257" observedRunningTime="2025-12-06 09:43:52.033520555 +0000 UTC m=+10006.846879944" watchObservedRunningTime="2025-12-06 09:43:52.041057765 +0000 UTC m=+10006.854417154" Dec 06 09:43:55 crc kubenswrapper[4954]: I1206 09:43:55.044777 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-87jnb" Dec 06 09:43:55 crc kubenswrapper[4954]: I1206 09:43:55.045092 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-87jnb" Dec 06 09:43:56 crc kubenswrapper[4954]: I1206 09:43:56.096183 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-87jnb" podUID="0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1" containerName="registry-server" probeResult="failure" output=< Dec 06 09:43:56 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 09:43:56 crc kubenswrapper[4954]: > Dec 06 09:43:57 crc kubenswrapper[4954]: I1206 09:43:57.444314 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:43:57 crc kubenswrapper[4954]: E1206 09:43:57.444626 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:44:05 crc kubenswrapper[4954]: I1206 09:44:05.118511 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-87jnb" Dec 06 09:44:05 crc kubenswrapper[4954]: I1206 09:44:05.178668 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-87jnb" Dec 06 09:44:05 crc kubenswrapper[4954]: I1206 09:44:05.379423 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-87jnb"] Dec 06 09:44:06 crc kubenswrapper[4954]: I1206 09:44:06.168294 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-87jnb" podUID="0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1" containerName="registry-server" containerID="cri-o://1809797a746080a263652503ac78d572d07e5eaab2b04af896975262d45174bb" gracePeriod=2 Dec 06 09:44:06 crc kubenswrapper[4954]: I1206 09:44:06.719396 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-87jnb" Dec 06 09:44:06 crc kubenswrapper[4954]: I1206 09:44:06.824057 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzpvv\" (UniqueName: \"kubernetes.io/projected/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1-kube-api-access-pzpvv\") pod \"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1\" (UID: \"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1\") " Dec 06 09:44:06 crc kubenswrapper[4954]: I1206 09:44:06.824123 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1-utilities\") pod \"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1\" (UID: \"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1\") " Dec 06 09:44:06 crc kubenswrapper[4954]: I1206 09:44:06.824256 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1-catalog-content\") pod \"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1\" (UID: \"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1\") " Dec 06 09:44:06 crc kubenswrapper[4954]: I1206 09:44:06.825037 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1-utilities" (OuterVolumeSpecName: "utilities") pod "0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1" (UID: "0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:44:06 crc kubenswrapper[4954]: I1206 09:44:06.825266 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:44:06 crc kubenswrapper[4954]: I1206 09:44:06.829852 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1-kube-api-access-pzpvv" (OuterVolumeSpecName: "kube-api-access-pzpvv") pod "0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1" (UID: "0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1"). InnerVolumeSpecName "kube-api-access-pzpvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:44:06 crc kubenswrapper[4954]: I1206 09:44:06.927593 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzpvv\" (UniqueName: \"kubernetes.io/projected/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1-kube-api-access-pzpvv\") on node \"crc\" DevicePath \"\"" Dec 06 09:44:06 crc kubenswrapper[4954]: I1206 09:44:06.946500 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1" (UID: "0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:44:07 crc kubenswrapper[4954]: I1206 09:44:07.029889 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:44:07 crc kubenswrapper[4954]: I1206 09:44:07.178600 4954 generic.go:334] "Generic (PLEG): container finished" podID="0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1" containerID="1809797a746080a263652503ac78d572d07e5eaab2b04af896975262d45174bb" exitCode=0 Dec 06 09:44:07 crc kubenswrapper[4954]: I1206 09:44:07.178653 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-87jnb" Dec 06 09:44:07 crc kubenswrapper[4954]: I1206 09:44:07.178677 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87jnb" event={"ID":"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1","Type":"ContainerDied","Data":"1809797a746080a263652503ac78d572d07e5eaab2b04af896975262d45174bb"} Dec 06 09:44:07 crc kubenswrapper[4954]: I1206 09:44:07.178720 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87jnb" event={"ID":"0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1","Type":"ContainerDied","Data":"d90bfea11e54ac2eb27907b5306d0e931a2c4ceeff27300f4a8992fb56da96ff"} Dec 06 09:44:07 crc kubenswrapper[4954]: I1206 09:44:07.178768 4954 scope.go:117] "RemoveContainer" containerID="1809797a746080a263652503ac78d572d07e5eaab2b04af896975262d45174bb" Dec 06 09:44:07 crc kubenswrapper[4954]: I1206 09:44:07.201155 4954 scope.go:117] "RemoveContainer" containerID="7e059892a1f86207b6e32ad536ab178e33d2d2b87a6ee153b88ee988aa6a0104" Dec 06 09:44:07 crc kubenswrapper[4954]: I1206 09:44:07.217547 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-87jnb"] Dec 06 09:44:07 crc kubenswrapper[4954]: I1206 09:44:07.231220 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-87jnb"] Dec 06 09:44:07 crc kubenswrapper[4954]: I1206 09:44:07.457659 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1" path="/var/lib/kubelet/pods/0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1/volumes" Dec 06 09:44:07 crc kubenswrapper[4954]: I1206 09:44:07.685846 4954 scope.go:117] "RemoveContainer" containerID="9311480085f932bf0101ab076b2b5b238c8f2e9c0e4b4b5d1a2ab4b4cae71a33" Dec 06 09:44:07 crc kubenswrapper[4954]: I1206 09:44:07.756985 4954 scope.go:117] "RemoveContainer" containerID="1809797a746080a263652503ac78d572d07e5eaab2b04af896975262d45174bb" Dec 06 09:44:07 crc kubenswrapper[4954]: E1206 09:44:07.757532 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1809797a746080a263652503ac78d572d07e5eaab2b04af896975262d45174bb\": container with ID starting with 1809797a746080a263652503ac78d572d07e5eaab2b04af896975262d45174bb not found: ID does not exist" containerID="1809797a746080a263652503ac78d572d07e5eaab2b04af896975262d45174bb" Dec 06 09:44:07 crc kubenswrapper[4954]: I1206 09:44:07.757577 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1809797a746080a263652503ac78d572d07e5eaab2b04af896975262d45174bb"} err="failed to get container status \"1809797a746080a263652503ac78d572d07e5eaab2b04af896975262d45174bb\": rpc error: code = NotFound desc = could not find container \"1809797a746080a263652503ac78d572d07e5eaab2b04af896975262d45174bb\": container with ID starting with 1809797a746080a263652503ac78d572d07e5eaab2b04af896975262d45174bb not found: ID does not exist" Dec 06 09:44:07 crc kubenswrapper[4954]: I1206 09:44:07.757597 4954 scope.go:117] "RemoveContainer" containerID="7e059892a1f86207b6e32ad536ab178e33d2d2b87a6ee153b88ee988aa6a0104" Dec 06 09:44:07 crc kubenswrapper[4954]: E1206 09:44:07.758840 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e059892a1f86207b6e32ad536ab178e33d2d2b87a6ee153b88ee988aa6a0104\": container with ID starting with 7e059892a1f86207b6e32ad536ab178e33d2d2b87a6ee153b88ee988aa6a0104 not found: ID does not exist" containerID="7e059892a1f86207b6e32ad536ab178e33d2d2b87a6ee153b88ee988aa6a0104" Dec 06 09:44:07 crc kubenswrapper[4954]: I1206 09:44:07.758869 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e059892a1f86207b6e32ad536ab178e33d2d2b87a6ee153b88ee988aa6a0104"} err="failed to get container status \"7e059892a1f86207b6e32ad536ab178e33d2d2b87a6ee153b88ee988aa6a0104\": rpc error: code = NotFound desc = could not find container \"7e059892a1f86207b6e32ad536ab178e33d2d2b87a6ee153b88ee988aa6a0104\": container with ID starting with 7e059892a1f86207b6e32ad536ab178e33d2d2b87a6ee153b88ee988aa6a0104 not found: ID does not exist" Dec 06 09:44:07 crc kubenswrapper[4954]: I1206 09:44:07.758884 4954 scope.go:117] "RemoveContainer" containerID="9311480085f932bf0101ab076b2b5b238c8f2e9c0e4b4b5d1a2ab4b4cae71a33" Dec 06 09:44:07 crc kubenswrapper[4954]: E1206 09:44:07.759118 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9311480085f932bf0101ab076b2b5b238c8f2e9c0e4b4b5d1a2ab4b4cae71a33\": container with ID starting with 9311480085f932bf0101ab076b2b5b238c8f2e9c0e4b4b5d1a2ab4b4cae71a33 not found: ID does not exist" containerID="9311480085f932bf0101ab076b2b5b238c8f2e9c0e4b4b5d1a2ab4b4cae71a33" Dec 06 09:44:07 crc kubenswrapper[4954]: I1206 09:44:07.759142 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9311480085f932bf0101ab076b2b5b238c8f2e9c0e4b4b5d1a2ab4b4cae71a33"} err="failed to get container status \"9311480085f932bf0101ab076b2b5b238c8f2e9c0e4b4b5d1a2ab4b4cae71a33\": rpc error: code = NotFound desc = could not find container \"9311480085f932bf0101ab076b2b5b238c8f2e9c0e4b4b5d1a2ab4b4cae71a33\": container with ID starting with 9311480085f932bf0101ab076b2b5b238c8f2e9c0e4b4b5d1a2ab4b4cae71a33 not found: ID does not exist" Dec 06 09:44:08 crc kubenswrapper[4954]: I1206 09:44:08.443741 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:44:08 crc kubenswrapper[4954]: E1206 09:44:08.444372 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:44:20 crc kubenswrapper[4954]: I1206 09:44:20.444195 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:44:20 crc kubenswrapper[4954]: E1206 09:44:20.445078 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:44:32 crc kubenswrapper[4954]: I1206 09:44:32.443487 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:44:32 crc kubenswrapper[4954]: E1206 09:44:32.444304 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:44:43 crc kubenswrapper[4954]: I1206 09:44:43.444193 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:44:43 crc kubenswrapper[4954]: E1206 09:44:43.444828 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:44:54 crc kubenswrapper[4954]: I1206 09:44:54.442949 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:44:54 crc kubenswrapper[4954]: E1206 09:44:54.443633 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.166647 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f"] Dec 06 09:45:00 crc kubenswrapper[4954]: E1206 09:45:00.167545 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1" containerName="extract-content" Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.167579 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1" containerName="extract-content" Dec 06 09:45:00 crc kubenswrapper[4954]: E1206 09:45:00.167605 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1" containerName="extract-utilities" Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.167611 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1" containerName="extract-utilities" Dec 06 09:45:00 crc kubenswrapper[4954]: E1206 09:45:00.167633 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1" containerName="registry-server" Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.167639 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1" containerName="registry-server" Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.167835 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc1d0a6-7bc7-4efc-b267-e26d1a7e33f1" containerName="registry-server" Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.168608 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f" Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.170685 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.171087 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.179572 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f"] Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.251670 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mvwp\" (UniqueName: \"kubernetes.io/projected/f29c6b96-904b-4656-a13d-c338c5173352-kube-api-access-7mvwp\") pod \"collect-profiles-29416905-zmj6f\" (UID: \"f29c6b96-904b-4656-a13d-c338c5173352\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f" Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.252168 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f29c6b96-904b-4656-a13d-c338c5173352-config-volume\") pod \"collect-profiles-29416905-zmj6f\" (UID: \"f29c6b96-904b-4656-a13d-c338c5173352\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f" Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.252451 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f29c6b96-904b-4656-a13d-c338c5173352-secret-volume\") pod \"collect-profiles-29416905-zmj6f\" (UID: \"f29c6b96-904b-4656-a13d-c338c5173352\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f" Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.354305 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f29c6b96-904b-4656-a13d-c338c5173352-secret-volume\") pod \"collect-profiles-29416905-zmj6f\" (UID: \"f29c6b96-904b-4656-a13d-c338c5173352\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f" Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.354416 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mvwp\" (UniqueName: \"kubernetes.io/projected/f29c6b96-904b-4656-a13d-c338c5173352-kube-api-access-7mvwp\") pod \"collect-profiles-29416905-zmj6f\" (UID: \"f29c6b96-904b-4656-a13d-c338c5173352\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f" Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.354515 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f29c6b96-904b-4656-a13d-c338c5173352-config-volume\") pod \"collect-profiles-29416905-zmj6f\" (UID: \"f29c6b96-904b-4656-a13d-c338c5173352\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f" Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.357195 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f29c6b96-904b-4656-a13d-c338c5173352-config-volume\") pod \"collect-profiles-29416905-zmj6f\" (UID: \"f29c6b96-904b-4656-a13d-c338c5173352\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f" Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.370428 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f29c6b96-904b-4656-a13d-c338c5173352-secret-volume\") pod \"collect-profiles-29416905-zmj6f\" (UID: \"f29c6b96-904b-4656-a13d-c338c5173352\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f" Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.373806 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mvwp\" (UniqueName: \"kubernetes.io/projected/f29c6b96-904b-4656-a13d-c338c5173352-kube-api-access-7mvwp\") pod \"collect-profiles-29416905-zmj6f\" (UID: \"f29c6b96-904b-4656-a13d-c338c5173352\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f" Dec 06 09:45:00 crc kubenswrapper[4954]: I1206 09:45:00.541552 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f" Dec 06 09:45:01 crc kubenswrapper[4954]: I1206 09:45:01.062995 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f"] Dec 06 09:45:01 crc kubenswrapper[4954]: I1206 09:45:01.784050 4954 generic.go:334] "Generic (PLEG): container finished" podID="f29c6b96-904b-4656-a13d-c338c5173352" containerID="ae6775abc5896a431d58f0a2fe22ceaf79e7edd6b6c5ae3e64c048fa4b9b9166" exitCode=0 Dec 06 09:45:01 crc kubenswrapper[4954]: I1206 09:45:01.784108 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f" event={"ID":"f29c6b96-904b-4656-a13d-c338c5173352","Type":"ContainerDied","Data":"ae6775abc5896a431d58f0a2fe22ceaf79e7edd6b6c5ae3e64c048fa4b9b9166"} Dec 06 09:45:01 crc kubenswrapper[4954]: I1206 09:45:01.784373 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f" event={"ID":"f29c6b96-904b-4656-a13d-c338c5173352","Type":"ContainerStarted","Data":"b466fa41e23b75e145aec8a8ff6174a2481d1519875c74fcb8b750cc02f6dc56"} Dec 06 09:45:03 crc kubenswrapper[4954]: I1206 09:45:03.184910 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f" Dec 06 09:45:03 crc kubenswrapper[4954]: I1206 09:45:03.325056 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f29c6b96-904b-4656-a13d-c338c5173352-secret-volume\") pod \"f29c6b96-904b-4656-a13d-c338c5173352\" (UID: \"f29c6b96-904b-4656-a13d-c338c5173352\") " Dec 06 09:45:03 crc kubenswrapper[4954]: I1206 09:45:03.325330 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mvwp\" (UniqueName: \"kubernetes.io/projected/f29c6b96-904b-4656-a13d-c338c5173352-kube-api-access-7mvwp\") pod \"f29c6b96-904b-4656-a13d-c338c5173352\" (UID: \"f29c6b96-904b-4656-a13d-c338c5173352\") " Dec 06 09:45:03 crc kubenswrapper[4954]: I1206 09:45:03.325368 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f29c6b96-904b-4656-a13d-c338c5173352-config-volume\") pod \"f29c6b96-904b-4656-a13d-c338c5173352\" (UID: \"f29c6b96-904b-4656-a13d-c338c5173352\") " Dec 06 09:45:03 crc kubenswrapper[4954]: I1206 09:45:03.326430 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f29c6b96-904b-4656-a13d-c338c5173352-config-volume" (OuterVolumeSpecName: "config-volume") pod "f29c6b96-904b-4656-a13d-c338c5173352" (UID: "f29c6b96-904b-4656-a13d-c338c5173352"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:45:03 crc kubenswrapper[4954]: I1206 09:45:03.339802 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f29c6b96-904b-4656-a13d-c338c5173352-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f29c6b96-904b-4656-a13d-c338c5173352" (UID: "f29c6b96-904b-4656-a13d-c338c5173352"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:45:03 crc kubenswrapper[4954]: I1206 09:45:03.345388 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f29c6b96-904b-4656-a13d-c338c5173352-kube-api-access-7mvwp" (OuterVolumeSpecName: "kube-api-access-7mvwp") pod "f29c6b96-904b-4656-a13d-c338c5173352" (UID: "f29c6b96-904b-4656-a13d-c338c5173352"). InnerVolumeSpecName "kube-api-access-7mvwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:45:03 crc kubenswrapper[4954]: I1206 09:45:03.428176 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f29c6b96-904b-4656-a13d-c338c5173352-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 09:45:03 crc kubenswrapper[4954]: I1206 09:45:03.428222 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mvwp\" (UniqueName: \"kubernetes.io/projected/f29c6b96-904b-4656-a13d-c338c5173352-kube-api-access-7mvwp\") on node \"crc\" DevicePath \"\"" Dec 06 09:45:03 crc kubenswrapper[4954]: I1206 09:45:03.428233 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f29c6b96-904b-4656-a13d-c338c5173352-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 09:45:03 crc kubenswrapper[4954]: I1206 09:45:03.814016 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f" event={"ID":"f29c6b96-904b-4656-a13d-c338c5173352","Type":"ContainerDied","Data":"b466fa41e23b75e145aec8a8ff6174a2481d1519875c74fcb8b750cc02f6dc56"} Dec 06 09:45:03 crc kubenswrapper[4954]: I1206 09:45:03.814056 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b466fa41e23b75e145aec8a8ff6174a2481d1519875c74fcb8b750cc02f6dc56" Dec 06 09:45:03 crc kubenswrapper[4954]: I1206 09:45:03.814055 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f" Dec 06 09:45:04 crc kubenswrapper[4954]: I1206 09:45:04.273879 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb"] Dec 06 09:45:04 crc kubenswrapper[4954]: I1206 09:45:04.283250 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416860-mc6mb"] Dec 06 09:45:05 crc kubenswrapper[4954]: I1206 09:45:05.458348 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fff71e07-e11a-46fe-873b-4503a384170e" path="/var/lib/kubelet/pods/fff71e07-e11a-46fe-873b-4503a384170e/volumes" Dec 06 09:45:09 crc kubenswrapper[4954]: I1206 09:45:09.444018 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:45:09 crc kubenswrapper[4954]: E1206 09:45:09.445126 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:45:22 crc kubenswrapper[4954]: I1206 09:45:22.444316 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:45:23 crc kubenswrapper[4954]: I1206 09:45:23.071088 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"ecc8d0836703f7e1eb88ddf5d2d3f6cc0d402d1a725f2802a1c6e875ac74d2b9"} Dec 06 09:45:25 crc kubenswrapper[4954]: I1206 09:45:25.794330 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5dqgq"] Dec 06 09:45:25 crc kubenswrapper[4954]: E1206 09:45:25.797756 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29c6b96-904b-4656-a13d-c338c5173352" containerName="collect-profiles" Dec 06 09:45:25 crc kubenswrapper[4954]: I1206 09:45:25.798000 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29c6b96-904b-4656-a13d-c338c5173352" containerName="collect-profiles" Dec 06 09:45:25 crc kubenswrapper[4954]: I1206 09:45:25.798465 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f29c6b96-904b-4656-a13d-c338c5173352" containerName="collect-profiles" Dec 06 09:45:25 crc kubenswrapper[4954]: I1206 09:45:25.809519 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5dqgq"] Dec 06 09:45:25 crc kubenswrapper[4954]: I1206 09:45:25.809646 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5dqgq" Dec 06 09:45:25 crc kubenswrapper[4954]: I1206 09:45:25.909397 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd-catalog-content\") pod \"certified-operators-5dqgq\" (UID: \"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd\") " pod="openshift-marketplace/certified-operators-5dqgq" Dec 06 09:45:25 crc kubenswrapper[4954]: I1206 09:45:25.909518 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd-utilities\") pod \"certified-operators-5dqgq\" (UID: \"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd\") " pod="openshift-marketplace/certified-operators-5dqgq" Dec 06 09:45:25 crc kubenswrapper[4954]: I1206 09:45:25.909810 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtfk6\" (UniqueName: \"kubernetes.io/projected/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd-kube-api-access-vtfk6\") pod \"certified-operators-5dqgq\" (UID: \"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd\") " pod="openshift-marketplace/certified-operators-5dqgq" Dec 06 09:45:26 crc kubenswrapper[4954]: I1206 09:45:26.012184 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtfk6\" (UniqueName: \"kubernetes.io/projected/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd-kube-api-access-vtfk6\") pod \"certified-operators-5dqgq\" (UID: \"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd\") " pod="openshift-marketplace/certified-operators-5dqgq" Dec 06 09:45:26 crc kubenswrapper[4954]: I1206 09:45:26.012660 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd-catalog-content\") pod \"certified-operators-5dqgq\" (UID: \"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd\") " pod="openshift-marketplace/certified-operators-5dqgq" Dec 06 09:45:26 crc kubenswrapper[4954]: I1206 09:45:26.012795 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd-utilities\") pod \"certified-operators-5dqgq\" (UID: \"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd\") " pod="openshift-marketplace/certified-operators-5dqgq" Dec 06 09:45:26 crc kubenswrapper[4954]: I1206 09:45:26.013198 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd-catalog-content\") pod \"certified-operators-5dqgq\" (UID: \"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd\") " pod="openshift-marketplace/certified-operators-5dqgq" Dec 06 09:45:26 crc kubenswrapper[4954]: I1206 09:45:26.013319 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd-utilities\") pod \"certified-operators-5dqgq\" (UID: \"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd\") " pod="openshift-marketplace/certified-operators-5dqgq" Dec 06 09:45:26 crc kubenswrapper[4954]: I1206 09:45:26.037510 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtfk6\" (UniqueName: \"kubernetes.io/projected/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd-kube-api-access-vtfk6\") pod \"certified-operators-5dqgq\" (UID: \"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd\") " pod="openshift-marketplace/certified-operators-5dqgq" Dec 06 09:45:26 crc kubenswrapper[4954]: I1206 09:45:26.143715 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5dqgq" Dec 06 09:45:26 crc kubenswrapper[4954]: I1206 09:45:26.722119 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5dqgq"] Dec 06 09:45:27 crc kubenswrapper[4954]: I1206 09:45:27.124490 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dqgq" event={"ID":"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd","Type":"ContainerStarted","Data":"877615df47b6e28c66a1223a748cee19d5669283d49043369d42d1414a759f13"} Dec 06 09:45:28 crc kubenswrapper[4954]: I1206 09:45:28.137239 4954 generic.go:334] "Generic (PLEG): container finished" podID="97f27249-c1cd-42b8-85d0-27c8ebe4a8dd" containerID="b762d6722d2f3f8a96f8adc298a907fc6d6c45589d5ea450d4b2b0ac6f30f421" exitCode=0 Dec 06 09:45:28 crc kubenswrapper[4954]: I1206 09:45:28.137534 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dqgq" event={"ID":"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd","Type":"ContainerDied","Data":"b762d6722d2f3f8a96f8adc298a907fc6d6c45589d5ea450d4b2b0ac6f30f421"} Dec 06 09:45:28 crc kubenswrapper[4954]: I1206 09:45:28.140214 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 09:45:29 crc kubenswrapper[4954]: I1206 09:45:29.917599 4954 scope.go:117] "RemoveContainer" containerID="333f424f7ebedbbabc5761b83016cf316e7a55a8f0bd716ca04772782f6d365f" Dec 06 09:45:30 crc kubenswrapper[4954]: I1206 09:45:30.159016 4954 generic.go:334] "Generic (PLEG): container finished" podID="97f27249-c1cd-42b8-85d0-27c8ebe4a8dd" containerID="51cf9f632470cfb448d75abf82be830c79ca2343734de41c2e9af7f7a38a8e20" exitCode=0 Dec 06 09:45:30 crc kubenswrapper[4954]: I1206 09:45:30.159073 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dqgq" event={"ID":"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd","Type":"ContainerDied","Data":"51cf9f632470cfb448d75abf82be830c79ca2343734de41c2e9af7f7a38a8e20"} Dec 06 09:45:31 crc kubenswrapper[4954]: I1206 09:45:31.169613 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dqgq" event={"ID":"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd","Type":"ContainerStarted","Data":"a4cf77b3621542fa14e1b155bf69bada7b52f3a246e12c01ea54903372e46707"} Dec 06 09:45:31 crc kubenswrapper[4954]: I1206 09:45:31.193995 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5dqgq" podStartSLOduration=3.765775873 podStartE2EDuration="6.193975748s" podCreationTimestamp="2025-12-06 09:45:25 +0000 UTC" firstStartedPulling="2025-12-06 09:45:28.13983161 +0000 UTC m=+10102.953191019" lastFinishedPulling="2025-12-06 09:45:30.568031505 +0000 UTC m=+10105.381390894" observedRunningTime="2025-12-06 09:45:31.18768517 +0000 UTC m=+10106.001044569" watchObservedRunningTime="2025-12-06 09:45:31.193975748 +0000 UTC m=+10106.007335137" Dec 06 09:45:36 crc kubenswrapper[4954]: I1206 09:45:36.144299 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5dqgq" Dec 06 09:45:36 crc kubenswrapper[4954]: I1206 09:45:36.144961 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5dqgq" Dec 06 09:45:36 crc kubenswrapper[4954]: I1206 09:45:36.194055 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5dqgq" Dec 06 09:45:36 crc kubenswrapper[4954]: I1206 09:45:36.272859 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5dqgq" Dec 06 09:45:36 crc kubenswrapper[4954]: I1206 09:45:36.442637 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5dqgq"] Dec 06 09:45:38 crc kubenswrapper[4954]: I1206 09:45:38.243923 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5dqgq" podUID="97f27249-c1cd-42b8-85d0-27c8ebe4a8dd" containerName="registry-server" containerID="cri-o://a4cf77b3621542fa14e1b155bf69bada7b52f3a246e12c01ea54903372e46707" gracePeriod=2 Dec 06 09:45:38 crc kubenswrapper[4954]: I1206 09:45:38.834741 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5dqgq" Dec 06 09:45:38 crc kubenswrapper[4954]: I1206 09:45:38.900453 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd-utilities\") pod \"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd\" (UID: \"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd\") " Dec 06 09:45:38 crc kubenswrapper[4954]: I1206 09:45:38.900533 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtfk6\" (UniqueName: \"kubernetes.io/projected/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd-kube-api-access-vtfk6\") pod \"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd\" (UID: \"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd\") " Dec 06 09:45:38 crc kubenswrapper[4954]: I1206 09:45:38.900802 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd-catalog-content\") pod \"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd\" (UID: \"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd\") " Dec 06 09:45:38 crc kubenswrapper[4954]: I1206 09:45:38.901770 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd-utilities" (OuterVolumeSpecName: "utilities") pod "97f27249-c1cd-42b8-85d0-27c8ebe4a8dd" (UID: "97f27249-c1cd-42b8-85d0-27c8ebe4a8dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:45:38 crc kubenswrapper[4954]: I1206 09:45:38.925333 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd-kube-api-access-vtfk6" (OuterVolumeSpecName: "kube-api-access-vtfk6") pod "97f27249-c1cd-42b8-85d0-27c8ebe4a8dd" (UID: "97f27249-c1cd-42b8-85d0-27c8ebe4a8dd"). InnerVolumeSpecName "kube-api-access-vtfk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:45:38 crc kubenswrapper[4954]: I1206 09:45:38.975859 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97f27249-c1cd-42b8-85d0-27c8ebe4a8dd" (UID: "97f27249-c1cd-42b8-85d0-27c8ebe4a8dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.003717 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.003753 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtfk6\" (UniqueName: \"kubernetes.io/projected/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd-kube-api-access-vtfk6\") on node \"crc\" DevicePath \"\"" Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.003764 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.254775 4954 generic.go:334] "Generic (PLEG): container finished" podID="97f27249-c1cd-42b8-85d0-27c8ebe4a8dd" containerID="a4cf77b3621542fa14e1b155bf69bada7b52f3a246e12c01ea54903372e46707" exitCode=0 Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.254814 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dqgq" event={"ID":"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd","Type":"ContainerDied","Data":"a4cf77b3621542fa14e1b155bf69bada7b52f3a246e12c01ea54903372e46707"} Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.254838 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dqgq" event={"ID":"97f27249-c1cd-42b8-85d0-27c8ebe4a8dd","Type":"ContainerDied","Data":"877615df47b6e28c66a1223a748cee19d5669283d49043369d42d1414a759f13"} Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.254849 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5dqgq" Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.254857 4954 scope.go:117] "RemoveContainer" containerID="a4cf77b3621542fa14e1b155bf69bada7b52f3a246e12c01ea54903372e46707" Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.286186 4954 scope.go:117] "RemoveContainer" containerID="51cf9f632470cfb448d75abf82be830c79ca2343734de41c2e9af7f7a38a8e20" Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.294306 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5dqgq"] Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.303664 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5dqgq"] Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.454472 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97f27249-c1cd-42b8-85d0-27c8ebe4a8dd" path="/var/lib/kubelet/pods/97f27249-c1cd-42b8-85d0-27c8ebe4a8dd/volumes" Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.489922 4954 scope.go:117] "RemoveContainer" containerID="b762d6722d2f3f8a96f8adc298a907fc6d6c45589d5ea450d4b2b0ac6f30f421" Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.536061 4954 scope.go:117] "RemoveContainer" containerID="a4cf77b3621542fa14e1b155bf69bada7b52f3a246e12c01ea54903372e46707" Dec 06 09:45:39 crc kubenswrapper[4954]: E1206 09:45:39.536517 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4cf77b3621542fa14e1b155bf69bada7b52f3a246e12c01ea54903372e46707\": container with ID starting with a4cf77b3621542fa14e1b155bf69bada7b52f3a246e12c01ea54903372e46707 not found: ID does not exist" containerID="a4cf77b3621542fa14e1b155bf69bada7b52f3a246e12c01ea54903372e46707" Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.536561 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4cf77b3621542fa14e1b155bf69bada7b52f3a246e12c01ea54903372e46707"} err="failed to get container status \"a4cf77b3621542fa14e1b155bf69bada7b52f3a246e12c01ea54903372e46707\": rpc error: code = NotFound desc = could not find container \"a4cf77b3621542fa14e1b155bf69bada7b52f3a246e12c01ea54903372e46707\": container with ID starting with a4cf77b3621542fa14e1b155bf69bada7b52f3a246e12c01ea54903372e46707 not found: ID does not exist" Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.536598 4954 scope.go:117] "RemoveContainer" containerID="51cf9f632470cfb448d75abf82be830c79ca2343734de41c2e9af7f7a38a8e20" Dec 06 09:45:39 crc kubenswrapper[4954]: E1206 09:45:39.537854 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51cf9f632470cfb448d75abf82be830c79ca2343734de41c2e9af7f7a38a8e20\": container with ID starting with 51cf9f632470cfb448d75abf82be830c79ca2343734de41c2e9af7f7a38a8e20 not found: ID does not exist" containerID="51cf9f632470cfb448d75abf82be830c79ca2343734de41c2e9af7f7a38a8e20" Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.537898 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51cf9f632470cfb448d75abf82be830c79ca2343734de41c2e9af7f7a38a8e20"} err="failed to get container status \"51cf9f632470cfb448d75abf82be830c79ca2343734de41c2e9af7f7a38a8e20\": rpc error: code = NotFound desc = could not find container \"51cf9f632470cfb448d75abf82be830c79ca2343734de41c2e9af7f7a38a8e20\": container with ID starting with 51cf9f632470cfb448d75abf82be830c79ca2343734de41c2e9af7f7a38a8e20 not found: ID does not exist" Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.537920 4954 scope.go:117] "RemoveContainer" containerID="b762d6722d2f3f8a96f8adc298a907fc6d6c45589d5ea450d4b2b0ac6f30f421" Dec 06 09:45:39 crc kubenswrapper[4954]: E1206 09:45:39.538250 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b762d6722d2f3f8a96f8adc298a907fc6d6c45589d5ea450d4b2b0ac6f30f421\": container with ID starting with b762d6722d2f3f8a96f8adc298a907fc6d6c45589d5ea450d4b2b0ac6f30f421 not found: ID does not exist" containerID="b762d6722d2f3f8a96f8adc298a907fc6d6c45589d5ea450d4b2b0ac6f30f421" Dec 06 09:45:39 crc kubenswrapper[4954]: I1206 09:45:39.538296 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b762d6722d2f3f8a96f8adc298a907fc6d6c45589d5ea450d4b2b0ac6f30f421"} err="failed to get container status \"b762d6722d2f3f8a96f8adc298a907fc6d6c45589d5ea450d4b2b0ac6f30f421\": rpc error: code = NotFound desc = could not find container \"b762d6722d2f3f8a96f8adc298a907fc6d6c45589d5ea450d4b2b0ac6f30f421\": container with ID starting with b762d6722d2f3f8a96f8adc298a907fc6d6c45589d5ea450d4b2b0ac6f30f421 not found: ID does not exist" Dec 06 09:46:12 crc kubenswrapper[4954]: I1206 09:46:12.618781 4954 generic.go:334] "Generic (PLEG): container finished" podID="509d808a-804b-405a-9a0a-545a8d15a90e" containerID="7bfc01d6e69d36ee379ce7e63df7bf01e26c96d146a980ebce93042d57739357" exitCode=0 Dec 06 09:46:12 crc kubenswrapper[4954]: I1206 09:46:12.618884 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" event={"ID":"509d808a-804b-405a-9a0a-545a8d15a90e","Type":"ContainerDied","Data":"7bfc01d6e69d36ee379ce7e63df7bf01e26c96d146a980ebce93042d57739357"} Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.080111 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.264604 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g88qf\" (UniqueName: \"kubernetes.io/projected/509d808a-804b-405a-9a0a-545a8d15a90e-kube-api-access-g88qf\") pod \"509d808a-804b-405a-9a0a-545a8d15a90e\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.264687 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ceilometer-compute-config-data-0\") pod \"509d808a-804b-405a-9a0a-545a8d15a90e\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.264734 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ceilometer-compute-config-data-2\") pod \"509d808a-804b-405a-9a0a-545a8d15a90e\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.264767 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ssh-key\") pod \"509d808a-804b-405a-9a0a-545a8d15a90e\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.264801 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-inventory\") pod \"509d808a-804b-405a-9a0a-545a8d15a90e\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.264903 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-telemetry-combined-ca-bundle\") pod \"509d808a-804b-405a-9a0a-545a8d15a90e\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.265130 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ceilometer-compute-config-data-1\") pod \"509d808a-804b-405a-9a0a-545a8d15a90e\" (UID: \"509d808a-804b-405a-9a0a-545a8d15a90e\") " Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.271331 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/509d808a-804b-405a-9a0a-545a8d15a90e-kube-api-access-g88qf" (OuterVolumeSpecName: "kube-api-access-g88qf") pod "509d808a-804b-405a-9a0a-545a8d15a90e" (UID: "509d808a-804b-405a-9a0a-545a8d15a90e"). InnerVolumeSpecName "kube-api-access-g88qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.271989 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "509d808a-804b-405a-9a0a-545a8d15a90e" (UID: "509d808a-804b-405a-9a0a-545a8d15a90e"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.300966 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "509d808a-804b-405a-9a0a-545a8d15a90e" (UID: "509d808a-804b-405a-9a0a-545a8d15a90e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.302135 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "509d808a-804b-405a-9a0a-545a8d15a90e" (UID: "509d808a-804b-405a-9a0a-545a8d15a90e"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.303824 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "509d808a-804b-405a-9a0a-545a8d15a90e" (UID: "509d808a-804b-405a-9a0a-545a8d15a90e"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.309242 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "509d808a-804b-405a-9a0a-545a8d15a90e" (UID: "509d808a-804b-405a-9a0a-545a8d15a90e"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.320754 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-inventory" (OuterVolumeSpecName: "inventory") pod "509d808a-804b-405a-9a0a-545a8d15a90e" (UID: "509d808a-804b-405a-9a0a-545a8d15a90e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.373626 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g88qf\" (UniqueName: \"kubernetes.io/projected/509d808a-804b-405a-9a0a-545a8d15a90e-kube-api-access-g88qf\") on node \"crc\" DevicePath \"\"" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.373674 4954 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.373689 4954 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.373703 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.373716 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.373730 4954 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.373743 4954 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/509d808a-804b-405a-9a0a-545a8d15a90e-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.636985 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" event={"ID":"509d808a-804b-405a-9a0a-545a8d15a90e","Type":"ContainerDied","Data":"db9875d5d7b867e5b2769f071f9c07cc409695bd28e1c5b0b952db09c8a5da29"} Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.637242 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db9875d5d7b867e5b2769f071f9c07cc409695bd28e1c5b0b952db09c8a5da29" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.637052 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-2k4qx" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.823965 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-gg4q9"] Dec 06 09:46:14 crc kubenswrapper[4954]: E1206 09:46:14.824589 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="509d808a-804b-405a-9a0a-545a8d15a90e" containerName="telemetry-openstack-openstack-cell1" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.824613 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="509d808a-804b-405a-9a0a-545a8d15a90e" containerName="telemetry-openstack-openstack-cell1" Dec 06 09:46:14 crc kubenswrapper[4954]: E1206 09:46:14.824645 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97f27249-c1cd-42b8-85d0-27c8ebe4a8dd" containerName="registry-server" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.824655 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f27249-c1cd-42b8-85d0-27c8ebe4a8dd" containerName="registry-server" Dec 06 09:46:14 crc kubenswrapper[4954]: E1206 09:46:14.824668 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97f27249-c1cd-42b8-85d0-27c8ebe4a8dd" containerName="extract-content" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.824675 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f27249-c1cd-42b8-85d0-27c8ebe4a8dd" containerName="extract-content" Dec 06 09:46:14 crc kubenswrapper[4954]: E1206 09:46:14.824696 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97f27249-c1cd-42b8-85d0-27c8ebe4a8dd" containerName="extract-utilities" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.824704 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f27249-c1cd-42b8-85d0-27c8ebe4a8dd" containerName="extract-utilities" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.824994 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="97f27249-c1cd-42b8-85d0-27c8ebe4a8dd" containerName="registry-server" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.825022 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="509d808a-804b-405a-9a0a-545a8d15a90e" containerName="telemetry-openstack-openstack-cell1" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.826063 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.830209 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.830401 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.830546 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.830627 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.831596 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.836709 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-gg4q9"] Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.984093 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxrqw\" (UniqueName: \"kubernetes.io/projected/3aa52c3d-befd-4077-b47c-b56664b536eb-kube-api-access-cxrqw\") pod \"neutron-sriov-openstack-openstack-cell1-gg4q9\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.984266 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-gg4q9\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.984522 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-gg4q9\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.984753 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-gg4q9\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:46:14 crc kubenswrapper[4954]: I1206 09:46:14.984867 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-gg4q9\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:46:15 crc kubenswrapper[4954]: I1206 09:46:15.086738 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxrqw\" (UniqueName: \"kubernetes.io/projected/3aa52c3d-befd-4077-b47c-b56664b536eb-kube-api-access-cxrqw\") pod \"neutron-sriov-openstack-openstack-cell1-gg4q9\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:46:15 crc kubenswrapper[4954]: I1206 09:46:15.086804 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-gg4q9\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:46:15 crc kubenswrapper[4954]: I1206 09:46:15.086888 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-gg4q9\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:46:15 crc kubenswrapper[4954]: I1206 09:46:15.086933 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-gg4q9\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:46:15 crc kubenswrapper[4954]: I1206 09:46:15.087617 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-gg4q9\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:46:15 crc kubenswrapper[4954]: I1206 09:46:15.091946 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-gg4q9\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:46:15 crc kubenswrapper[4954]: I1206 09:46:15.092352 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-gg4q9\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:46:15 crc kubenswrapper[4954]: I1206 09:46:15.093914 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-gg4q9\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:46:15 crc kubenswrapper[4954]: I1206 09:46:15.094058 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-gg4q9\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:46:15 crc kubenswrapper[4954]: I1206 09:46:15.102343 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxrqw\" (UniqueName: \"kubernetes.io/projected/3aa52c3d-befd-4077-b47c-b56664b536eb-kube-api-access-cxrqw\") pod \"neutron-sriov-openstack-openstack-cell1-gg4q9\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:46:15 crc kubenswrapper[4954]: I1206 09:46:15.155873 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:46:15 crc kubenswrapper[4954]: I1206 09:46:15.764550 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-gg4q9"] Dec 06 09:46:16 crc kubenswrapper[4954]: I1206 09:46:16.654988 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" event={"ID":"3aa52c3d-befd-4077-b47c-b56664b536eb","Type":"ContainerStarted","Data":"ab9c45efcbf6c65c265e22117b8e1efc1fe46b740c95416a871f5fd0561d597f"} Dec 06 09:46:17 crc kubenswrapper[4954]: I1206 09:46:17.665406 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" event={"ID":"3aa52c3d-befd-4077-b47c-b56664b536eb","Type":"ContainerStarted","Data":"df8910f29e746a6ee1992e27471a39c115d61bc9484bfe4bf04083222978aa8d"} Dec 06 09:46:17 crc kubenswrapper[4954]: I1206 09:46:17.688148 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" podStartSLOduration=3.061834117 podStartE2EDuration="3.688127359s" podCreationTimestamp="2025-12-06 09:46:14 +0000 UTC" firstStartedPulling="2025-12-06 09:46:15.767888233 +0000 UTC m=+10150.581247622" lastFinishedPulling="2025-12-06 09:46:16.394181475 +0000 UTC m=+10151.207540864" observedRunningTime="2025-12-06 09:46:17.679715015 +0000 UTC m=+10152.493074404" watchObservedRunningTime="2025-12-06 09:46:17.688127359 +0000 UTC m=+10152.501486748" Dec 06 09:47:05 crc kubenswrapper[4954]: I1206 09:47:05.238697 4954 generic.go:334] "Generic (PLEG): container finished" podID="3aa52c3d-befd-4077-b47c-b56664b536eb" containerID="df8910f29e746a6ee1992e27471a39c115d61bc9484bfe4bf04083222978aa8d" exitCode=0 Dec 06 09:47:05 crc kubenswrapper[4954]: I1206 09:47:05.238790 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" event={"ID":"3aa52c3d-befd-4077-b47c-b56664b536eb","Type":"ContainerDied","Data":"df8910f29e746a6ee1992e27471a39c115d61bc9484bfe4bf04083222978aa8d"} Dec 06 09:47:06 crc kubenswrapper[4954]: I1206 09:47:06.793922 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:47:06 crc kubenswrapper[4954]: I1206 09:47:06.929156 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-inventory\") pod \"3aa52c3d-befd-4077-b47c-b56664b536eb\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " Dec 06 09:47:06 crc kubenswrapper[4954]: I1206 09:47:06.929479 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-neutron-sriov-combined-ca-bundle\") pod \"3aa52c3d-befd-4077-b47c-b56664b536eb\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " Dec 06 09:47:06 crc kubenswrapper[4954]: I1206 09:47:06.929537 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxrqw\" (UniqueName: \"kubernetes.io/projected/3aa52c3d-befd-4077-b47c-b56664b536eb-kube-api-access-cxrqw\") pod \"3aa52c3d-befd-4077-b47c-b56664b536eb\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " Dec 06 09:47:06 crc kubenswrapper[4954]: I1206 09:47:06.929634 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-ssh-key\") pod \"3aa52c3d-befd-4077-b47c-b56664b536eb\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " Dec 06 09:47:06 crc kubenswrapper[4954]: I1206 09:47:06.929675 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-neutron-sriov-agent-neutron-config-0\") pod \"3aa52c3d-befd-4077-b47c-b56664b536eb\" (UID: \"3aa52c3d-befd-4077-b47c-b56664b536eb\") " Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.311465 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" event={"ID":"3aa52c3d-befd-4077-b47c-b56664b536eb","Type":"ContainerDied","Data":"ab9c45efcbf6c65c265e22117b8e1efc1fe46b740c95416a871f5fd0561d597f"} Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.311504 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab9c45efcbf6c65c265e22117b8e1efc1fe46b740c95416a871f5fd0561d597f" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.311553 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-gg4q9" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.385704 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc"] Dec 06 09:47:07 crc kubenswrapper[4954]: E1206 09:47:07.386297 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aa52c3d-befd-4077-b47c-b56664b536eb" containerName="neutron-sriov-openstack-openstack-cell1" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.386319 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aa52c3d-befd-4077-b47c-b56664b536eb" containerName="neutron-sriov-openstack-openstack-cell1" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.386661 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aa52c3d-befd-4077-b47c-b56664b536eb" containerName="neutron-sriov-openstack-openstack-cell1" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.387479 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.394270 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.404786 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc"] Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.470025 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "3aa52c3d-befd-4077-b47c-b56664b536eb" (UID: "3aa52c3d-befd-4077-b47c-b56664b536eb"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.481900 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aa52c3d-befd-4077-b47c-b56664b536eb-kube-api-access-cxrqw" (OuterVolumeSpecName: "kube-api-access-cxrqw") pod "3aa52c3d-befd-4077-b47c-b56664b536eb" (UID: "3aa52c3d-befd-4077-b47c-b56664b536eb"). InnerVolumeSpecName "kube-api-access-cxrqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.500318 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-inventory" (OuterVolumeSpecName: "inventory") pod "3aa52c3d-befd-4077-b47c-b56664b536eb" (UID: "3aa52c3d-befd-4077-b47c-b56664b536eb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.508152 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-wfjjc\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.508207 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-wfjjc\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.508363 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-wfjjc\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.508523 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g26bs\" (UniqueName: \"kubernetes.io/projected/d2897056-ac73-4b75-8fe9-932a99a6bf80-kube-api-access-g26bs\") pod \"neutron-dhcp-openstack-openstack-cell1-wfjjc\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.508628 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-wfjjc\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.508725 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.508749 4954 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.508763 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxrqw\" (UniqueName: \"kubernetes.io/projected/3aa52c3d-befd-4077-b47c-b56664b536eb-kube-api-access-cxrqw\") on node \"crc\" DevicePath \"\"" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.514430 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "3aa52c3d-befd-4077-b47c-b56664b536eb" (UID: "3aa52c3d-befd-4077-b47c-b56664b536eb"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.526108 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3aa52c3d-befd-4077-b47c-b56664b536eb" (UID: "3aa52c3d-befd-4077-b47c-b56664b536eb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.610877 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-wfjjc\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.610984 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g26bs\" (UniqueName: \"kubernetes.io/projected/d2897056-ac73-4b75-8fe9-932a99a6bf80-kube-api-access-g26bs\") pod \"neutron-dhcp-openstack-openstack-cell1-wfjjc\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.611024 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-wfjjc\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.611058 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-wfjjc\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.611077 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-wfjjc\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.611178 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.611194 4954 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3aa52c3d-befd-4077-b47c-b56664b536eb-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.614687 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-wfjjc\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.615156 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-wfjjc\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.615351 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-wfjjc\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.616539 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-wfjjc\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.636285 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g26bs\" (UniqueName: \"kubernetes.io/projected/d2897056-ac73-4b75-8fe9-932a99a6bf80-kube-api-access-g26bs\") pod \"neutron-dhcp-openstack-openstack-cell1-wfjjc\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:47:07 crc kubenswrapper[4954]: I1206 09:47:07.720662 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:47:08 crc kubenswrapper[4954]: I1206 09:47:08.286031 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc"] Dec 06 09:47:08 crc kubenswrapper[4954]: I1206 09:47:08.321239 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" event={"ID":"d2897056-ac73-4b75-8fe9-932a99a6bf80","Type":"ContainerStarted","Data":"2c97f74ed17186de1528d8a801d116c361bb69ad6d5e8f6482652407ee458f49"} Dec 06 09:47:09 crc kubenswrapper[4954]: I1206 09:47:09.333629 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" event={"ID":"d2897056-ac73-4b75-8fe9-932a99a6bf80","Type":"ContainerStarted","Data":"06b68d54e5630c95e46eeb74b4271035f003a52e3b1dadd90e47603624587def"} Dec 06 09:47:09 crc kubenswrapper[4954]: I1206 09:47:09.356279 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" podStartSLOduration=1.793735668 podStartE2EDuration="2.356217001s" podCreationTimestamp="2025-12-06 09:47:07 +0000 UTC" firstStartedPulling="2025-12-06 09:47:08.298464217 +0000 UTC m=+10203.111823596" lastFinishedPulling="2025-12-06 09:47:08.86094553 +0000 UTC m=+10203.674304929" observedRunningTime="2025-12-06 09:47:09.34980887 +0000 UTC m=+10204.163168269" watchObservedRunningTime="2025-12-06 09:47:09.356217001 +0000 UTC m=+10204.169576380" Dec 06 09:47:40 crc kubenswrapper[4954]: I1206 09:47:40.100881 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:47:40 crc kubenswrapper[4954]: I1206 09:47:40.101302 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:48:09 crc kubenswrapper[4954]: I1206 09:48:09.962863 4954 generic.go:334] "Generic (PLEG): container finished" podID="d2897056-ac73-4b75-8fe9-932a99a6bf80" containerID="06b68d54e5630c95e46eeb74b4271035f003a52e3b1dadd90e47603624587def" exitCode=0 Dec 06 09:48:09 crc kubenswrapper[4954]: I1206 09:48:09.962933 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" event={"ID":"d2897056-ac73-4b75-8fe9-932a99a6bf80","Type":"ContainerDied","Data":"06b68d54e5630c95e46eeb74b4271035f003a52e3b1dadd90e47603624587def"} Dec 06 09:48:10 crc kubenswrapper[4954]: I1206 09:48:10.101649 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:48:10 crc kubenswrapper[4954]: I1206 09:48:10.101713 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.427451 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.559119 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-inventory\") pod \"d2897056-ac73-4b75-8fe9-932a99a6bf80\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.559201 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-neutron-dhcp-agent-neutron-config-0\") pod \"d2897056-ac73-4b75-8fe9-932a99a6bf80\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.559454 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g26bs\" (UniqueName: \"kubernetes.io/projected/d2897056-ac73-4b75-8fe9-932a99a6bf80-kube-api-access-g26bs\") pod \"d2897056-ac73-4b75-8fe9-932a99a6bf80\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.559501 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-ssh-key\") pod \"d2897056-ac73-4b75-8fe9-932a99a6bf80\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.559753 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-neutron-dhcp-combined-ca-bundle\") pod \"d2897056-ac73-4b75-8fe9-932a99a6bf80\" (UID: \"d2897056-ac73-4b75-8fe9-932a99a6bf80\") " Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.565950 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "d2897056-ac73-4b75-8fe9-932a99a6bf80" (UID: "d2897056-ac73-4b75-8fe9-932a99a6bf80"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.573888 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2897056-ac73-4b75-8fe9-932a99a6bf80-kube-api-access-g26bs" (OuterVolumeSpecName: "kube-api-access-g26bs") pod "d2897056-ac73-4b75-8fe9-932a99a6bf80" (UID: "d2897056-ac73-4b75-8fe9-932a99a6bf80"). InnerVolumeSpecName "kube-api-access-g26bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.590199 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d2897056-ac73-4b75-8fe9-932a99a6bf80" (UID: "d2897056-ac73-4b75-8fe9-932a99a6bf80"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.590739 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "d2897056-ac73-4b75-8fe9-932a99a6bf80" (UID: "d2897056-ac73-4b75-8fe9-932a99a6bf80"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.592765 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-inventory" (OuterVolumeSpecName: "inventory") pod "d2897056-ac73-4b75-8fe9-932a99a6bf80" (UID: "d2897056-ac73-4b75-8fe9-932a99a6bf80"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.663492 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g26bs\" (UniqueName: \"kubernetes.io/projected/d2897056-ac73-4b75-8fe9-932a99a6bf80-kube-api-access-g26bs\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.663537 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.663551 4954 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.663628 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.663642 4954 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d2897056-ac73-4b75-8fe9-932a99a6bf80-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.989265 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" event={"ID":"d2897056-ac73-4b75-8fe9-932a99a6bf80","Type":"ContainerDied","Data":"2c97f74ed17186de1528d8a801d116c361bb69ad6d5e8f6482652407ee458f49"} Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.989323 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c97f74ed17186de1528d8a801d116c361bb69ad6d5e8f6482652407ee458f49" Dec 06 09:48:11 crc kubenswrapper[4954]: I1206 09:48:11.989322 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-wfjjc" Dec 06 09:48:30 crc kubenswrapper[4954]: I1206 09:48:30.031901 4954 scope.go:117] "RemoveContainer" containerID="11e388b7d9a3878d863e4e06b32ea6a46016220797814b445745338b711015fc" Dec 06 09:48:30 crc kubenswrapper[4954]: I1206 09:48:30.071371 4954 scope.go:117] "RemoveContainer" containerID="591e630964e1b404a9d53867e0a8861a77331c172a529d8137a98ae2d68f3dcf" Dec 06 09:48:30 crc kubenswrapper[4954]: I1206 09:48:30.690933 4954 scope.go:117] "RemoveContainer" containerID="3cf7199f141cf3a8bfc733a02fca44daf712e2afae25fa149be377bb2a80e67c" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.081473 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.082845 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="309d0b52-0b14-4f55-b15c-acb1a6824a88" containerName="nova-cell0-conductor-conductor" containerID="cri-o://7879d554328be882d29323f6423d040c51857bfb864f3e9ad740f8ef137010b6" gracePeriod=30 Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.602578 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.602936 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="7d57c38b-caac-4a50-b930-85079409a490" containerName="nova-cell1-conductor-conductor" containerID="cri-o://57c3c5afb583cb02bd0bdff592d19200582363dc10579239b56c8e9b4aeaff9c" gracePeriod=30 Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.753488 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.753813 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6699d8f6-3aa0-4564-ac22-cfbfade1f563" containerName="nova-scheduler-scheduler" containerID="cri-o://b274541346b18644b926e197db6439fba972c34ad7d27db1a9c2c23ab8675758" gracePeriod=30 Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.767386 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.768676 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" containerName="nova-api-log" containerID="cri-o://4bc6e5b0e964b77c627b986b87b8adee2fc9226f47b5cb20140ddbd607f93902" gracePeriod=30 Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.769080 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" containerName="nova-api-api" containerID="cri-o://d71644552accf5ae017efcff05ca3deea024bd07c42a94f8b77215c7f8d0956b" gracePeriod=30 Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.829674 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.830127 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2ac9b546-77cc-444e-9263-ad4f8edfba97" containerName="nova-metadata-log" containerID="cri-o://a63ecc39c19cdb980123413a222cae58a66829ca8107faaa473e80b57ac52273" gracePeriod=30 Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.830861 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2ac9b546-77cc-444e-9263-ad4f8edfba97" containerName="nova-metadata-metadata" containerID="cri-o://5ba2b28d1db1e5489ac7175d5b6f06f37a289288cd713269dfbe260180b3d9f2" gracePeriod=30 Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.852005 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx"] Dec 06 09:48:38 crc kubenswrapper[4954]: E1206 09:48:38.852437 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2897056-ac73-4b75-8fe9-932a99a6bf80" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.852455 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2897056-ac73-4b75-8fe9-932a99a6bf80" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.852674 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2897056-ac73-4b75-8fe9-932a99a6bf80" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.853475 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.859954 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.860127 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.860261 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.860364 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.860518 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghzsl" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.861053 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.861180 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.880715 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx"] Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.964739 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.964810 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.964918 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.964944 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.965053 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.965091 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.965108 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkk4g\" (UniqueName: \"kubernetes.io/projected/4787a099-d3d6-4ae4-9cfa-b50d3d082889-kube-api-access-gkk4g\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.965197 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:38 crc kubenswrapper[4954]: I1206 09:48:38.965242 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.066776 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.066856 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.066898 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.066922 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.066986 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.067024 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.067045 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkk4g\" (UniqueName: \"kubernetes.io/projected/4787a099-d3d6-4ae4-9cfa-b50d3d082889-kube-api-access-gkk4g\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.067083 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.067108 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.068836 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.074615 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.075228 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.077022 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.080935 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.086435 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.087112 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.089398 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.102625 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkk4g\" (UniqueName: \"kubernetes.io/projected/4787a099-d3d6-4ae4-9cfa-b50d3d082889-kube-api-access-gkk4g\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.181731 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:48:39 crc kubenswrapper[4954]: E1206 09:48:39.193585 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b274541346b18644b926e197db6439fba972c34ad7d27db1a9c2c23ab8675758" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 09:48:39 crc kubenswrapper[4954]: E1206 09:48:39.196462 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b274541346b18644b926e197db6439fba972c34ad7d27db1a9c2c23ab8675758" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 09:48:39 crc kubenswrapper[4954]: E1206 09:48:39.197840 4954 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b274541346b18644b926e197db6439fba972c34ad7d27db1a9c2c23ab8675758" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 09:48:39 crc kubenswrapper[4954]: E1206 09:48:39.199952 4954 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6699d8f6-3aa0-4564-ac22-cfbfade1f563" containerName="nova-scheduler-scheduler" Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.323911 4954 generic.go:334] "Generic (PLEG): container finished" podID="52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" containerID="4bc6e5b0e964b77c627b986b87b8adee2fc9226f47b5cb20140ddbd607f93902" exitCode=143 Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.324016 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb","Type":"ContainerDied","Data":"4bc6e5b0e964b77c627b986b87b8adee2fc9226f47b5cb20140ddbd607f93902"} Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.327233 4954 generic.go:334] "Generic (PLEG): container finished" podID="2ac9b546-77cc-444e-9263-ad4f8edfba97" containerID="a63ecc39c19cdb980123413a222cae58a66829ca8107faaa473e80b57ac52273" exitCode=143 Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.327279 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2ac9b546-77cc-444e-9263-ad4f8edfba97","Type":"ContainerDied","Data":"a63ecc39c19cdb980123413a222cae58a66829ca8107faaa473e80b57ac52273"} Dec 06 09:48:39 crc kubenswrapper[4954]: I1206 09:48:39.769770 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx"] Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.101750 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.101804 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.101846 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.102404 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ecc8d0836703f7e1eb88ddf5d2d3f6cc0d402d1a725f2802a1c6e875ac74d2b9"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.102448 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://ecc8d0836703f7e1eb88ddf5d2d3f6cc0d402d1a725f2802a1c6e875ac74d2b9" gracePeriod=600 Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.189733 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.300088 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d57c38b-caac-4a50-b930-85079409a490-combined-ca-bundle\") pod \"7d57c38b-caac-4a50-b930-85079409a490\" (UID: \"7d57c38b-caac-4a50-b930-85079409a490\") " Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.300438 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgbjj\" (UniqueName: \"kubernetes.io/projected/7d57c38b-caac-4a50-b930-85079409a490-kube-api-access-mgbjj\") pod \"7d57c38b-caac-4a50-b930-85079409a490\" (UID: \"7d57c38b-caac-4a50-b930-85079409a490\") " Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.300591 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d57c38b-caac-4a50-b930-85079409a490-config-data\") pod \"7d57c38b-caac-4a50-b930-85079409a490\" (UID: \"7d57c38b-caac-4a50-b930-85079409a490\") " Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.304149 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d57c38b-caac-4a50-b930-85079409a490-kube-api-access-mgbjj" (OuterVolumeSpecName: "kube-api-access-mgbjj") pod "7d57c38b-caac-4a50-b930-85079409a490" (UID: "7d57c38b-caac-4a50-b930-85079409a490"). InnerVolumeSpecName "kube-api-access-mgbjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.328128 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d57c38b-caac-4a50-b930-85079409a490-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d57c38b-caac-4a50-b930-85079409a490" (UID: "7d57c38b-caac-4a50-b930-85079409a490"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.333427 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d57c38b-caac-4a50-b930-85079409a490-config-data" (OuterVolumeSpecName: "config-data") pod "7d57c38b-caac-4a50-b930-85079409a490" (UID: "7d57c38b-caac-4a50-b930-85079409a490"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.348842 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="ecc8d0836703f7e1eb88ddf5d2d3f6cc0d402d1a725f2802a1c6e875ac74d2b9" exitCode=0 Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.348917 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"ecc8d0836703f7e1eb88ddf5d2d3f6cc0d402d1a725f2802a1c6e875ac74d2b9"} Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.348965 4954 scope.go:117] "RemoveContainer" containerID="3ccdf154367fd4fd8558ce096a4777fee80906b774e453b909ed6aa9f57472cf" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.350253 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" event={"ID":"4787a099-d3d6-4ae4-9cfa-b50d3d082889","Type":"ContainerStarted","Data":"8156a0ec174bce0d94d18d04e32201b92d777d3df1f7fff61185a20cc88ff3dc"} Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.351544 4954 generic.go:334] "Generic (PLEG): container finished" podID="7d57c38b-caac-4a50-b930-85079409a490" containerID="57c3c5afb583cb02bd0bdff592d19200582363dc10579239b56c8e9b4aeaff9c" exitCode=0 Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.351586 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7d57c38b-caac-4a50-b930-85079409a490","Type":"ContainerDied","Data":"57c3c5afb583cb02bd0bdff592d19200582363dc10579239b56c8e9b4aeaff9c"} Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.351595 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.351600 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7d57c38b-caac-4a50-b930-85079409a490","Type":"ContainerDied","Data":"b30209fdd1ebadeb04a4a09ec4e4f2f8688722046245838a079298b8b88bec13"} Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.396833 4954 scope.go:117] "RemoveContainer" containerID="57c3c5afb583cb02bd0bdff592d19200582363dc10579239b56c8e9b4aeaff9c" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.399557 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.403063 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d57c38b-caac-4a50-b930-85079409a490-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.403093 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d57c38b-caac-4a50-b930-85079409a490-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.403105 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgbjj\" (UniqueName: \"kubernetes.io/projected/7d57c38b-caac-4a50-b930-85079409a490-kube-api-access-mgbjj\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.416324 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.428009 4954 scope.go:117] "RemoveContainer" containerID="57c3c5afb583cb02bd0bdff592d19200582363dc10579239b56c8e9b4aeaff9c" Dec 06 09:48:40 crc kubenswrapper[4954]: E1206 09:48:40.428603 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57c3c5afb583cb02bd0bdff592d19200582363dc10579239b56c8e9b4aeaff9c\": container with ID starting with 57c3c5afb583cb02bd0bdff592d19200582363dc10579239b56c8e9b4aeaff9c not found: ID does not exist" containerID="57c3c5afb583cb02bd0bdff592d19200582363dc10579239b56c8e9b4aeaff9c" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.428667 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57c3c5afb583cb02bd0bdff592d19200582363dc10579239b56c8e9b4aeaff9c"} err="failed to get container status \"57c3c5afb583cb02bd0bdff592d19200582363dc10579239b56c8e9b4aeaff9c\": rpc error: code = NotFound desc = could not find container \"57c3c5afb583cb02bd0bdff592d19200582363dc10579239b56c8e9b4aeaff9c\": container with ID starting with 57c3c5afb583cb02bd0bdff592d19200582363dc10579239b56c8e9b4aeaff9c not found: ID does not exist" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.429956 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 09:48:40 crc kubenswrapper[4954]: E1206 09:48:40.430688 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d57c38b-caac-4a50-b930-85079409a490" containerName="nova-cell1-conductor-conductor" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.430714 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d57c38b-caac-4a50-b930-85079409a490" containerName="nova-cell1-conductor-conductor" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.430973 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d57c38b-caac-4a50-b930-85079409a490" containerName="nova-cell1-conductor-conductor" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.432000 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.439340 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.440201 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.508320 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ff9df60-ff7e-4974-94d3-eb3d76cf6887-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1ff9df60-ff7e-4974-94d3-eb3d76cf6887\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.508900 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ff9df60-ff7e-4974-94d3-eb3d76cf6887-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1ff9df60-ff7e-4974-94d3-eb3d76cf6887\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.509106 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs54k\" (UniqueName: \"kubernetes.io/projected/1ff9df60-ff7e-4974-94d3-eb3d76cf6887-kube-api-access-qs54k\") pod \"nova-cell1-conductor-0\" (UID: \"1ff9df60-ff7e-4974-94d3-eb3d76cf6887\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.611275 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ff9df60-ff7e-4974-94d3-eb3d76cf6887-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1ff9df60-ff7e-4974-94d3-eb3d76cf6887\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.611342 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ff9df60-ff7e-4974-94d3-eb3d76cf6887-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1ff9df60-ff7e-4974-94d3-eb3d76cf6887\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.611415 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs54k\" (UniqueName: \"kubernetes.io/projected/1ff9df60-ff7e-4974-94d3-eb3d76cf6887-kube-api-access-qs54k\") pod \"nova-cell1-conductor-0\" (UID: \"1ff9df60-ff7e-4974-94d3-eb3d76cf6887\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.615300 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ff9df60-ff7e-4974-94d3-eb3d76cf6887-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1ff9df60-ff7e-4974-94d3-eb3d76cf6887\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.615414 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ff9df60-ff7e-4974-94d3-eb3d76cf6887-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1ff9df60-ff7e-4974-94d3-eb3d76cf6887\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.626967 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs54k\" (UniqueName: \"kubernetes.io/projected/1ff9df60-ff7e-4974-94d3-eb3d76cf6887-kube-api-access-qs54k\") pod \"nova-cell1-conductor-0\" (UID: \"1ff9df60-ff7e-4974-94d3-eb3d76cf6887\") " pod="openstack/nova-cell1-conductor-0" Dec 06 09:48:40 crc kubenswrapper[4954]: I1206 09:48:40.759105 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 06 09:48:41 crc kubenswrapper[4954]: I1206 09:48:41.254842 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 09:48:41 crc kubenswrapper[4954]: I1206 09:48:41.370899 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921"} Dec 06 09:48:41 crc kubenswrapper[4954]: I1206 09:48:41.381554 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" event={"ID":"4787a099-d3d6-4ae4-9cfa-b50d3d082889","Type":"ContainerStarted","Data":"9e678cb95550c2b064ab40161a5b7c7126ebc9826631369eb573dabaaa19152c"} Dec 06 09:48:41 crc kubenswrapper[4954]: I1206 09:48:41.416653 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" podStartSLOduration=2.984992975 podStartE2EDuration="3.416624201s" podCreationTimestamp="2025-12-06 09:48:38 +0000 UTC" firstStartedPulling="2025-12-06 09:48:39.786211275 +0000 UTC m=+10294.599570664" lastFinishedPulling="2025-12-06 09:48:40.217842501 +0000 UTC m=+10295.031201890" observedRunningTime="2025-12-06 09:48:41.404807576 +0000 UTC m=+10296.218166965" watchObservedRunningTime="2025-12-06 09:48:41.416624201 +0000 UTC m=+10296.229983600" Dec 06 09:48:41 crc kubenswrapper[4954]: I1206 09:48:41.456047 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d57c38b-caac-4a50-b930-85079409a490" path="/var/lib/kubelet/pods/7d57c38b-caac-4a50-b930-85079409a490/volumes" Dec 06 09:48:41 crc kubenswrapper[4954]: W1206 09:48:41.674887 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ff9df60_ff7e_4974_94d3_eb3d76cf6887.slice/crio-f47f069cc0b1cb244cc29a1038ffd0bfdaa0431e21c269d0165eafd3080a48e4 WatchSource:0}: Error finding container f47f069cc0b1cb244cc29a1038ffd0bfdaa0431e21c269d0165eafd3080a48e4: Status 404 returned error can't find the container with id f47f069cc0b1cb244cc29a1038ffd0bfdaa0431e21c269d0165eafd3080a48e4 Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.069468 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.145714 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309d0b52-0b14-4f55-b15c-acb1a6824a88-combined-ca-bundle\") pod \"309d0b52-0b14-4f55-b15c-acb1a6824a88\" (UID: \"309d0b52-0b14-4f55-b15c-acb1a6824a88\") " Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.145775 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309d0b52-0b14-4f55-b15c-acb1a6824a88-config-data\") pod \"309d0b52-0b14-4f55-b15c-acb1a6824a88\" (UID: \"309d0b52-0b14-4f55-b15c-acb1a6824a88\") " Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.145792 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrpgl\" (UniqueName: \"kubernetes.io/projected/309d0b52-0b14-4f55-b15c-acb1a6824a88-kube-api-access-vrpgl\") pod \"309d0b52-0b14-4f55-b15c-acb1a6824a88\" (UID: \"309d0b52-0b14-4f55-b15c-acb1a6824a88\") " Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.160768 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/309d0b52-0b14-4f55-b15c-acb1a6824a88-kube-api-access-vrpgl" (OuterVolumeSpecName: "kube-api-access-vrpgl") pod "309d0b52-0b14-4f55-b15c-acb1a6824a88" (UID: "309d0b52-0b14-4f55-b15c-acb1a6824a88"). InnerVolumeSpecName "kube-api-access-vrpgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.177806 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/309d0b52-0b14-4f55-b15c-acb1a6824a88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "309d0b52-0b14-4f55-b15c-acb1a6824a88" (UID: "309d0b52-0b14-4f55-b15c-acb1a6824a88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.205737 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/309d0b52-0b14-4f55-b15c-acb1a6824a88-config-data" (OuterVolumeSpecName: "config-data") pod "309d0b52-0b14-4f55-b15c-acb1a6824a88" (UID: "309d0b52-0b14-4f55-b15c-acb1a6824a88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.248336 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309d0b52-0b14-4f55-b15c-acb1a6824a88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.248370 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309d0b52-0b14-4f55-b15c-acb1a6824a88-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.248380 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrpgl\" (UniqueName: \"kubernetes.io/projected/309d0b52-0b14-4f55-b15c-acb1a6824a88-kube-api-access-vrpgl\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.260877 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="2ac9b546-77cc-444e-9263-ad4f8edfba97" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.107:8775/\": read tcp 10.217.0.2:33982->10.217.1.107:8775: read: connection reset by peer" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.260911 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="2ac9b546-77cc-444e-9263-ad4f8edfba97" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.107:8775/\": read tcp 10.217.0.2:33978->10.217.1.107:8775: read: connection reset by peer" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.409069 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1ff9df60-ff7e-4974-94d3-eb3d76cf6887","Type":"ContainerStarted","Data":"6e8b10edba7d262d1aec6a02d3e1291647a913957ae87a97266de14b3129d96a"} Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.409103 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1ff9df60-ff7e-4974-94d3-eb3d76cf6887","Type":"ContainerStarted","Data":"f47f069cc0b1cb244cc29a1038ffd0bfdaa0431e21c269d0165eafd3080a48e4"} Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.409131 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.426077 4954 generic.go:334] "Generic (PLEG): container finished" podID="52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" containerID="d71644552accf5ae017efcff05ca3deea024bd07c42a94f8b77215c7f8d0956b" exitCode=0 Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.426159 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb","Type":"ContainerDied","Data":"d71644552accf5ae017efcff05ca3deea024bd07c42a94f8b77215c7f8d0956b"} Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.431814 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.431797361 podStartE2EDuration="2.431797361s" podCreationTimestamp="2025-12-06 09:48:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:48:42.429938422 +0000 UTC m=+10297.243297811" watchObservedRunningTime="2025-12-06 09:48:42.431797361 +0000 UTC m=+10297.245156740" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.436896 4954 generic.go:334] "Generic (PLEG): container finished" podID="2ac9b546-77cc-444e-9263-ad4f8edfba97" containerID="5ba2b28d1db1e5489ac7175d5b6f06f37a289288cd713269dfbe260180b3d9f2" exitCode=0 Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.437100 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2ac9b546-77cc-444e-9263-ad4f8edfba97","Type":"ContainerDied","Data":"5ba2b28d1db1e5489ac7175d5b6f06f37a289288cd713269dfbe260180b3d9f2"} Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.450377 4954 generic.go:334] "Generic (PLEG): container finished" podID="309d0b52-0b14-4f55-b15c-acb1a6824a88" containerID="7879d554328be882d29323f6423d040c51857bfb864f3e9ad740f8ef137010b6" exitCode=0 Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.451010 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.451379 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"309d0b52-0b14-4f55-b15c-acb1a6824a88","Type":"ContainerDied","Data":"7879d554328be882d29323f6423d040c51857bfb864f3e9ad740f8ef137010b6"} Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.451407 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"309d0b52-0b14-4f55-b15c-acb1a6824a88","Type":"ContainerDied","Data":"c5026b8160d1f070ecaa5582f82e6a84679c1e1c64030fba5e2e7cedf256f2d0"} Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.451423 4954 scope.go:117] "RemoveContainer" containerID="7879d554328be882d29323f6423d040c51857bfb864f3e9ad740f8ef137010b6" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.495846 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.519903 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.545471 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.553820 4954 scope.go:117] "RemoveContainer" containerID="7879d554328be882d29323f6423d040c51857bfb864f3e9ad740f8ef137010b6" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.557901 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-config-data\") pod \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.558091 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-public-tls-certs\") pod \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.558131 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djgbq\" (UniqueName: \"kubernetes.io/projected/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-kube-api-access-djgbq\") pod \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.558171 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-internal-tls-certs\") pod \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.558324 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-combined-ca-bundle\") pod \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.558397 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-logs\") pod \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\" (UID: \"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb\") " Dec 06 09:48:42 crc kubenswrapper[4954]: E1206 09:48:42.559742 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7879d554328be882d29323f6423d040c51857bfb864f3e9ad740f8ef137010b6\": container with ID starting with 7879d554328be882d29323f6423d040c51857bfb864f3e9ad740f8ef137010b6 not found: ID does not exist" containerID="7879d554328be882d29323f6423d040c51857bfb864f3e9ad740f8ef137010b6" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.559782 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7879d554328be882d29323f6423d040c51857bfb864f3e9ad740f8ef137010b6"} err="failed to get container status \"7879d554328be882d29323f6423d040c51857bfb864f3e9ad740f8ef137010b6\": rpc error: code = NotFound desc = could not find container \"7879d554328be882d29323f6423d040c51857bfb864f3e9ad740f8ef137010b6\": container with ID starting with 7879d554328be882d29323f6423d040c51857bfb864f3e9ad740f8ef137010b6 not found: ID does not exist" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.560599 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-logs" (OuterVolumeSpecName: "logs") pod "52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" (UID: "52eacff3-ab36-43a1-aa46-5a3b5e8c06cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.562668 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 09:48:42 crc kubenswrapper[4954]: E1206 09:48:42.563145 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" containerName="nova-api-api" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.563163 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" containerName="nova-api-api" Dec 06 09:48:42 crc kubenswrapper[4954]: E1206 09:48:42.563188 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="309d0b52-0b14-4f55-b15c-acb1a6824a88" containerName="nova-cell0-conductor-conductor" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.563195 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="309d0b52-0b14-4f55-b15c-acb1a6824a88" containerName="nova-cell0-conductor-conductor" Dec 06 09:48:42 crc kubenswrapper[4954]: E1206 09:48:42.563228 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" containerName="nova-api-log" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.563236 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" containerName="nova-api-log" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.563425 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="309d0b52-0b14-4f55-b15c-acb1a6824a88" containerName="nova-cell0-conductor-conductor" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.563450 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" containerName="nova-api-log" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.563461 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" containerName="nova-api-api" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.564191 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.566934 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.578071 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.582414 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-kube-api-access-djgbq" (OuterVolumeSpecName: "kube-api-access-djgbq") pod "52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" (UID: "52eacff3-ab36-43a1-aa46-5a3b5e8c06cb"). InnerVolumeSpecName "kube-api-access-djgbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.606908 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" (UID: "52eacff3-ab36-43a1-aa46-5a3b5e8c06cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.611754 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-config-data" (OuterVolumeSpecName: "config-data") pod "52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" (UID: "52eacff3-ab36-43a1-aa46-5a3b5e8c06cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.621367 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" (UID: "52eacff3-ab36-43a1-aa46-5a3b5e8c06cb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.624351 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" (UID: "52eacff3-ab36-43a1-aa46-5a3b5e8c06cb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.660580 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llxb2\" (UniqueName: \"kubernetes.io/projected/745282f6-0c52-4f5b-a2e2-6e30a5a6d763-kube-api-access-llxb2\") pod \"nova-cell0-conductor-0\" (UID: \"745282f6-0c52-4f5b-a2e2-6e30a5a6d763\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.660831 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/745282f6-0c52-4f5b-a2e2-6e30a5a6d763-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"745282f6-0c52-4f5b-a2e2-6e30a5a6d763\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.660921 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/745282f6-0c52-4f5b-a2e2-6e30a5a6d763-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"745282f6-0c52-4f5b-a2e2-6e30a5a6d763\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.661107 4954 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.661182 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djgbq\" (UniqueName: \"kubernetes.io/projected/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-kube-api-access-djgbq\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.661252 4954 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.661318 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.661393 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.661455 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.743004 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.763054 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llxb2\" (UniqueName: \"kubernetes.io/projected/745282f6-0c52-4f5b-a2e2-6e30a5a6d763-kube-api-access-llxb2\") pod \"nova-cell0-conductor-0\" (UID: \"745282f6-0c52-4f5b-a2e2-6e30a5a6d763\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.763126 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/745282f6-0c52-4f5b-a2e2-6e30a5a6d763-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"745282f6-0c52-4f5b-a2e2-6e30a5a6d763\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.763176 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/745282f6-0c52-4f5b-a2e2-6e30a5a6d763-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"745282f6-0c52-4f5b-a2e2-6e30a5a6d763\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.768179 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/745282f6-0c52-4f5b-a2e2-6e30a5a6d763-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"745282f6-0c52-4f5b-a2e2-6e30a5a6d763\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.768225 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/745282f6-0c52-4f5b-a2e2-6e30a5a6d763-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"745282f6-0c52-4f5b-a2e2-6e30a5a6d763\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.784771 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llxb2\" (UniqueName: \"kubernetes.io/projected/745282f6-0c52-4f5b-a2e2-6e30a5a6d763-kube-api-access-llxb2\") pod \"nova-cell0-conductor-0\" (UID: \"745282f6-0c52-4f5b-a2e2-6e30a5a6d763\") " pod="openstack/nova-cell0-conductor-0" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.864775 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ac9b546-77cc-444e-9263-ad4f8edfba97-nova-metadata-tls-certs\") pod \"2ac9b546-77cc-444e-9263-ad4f8edfba97\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.865162 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ac9b546-77cc-444e-9263-ad4f8edfba97-combined-ca-bundle\") pod \"2ac9b546-77cc-444e-9263-ad4f8edfba97\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.865215 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94whj\" (UniqueName: \"kubernetes.io/projected/2ac9b546-77cc-444e-9263-ad4f8edfba97-kube-api-access-94whj\") pod \"2ac9b546-77cc-444e-9263-ad4f8edfba97\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.865524 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ac9b546-77cc-444e-9263-ad4f8edfba97-logs\") pod \"2ac9b546-77cc-444e-9263-ad4f8edfba97\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.865604 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ac9b546-77cc-444e-9263-ad4f8edfba97-config-data\") pod \"2ac9b546-77cc-444e-9263-ad4f8edfba97\" (UID: \"2ac9b546-77cc-444e-9263-ad4f8edfba97\") " Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.866402 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ac9b546-77cc-444e-9263-ad4f8edfba97-logs" (OuterVolumeSpecName: "logs") pod "2ac9b546-77cc-444e-9263-ad4f8edfba97" (UID: "2ac9b546-77cc-444e-9263-ad4f8edfba97"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.878133 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ac9b546-77cc-444e-9263-ad4f8edfba97-kube-api-access-94whj" (OuterVolumeSpecName: "kube-api-access-94whj") pod "2ac9b546-77cc-444e-9263-ad4f8edfba97" (UID: "2ac9b546-77cc-444e-9263-ad4f8edfba97"). InnerVolumeSpecName "kube-api-access-94whj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.892943 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ac9b546-77cc-444e-9263-ad4f8edfba97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ac9b546-77cc-444e-9263-ad4f8edfba97" (UID: "2ac9b546-77cc-444e-9263-ad4f8edfba97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.893419 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.902522 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ac9b546-77cc-444e-9263-ad4f8edfba97-config-data" (OuterVolumeSpecName: "config-data") pod "2ac9b546-77cc-444e-9263-ad4f8edfba97" (UID: "2ac9b546-77cc-444e-9263-ad4f8edfba97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.935139 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ac9b546-77cc-444e-9263-ad4f8edfba97-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2ac9b546-77cc-444e-9263-ad4f8edfba97" (UID: "2ac9b546-77cc-444e-9263-ad4f8edfba97"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.968576 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ac9b546-77cc-444e-9263-ad4f8edfba97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.968613 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94whj\" (UniqueName: \"kubernetes.io/projected/2ac9b546-77cc-444e-9263-ad4f8edfba97-kube-api-access-94whj\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.968624 4954 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ac9b546-77cc-444e-9263-ad4f8edfba97-logs\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.968635 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ac9b546-77cc-444e-9263-ad4f8edfba97-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:42 crc kubenswrapper[4954]: I1206 09:48:42.968644 4954 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ac9b546-77cc-444e-9263-ad4f8edfba97-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.462459 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="309d0b52-0b14-4f55-b15c-acb1a6824a88" path="/var/lib/kubelet/pods/309d0b52-0b14-4f55-b15c-acb1a6824a88/volumes" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.469084 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.505126 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.505114 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52eacff3-ab36-43a1-aa46-5a3b5e8c06cb","Type":"ContainerDied","Data":"6b23254c3a0a68e9730786a752e9ba490964144c5de92c690c832a19edc70b84"} Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.505554 4954 scope.go:117] "RemoveContainer" containerID="d71644552accf5ae017efcff05ca3deea024bd07c42a94f8b77215c7f8d0956b" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.508259 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2ac9b546-77cc-444e-9263-ad4f8edfba97","Type":"ContainerDied","Data":"1ee4bb710f71f144649f70d9d7b190e364ac0288a0815916f136cc9c24b7736b"} Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.508333 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.514168 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"745282f6-0c52-4f5b-a2e2-6e30a5a6d763","Type":"ContainerStarted","Data":"41b3dd1f64b18a64d15c86e0ad4b20473d652620991ed1cdb822d5a3c9226a08"} Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.515376 4954 generic.go:334] "Generic (PLEG): container finished" podID="6699d8f6-3aa0-4564-ac22-cfbfade1f563" containerID="b274541346b18644b926e197db6439fba972c34ad7d27db1a9c2c23ab8675758" exitCode=0 Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.515430 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6699d8f6-3aa0-4564-ac22-cfbfade1f563","Type":"ContainerDied","Data":"b274541346b18644b926e197db6439fba972c34ad7d27db1a9c2c23ab8675758"} Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.630465 4954 scope.go:117] "RemoveContainer" containerID="4bc6e5b0e964b77c627b986b87b8adee2fc9226f47b5cb20140ddbd607f93902" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.693283 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.703962 4954 scope.go:117] "RemoveContainer" containerID="5ba2b28d1db1e5489ac7175d5b6f06f37a289288cd713269dfbe260180b3d9f2" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.758967 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.788283 4954 scope.go:117] "RemoveContainer" containerID="a63ecc39c19cdb980123413a222cae58a66829ca8107faaa473e80b57ac52273" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.790304 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6699d8f6-3aa0-4564-ac22-cfbfade1f563-combined-ca-bundle\") pod \"6699d8f6-3aa0-4564-ac22-cfbfade1f563\" (UID: \"6699d8f6-3aa0-4564-ac22-cfbfade1f563\") " Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.812528 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gzml\" (UniqueName: \"kubernetes.io/projected/6699d8f6-3aa0-4564-ac22-cfbfade1f563-kube-api-access-5gzml\") pod \"6699d8f6-3aa0-4564-ac22-cfbfade1f563\" (UID: \"6699d8f6-3aa0-4564-ac22-cfbfade1f563\") " Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.812771 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6699d8f6-3aa0-4564-ac22-cfbfade1f563-config-data\") pod \"6699d8f6-3aa0-4564-ac22-cfbfade1f563\" (UID: \"6699d8f6-3aa0-4564-ac22-cfbfade1f563\") " Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.828236 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.828893 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6699d8f6-3aa0-4564-ac22-cfbfade1f563-kube-api-access-5gzml" (OuterVolumeSpecName: "kube-api-access-5gzml") pod "6699d8f6-3aa0-4564-ac22-cfbfade1f563" (UID: "6699d8f6-3aa0-4564-ac22-cfbfade1f563"). InnerVolumeSpecName "kube-api-access-5gzml". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.858920 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.882074 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.882819 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6699d8f6-3aa0-4564-ac22-cfbfade1f563-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6699d8f6-3aa0-4564-ac22-cfbfade1f563" (UID: "6699d8f6-3aa0-4564-ac22-cfbfade1f563"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.887694 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6699d8f6-3aa0-4564-ac22-cfbfade1f563-config-data" (OuterVolumeSpecName: "config-data") pod "6699d8f6-3aa0-4564-ac22-cfbfade1f563" (UID: "6699d8f6-3aa0-4564-ac22-cfbfade1f563"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.894954 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:48:43 crc kubenswrapper[4954]: E1206 09:48:43.895510 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6699d8f6-3aa0-4564-ac22-cfbfade1f563" containerName="nova-scheduler-scheduler" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.895538 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="6699d8f6-3aa0-4564-ac22-cfbfade1f563" containerName="nova-scheduler-scheduler" Dec 06 09:48:43 crc kubenswrapper[4954]: E1206 09:48:43.895635 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac9b546-77cc-444e-9263-ad4f8edfba97" containerName="nova-metadata-log" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.895648 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac9b546-77cc-444e-9263-ad4f8edfba97" containerName="nova-metadata-log" Dec 06 09:48:43 crc kubenswrapper[4954]: E1206 09:48:43.895683 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac9b546-77cc-444e-9263-ad4f8edfba97" containerName="nova-metadata-metadata" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.895692 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac9b546-77cc-444e-9263-ad4f8edfba97" containerName="nova-metadata-metadata" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.895940 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ac9b546-77cc-444e-9263-ad4f8edfba97" containerName="nova-metadata-log" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.895965 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ac9b546-77cc-444e-9263-ad4f8edfba97" containerName="nova-metadata-metadata" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.895979 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="6699d8f6-3aa0-4564-ac22-cfbfade1f563" containerName="nova-scheduler-scheduler" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.897723 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.900438 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.900611 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.911826 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.926231 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6699d8f6-3aa0-4564-ac22-cfbfade1f563-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.926257 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6699d8f6-3aa0-4564-ac22-cfbfade1f563-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.926267 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gzml\" (UniqueName: \"kubernetes.io/projected/6699d8f6-3aa0-4564-ac22-cfbfade1f563-kube-api-access-5gzml\") on node \"crc\" DevicePath \"\"" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.930695 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.932954 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.935217 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.935388 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.936482 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 09:48:43 crc kubenswrapper[4954]: I1206 09:48:43.956498 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.028131 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e24ace15-e3ed-42ec-ba16-dfc982a11a0c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e24ace15-e3ed-42ec-ba16-dfc982a11a0c\") " pod="openstack/nova-metadata-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.028176 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g67f4\" (UniqueName: \"kubernetes.io/projected/e24ace15-e3ed-42ec-ba16-dfc982a11a0c-kube-api-access-g67f4\") pod \"nova-metadata-0\" (UID: \"e24ace15-e3ed-42ec-ba16-dfc982a11a0c\") " pod="openstack/nova-metadata-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.028220 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45520f31-d94a-4fb2-85a6-f5b9bc5ef338-logs\") pod \"nova-api-0\" (UID: \"45520f31-d94a-4fb2-85a6-f5b9bc5ef338\") " pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.028375 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e24ace15-e3ed-42ec-ba16-dfc982a11a0c-config-data\") pod \"nova-metadata-0\" (UID: \"e24ace15-e3ed-42ec-ba16-dfc982a11a0c\") " pod="openstack/nova-metadata-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.028436 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e24ace15-e3ed-42ec-ba16-dfc982a11a0c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e24ace15-e3ed-42ec-ba16-dfc982a11a0c\") " pod="openstack/nova-metadata-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.028476 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45520f31-d94a-4fb2-85a6-f5b9bc5ef338-internal-tls-certs\") pod \"nova-api-0\" (UID: \"45520f31-d94a-4fb2-85a6-f5b9bc5ef338\") " pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.028617 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e24ace15-e3ed-42ec-ba16-dfc982a11a0c-logs\") pod \"nova-metadata-0\" (UID: \"e24ace15-e3ed-42ec-ba16-dfc982a11a0c\") " pod="openstack/nova-metadata-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.028704 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45520f31-d94a-4fb2-85a6-f5b9bc5ef338-public-tls-certs\") pod \"nova-api-0\" (UID: \"45520f31-d94a-4fb2-85a6-f5b9bc5ef338\") " pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.028792 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdf5x\" (UniqueName: \"kubernetes.io/projected/45520f31-d94a-4fb2-85a6-f5b9bc5ef338-kube-api-access-qdf5x\") pod \"nova-api-0\" (UID: \"45520f31-d94a-4fb2-85a6-f5b9bc5ef338\") " pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.028828 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45520f31-d94a-4fb2-85a6-f5b9bc5ef338-config-data\") pod \"nova-api-0\" (UID: \"45520f31-d94a-4fb2-85a6-f5b9bc5ef338\") " pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.028914 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45520f31-d94a-4fb2-85a6-f5b9bc5ef338-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"45520f31-d94a-4fb2-85a6-f5b9bc5ef338\") " pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.130455 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45520f31-d94a-4fb2-85a6-f5b9bc5ef338-public-tls-certs\") pod \"nova-api-0\" (UID: \"45520f31-d94a-4fb2-85a6-f5b9bc5ef338\") " pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.130521 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdf5x\" (UniqueName: \"kubernetes.io/projected/45520f31-d94a-4fb2-85a6-f5b9bc5ef338-kube-api-access-qdf5x\") pod \"nova-api-0\" (UID: \"45520f31-d94a-4fb2-85a6-f5b9bc5ef338\") " pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.130545 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45520f31-d94a-4fb2-85a6-f5b9bc5ef338-config-data\") pod \"nova-api-0\" (UID: \"45520f31-d94a-4fb2-85a6-f5b9bc5ef338\") " pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.130599 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45520f31-d94a-4fb2-85a6-f5b9bc5ef338-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"45520f31-d94a-4fb2-85a6-f5b9bc5ef338\") " pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.131250 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e24ace15-e3ed-42ec-ba16-dfc982a11a0c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e24ace15-e3ed-42ec-ba16-dfc982a11a0c\") " pod="openstack/nova-metadata-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.131289 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g67f4\" (UniqueName: \"kubernetes.io/projected/e24ace15-e3ed-42ec-ba16-dfc982a11a0c-kube-api-access-g67f4\") pod \"nova-metadata-0\" (UID: \"e24ace15-e3ed-42ec-ba16-dfc982a11a0c\") " pod="openstack/nova-metadata-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.131328 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45520f31-d94a-4fb2-85a6-f5b9bc5ef338-logs\") pod \"nova-api-0\" (UID: \"45520f31-d94a-4fb2-85a6-f5b9bc5ef338\") " pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.131677 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e24ace15-e3ed-42ec-ba16-dfc982a11a0c-config-data\") pod \"nova-metadata-0\" (UID: \"e24ace15-e3ed-42ec-ba16-dfc982a11a0c\") " pod="openstack/nova-metadata-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.131699 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e24ace15-e3ed-42ec-ba16-dfc982a11a0c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e24ace15-e3ed-42ec-ba16-dfc982a11a0c\") " pod="openstack/nova-metadata-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.131787 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45520f31-d94a-4fb2-85a6-f5b9bc5ef338-logs\") pod \"nova-api-0\" (UID: \"45520f31-d94a-4fb2-85a6-f5b9bc5ef338\") " pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.132034 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45520f31-d94a-4fb2-85a6-f5b9bc5ef338-internal-tls-certs\") pod \"nova-api-0\" (UID: \"45520f31-d94a-4fb2-85a6-f5b9bc5ef338\") " pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.132323 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e24ace15-e3ed-42ec-ba16-dfc982a11a0c-logs\") pod \"nova-metadata-0\" (UID: \"e24ace15-e3ed-42ec-ba16-dfc982a11a0c\") " pod="openstack/nova-metadata-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.132737 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e24ace15-e3ed-42ec-ba16-dfc982a11a0c-logs\") pod \"nova-metadata-0\" (UID: \"e24ace15-e3ed-42ec-ba16-dfc982a11a0c\") " pod="openstack/nova-metadata-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.526796 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"745282f6-0c52-4f5b-a2e2-6e30a5a6d763","Type":"ContainerStarted","Data":"f54cca2095618b6af4e934c666c86128d446729e57f4a4eda360ca7e25b021fc"} Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.527899 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.530191 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6699d8f6-3aa0-4564-ac22-cfbfade1f563","Type":"ContainerDied","Data":"88a9dfcdd062baf9877594a52e9a67e1ae25d9aeb57ff3673f21fa7c6a18407b"} Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.530230 4954 scope.go:117] "RemoveContainer" containerID="b274541346b18644b926e197db6439fba972c34ad7d27db1a9c2c23ab8675758" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.530355 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.556873 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.556853822 podStartE2EDuration="2.556853822s" podCreationTimestamp="2025-12-06 09:48:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:48:44.548898681 +0000 UTC m=+10299.362258080" watchObservedRunningTime="2025-12-06 09:48:44.556853822 +0000 UTC m=+10299.370213211" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.569193 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45520f31-d94a-4fb2-85a6-f5b9bc5ef338-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"45520f31-d94a-4fb2-85a6-f5b9bc5ef338\") " pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.569406 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e24ace15-e3ed-42ec-ba16-dfc982a11a0c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e24ace15-e3ed-42ec-ba16-dfc982a11a0c\") " pod="openstack/nova-metadata-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.569750 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e24ace15-e3ed-42ec-ba16-dfc982a11a0c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e24ace15-e3ed-42ec-ba16-dfc982a11a0c\") " pod="openstack/nova-metadata-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.570122 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45520f31-d94a-4fb2-85a6-f5b9bc5ef338-public-tls-certs\") pod \"nova-api-0\" (UID: \"45520f31-d94a-4fb2-85a6-f5b9bc5ef338\") " pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.570328 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45520f31-d94a-4fb2-85a6-f5b9bc5ef338-internal-tls-certs\") pod \"nova-api-0\" (UID: \"45520f31-d94a-4fb2-85a6-f5b9bc5ef338\") " pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.570540 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e24ace15-e3ed-42ec-ba16-dfc982a11a0c-config-data\") pod \"nova-metadata-0\" (UID: \"e24ace15-e3ed-42ec-ba16-dfc982a11a0c\") " pod="openstack/nova-metadata-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.575210 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdf5x\" (UniqueName: \"kubernetes.io/projected/45520f31-d94a-4fb2-85a6-f5b9bc5ef338-kube-api-access-qdf5x\") pod \"nova-api-0\" (UID: \"45520f31-d94a-4fb2-85a6-f5b9bc5ef338\") " pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.577922 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45520f31-d94a-4fb2-85a6-f5b9bc5ef338-config-data\") pod \"nova-api-0\" (UID: \"45520f31-d94a-4fb2-85a6-f5b9bc5ef338\") " pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.578954 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g67f4\" (UniqueName: \"kubernetes.io/projected/e24ace15-e3ed-42ec-ba16-dfc982a11a0c-kube-api-access-g67f4\") pod \"nova-metadata-0\" (UID: \"e24ace15-e3ed-42ec-ba16-dfc982a11a0c\") " pod="openstack/nova-metadata-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.704943 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.727712 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.739976 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.741712 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.744002 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.766631 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.833673 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.846836 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75659995-67ef-4ae7-8908-f47ff93e36eb-config-data\") pod \"nova-scheduler-0\" (UID: \"75659995-67ef-4ae7-8908-f47ff93e36eb\") " pod="openstack/nova-scheduler-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.847252 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mztxc\" (UniqueName: \"kubernetes.io/projected/75659995-67ef-4ae7-8908-f47ff93e36eb-kube-api-access-mztxc\") pod \"nova-scheduler-0\" (UID: \"75659995-67ef-4ae7-8908-f47ff93e36eb\") " pod="openstack/nova-scheduler-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.847400 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75659995-67ef-4ae7-8908-f47ff93e36eb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"75659995-67ef-4ae7-8908-f47ff93e36eb\") " pod="openstack/nova-scheduler-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.860030 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.949531 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mztxc\" (UniqueName: \"kubernetes.io/projected/75659995-67ef-4ae7-8908-f47ff93e36eb-kube-api-access-mztxc\") pod \"nova-scheduler-0\" (UID: \"75659995-67ef-4ae7-8908-f47ff93e36eb\") " pod="openstack/nova-scheduler-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.949728 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75659995-67ef-4ae7-8908-f47ff93e36eb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"75659995-67ef-4ae7-8908-f47ff93e36eb\") " pod="openstack/nova-scheduler-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.949812 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75659995-67ef-4ae7-8908-f47ff93e36eb-config-data\") pod \"nova-scheduler-0\" (UID: \"75659995-67ef-4ae7-8908-f47ff93e36eb\") " pod="openstack/nova-scheduler-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.957047 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75659995-67ef-4ae7-8908-f47ff93e36eb-config-data\") pod \"nova-scheduler-0\" (UID: \"75659995-67ef-4ae7-8908-f47ff93e36eb\") " pod="openstack/nova-scheduler-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.960038 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75659995-67ef-4ae7-8908-f47ff93e36eb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"75659995-67ef-4ae7-8908-f47ff93e36eb\") " pod="openstack/nova-scheduler-0" Dec 06 09:48:44 crc kubenswrapper[4954]: I1206 09:48:44.970943 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mztxc\" (UniqueName: \"kubernetes.io/projected/75659995-67ef-4ae7-8908-f47ff93e36eb-kube-api-access-mztxc\") pod \"nova-scheduler-0\" (UID: \"75659995-67ef-4ae7-8908-f47ff93e36eb\") " pod="openstack/nova-scheduler-0" Dec 06 09:48:45 crc kubenswrapper[4954]: I1206 09:48:45.064829 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 09:48:45 crc kubenswrapper[4954]: I1206 09:48:45.396397 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 09:48:45 crc kubenswrapper[4954]: I1206 09:48:45.407953 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 09:48:45 crc kubenswrapper[4954]: W1206 09:48:45.408218 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode24ace15_e3ed_42ec_ba16_dfc982a11a0c.slice/crio-e3849a7d5bc15d785fc98632309d6e2f0c1262dfc20aacc3edf7baa0543bf8a0 WatchSource:0}: Error finding container e3849a7d5bc15d785fc98632309d6e2f0c1262dfc20aacc3edf7baa0543bf8a0: Status 404 returned error can't find the container with id e3849a7d5bc15d785fc98632309d6e2f0c1262dfc20aacc3edf7baa0543bf8a0 Dec 06 09:48:45 crc kubenswrapper[4954]: W1206 09:48:45.414746 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45520f31_d94a_4fb2_85a6_f5b9bc5ef338.slice/crio-24e21e67dc9fc4989ea1df1d32cc1071d2abf1ac59d2c091b7ac307c2602e6c9 WatchSource:0}: Error finding container 24e21e67dc9fc4989ea1df1d32cc1071d2abf1ac59d2c091b7ac307c2602e6c9: Status 404 returned error can't find the container with id 24e21e67dc9fc4989ea1df1d32cc1071d2abf1ac59d2c091b7ac307c2602e6c9 Dec 06 09:48:45 crc kubenswrapper[4954]: I1206 09:48:45.457535 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ac9b546-77cc-444e-9263-ad4f8edfba97" path="/var/lib/kubelet/pods/2ac9b546-77cc-444e-9263-ad4f8edfba97/volumes" Dec 06 09:48:45 crc kubenswrapper[4954]: I1206 09:48:45.458178 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52eacff3-ab36-43a1-aa46-5a3b5e8c06cb" path="/var/lib/kubelet/pods/52eacff3-ab36-43a1-aa46-5a3b5e8c06cb/volumes" Dec 06 09:48:45 crc kubenswrapper[4954]: I1206 09:48:45.458802 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6699d8f6-3aa0-4564-ac22-cfbfade1f563" path="/var/lib/kubelet/pods/6699d8f6-3aa0-4564-ac22-cfbfade1f563/volumes" Dec 06 09:48:45 crc kubenswrapper[4954]: W1206 09:48:45.555900 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75659995_67ef_4ae7_8908_f47ff93e36eb.slice/crio-9a2c540f9df18d42f486519bc6b3a388adda2e8179f43c0564e1fd0527f91603 WatchSource:0}: Error finding container 9a2c540f9df18d42f486519bc6b3a388adda2e8179f43c0564e1fd0527f91603: Status 404 returned error can't find the container with id 9a2c540f9df18d42f486519bc6b3a388adda2e8179f43c0564e1fd0527f91603 Dec 06 09:48:45 crc kubenswrapper[4954]: I1206 09:48:45.557974 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 09:48:45 crc kubenswrapper[4954]: I1206 09:48:45.566998 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45520f31-d94a-4fb2-85a6-f5b9bc5ef338","Type":"ContainerStarted","Data":"24e21e67dc9fc4989ea1df1d32cc1071d2abf1ac59d2c091b7ac307c2602e6c9"} Dec 06 09:48:45 crc kubenswrapper[4954]: I1206 09:48:45.569796 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e24ace15-e3ed-42ec-ba16-dfc982a11a0c","Type":"ContainerStarted","Data":"e3849a7d5bc15d785fc98632309d6e2f0c1262dfc20aacc3edf7baa0543bf8a0"} Dec 06 09:48:46 crc kubenswrapper[4954]: I1206 09:48:46.586831 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e24ace15-e3ed-42ec-ba16-dfc982a11a0c","Type":"ContainerStarted","Data":"55ca37d2aa0ab02f1e0204b774cdd2fd05cb93c1fceb28d6081be53f5457e0dc"} Dec 06 09:48:46 crc kubenswrapper[4954]: I1206 09:48:46.586870 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e24ace15-e3ed-42ec-ba16-dfc982a11a0c","Type":"ContainerStarted","Data":"f8ca9e76ed9e511ed05f67b150e9d438c730f402a10df57263140d7cef554c58"} Dec 06 09:48:46 crc kubenswrapper[4954]: I1206 09:48:46.589730 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"75659995-67ef-4ae7-8908-f47ff93e36eb","Type":"ContainerStarted","Data":"ae69c43d7d96b9b2f774347523451a94dc349760bb7fe58aed7f629b19057240"} Dec 06 09:48:46 crc kubenswrapper[4954]: I1206 09:48:46.589776 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"75659995-67ef-4ae7-8908-f47ff93e36eb","Type":"ContainerStarted","Data":"9a2c540f9df18d42f486519bc6b3a388adda2e8179f43c0564e1fd0527f91603"} Dec 06 09:48:46 crc kubenswrapper[4954]: I1206 09:48:46.591962 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45520f31-d94a-4fb2-85a6-f5b9bc5ef338","Type":"ContainerStarted","Data":"bb3d01e11b74394e50c09a1b4f7460a1f572c72f4aaaf87ea7b84ac3a2f72bb7"} Dec 06 09:48:46 crc kubenswrapper[4954]: I1206 09:48:46.592010 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45520f31-d94a-4fb2-85a6-f5b9bc5ef338","Type":"ContainerStarted","Data":"b8cf993464704fc7efc44920de3be7586801f1b6e0bc4e396fcd5748c6701c7c"} Dec 06 09:48:46 crc kubenswrapper[4954]: I1206 09:48:46.614996 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.61490542 podStartE2EDuration="3.61490542s" podCreationTimestamp="2025-12-06 09:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:48:46.608311955 +0000 UTC m=+10301.421671374" watchObservedRunningTime="2025-12-06 09:48:46.61490542 +0000 UTC m=+10301.428264799" Dec 06 09:48:46 crc kubenswrapper[4954]: I1206 09:48:46.645535 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.645509915 podStartE2EDuration="2.645509915s" podCreationTimestamp="2025-12-06 09:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:48:46.639543436 +0000 UTC m=+10301.452902865" watchObservedRunningTime="2025-12-06 09:48:46.645509915 +0000 UTC m=+10301.458869344" Dec 06 09:48:46 crc kubenswrapper[4954]: I1206 09:48:46.676275 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.676257914 podStartE2EDuration="3.676257914s" podCreationTimestamp="2025-12-06 09:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:48:46.672170425 +0000 UTC m=+10301.485529814" watchObservedRunningTime="2025-12-06 09:48:46.676257914 +0000 UTC m=+10301.489617303" Dec 06 09:48:49 crc kubenswrapper[4954]: I1206 09:48:49.834522 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 09:48:49 crc kubenswrapper[4954]: I1206 09:48:49.835130 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 09:48:50 crc kubenswrapper[4954]: I1206 09:48:50.065658 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 06 09:48:50 crc kubenswrapper[4954]: I1206 09:48:50.808292 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 06 09:48:52 crc kubenswrapper[4954]: I1206 09:48:52.924893 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 06 09:48:54 crc kubenswrapper[4954]: I1206 09:48:54.834901 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 09:48:54 crc kubenswrapper[4954]: I1206 09:48:54.835226 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 09:48:54 crc kubenswrapper[4954]: I1206 09:48:54.860487 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 09:48:54 crc kubenswrapper[4954]: I1206 09:48:54.860543 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 09:48:55 crc kubenswrapper[4954]: I1206 09:48:55.065217 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 06 09:48:55 crc kubenswrapper[4954]: I1206 09:48:55.091728 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 06 09:48:55 crc kubenswrapper[4954]: I1206 09:48:55.725301 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 06 09:48:55 crc kubenswrapper[4954]: I1206 09:48:55.850780 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e24ace15-e3ed-42ec-ba16-dfc982a11a0c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 09:48:55 crc kubenswrapper[4954]: I1206 09:48:55.851042 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e24ace15-e3ed-42ec-ba16-dfc982a11a0c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 09:48:55 crc kubenswrapper[4954]: I1206 09:48:55.874949 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="45520f31-d94a-4fb2-85a6-f5b9bc5ef338" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 09:48:55 crc kubenswrapper[4954]: I1206 09:48:55.874960 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="45520f31-d94a-4fb2-85a6-f5b9bc5ef338" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 09:49:04 crc kubenswrapper[4954]: I1206 09:49:04.846103 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 09:49:04 crc kubenswrapper[4954]: I1206 09:49:04.846831 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 09:49:04 crc kubenswrapper[4954]: I1206 09:49:04.853976 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 09:49:04 crc kubenswrapper[4954]: I1206 09:49:04.856571 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 09:49:04 crc kubenswrapper[4954]: I1206 09:49:04.871880 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 09:49:04 crc kubenswrapper[4954]: I1206 09:49:04.873003 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 09:49:04 crc kubenswrapper[4954]: I1206 09:49:04.875296 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 09:49:04 crc kubenswrapper[4954]: I1206 09:49:04.886871 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 09:49:05 crc kubenswrapper[4954]: I1206 09:49:05.793627 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 09:49:05 crc kubenswrapper[4954]: I1206 09:49:05.800675 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 09:49:13 crc kubenswrapper[4954]: I1206 09:49:13.614808 4954 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod52eacff3-ab36-43a1-aa46-5a3b5e8c06cb"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod52eacff3-ab36-43a1-aa46-5a3b5e8c06cb] : Timed out while waiting for systemd to remove kubepods-besteffort-pod52eacff3_ab36_43a1_aa46_5a3b5e8c06cb.slice" Dec 06 09:49:13 crc kubenswrapper[4954]: I1206 09:49:13.621038 4954 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod2ac9b546-77cc-444e-9263-ad4f8edfba97"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod2ac9b546-77cc-444e-9263-ad4f8edfba97] : Timed out while waiting for systemd to remove kubepods-besteffort-pod2ac9b546_77cc_444e_9263_ad4f8edfba97.slice" Dec 06 09:50:40 crc kubenswrapper[4954]: I1206 09:50:40.101011 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:50:40 crc kubenswrapper[4954]: I1206 09:50:40.101461 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:51:10 crc kubenswrapper[4954]: I1206 09:51:10.101280 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:51:10 crc kubenswrapper[4954]: I1206 09:51:10.102102 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:51:27 crc kubenswrapper[4954]: I1206 09:51:27.345604 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" event={"ID":"4787a099-d3d6-4ae4-9cfa-b50d3d082889","Type":"ContainerDied","Data":"9e678cb95550c2b064ab40161a5b7c7126ebc9826631369eb573dabaaa19152c"} Dec 06 09:51:27 crc kubenswrapper[4954]: I1206 09:51:27.345594 4954 generic.go:334] "Generic (PLEG): container finished" podID="4787a099-d3d6-4ae4-9cfa-b50d3d082889" containerID="9e678cb95550c2b064ab40161a5b7c7126ebc9826631369eb573dabaaa19152c" exitCode=0 Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.815507 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.926419 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-inventory\") pod \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.926513 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-ssh-key\") pod \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.926598 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkk4g\" (UniqueName: \"kubernetes.io/projected/4787a099-d3d6-4ae4-9cfa-b50d3d082889-kube-api-access-gkk4g\") pod \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.926649 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cell1-compute-config-1\") pod \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.926786 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cell1-combined-ca-bundle\") pod \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.926932 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-migration-ssh-key-1\") pod \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.926981 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-migration-ssh-key-0\") pod \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.927008 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cells-global-config-0\") pod \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.927075 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cell1-compute-config-0\") pod \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\" (UID: \"4787a099-d3d6-4ae4-9cfa-b50d3d082889\") " Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.939563 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "4787a099-d3d6-4ae4-9cfa-b50d3d082889" (UID: "4787a099-d3d6-4ae4-9cfa-b50d3d082889"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.939629 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4787a099-d3d6-4ae4-9cfa-b50d3d082889-kube-api-access-gkk4g" (OuterVolumeSpecName: "kube-api-access-gkk4g") pod "4787a099-d3d6-4ae4-9cfa-b50d3d082889" (UID: "4787a099-d3d6-4ae4-9cfa-b50d3d082889"). InnerVolumeSpecName "kube-api-access-gkk4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.958491 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "4787a099-d3d6-4ae4-9cfa-b50d3d082889" (UID: "4787a099-d3d6-4ae4-9cfa-b50d3d082889"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.965762 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "4787a099-d3d6-4ae4-9cfa-b50d3d082889" (UID: "4787a099-d3d6-4ae4-9cfa-b50d3d082889"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.970326 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "4787a099-d3d6-4ae4-9cfa-b50d3d082889" (UID: "4787a099-d3d6-4ae4-9cfa-b50d3d082889"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.971789 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4787a099-d3d6-4ae4-9cfa-b50d3d082889" (UID: "4787a099-d3d6-4ae4-9cfa-b50d3d082889"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.972878 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "4787a099-d3d6-4ae4-9cfa-b50d3d082889" (UID: "4787a099-d3d6-4ae4-9cfa-b50d3d082889"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.981364 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "4787a099-d3d6-4ae4-9cfa-b50d3d082889" (UID: "4787a099-d3d6-4ae4-9cfa-b50d3d082889"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:51:28 crc kubenswrapper[4954]: I1206 09:51:28.983890 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-inventory" (OuterVolumeSpecName: "inventory") pod "4787a099-d3d6-4ae4-9cfa-b50d3d082889" (UID: "4787a099-d3d6-4ae4-9cfa-b50d3d082889"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:51:29 crc kubenswrapper[4954]: I1206 09:51:29.029712 4954 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:29 crc kubenswrapper[4954]: I1206 09:51:29.029749 4954 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:29 crc kubenswrapper[4954]: I1206 09:51:29.029760 4954 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:29 crc kubenswrapper[4954]: I1206 09:51:29.029770 4954 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:29 crc kubenswrapper[4954]: I1206 09:51:29.029780 4954 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:29 crc kubenswrapper[4954]: I1206 09:51:29.029789 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:29 crc kubenswrapper[4954]: I1206 09:51:29.029797 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkk4g\" (UniqueName: \"kubernetes.io/projected/4787a099-d3d6-4ae4-9cfa-b50d3d082889-kube-api-access-gkk4g\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:29 crc kubenswrapper[4954]: I1206 09:51:29.029810 4954 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:29 crc kubenswrapper[4954]: I1206 09:51:29.029818 4954 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4787a099-d3d6-4ae4-9cfa-b50d3d082889-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:51:29 crc kubenswrapper[4954]: I1206 09:51:29.365593 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" event={"ID":"4787a099-d3d6-4ae4-9cfa-b50d3d082889","Type":"ContainerDied","Data":"8156a0ec174bce0d94d18d04e32201b92d777d3df1f7fff61185a20cc88ff3dc"} Dec 06 09:51:29 crc kubenswrapper[4954]: I1206 09:51:29.365631 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8156a0ec174bce0d94d18d04e32201b92d777d3df1f7fff61185a20cc88ff3dc" Dec 06 09:51:29 crc kubenswrapper[4954]: I1206 09:51:29.365684 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx" Dec 06 09:51:40 crc kubenswrapper[4954]: I1206 09:51:40.101180 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:51:40 crc kubenswrapper[4954]: I1206 09:51:40.101679 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:51:40 crc kubenswrapper[4954]: I1206 09:51:40.101715 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 09:51:40 crc kubenswrapper[4954]: I1206 09:51:40.102443 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 09:51:40 crc kubenswrapper[4954]: I1206 09:51:40.102498 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" gracePeriod=600 Dec 06 09:51:40 crc kubenswrapper[4954]: E1206 09:51:40.226384 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:51:40 crc kubenswrapper[4954]: I1206 09:51:40.485210 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" exitCode=0 Dec 06 09:51:40 crc kubenswrapper[4954]: I1206 09:51:40.485244 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921"} Dec 06 09:51:40 crc kubenswrapper[4954]: I1206 09:51:40.485296 4954 scope.go:117] "RemoveContainer" containerID="ecc8d0836703f7e1eb88ddf5d2d3f6cc0d402d1a725f2802a1c6e875ac74d2b9" Dec 06 09:51:40 crc kubenswrapper[4954]: I1206 09:51:40.486355 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:51:40 crc kubenswrapper[4954]: E1206 09:51:40.486700 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:51:52 crc kubenswrapper[4954]: I1206 09:51:52.444094 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:51:52 crc kubenswrapper[4954]: E1206 09:51:52.444767 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:52:07 crc kubenswrapper[4954]: I1206 09:52:07.443694 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:52:07 crc kubenswrapper[4954]: E1206 09:52:07.444467 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:52:18 crc kubenswrapper[4954]: I1206 09:52:18.445903 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:52:18 crc kubenswrapper[4954]: E1206 09:52:18.447371 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:52:20 crc kubenswrapper[4954]: I1206 09:52:20.262822 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-crrgq"] Dec 06 09:52:20 crc kubenswrapper[4954]: E1206 09:52:20.263766 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4787a099-d3d6-4ae4-9cfa-b50d3d082889" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 06 09:52:20 crc kubenswrapper[4954]: I1206 09:52:20.263785 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4787a099-d3d6-4ae4-9cfa-b50d3d082889" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 06 09:52:20 crc kubenswrapper[4954]: I1206 09:52:20.264134 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="4787a099-d3d6-4ae4-9cfa-b50d3d082889" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 06 09:52:20 crc kubenswrapper[4954]: I1206 09:52:20.266201 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crrgq" Dec 06 09:52:20 crc kubenswrapper[4954]: I1206 09:52:20.280076 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-crrgq"] Dec 06 09:52:20 crc kubenswrapper[4954]: I1206 09:52:20.310103 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg8zc\" (UniqueName: \"kubernetes.io/projected/8955e8f3-6da2-4878-aab6-6450c89bcab0-kube-api-access-lg8zc\") pod \"community-operators-crrgq\" (UID: \"8955e8f3-6da2-4878-aab6-6450c89bcab0\") " pod="openshift-marketplace/community-operators-crrgq" Dec 06 09:52:20 crc kubenswrapper[4954]: I1206 09:52:20.310186 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8955e8f3-6da2-4878-aab6-6450c89bcab0-utilities\") pod \"community-operators-crrgq\" (UID: \"8955e8f3-6da2-4878-aab6-6450c89bcab0\") " pod="openshift-marketplace/community-operators-crrgq" Dec 06 09:52:20 crc kubenswrapper[4954]: I1206 09:52:20.310475 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8955e8f3-6da2-4878-aab6-6450c89bcab0-catalog-content\") pod \"community-operators-crrgq\" (UID: \"8955e8f3-6da2-4878-aab6-6450c89bcab0\") " pod="openshift-marketplace/community-operators-crrgq" Dec 06 09:52:20 crc kubenswrapper[4954]: I1206 09:52:20.413036 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8955e8f3-6da2-4878-aab6-6450c89bcab0-utilities\") pod \"community-operators-crrgq\" (UID: \"8955e8f3-6da2-4878-aab6-6450c89bcab0\") " pod="openshift-marketplace/community-operators-crrgq" Dec 06 09:52:20 crc kubenswrapper[4954]: I1206 09:52:20.413129 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8955e8f3-6da2-4878-aab6-6450c89bcab0-catalog-content\") pod \"community-operators-crrgq\" (UID: \"8955e8f3-6da2-4878-aab6-6450c89bcab0\") " pod="openshift-marketplace/community-operators-crrgq" Dec 06 09:52:20 crc kubenswrapper[4954]: I1206 09:52:20.413227 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg8zc\" (UniqueName: \"kubernetes.io/projected/8955e8f3-6da2-4878-aab6-6450c89bcab0-kube-api-access-lg8zc\") pod \"community-operators-crrgq\" (UID: \"8955e8f3-6da2-4878-aab6-6450c89bcab0\") " pod="openshift-marketplace/community-operators-crrgq" Dec 06 09:52:20 crc kubenswrapper[4954]: I1206 09:52:20.413959 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8955e8f3-6da2-4878-aab6-6450c89bcab0-utilities\") pod \"community-operators-crrgq\" (UID: \"8955e8f3-6da2-4878-aab6-6450c89bcab0\") " pod="openshift-marketplace/community-operators-crrgq" Dec 06 09:52:20 crc kubenswrapper[4954]: I1206 09:52:20.414292 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8955e8f3-6da2-4878-aab6-6450c89bcab0-catalog-content\") pod \"community-operators-crrgq\" (UID: \"8955e8f3-6da2-4878-aab6-6450c89bcab0\") " pod="openshift-marketplace/community-operators-crrgq" Dec 06 09:52:20 crc kubenswrapper[4954]: I1206 09:52:20.441676 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg8zc\" (UniqueName: \"kubernetes.io/projected/8955e8f3-6da2-4878-aab6-6450c89bcab0-kube-api-access-lg8zc\") pod \"community-operators-crrgq\" (UID: \"8955e8f3-6da2-4878-aab6-6450c89bcab0\") " pod="openshift-marketplace/community-operators-crrgq" Dec 06 09:52:20 crc kubenswrapper[4954]: I1206 09:52:20.611965 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crrgq" Dec 06 09:52:21 crc kubenswrapper[4954]: I1206 09:52:21.166181 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-crrgq"] Dec 06 09:52:21 crc kubenswrapper[4954]: I1206 09:52:21.926649 4954 generic.go:334] "Generic (PLEG): container finished" podID="8955e8f3-6da2-4878-aab6-6450c89bcab0" containerID="aa3e92bda58534b56afb354a9a34651ea5addd32bc8786b587eb8dd2f496561a" exitCode=0 Dec 06 09:52:21 crc kubenswrapper[4954]: I1206 09:52:21.926746 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crrgq" event={"ID":"8955e8f3-6da2-4878-aab6-6450c89bcab0","Type":"ContainerDied","Data":"aa3e92bda58534b56afb354a9a34651ea5addd32bc8786b587eb8dd2f496561a"} Dec 06 09:52:21 crc kubenswrapper[4954]: I1206 09:52:21.926945 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crrgq" event={"ID":"8955e8f3-6da2-4878-aab6-6450c89bcab0","Type":"ContainerStarted","Data":"c460007a6d80c426ff0543fff68ccc92973275a76fb7899f205acece0df7fc2f"} Dec 06 09:52:21 crc kubenswrapper[4954]: I1206 09:52:21.930370 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 09:52:22 crc kubenswrapper[4954]: I1206 09:52:22.937976 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crrgq" event={"ID":"8955e8f3-6da2-4878-aab6-6450c89bcab0","Type":"ContainerStarted","Data":"a7e8009e549ede614194bf898fbd29bb0361943cc3e5171dc8c5eba336e7803c"} Dec 06 09:52:23 crc kubenswrapper[4954]: I1206 09:52:23.955610 4954 generic.go:334] "Generic (PLEG): container finished" podID="8955e8f3-6da2-4878-aab6-6450c89bcab0" containerID="a7e8009e549ede614194bf898fbd29bb0361943cc3e5171dc8c5eba336e7803c" exitCode=0 Dec 06 09:52:23 crc kubenswrapper[4954]: I1206 09:52:23.955658 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crrgq" event={"ID":"8955e8f3-6da2-4878-aab6-6450c89bcab0","Type":"ContainerDied","Data":"a7e8009e549ede614194bf898fbd29bb0361943cc3e5171dc8c5eba336e7803c"} Dec 06 09:52:24 crc kubenswrapper[4954]: I1206 09:52:24.968347 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crrgq" event={"ID":"8955e8f3-6da2-4878-aab6-6450c89bcab0","Type":"ContainerStarted","Data":"9452f70a134246259525bbc9449d88aa3f7ad046be45fb02e520db30a51ef7bc"} Dec 06 09:52:24 crc kubenswrapper[4954]: I1206 09:52:24.987602 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-crrgq" podStartSLOduration=2.549915278 podStartE2EDuration="4.987550836s" podCreationTimestamp="2025-12-06 09:52:20 +0000 UTC" firstStartedPulling="2025-12-06 09:52:21.929806781 +0000 UTC m=+10516.743166210" lastFinishedPulling="2025-12-06 09:52:24.367442369 +0000 UTC m=+10519.180801768" observedRunningTime="2025-12-06 09:52:24.985583273 +0000 UTC m=+10519.798942672" watchObservedRunningTime="2025-12-06 09:52:24.987550836 +0000 UTC m=+10519.800910225" Dec 06 09:52:30 crc kubenswrapper[4954]: I1206 09:52:30.443372 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:52:30 crc kubenswrapper[4954]: E1206 09:52:30.444787 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:52:30 crc kubenswrapper[4954]: I1206 09:52:30.612494 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-crrgq" Dec 06 09:52:30 crc kubenswrapper[4954]: I1206 09:52:30.612578 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-crrgq" Dec 06 09:52:30 crc kubenswrapper[4954]: I1206 09:52:30.676995 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-crrgq" Dec 06 09:52:31 crc kubenswrapper[4954]: I1206 09:52:31.081534 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-crrgq" Dec 06 09:52:31 crc kubenswrapper[4954]: I1206 09:52:31.136552 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-crrgq"] Dec 06 09:52:33 crc kubenswrapper[4954]: I1206 09:52:33.050583 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-crrgq" podUID="8955e8f3-6da2-4878-aab6-6450c89bcab0" containerName="registry-server" containerID="cri-o://9452f70a134246259525bbc9449d88aa3f7ad046be45fb02e520db30a51ef7bc" gracePeriod=2 Dec 06 09:52:33 crc kubenswrapper[4954]: I1206 09:52:33.564908 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crrgq" Dec 06 09:52:33 crc kubenswrapper[4954]: I1206 09:52:33.757190 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8955e8f3-6da2-4878-aab6-6450c89bcab0-catalog-content\") pod \"8955e8f3-6da2-4878-aab6-6450c89bcab0\" (UID: \"8955e8f3-6da2-4878-aab6-6450c89bcab0\") " Dec 06 09:52:33 crc kubenswrapper[4954]: I1206 09:52:33.757295 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg8zc\" (UniqueName: \"kubernetes.io/projected/8955e8f3-6da2-4878-aab6-6450c89bcab0-kube-api-access-lg8zc\") pod \"8955e8f3-6da2-4878-aab6-6450c89bcab0\" (UID: \"8955e8f3-6da2-4878-aab6-6450c89bcab0\") " Dec 06 09:52:33 crc kubenswrapper[4954]: I1206 09:52:33.757361 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8955e8f3-6da2-4878-aab6-6450c89bcab0-utilities\") pod \"8955e8f3-6da2-4878-aab6-6450c89bcab0\" (UID: \"8955e8f3-6da2-4878-aab6-6450c89bcab0\") " Dec 06 09:52:33 crc kubenswrapper[4954]: I1206 09:52:33.758667 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8955e8f3-6da2-4878-aab6-6450c89bcab0-utilities" (OuterVolumeSpecName: "utilities") pod "8955e8f3-6da2-4878-aab6-6450c89bcab0" (UID: "8955e8f3-6da2-4878-aab6-6450c89bcab0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:52:33 crc kubenswrapper[4954]: I1206 09:52:33.763667 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8955e8f3-6da2-4878-aab6-6450c89bcab0-kube-api-access-lg8zc" (OuterVolumeSpecName: "kube-api-access-lg8zc") pod "8955e8f3-6da2-4878-aab6-6450c89bcab0" (UID: "8955e8f3-6da2-4878-aab6-6450c89bcab0"). InnerVolumeSpecName "kube-api-access-lg8zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:52:33 crc kubenswrapper[4954]: I1206 09:52:33.834827 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8955e8f3-6da2-4878-aab6-6450c89bcab0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8955e8f3-6da2-4878-aab6-6450c89bcab0" (UID: "8955e8f3-6da2-4878-aab6-6450c89bcab0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:52:33 crc kubenswrapper[4954]: I1206 09:52:33.860033 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8955e8f3-6da2-4878-aab6-6450c89bcab0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:52:33 crc kubenswrapper[4954]: I1206 09:52:33.860089 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg8zc\" (UniqueName: \"kubernetes.io/projected/8955e8f3-6da2-4878-aab6-6450c89bcab0-kube-api-access-lg8zc\") on node \"crc\" DevicePath \"\"" Dec 06 09:52:33 crc kubenswrapper[4954]: I1206 09:52:33.860101 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8955e8f3-6da2-4878-aab6-6450c89bcab0-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:52:34 crc kubenswrapper[4954]: I1206 09:52:34.069518 4954 generic.go:334] "Generic (PLEG): container finished" podID="8955e8f3-6da2-4878-aab6-6450c89bcab0" containerID="9452f70a134246259525bbc9449d88aa3f7ad046be45fb02e520db30a51ef7bc" exitCode=0 Dec 06 09:52:34 crc kubenswrapper[4954]: I1206 09:52:34.069593 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crrgq" Dec 06 09:52:34 crc kubenswrapper[4954]: I1206 09:52:34.069622 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crrgq" event={"ID":"8955e8f3-6da2-4878-aab6-6450c89bcab0","Type":"ContainerDied","Data":"9452f70a134246259525bbc9449d88aa3f7ad046be45fb02e520db30a51ef7bc"} Dec 06 09:52:34 crc kubenswrapper[4954]: I1206 09:52:34.069735 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crrgq" event={"ID":"8955e8f3-6da2-4878-aab6-6450c89bcab0","Type":"ContainerDied","Data":"c460007a6d80c426ff0543fff68ccc92973275a76fb7899f205acece0df7fc2f"} Dec 06 09:52:34 crc kubenswrapper[4954]: I1206 09:52:34.069824 4954 scope.go:117] "RemoveContainer" containerID="9452f70a134246259525bbc9449d88aa3f7ad046be45fb02e520db30a51ef7bc" Dec 06 09:52:34 crc kubenswrapper[4954]: I1206 09:52:34.116984 4954 scope.go:117] "RemoveContainer" containerID="a7e8009e549ede614194bf898fbd29bb0361943cc3e5171dc8c5eba336e7803c" Dec 06 09:52:34 crc kubenswrapper[4954]: I1206 09:52:34.130498 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-crrgq"] Dec 06 09:52:34 crc kubenswrapper[4954]: I1206 09:52:34.141174 4954 scope.go:117] "RemoveContainer" containerID="aa3e92bda58534b56afb354a9a34651ea5addd32bc8786b587eb8dd2f496561a" Dec 06 09:52:34 crc kubenswrapper[4954]: I1206 09:52:34.163036 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-crrgq"] Dec 06 09:52:34 crc kubenswrapper[4954]: I1206 09:52:34.194623 4954 scope.go:117] "RemoveContainer" containerID="9452f70a134246259525bbc9449d88aa3f7ad046be45fb02e520db30a51ef7bc" Dec 06 09:52:34 crc kubenswrapper[4954]: E1206 09:52:34.195833 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9452f70a134246259525bbc9449d88aa3f7ad046be45fb02e520db30a51ef7bc\": container with ID starting with 9452f70a134246259525bbc9449d88aa3f7ad046be45fb02e520db30a51ef7bc not found: ID does not exist" containerID="9452f70a134246259525bbc9449d88aa3f7ad046be45fb02e520db30a51ef7bc" Dec 06 09:52:34 crc kubenswrapper[4954]: I1206 09:52:34.195884 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9452f70a134246259525bbc9449d88aa3f7ad046be45fb02e520db30a51ef7bc"} err="failed to get container status \"9452f70a134246259525bbc9449d88aa3f7ad046be45fb02e520db30a51ef7bc\": rpc error: code = NotFound desc = could not find container \"9452f70a134246259525bbc9449d88aa3f7ad046be45fb02e520db30a51ef7bc\": container with ID starting with 9452f70a134246259525bbc9449d88aa3f7ad046be45fb02e520db30a51ef7bc not found: ID does not exist" Dec 06 09:52:34 crc kubenswrapper[4954]: I1206 09:52:34.195938 4954 scope.go:117] "RemoveContainer" containerID="a7e8009e549ede614194bf898fbd29bb0361943cc3e5171dc8c5eba336e7803c" Dec 06 09:52:34 crc kubenswrapper[4954]: E1206 09:52:34.197806 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7e8009e549ede614194bf898fbd29bb0361943cc3e5171dc8c5eba336e7803c\": container with ID starting with a7e8009e549ede614194bf898fbd29bb0361943cc3e5171dc8c5eba336e7803c not found: ID does not exist" containerID="a7e8009e549ede614194bf898fbd29bb0361943cc3e5171dc8c5eba336e7803c" Dec 06 09:52:34 crc kubenswrapper[4954]: I1206 09:52:34.197941 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e8009e549ede614194bf898fbd29bb0361943cc3e5171dc8c5eba336e7803c"} err="failed to get container status \"a7e8009e549ede614194bf898fbd29bb0361943cc3e5171dc8c5eba336e7803c\": rpc error: code = NotFound desc = could not find container \"a7e8009e549ede614194bf898fbd29bb0361943cc3e5171dc8c5eba336e7803c\": container with ID starting with a7e8009e549ede614194bf898fbd29bb0361943cc3e5171dc8c5eba336e7803c not found: ID does not exist" Dec 06 09:52:34 crc kubenswrapper[4954]: I1206 09:52:34.198044 4954 scope.go:117] "RemoveContainer" containerID="aa3e92bda58534b56afb354a9a34651ea5addd32bc8786b587eb8dd2f496561a" Dec 06 09:52:34 crc kubenswrapper[4954]: E1206 09:52:34.198971 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa3e92bda58534b56afb354a9a34651ea5addd32bc8786b587eb8dd2f496561a\": container with ID starting with aa3e92bda58534b56afb354a9a34651ea5addd32bc8786b587eb8dd2f496561a not found: ID does not exist" containerID="aa3e92bda58534b56afb354a9a34651ea5addd32bc8786b587eb8dd2f496561a" Dec 06 09:52:34 crc kubenswrapper[4954]: I1206 09:52:34.199117 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa3e92bda58534b56afb354a9a34651ea5addd32bc8786b587eb8dd2f496561a"} err="failed to get container status \"aa3e92bda58534b56afb354a9a34651ea5addd32bc8786b587eb8dd2f496561a\": rpc error: code = NotFound desc = could not find container \"aa3e92bda58534b56afb354a9a34651ea5addd32bc8786b587eb8dd2f496561a\": container with ID starting with aa3e92bda58534b56afb354a9a34651ea5addd32bc8786b587eb8dd2f496561a not found: ID does not exist" Dec 06 09:52:35 crc kubenswrapper[4954]: I1206 09:52:35.453644 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8955e8f3-6da2-4878-aab6-6450c89bcab0" path="/var/lib/kubelet/pods/8955e8f3-6da2-4878-aab6-6450c89bcab0/volumes" Dec 06 09:52:41 crc kubenswrapper[4954]: I1206 09:52:41.444321 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:52:41 crc kubenswrapper[4954]: E1206 09:52:41.445101 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:52:54 crc kubenswrapper[4954]: I1206 09:52:54.444604 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:52:54 crc kubenswrapper[4954]: E1206 09:52:54.445600 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:53:08 crc kubenswrapper[4954]: I1206 09:53:08.444080 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:53:08 crc kubenswrapper[4954]: E1206 09:53:08.444990 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:53:12 crc kubenswrapper[4954]: I1206 09:53:12.717398 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 06 09:53:12 crc kubenswrapper[4954]: I1206 09:53:12.719169 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="c1d0492d-6e62-434f-9ede-02d942146a89" containerName="adoption" containerID="cri-o://a6ed010538ba1029f5bb0790e878476d37f6e16617297d651553ed5342902e62" gracePeriod=30 Dec 06 09:53:19 crc kubenswrapper[4954]: I1206 09:53:19.443196 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:53:19 crc kubenswrapper[4954]: E1206 09:53:19.444038 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:53:23 crc kubenswrapper[4954]: I1206 09:53:23.575209 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l6nrn"] Dec 06 09:53:23 crc kubenswrapper[4954]: E1206 09:53:23.576342 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8955e8f3-6da2-4878-aab6-6450c89bcab0" containerName="extract-content" Dec 06 09:53:23 crc kubenswrapper[4954]: I1206 09:53:23.576359 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8955e8f3-6da2-4878-aab6-6450c89bcab0" containerName="extract-content" Dec 06 09:53:23 crc kubenswrapper[4954]: E1206 09:53:23.576400 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8955e8f3-6da2-4878-aab6-6450c89bcab0" containerName="extract-utilities" Dec 06 09:53:23 crc kubenswrapper[4954]: I1206 09:53:23.576410 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8955e8f3-6da2-4878-aab6-6450c89bcab0" containerName="extract-utilities" Dec 06 09:53:23 crc kubenswrapper[4954]: E1206 09:53:23.576431 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8955e8f3-6da2-4878-aab6-6450c89bcab0" containerName="registry-server" Dec 06 09:53:23 crc kubenswrapper[4954]: I1206 09:53:23.576441 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8955e8f3-6da2-4878-aab6-6450c89bcab0" containerName="registry-server" Dec 06 09:53:23 crc kubenswrapper[4954]: I1206 09:53:23.576737 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8955e8f3-6da2-4878-aab6-6450c89bcab0" containerName="registry-server" Dec 06 09:53:23 crc kubenswrapper[4954]: I1206 09:53:23.579182 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l6nrn" Dec 06 09:53:23 crc kubenswrapper[4954]: I1206 09:53:23.588327 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l6nrn"] Dec 06 09:53:23 crc kubenswrapper[4954]: I1206 09:53:23.693357 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b-catalog-content\") pod \"redhat-marketplace-l6nrn\" (UID: \"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b\") " pod="openshift-marketplace/redhat-marketplace-l6nrn" Dec 06 09:53:23 crc kubenswrapper[4954]: I1206 09:53:23.693437 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b-utilities\") pod \"redhat-marketplace-l6nrn\" (UID: \"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b\") " pod="openshift-marketplace/redhat-marketplace-l6nrn" Dec 06 09:53:23 crc kubenswrapper[4954]: I1206 09:53:23.693580 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpbct\" (UniqueName: \"kubernetes.io/projected/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b-kube-api-access-zpbct\") pod \"redhat-marketplace-l6nrn\" (UID: \"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b\") " pod="openshift-marketplace/redhat-marketplace-l6nrn" Dec 06 09:53:23 crc kubenswrapper[4954]: I1206 09:53:23.795603 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpbct\" (UniqueName: \"kubernetes.io/projected/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b-kube-api-access-zpbct\") pod \"redhat-marketplace-l6nrn\" (UID: \"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b\") " pod="openshift-marketplace/redhat-marketplace-l6nrn" Dec 06 09:53:23 crc kubenswrapper[4954]: I1206 09:53:23.795734 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b-catalog-content\") pod \"redhat-marketplace-l6nrn\" (UID: \"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b\") " pod="openshift-marketplace/redhat-marketplace-l6nrn" Dec 06 09:53:23 crc kubenswrapper[4954]: I1206 09:53:23.795772 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b-utilities\") pod \"redhat-marketplace-l6nrn\" (UID: \"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b\") " pod="openshift-marketplace/redhat-marketplace-l6nrn" Dec 06 09:53:23 crc kubenswrapper[4954]: I1206 09:53:23.796316 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b-utilities\") pod \"redhat-marketplace-l6nrn\" (UID: \"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b\") " pod="openshift-marketplace/redhat-marketplace-l6nrn" Dec 06 09:53:23 crc kubenswrapper[4954]: I1206 09:53:23.796652 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b-catalog-content\") pod \"redhat-marketplace-l6nrn\" (UID: \"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b\") " pod="openshift-marketplace/redhat-marketplace-l6nrn" Dec 06 09:53:23 crc kubenswrapper[4954]: I1206 09:53:23.848481 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpbct\" (UniqueName: \"kubernetes.io/projected/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b-kube-api-access-zpbct\") pod \"redhat-marketplace-l6nrn\" (UID: \"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b\") " pod="openshift-marketplace/redhat-marketplace-l6nrn" Dec 06 09:53:23 crc kubenswrapper[4954]: I1206 09:53:23.920767 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l6nrn" Dec 06 09:53:24 crc kubenswrapper[4954]: W1206 09:53:24.421751 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67776cbe_b64a_4342_8ec8_1a1f36cc9e0b.slice/crio-31301a400dd925621e0115c1e957ea803ed8c0a81eccf56c8df6f675e907e624 WatchSource:0}: Error finding container 31301a400dd925621e0115c1e957ea803ed8c0a81eccf56c8df6f675e907e624: Status 404 returned error can't find the container with id 31301a400dd925621e0115c1e957ea803ed8c0a81eccf56c8df6f675e907e624 Dec 06 09:53:24 crc kubenswrapper[4954]: I1206 09:53:24.432307 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l6nrn"] Dec 06 09:53:24 crc kubenswrapper[4954]: I1206 09:53:24.645453 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l6nrn" event={"ID":"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b","Type":"ContainerStarted","Data":"fdb3148c2f07dcc6ee8573b352291c45ea313c3c0186b651c6e6f407095fa948"} Dec 06 09:53:24 crc kubenswrapper[4954]: I1206 09:53:24.645496 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l6nrn" event={"ID":"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b","Type":"ContainerStarted","Data":"31301a400dd925621e0115c1e957ea803ed8c0a81eccf56c8df6f675e907e624"} Dec 06 09:53:25 crc kubenswrapper[4954]: I1206 09:53:25.670222 4954 generic.go:334] "Generic (PLEG): container finished" podID="67776cbe-b64a-4342-8ec8-1a1f36cc9e0b" containerID="fdb3148c2f07dcc6ee8573b352291c45ea313c3c0186b651c6e6f407095fa948" exitCode=0 Dec 06 09:53:25 crc kubenswrapper[4954]: I1206 09:53:25.670311 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l6nrn" event={"ID":"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b","Type":"ContainerDied","Data":"fdb3148c2f07dcc6ee8573b352291c45ea313c3c0186b651c6e6f407095fa948"} Dec 06 09:53:26 crc kubenswrapper[4954]: I1206 09:53:26.684730 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l6nrn" event={"ID":"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b","Type":"ContainerStarted","Data":"945b0b7111a89d4097339f30323570ea71bd697c65962723097d5509a07441cb"} Dec 06 09:53:27 crc kubenswrapper[4954]: I1206 09:53:27.701688 4954 generic.go:334] "Generic (PLEG): container finished" podID="67776cbe-b64a-4342-8ec8-1a1f36cc9e0b" containerID="945b0b7111a89d4097339f30323570ea71bd697c65962723097d5509a07441cb" exitCode=0 Dec 06 09:53:27 crc kubenswrapper[4954]: I1206 09:53:27.701753 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l6nrn" event={"ID":"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b","Type":"ContainerDied","Data":"945b0b7111a89d4097339f30323570ea71bd697c65962723097d5509a07441cb"} Dec 06 09:53:28 crc kubenswrapper[4954]: I1206 09:53:28.738151 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l6nrn" event={"ID":"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b","Type":"ContainerStarted","Data":"19a52a49e1f420057673ac68cbfd9447cecff50a6f7bd78d90a6b3ae96ddac03"} Dec 06 09:53:28 crc kubenswrapper[4954]: I1206 09:53:28.759841 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l6nrn" podStartSLOduration=3.334381113 podStartE2EDuration="5.759819116s" podCreationTimestamp="2025-12-06 09:53:23 +0000 UTC" firstStartedPulling="2025-12-06 09:53:25.672211146 +0000 UTC m=+10580.485570535" lastFinishedPulling="2025-12-06 09:53:28.097649149 +0000 UTC m=+10582.911008538" observedRunningTime="2025-12-06 09:53:28.75810463 +0000 UTC m=+10583.571464029" watchObservedRunningTime="2025-12-06 09:53:28.759819116 +0000 UTC m=+10583.573178515" Dec 06 09:53:33 crc kubenswrapper[4954]: I1206 09:53:33.444413 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:53:33 crc kubenswrapper[4954]: E1206 09:53:33.445644 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:53:33 crc kubenswrapper[4954]: I1206 09:53:33.921875 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l6nrn" Dec 06 09:53:33 crc kubenswrapper[4954]: I1206 09:53:33.921935 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l6nrn" Dec 06 09:53:34 crc kubenswrapper[4954]: I1206 09:53:34.000069 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l6nrn" Dec 06 09:53:34 crc kubenswrapper[4954]: I1206 09:53:34.902207 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l6nrn" Dec 06 09:53:34 crc kubenswrapper[4954]: I1206 09:53:34.973646 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l6nrn"] Dec 06 09:53:36 crc kubenswrapper[4954]: I1206 09:53:36.849653 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l6nrn" podUID="67776cbe-b64a-4342-8ec8-1a1f36cc9e0b" containerName="registry-server" containerID="cri-o://19a52a49e1f420057673ac68cbfd9447cecff50a6f7bd78d90a6b3ae96ddac03" gracePeriod=2 Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.373940 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l6nrn" Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.487827 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b-catalog-content\") pod \"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b\" (UID: \"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b\") " Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.488175 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpbct\" (UniqueName: \"kubernetes.io/projected/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b-kube-api-access-zpbct\") pod \"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b\" (UID: \"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b\") " Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.488298 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b-utilities\") pod \"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b\" (UID: \"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b\") " Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.489409 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b-utilities" (OuterVolumeSpecName: "utilities") pod "67776cbe-b64a-4342-8ec8-1a1f36cc9e0b" (UID: "67776cbe-b64a-4342-8ec8-1a1f36cc9e0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.497859 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b-kube-api-access-zpbct" (OuterVolumeSpecName: "kube-api-access-zpbct") pod "67776cbe-b64a-4342-8ec8-1a1f36cc9e0b" (UID: "67776cbe-b64a-4342-8ec8-1a1f36cc9e0b"). InnerVolumeSpecName "kube-api-access-zpbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.522757 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67776cbe-b64a-4342-8ec8-1a1f36cc9e0b" (UID: "67776cbe-b64a-4342-8ec8-1a1f36cc9e0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.591022 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.591056 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpbct\" (UniqueName: \"kubernetes.io/projected/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b-kube-api-access-zpbct\") on node \"crc\" DevicePath \"\"" Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.591066 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.873506 4954 generic.go:334] "Generic (PLEG): container finished" podID="67776cbe-b64a-4342-8ec8-1a1f36cc9e0b" containerID="19a52a49e1f420057673ac68cbfd9447cecff50a6f7bd78d90a6b3ae96ddac03" exitCode=0 Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.873600 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l6nrn" event={"ID":"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b","Type":"ContainerDied","Data":"19a52a49e1f420057673ac68cbfd9447cecff50a6f7bd78d90a6b3ae96ddac03"} Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.873637 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l6nrn" event={"ID":"67776cbe-b64a-4342-8ec8-1a1f36cc9e0b","Type":"ContainerDied","Data":"31301a400dd925621e0115c1e957ea803ed8c0a81eccf56c8df6f675e907e624"} Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.873652 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l6nrn" Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.873706 4954 scope.go:117] "RemoveContainer" containerID="19a52a49e1f420057673ac68cbfd9447cecff50a6f7bd78d90a6b3ae96ddac03" Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.917077 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l6nrn"] Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.917974 4954 scope.go:117] "RemoveContainer" containerID="945b0b7111a89d4097339f30323570ea71bd697c65962723097d5509a07441cb" Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.926922 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l6nrn"] Dec 06 09:53:37 crc kubenswrapper[4954]: I1206 09:53:37.947587 4954 scope.go:117] "RemoveContainer" containerID="fdb3148c2f07dcc6ee8573b352291c45ea313c3c0186b651c6e6f407095fa948" Dec 06 09:53:38 crc kubenswrapper[4954]: I1206 09:53:38.004943 4954 scope.go:117] "RemoveContainer" containerID="19a52a49e1f420057673ac68cbfd9447cecff50a6f7bd78d90a6b3ae96ddac03" Dec 06 09:53:38 crc kubenswrapper[4954]: E1206 09:53:38.005558 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19a52a49e1f420057673ac68cbfd9447cecff50a6f7bd78d90a6b3ae96ddac03\": container with ID starting with 19a52a49e1f420057673ac68cbfd9447cecff50a6f7bd78d90a6b3ae96ddac03 not found: ID does not exist" containerID="19a52a49e1f420057673ac68cbfd9447cecff50a6f7bd78d90a6b3ae96ddac03" Dec 06 09:53:38 crc kubenswrapper[4954]: I1206 09:53:38.005599 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19a52a49e1f420057673ac68cbfd9447cecff50a6f7bd78d90a6b3ae96ddac03"} err="failed to get container status \"19a52a49e1f420057673ac68cbfd9447cecff50a6f7bd78d90a6b3ae96ddac03\": rpc error: code = NotFound desc = could not find container \"19a52a49e1f420057673ac68cbfd9447cecff50a6f7bd78d90a6b3ae96ddac03\": container with ID starting with 19a52a49e1f420057673ac68cbfd9447cecff50a6f7bd78d90a6b3ae96ddac03 not found: ID does not exist" Dec 06 09:53:38 crc kubenswrapper[4954]: I1206 09:53:38.005620 4954 scope.go:117] "RemoveContainer" containerID="945b0b7111a89d4097339f30323570ea71bd697c65962723097d5509a07441cb" Dec 06 09:53:38 crc kubenswrapper[4954]: E1206 09:53:38.006265 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"945b0b7111a89d4097339f30323570ea71bd697c65962723097d5509a07441cb\": container with ID starting with 945b0b7111a89d4097339f30323570ea71bd697c65962723097d5509a07441cb not found: ID does not exist" containerID="945b0b7111a89d4097339f30323570ea71bd697c65962723097d5509a07441cb" Dec 06 09:53:38 crc kubenswrapper[4954]: I1206 09:53:38.006281 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"945b0b7111a89d4097339f30323570ea71bd697c65962723097d5509a07441cb"} err="failed to get container status \"945b0b7111a89d4097339f30323570ea71bd697c65962723097d5509a07441cb\": rpc error: code = NotFound desc = could not find container \"945b0b7111a89d4097339f30323570ea71bd697c65962723097d5509a07441cb\": container with ID starting with 945b0b7111a89d4097339f30323570ea71bd697c65962723097d5509a07441cb not found: ID does not exist" Dec 06 09:53:38 crc kubenswrapper[4954]: I1206 09:53:38.006294 4954 scope.go:117] "RemoveContainer" containerID="fdb3148c2f07dcc6ee8573b352291c45ea313c3c0186b651c6e6f407095fa948" Dec 06 09:53:38 crc kubenswrapper[4954]: E1206 09:53:38.006885 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdb3148c2f07dcc6ee8573b352291c45ea313c3c0186b651c6e6f407095fa948\": container with ID starting with fdb3148c2f07dcc6ee8573b352291c45ea313c3c0186b651c6e6f407095fa948 not found: ID does not exist" containerID="fdb3148c2f07dcc6ee8573b352291c45ea313c3c0186b651c6e6f407095fa948" Dec 06 09:53:38 crc kubenswrapper[4954]: I1206 09:53:38.006902 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb3148c2f07dcc6ee8573b352291c45ea313c3c0186b651c6e6f407095fa948"} err="failed to get container status \"fdb3148c2f07dcc6ee8573b352291c45ea313c3c0186b651c6e6f407095fa948\": rpc error: code = NotFound desc = could not find container \"fdb3148c2f07dcc6ee8573b352291c45ea313c3c0186b651c6e6f407095fa948\": container with ID starting with fdb3148c2f07dcc6ee8573b352291c45ea313c3c0186b651c6e6f407095fa948 not found: ID does not exist" Dec 06 09:53:39 crc kubenswrapper[4954]: I1206 09:53:39.461899 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67776cbe-b64a-4342-8ec8-1a1f36cc9e0b" path="/var/lib/kubelet/pods/67776cbe-b64a-4342-8ec8-1a1f36cc9e0b/volumes" Dec 06 09:53:42 crc kubenswrapper[4954]: I1206 09:53:42.923942 4954 generic.go:334] "Generic (PLEG): container finished" podID="c1d0492d-6e62-434f-9ede-02d942146a89" containerID="a6ed010538ba1029f5bb0790e878476d37f6e16617297d651553ed5342902e62" exitCode=137 Dec 06 09:53:42 crc kubenswrapper[4954]: I1206 09:53:42.924687 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"c1d0492d-6e62-434f-9ede-02d942146a89","Type":"ContainerDied","Data":"a6ed010538ba1029f5bb0790e878476d37f6e16617297d651553ed5342902e62"} Dec 06 09:53:43 crc kubenswrapper[4954]: I1206 09:53:43.268600 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 06 09:53:43 crc kubenswrapper[4954]: I1206 09:53:43.410081 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs598\" (UniqueName: \"kubernetes.io/projected/c1d0492d-6e62-434f-9ede-02d942146a89-kube-api-access-qs598\") pod \"c1d0492d-6e62-434f-9ede-02d942146a89\" (UID: \"c1d0492d-6e62-434f-9ede-02d942146a89\") " Dec 06 09:53:43 crc kubenswrapper[4954]: I1206 09:53:43.410997 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8ccc5f0-1276-4b9a-9397-189833c8d274\") pod \"c1d0492d-6e62-434f-9ede-02d942146a89\" (UID: \"c1d0492d-6e62-434f-9ede-02d942146a89\") " Dec 06 09:53:43 crc kubenswrapper[4954]: I1206 09:53:43.419888 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1d0492d-6e62-434f-9ede-02d942146a89-kube-api-access-qs598" (OuterVolumeSpecName: "kube-api-access-qs598") pod "c1d0492d-6e62-434f-9ede-02d942146a89" (UID: "c1d0492d-6e62-434f-9ede-02d942146a89"). InnerVolumeSpecName "kube-api-access-qs598". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:53:43 crc kubenswrapper[4954]: I1206 09:53:43.442697 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8ccc5f0-1276-4b9a-9397-189833c8d274" (OuterVolumeSpecName: "mariadb-data") pod "c1d0492d-6e62-434f-9ede-02d942146a89" (UID: "c1d0492d-6e62-434f-9ede-02d942146a89"). InnerVolumeSpecName "pvc-b8ccc5f0-1276-4b9a-9397-189833c8d274". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 06 09:53:43 crc kubenswrapper[4954]: I1206 09:53:43.513704 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs598\" (UniqueName: \"kubernetes.io/projected/c1d0492d-6e62-434f-9ede-02d942146a89-kube-api-access-qs598\") on node \"crc\" DevicePath \"\"" Dec 06 09:53:43 crc kubenswrapper[4954]: I1206 09:53:43.513786 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b8ccc5f0-1276-4b9a-9397-189833c8d274\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8ccc5f0-1276-4b9a-9397-189833c8d274\") on node \"crc\" " Dec 06 09:53:43 crc kubenswrapper[4954]: I1206 09:53:43.548265 4954 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 06 09:53:43 crc kubenswrapper[4954]: I1206 09:53:43.548707 4954 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b8ccc5f0-1276-4b9a-9397-189833c8d274" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8ccc5f0-1276-4b9a-9397-189833c8d274") on node "crc" Dec 06 09:53:43 crc kubenswrapper[4954]: I1206 09:53:43.615526 4954 reconciler_common.go:293] "Volume detached for volume \"pvc-b8ccc5f0-1276-4b9a-9397-189833c8d274\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8ccc5f0-1276-4b9a-9397-189833c8d274\") on node \"crc\" DevicePath \"\"" Dec 06 09:53:43 crc kubenswrapper[4954]: I1206 09:53:43.939870 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"c1d0492d-6e62-434f-9ede-02d942146a89","Type":"ContainerDied","Data":"6dfb18bcf7e4abe9c17c5ce1e19844e0a6234d83fc350cee0979a833c876425d"} Dec 06 09:53:43 crc kubenswrapper[4954]: I1206 09:53:43.939985 4954 scope.go:117] "RemoveContainer" containerID="a6ed010538ba1029f5bb0790e878476d37f6e16617297d651553ed5342902e62" Dec 06 09:53:43 crc kubenswrapper[4954]: I1206 09:53:43.940873 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 06 09:53:43 crc kubenswrapper[4954]: I1206 09:53:43.970095 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 06 09:53:43 crc kubenswrapper[4954]: I1206 09:53:43.981213 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.499666 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w95vc"] Dec 06 09:53:44 crc kubenswrapper[4954]: E1206 09:53:44.500610 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67776cbe-b64a-4342-8ec8-1a1f36cc9e0b" containerName="registry-server" Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.500637 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="67776cbe-b64a-4342-8ec8-1a1f36cc9e0b" containerName="registry-server" Dec 06 09:53:44 crc kubenswrapper[4954]: E1206 09:53:44.500665 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67776cbe-b64a-4342-8ec8-1a1f36cc9e0b" containerName="extract-utilities" Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.500678 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="67776cbe-b64a-4342-8ec8-1a1f36cc9e0b" containerName="extract-utilities" Dec 06 09:53:44 crc kubenswrapper[4954]: E1206 09:53:44.500716 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67776cbe-b64a-4342-8ec8-1a1f36cc9e0b" containerName="extract-content" Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.500729 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="67776cbe-b64a-4342-8ec8-1a1f36cc9e0b" containerName="extract-content" Dec 06 09:53:44 crc kubenswrapper[4954]: E1206 09:53:44.500787 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d0492d-6e62-434f-9ede-02d942146a89" containerName="adoption" Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.500798 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d0492d-6e62-434f-9ede-02d942146a89" containerName="adoption" Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.501174 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1d0492d-6e62-434f-9ede-02d942146a89" containerName="adoption" Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.501236 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="67776cbe-b64a-4342-8ec8-1a1f36cc9e0b" containerName="registry-server" Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.503870 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w95vc" Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.514814 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w95vc"] Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.638955 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10cd08a6-0ea2-476e-a68c-b70b32386a8e-utilities\") pod \"redhat-operators-w95vc\" (UID: \"10cd08a6-0ea2-476e-a68c-b70b32386a8e\") " pod="openshift-marketplace/redhat-operators-w95vc" Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.639115 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10cd08a6-0ea2-476e-a68c-b70b32386a8e-catalog-content\") pod \"redhat-operators-w95vc\" (UID: \"10cd08a6-0ea2-476e-a68c-b70b32386a8e\") " pod="openshift-marketplace/redhat-operators-w95vc" Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.639148 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v29g9\" (UniqueName: \"kubernetes.io/projected/10cd08a6-0ea2-476e-a68c-b70b32386a8e-kube-api-access-v29g9\") pod \"redhat-operators-w95vc\" (UID: \"10cd08a6-0ea2-476e-a68c-b70b32386a8e\") " pod="openshift-marketplace/redhat-operators-w95vc" Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.650467 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.651158 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="8730273a-edb3-4f15-8e60-f359dfa1e91c" containerName="adoption" containerID="cri-o://564581786a415133bc0c02db3ea0b38fcae5ad9875b91e81e66912dd706fda8d" gracePeriod=30 Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.740597 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10cd08a6-0ea2-476e-a68c-b70b32386a8e-utilities\") pod \"redhat-operators-w95vc\" (UID: \"10cd08a6-0ea2-476e-a68c-b70b32386a8e\") " pod="openshift-marketplace/redhat-operators-w95vc" Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.740709 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10cd08a6-0ea2-476e-a68c-b70b32386a8e-catalog-content\") pod \"redhat-operators-w95vc\" (UID: \"10cd08a6-0ea2-476e-a68c-b70b32386a8e\") " pod="openshift-marketplace/redhat-operators-w95vc" Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.740731 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v29g9\" (UniqueName: \"kubernetes.io/projected/10cd08a6-0ea2-476e-a68c-b70b32386a8e-kube-api-access-v29g9\") pod \"redhat-operators-w95vc\" (UID: \"10cd08a6-0ea2-476e-a68c-b70b32386a8e\") " pod="openshift-marketplace/redhat-operators-w95vc" Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.741253 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10cd08a6-0ea2-476e-a68c-b70b32386a8e-utilities\") pod \"redhat-operators-w95vc\" (UID: \"10cd08a6-0ea2-476e-a68c-b70b32386a8e\") " pod="openshift-marketplace/redhat-operators-w95vc" Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.741360 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10cd08a6-0ea2-476e-a68c-b70b32386a8e-catalog-content\") pod \"redhat-operators-w95vc\" (UID: \"10cd08a6-0ea2-476e-a68c-b70b32386a8e\") " pod="openshift-marketplace/redhat-operators-w95vc" Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.760600 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v29g9\" (UniqueName: \"kubernetes.io/projected/10cd08a6-0ea2-476e-a68c-b70b32386a8e-kube-api-access-v29g9\") pod \"redhat-operators-w95vc\" (UID: \"10cd08a6-0ea2-476e-a68c-b70b32386a8e\") " pod="openshift-marketplace/redhat-operators-w95vc" Dec 06 09:53:44 crc kubenswrapper[4954]: I1206 09:53:44.847026 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w95vc" Dec 06 09:53:45 crc kubenswrapper[4954]: I1206 09:53:45.325474 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w95vc"] Dec 06 09:53:45 crc kubenswrapper[4954]: I1206 09:53:45.463451 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1d0492d-6e62-434f-9ede-02d942146a89" path="/var/lib/kubelet/pods/c1d0492d-6e62-434f-9ede-02d942146a89/volumes" Dec 06 09:53:45 crc kubenswrapper[4954]: I1206 09:53:45.971860 4954 generic.go:334] "Generic (PLEG): container finished" podID="10cd08a6-0ea2-476e-a68c-b70b32386a8e" containerID="fe33f7a35f869f6ca06cc0b8ff97f3d7652a85a4ac009befe2eaaf94fdd2d13c" exitCode=0 Dec 06 09:53:45 crc kubenswrapper[4954]: I1206 09:53:45.971906 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w95vc" event={"ID":"10cd08a6-0ea2-476e-a68c-b70b32386a8e","Type":"ContainerDied","Data":"fe33f7a35f869f6ca06cc0b8ff97f3d7652a85a4ac009befe2eaaf94fdd2d13c"} Dec 06 09:53:45 crc kubenswrapper[4954]: I1206 09:53:45.971930 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w95vc" event={"ID":"10cd08a6-0ea2-476e-a68c-b70b32386a8e","Type":"ContainerStarted","Data":"43d100505e4140c0767c5afa8deb0f666dd523fe9e3721491b1497757c97dc10"} Dec 06 09:53:46 crc kubenswrapper[4954]: I1206 09:53:46.444640 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:53:46 crc kubenswrapper[4954]: E1206 09:53:46.445531 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:53:48 crc kubenswrapper[4954]: I1206 09:53:48.006762 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w95vc" event={"ID":"10cd08a6-0ea2-476e-a68c-b70b32386a8e","Type":"ContainerStarted","Data":"21ddccc80018363da81df1c6d1d6b0bd72bf01591008a29526b55387bea911e1"} Dec 06 09:53:50 crc kubenswrapper[4954]: I1206 09:53:50.037029 4954 generic.go:334] "Generic (PLEG): container finished" podID="10cd08a6-0ea2-476e-a68c-b70b32386a8e" containerID="21ddccc80018363da81df1c6d1d6b0bd72bf01591008a29526b55387bea911e1" exitCode=0 Dec 06 09:53:50 crc kubenswrapper[4954]: I1206 09:53:50.037074 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w95vc" event={"ID":"10cd08a6-0ea2-476e-a68c-b70b32386a8e","Type":"ContainerDied","Data":"21ddccc80018363da81df1c6d1d6b0bd72bf01591008a29526b55387bea911e1"} Dec 06 09:53:51 crc kubenswrapper[4954]: I1206 09:53:51.052118 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w95vc" event={"ID":"10cd08a6-0ea2-476e-a68c-b70b32386a8e","Type":"ContainerStarted","Data":"49db9fa8b8e7c02ec7bea065e8a885e41687f1c1e2d71ee6000f1d455d724c7e"} Dec 06 09:53:51 crc kubenswrapper[4954]: I1206 09:53:51.086118 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w95vc" podStartSLOduration=2.5517721719999997 podStartE2EDuration="7.086089836s" podCreationTimestamp="2025-12-06 09:53:44 +0000 UTC" firstStartedPulling="2025-12-06 09:53:45.974919037 +0000 UTC m=+10600.788278426" lastFinishedPulling="2025-12-06 09:53:50.509236691 +0000 UTC m=+10605.322596090" observedRunningTime="2025-12-06 09:53:51.070771768 +0000 UTC m=+10605.884131157" watchObservedRunningTime="2025-12-06 09:53:51.086089836 +0000 UTC m=+10605.899449255" Dec 06 09:53:54 crc kubenswrapper[4954]: I1206 09:53:54.848447 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w95vc" Dec 06 09:53:54 crc kubenswrapper[4954]: I1206 09:53:54.849188 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w95vc" Dec 06 09:53:55 crc kubenswrapper[4954]: I1206 09:53:55.901837 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w95vc" podUID="10cd08a6-0ea2-476e-a68c-b70b32386a8e" containerName="registry-server" probeResult="failure" output=< Dec 06 09:53:55 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 09:53:55 crc kubenswrapper[4954]: > Dec 06 09:53:59 crc kubenswrapper[4954]: I1206 09:53:59.444734 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:53:59 crc kubenswrapper[4954]: E1206 09:53:59.445860 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:54:04 crc kubenswrapper[4954]: I1206 09:54:04.908354 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w95vc" Dec 06 09:54:04 crc kubenswrapper[4954]: I1206 09:54:04.961133 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w95vc" Dec 06 09:54:05 crc kubenswrapper[4954]: I1206 09:54:05.149412 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w95vc"] Dec 06 09:54:06 crc kubenswrapper[4954]: I1206 09:54:06.207645 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w95vc" podUID="10cd08a6-0ea2-476e-a68c-b70b32386a8e" containerName="registry-server" containerID="cri-o://49db9fa8b8e7c02ec7bea065e8a885e41687f1c1e2d71ee6000f1d455d724c7e" gracePeriod=2 Dec 06 09:54:06 crc kubenswrapper[4954]: I1206 09:54:06.783303 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w95vc" Dec 06 09:54:06 crc kubenswrapper[4954]: I1206 09:54:06.962722 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10cd08a6-0ea2-476e-a68c-b70b32386a8e-utilities\") pod \"10cd08a6-0ea2-476e-a68c-b70b32386a8e\" (UID: \"10cd08a6-0ea2-476e-a68c-b70b32386a8e\") " Dec 06 09:54:06 crc kubenswrapper[4954]: I1206 09:54:06.962815 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10cd08a6-0ea2-476e-a68c-b70b32386a8e-catalog-content\") pod \"10cd08a6-0ea2-476e-a68c-b70b32386a8e\" (UID: \"10cd08a6-0ea2-476e-a68c-b70b32386a8e\") " Dec 06 09:54:06 crc kubenswrapper[4954]: I1206 09:54:06.962845 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v29g9\" (UniqueName: \"kubernetes.io/projected/10cd08a6-0ea2-476e-a68c-b70b32386a8e-kube-api-access-v29g9\") pod \"10cd08a6-0ea2-476e-a68c-b70b32386a8e\" (UID: \"10cd08a6-0ea2-476e-a68c-b70b32386a8e\") " Dec 06 09:54:06 crc kubenswrapper[4954]: I1206 09:54:06.963667 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10cd08a6-0ea2-476e-a68c-b70b32386a8e-utilities" (OuterVolumeSpecName: "utilities") pod "10cd08a6-0ea2-476e-a68c-b70b32386a8e" (UID: "10cd08a6-0ea2-476e-a68c-b70b32386a8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:54:06 crc kubenswrapper[4954]: I1206 09:54:06.968455 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10cd08a6-0ea2-476e-a68c-b70b32386a8e-kube-api-access-v29g9" (OuterVolumeSpecName: "kube-api-access-v29g9") pod "10cd08a6-0ea2-476e-a68c-b70b32386a8e" (UID: "10cd08a6-0ea2-476e-a68c-b70b32386a8e"). InnerVolumeSpecName "kube-api-access-v29g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.065873 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10cd08a6-0ea2-476e-a68c-b70b32386a8e-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.065927 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v29g9\" (UniqueName: \"kubernetes.io/projected/10cd08a6-0ea2-476e-a68c-b70b32386a8e-kube-api-access-v29g9\") on node \"crc\" DevicePath \"\"" Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.066635 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10cd08a6-0ea2-476e-a68c-b70b32386a8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10cd08a6-0ea2-476e-a68c-b70b32386a8e" (UID: "10cd08a6-0ea2-476e-a68c-b70b32386a8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.168284 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10cd08a6-0ea2-476e-a68c-b70b32386a8e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.227432 4954 generic.go:334] "Generic (PLEG): container finished" podID="10cd08a6-0ea2-476e-a68c-b70b32386a8e" containerID="49db9fa8b8e7c02ec7bea065e8a885e41687f1c1e2d71ee6000f1d455d724c7e" exitCode=0 Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.227499 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w95vc" event={"ID":"10cd08a6-0ea2-476e-a68c-b70b32386a8e","Type":"ContainerDied","Data":"49db9fa8b8e7c02ec7bea065e8a885e41687f1c1e2d71ee6000f1d455d724c7e"} Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.227588 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w95vc" event={"ID":"10cd08a6-0ea2-476e-a68c-b70b32386a8e","Type":"ContainerDied","Data":"43d100505e4140c0767c5afa8deb0f666dd523fe9e3721491b1497757c97dc10"} Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.227613 4954 scope.go:117] "RemoveContainer" containerID="49db9fa8b8e7c02ec7bea065e8a885e41687f1c1e2d71ee6000f1d455d724c7e" Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.227519 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w95vc" Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.255844 4954 scope.go:117] "RemoveContainer" containerID="21ddccc80018363da81df1c6d1d6b0bd72bf01591008a29526b55387bea911e1" Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.263822 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w95vc"] Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.275920 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w95vc"] Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.283610 4954 scope.go:117] "RemoveContainer" containerID="fe33f7a35f869f6ca06cc0b8ff97f3d7652a85a4ac009befe2eaaf94fdd2d13c" Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.334347 4954 scope.go:117] "RemoveContainer" containerID="49db9fa8b8e7c02ec7bea065e8a885e41687f1c1e2d71ee6000f1d455d724c7e" Dec 06 09:54:07 crc kubenswrapper[4954]: E1206 09:54:07.334717 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49db9fa8b8e7c02ec7bea065e8a885e41687f1c1e2d71ee6000f1d455d724c7e\": container with ID starting with 49db9fa8b8e7c02ec7bea065e8a885e41687f1c1e2d71ee6000f1d455d724c7e not found: ID does not exist" containerID="49db9fa8b8e7c02ec7bea065e8a885e41687f1c1e2d71ee6000f1d455d724c7e" Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.334746 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49db9fa8b8e7c02ec7bea065e8a885e41687f1c1e2d71ee6000f1d455d724c7e"} err="failed to get container status \"49db9fa8b8e7c02ec7bea065e8a885e41687f1c1e2d71ee6000f1d455d724c7e\": rpc error: code = NotFound desc = could not find container \"49db9fa8b8e7c02ec7bea065e8a885e41687f1c1e2d71ee6000f1d455d724c7e\": container with ID starting with 49db9fa8b8e7c02ec7bea065e8a885e41687f1c1e2d71ee6000f1d455d724c7e not found: ID does not exist" Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.334774 4954 scope.go:117] "RemoveContainer" containerID="21ddccc80018363da81df1c6d1d6b0bd72bf01591008a29526b55387bea911e1" Dec 06 09:54:07 crc kubenswrapper[4954]: E1206 09:54:07.335300 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21ddccc80018363da81df1c6d1d6b0bd72bf01591008a29526b55387bea911e1\": container with ID starting with 21ddccc80018363da81df1c6d1d6b0bd72bf01591008a29526b55387bea911e1 not found: ID does not exist" containerID="21ddccc80018363da81df1c6d1d6b0bd72bf01591008a29526b55387bea911e1" Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.335326 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21ddccc80018363da81df1c6d1d6b0bd72bf01591008a29526b55387bea911e1"} err="failed to get container status \"21ddccc80018363da81df1c6d1d6b0bd72bf01591008a29526b55387bea911e1\": rpc error: code = NotFound desc = could not find container \"21ddccc80018363da81df1c6d1d6b0bd72bf01591008a29526b55387bea911e1\": container with ID starting with 21ddccc80018363da81df1c6d1d6b0bd72bf01591008a29526b55387bea911e1 not found: ID does not exist" Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.335342 4954 scope.go:117] "RemoveContainer" containerID="fe33f7a35f869f6ca06cc0b8ff97f3d7652a85a4ac009befe2eaaf94fdd2d13c" Dec 06 09:54:07 crc kubenswrapper[4954]: E1206 09:54:07.335743 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe33f7a35f869f6ca06cc0b8ff97f3d7652a85a4ac009befe2eaaf94fdd2d13c\": container with ID starting with fe33f7a35f869f6ca06cc0b8ff97f3d7652a85a4ac009befe2eaaf94fdd2d13c not found: ID does not exist" containerID="fe33f7a35f869f6ca06cc0b8ff97f3d7652a85a4ac009befe2eaaf94fdd2d13c" Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.335807 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe33f7a35f869f6ca06cc0b8ff97f3d7652a85a4ac009befe2eaaf94fdd2d13c"} err="failed to get container status \"fe33f7a35f869f6ca06cc0b8ff97f3d7652a85a4ac009befe2eaaf94fdd2d13c\": rpc error: code = NotFound desc = could not find container \"fe33f7a35f869f6ca06cc0b8ff97f3d7652a85a4ac009befe2eaaf94fdd2d13c\": container with ID starting with fe33f7a35f869f6ca06cc0b8ff97f3d7652a85a4ac009befe2eaaf94fdd2d13c not found: ID does not exist" Dec 06 09:54:07 crc kubenswrapper[4954]: I1206 09:54:07.457244 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10cd08a6-0ea2-476e-a68c-b70b32386a8e" path="/var/lib/kubelet/pods/10cd08a6-0ea2-476e-a68c-b70b32386a8e/volumes" Dec 06 09:54:11 crc kubenswrapper[4954]: I1206 09:54:11.444026 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:54:11 crc kubenswrapper[4954]: E1206 09:54:11.444814 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.216850 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.318127 4954 generic.go:334] "Generic (PLEG): container finished" podID="8730273a-edb3-4f15-8e60-f359dfa1e91c" containerID="564581786a415133bc0c02db3ea0b38fcae5ad9875b91e81e66912dd706fda8d" exitCode=137 Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.318169 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"8730273a-edb3-4f15-8e60-f359dfa1e91c","Type":"ContainerDied","Data":"564581786a415133bc0c02db3ea0b38fcae5ad9875b91e81e66912dd706fda8d"} Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.318182 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.318193 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"8730273a-edb3-4f15-8e60-f359dfa1e91c","Type":"ContainerDied","Data":"48a448171a772a7c8397b024516cfeb17c5df8bab2c7b59325e0c130492d6f36"} Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.318214 4954 scope.go:117] "RemoveContainer" containerID="564581786a415133bc0c02db3ea0b38fcae5ad9875b91e81e66912dd706fda8d" Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.339168 4954 scope.go:117] "RemoveContainer" containerID="564581786a415133bc0c02db3ea0b38fcae5ad9875b91e81e66912dd706fda8d" Dec 06 09:54:15 crc kubenswrapper[4954]: E1206 09:54:15.339691 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"564581786a415133bc0c02db3ea0b38fcae5ad9875b91e81e66912dd706fda8d\": container with ID starting with 564581786a415133bc0c02db3ea0b38fcae5ad9875b91e81e66912dd706fda8d not found: ID does not exist" containerID="564581786a415133bc0c02db3ea0b38fcae5ad9875b91e81e66912dd706fda8d" Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.339725 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"564581786a415133bc0c02db3ea0b38fcae5ad9875b91e81e66912dd706fda8d"} err="failed to get container status \"564581786a415133bc0c02db3ea0b38fcae5ad9875b91e81e66912dd706fda8d\": rpc error: code = NotFound desc = could not find container \"564581786a415133bc0c02db3ea0b38fcae5ad9875b91e81e66912dd706fda8d\": container with ID starting with 564581786a415133bc0c02db3ea0b38fcae5ad9875b91e81e66912dd706fda8d not found: ID does not exist" Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.363833 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/8730273a-edb3-4f15-8e60-f359dfa1e91c-ovn-data-cert\") pod \"8730273a-edb3-4f15-8e60-f359dfa1e91c\" (UID: \"8730273a-edb3-4f15-8e60-f359dfa1e91c\") " Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.364087 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7664z\" (UniqueName: \"kubernetes.io/projected/8730273a-edb3-4f15-8e60-f359dfa1e91c-kube-api-access-7664z\") pod \"8730273a-edb3-4f15-8e60-f359dfa1e91c\" (UID: \"8730273a-edb3-4f15-8e60-f359dfa1e91c\") " Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.364731 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad36409d-98bf-4466-88d2-f82ea3baa695\") pod \"8730273a-edb3-4f15-8e60-f359dfa1e91c\" (UID: \"8730273a-edb3-4f15-8e60-f359dfa1e91c\") " Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.370826 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8730273a-edb3-4f15-8e60-f359dfa1e91c-kube-api-access-7664z" (OuterVolumeSpecName: "kube-api-access-7664z") pod "8730273a-edb3-4f15-8e60-f359dfa1e91c" (UID: "8730273a-edb3-4f15-8e60-f359dfa1e91c"). InnerVolumeSpecName "kube-api-access-7664z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.371869 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8730273a-edb3-4f15-8e60-f359dfa1e91c-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "8730273a-edb3-4f15-8e60-f359dfa1e91c" (UID: "8730273a-edb3-4f15-8e60-f359dfa1e91c"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.388787 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad36409d-98bf-4466-88d2-f82ea3baa695" (OuterVolumeSpecName: "ovn-data") pod "8730273a-edb3-4f15-8e60-f359dfa1e91c" (UID: "8730273a-edb3-4f15-8e60-f359dfa1e91c"). InnerVolumeSpecName "pvc-ad36409d-98bf-4466-88d2-f82ea3baa695". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.467652 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7664z\" (UniqueName: \"kubernetes.io/projected/8730273a-edb3-4f15-8e60-f359dfa1e91c-kube-api-access-7664z\") on node \"crc\" DevicePath \"\"" Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.467750 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ad36409d-98bf-4466-88d2-f82ea3baa695\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad36409d-98bf-4466-88d2-f82ea3baa695\") on node \"crc\" " Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.467764 4954 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/8730273a-edb3-4f15-8e60-f359dfa1e91c-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.496456 4954 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.496643 4954 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ad36409d-98bf-4466-88d2-f82ea3baa695" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad36409d-98bf-4466-88d2-f82ea3baa695") on node "crc" Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.570456 4954 reconciler_common.go:293] "Volume detached for volume \"pvc-ad36409d-98bf-4466-88d2-f82ea3baa695\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad36409d-98bf-4466-88d2-f82ea3baa695\") on node \"crc\" DevicePath \"\"" Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.650006 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 06 09:54:15 crc kubenswrapper[4954]: I1206 09:54:15.660352 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Dec 06 09:54:17 crc kubenswrapper[4954]: I1206 09:54:17.455347 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8730273a-edb3-4f15-8e60-f359dfa1e91c" path="/var/lib/kubelet/pods/8730273a-edb3-4f15-8e60-f359dfa1e91c/volumes" Dec 06 09:54:22 crc kubenswrapper[4954]: I1206 09:54:22.443531 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:54:22 crc kubenswrapper[4954]: E1206 09:54:22.444364 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.196519 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-7cmrk"] Dec 06 09:54:27 crc kubenswrapper[4954]: E1206 09:54:27.197417 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8730273a-edb3-4f15-8e60-f359dfa1e91c" containerName="adoption" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.197430 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8730273a-edb3-4f15-8e60-f359dfa1e91c" containerName="adoption" Dec 06 09:54:27 crc kubenswrapper[4954]: E1206 09:54:27.197449 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10cd08a6-0ea2-476e-a68c-b70b32386a8e" containerName="extract-content" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.197455 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="10cd08a6-0ea2-476e-a68c-b70b32386a8e" containerName="extract-content" Dec 06 09:54:27 crc kubenswrapper[4954]: E1206 09:54:27.197484 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10cd08a6-0ea2-476e-a68c-b70b32386a8e" containerName="extract-utilities" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.197491 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="10cd08a6-0ea2-476e-a68c-b70b32386a8e" containerName="extract-utilities" Dec 06 09:54:27 crc kubenswrapper[4954]: E1206 09:54:27.197503 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10cd08a6-0ea2-476e-a68c-b70b32386a8e" containerName="registry-server" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.197508 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="10cd08a6-0ea2-476e-a68c-b70b32386a8e" containerName="registry-server" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.197737 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="10cd08a6-0ea2-476e-a68c-b70b32386a8e" containerName="registry-server" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.197760 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8730273a-edb3-4f15-8e60-f359dfa1e91c" containerName="adoption" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.198485 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.202599 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.202758 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.209380 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-7cmrk"] Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.236061 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f271dbc4-ed8e-469a-8672-e05126e2cbe6-etc-swift\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.236129 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f271dbc4-ed8e-469a-8672-e05126e2cbe6-scripts\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.236195 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv945\" (UniqueName: \"kubernetes.io/projected/f271dbc4-ed8e-469a-8672-e05126e2cbe6-kube-api-access-vv945\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.236226 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f271dbc4-ed8e-469a-8672-e05126e2cbe6-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.236260 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f271dbc4-ed8e-469a-8672-e05126e2cbe6-dispersionconf\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.236282 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f271dbc4-ed8e-469a-8672-e05126e2cbe6-ring-data-devices\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.236670 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f271dbc4-ed8e-469a-8672-e05126e2cbe6-swiftconf\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.338081 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f271dbc4-ed8e-469a-8672-e05126e2cbe6-swiftconf\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.338455 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f271dbc4-ed8e-469a-8672-e05126e2cbe6-etc-swift\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.338511 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f271dbc4-ed8e-469a-8672-e05126e2cbe6-scripts\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.338624 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv945\" (UniqueName: \"kubernetes.io/projected/f271dbc4-ed8e-469a-8672-e05126e2cbe6-kube-api-access-vv945\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.338682 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f271dbc4-ed8e-469a-8672-e05126e2cbe6-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.338738 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f271dbc4-ed8e-469a-8672-e05126e2cbe6-dispersionconf\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.338780 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f271dbc4-ed8e-469a-8672-e05126e2cbe6-ring-data-devices\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.339914 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f271dbc4-ed8e-469a-8672-e05126e2cbe6-etc-swift\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.340493 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f271dbc4-ed8e-469a-8672-e05126e2cbe6-ring-data-devices\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.341760 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f271dbc4-ed8e-469a-8672-e05126e2cbe6-scripts\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.345396 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f271dbc4-ed8e-469a-8672-e05126e2cbe6-dispersionconf\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.348590 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f271dbc4-ed8e-469a-8672-e05126e2cbe6-swiftconf\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.353834 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f271dbc4-ed8e-469a-8672-e05126e2cbe6-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.360482 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv945\" (UniqueName: \"kubernetes.io/projected/f271dbc4-ed8e-469a-8672-e05126e2cbe6-kube-api-access-vv945\") pod \"swift-ring-rebalance-debug-7cmrk\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:27 crc kubenswrapper[4954]: I1206 09:54:27.527718 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:28 crc kubenswrapper[4954]: I1206 09:54:28.028469 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-7cmrk"] Dec 06 09:54:28 crc kubenswrapper[4954]: W1206 09:54:28.038762 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf271dbc4_ed8e_469a_8672_e05126e2cbe6.slice/crio-2aa6cddb6351e8bfb9ed0d630a85f353c35f8345b9169a653b138cfdd73b5c31 WatchSource:0}: Error finding container 2aa6cddb6351e8bfb9ed0d630a85f353c35f8345b9169a653b138cfdd73b5c31: Status 404 returned error can't find the container with id 2aa6cddb6351e8bfb9ed0d630a85f353c35f8345b9169a653b138cfdd73b5c31 Dec 06 09:54:28 crc kubenswrapper[4954]: I1206 09:54:28.475865 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-7cmrk" event={"ID":"f271dbc4-ed8e-469a-8672-e05126e2cbe6","Type":"ContainerStarted","Data":"795a769796d0f9fd83487451056c60f256b31c9bea84538f79d191d2871d2fe0"} Dec 06 09:54:28 crc kubenswrapper[4954]: I1206 09:54:28.476304 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-7cmrk" event={"ID":"f271dbc4-ed8e-469a-8672-e05126e2cbe6","Type":"ContainerStarted","Data":"2aa6cddb6351e8bfb9ed0d630a85f353c35f8345b9169a653b138cfdd73b5c31"} Dec 06 09:54:28 crc kubenswrapper[4954]: I1206 09:54:28.502215 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-7cmrk" podStartSLOduration=1.50218969 podStartE2EDuration="1.50218969s" podCreationTimestamp="2025-12-06 09:54:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:54:28.495629935 +0000 UTC m=+10643.308989354" watchObservedRunningTime="2025-12-06 09:54:28.50218969 +0000 UTC m=+10643.315549099" Dec 06 09:54:29 crc kubenswrapper[4954]: I1206 09:54:29.487029 4954 generic.go:334] "Generic (PLEG): container finished" podID="f271dbc4-ed8e-469a-8672-e05126e2cbe6" containerID="795a769796d0f9fd83487451056c60f256b31c9bea84538f79d191d2871d2fe0" exitCode=0 Dec 06 09:54:29 crc kubenswrapper[4954]: I1206 09:54:29.487072 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-7cmrk" event={"ID":"f271dbc4-ed8e-469a-8672-e05126e2cbe6","Type":"ContainerDied","Data":"795a769796d0f9fd83487451056c60f256b31c9bea84538f79d191d2871d2fe0"} Dec 06 09:54:30 crc kubenswrapper[4954]: I1206 09:54:30.870481 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:30 crc kubenswrapper[4954]: I1206 09:54:30.927829 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-7cmrk"] Dec 06 09:54:30 crc kubenswrapper[4954]: I1206 09:54:30.948967 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-7cmrk"] Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.010100 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv945\" (UniqueName: \"kubernetes.io/projected/f271dbc4-ed8e-469a-8672-e05126e2cbe6-kube-api-access-vv945\") pod \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.010171 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f271dbc4-ed8e-469a-8672-e05126e2cbe6-combined-ca-bundle\") pod \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.010234 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f271dbc4-ed8e-469a-8672-e05126e2cbe6-scripts\") pod \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.010408 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f271dbc4-ed8e-469a-8672-e05126e2cbe6-dispersionconf\") pod \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.010447 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f271dbc4-ed8e-469a-8672-e05126e2cbe6-swiftconf\") pod \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.010518 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f271dbc4-ed8e-469a-8672-e05126e2cbe6-etc-swift\") pod \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.010549 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f271dbc4-ed8e-469a-8672-e05126e2cbe6-ring-data-devices\") pod \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\" (UID: \"f271dbc4-ed8e-469a-8672-e05126e2cbe6\") " Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.012347 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f271dbc4-ed8e-469a-8672-e05126e2cbe6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f271dbc4-ed8e-469a-8672-e05126e2cbe6" (UID: "f271dbc4-ed8e-469a-8672-e05126e2cbe6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.013095 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f271dbc4-ed8e-469a-8672-e05126e2cbe6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f271dbc4-ed8e-469a-8672-e05126e2cbe6" (UID: "f271dbc4-ed8e-469a-8672-e05126e2cbe6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.016491 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f271dbc4-ed8e-469a-8672-e05126e2cbe6-kube-api-access-vv945" (OuterVolumeSpecName: "kube-api-access-vv945") pod "f271dbc4-ed8e-469a-8672-e05126e2cbe6" (UID: "f271dbc4-ed8e-469a-8672-e05126e2cbe6"). InnerVolumeSpecName "kube-api-access-vv945". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.043985 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f271dbc4-ed8e-469a-8672-e05126e2cbe6-scripts" (OuterVolumeSpecName: "scripts") pod "f271dbc4-ed8e-469a-8672-e05126e2cbe6" (UID: "f271dbc4-ed8e-469a-8672-e05126e2cbe6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.046023 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f271dbc4-ed8e-469a-8672-e05126e2cbe6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f271dbc4-ed8e-469a-8672-e05126e2cbe6" (UID: "f271dbc4-ed8e-469a-8672-e05126e2cbe6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.055465 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f271dbc4-ed8e-469a-8672-e05126e2cbe6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f271dbc4-ed8e-469a-8672-e05126e2cbe6" (UID: "f271dbc4-ed8e-469a-8672-e05126e2cbe6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.070530 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f271dbc4-ed8e-469a-8672-e05126e2cbe6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f271dbc4-ed8e-469a-8672-e05126e2cbe6" (UID: "f271dbc4-ed8e-469a-8672-e05126e2cbe6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.113364 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f271dbc4-ed8e-469a-8672-e05126e2cbe6-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.113404 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f271dbc4-ed8e-469a-8672-e05126e2cbe6-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.113418 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f271dbc4-ed8e-469a-8672-e05126e2cbe6-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.113428 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f271dbc4-ed8e-469a-8672-e05126e2cbe6-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.113438 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f271dbc4-ed8e-469a-8672-e05126e2cbe6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.113451 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv945\" (UniqueName: \"kubernetes.io/projected/f271dbc4-ed8e-469a-8672-e05126e2cbe6-kube-api-access-vv945\") on node \"crc\" DevicePath \"\"" Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.113463 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f271dbc4-ed8e-469a-8672-e05126e2cbe6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.456426 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f271dbc4-ed8e-469a-8672-e05126e2cbe6" path="/var/lib/kubelet/pods/f271dbc4-ed8e-469a-8672-e05126e2cbe6/volumes" Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.514135 4954 scope.go:117] "RemoveContainer" containerID="795a769796d0f9fd83487451056c60f256b31c9bea84538f79d191d2871d2fe0" Dec 06 09:54:31 crc kubenswrapper[4954]: I1206 09:54:31.514278 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-7cmrk" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.178180 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-h2jmm"] Dec 06 09:54:32 crc kubenswrapper[4954]: E1206 09:54:32.179290 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f271dbc4-ed8e-469a-8672-e05126e2cbe6" containerName="swift-ring-rebalance" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.179315 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f271dbc4-ed8e-469a-8672-e05126e2cbe6" containerName="swift-ring-rebalance" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.179733 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f271dbc4-ed8e-469a-8672-e05126e2cbe6" containerName="swift-ring-rebalance" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.180999 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.186763 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.187110 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.196167 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-h2jmm"] Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.343034 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e85af58e-55ba-48c3-982e-49a0e4a1c3de-dispersionconf\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.343117 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhkzd\" (UniqueName: \"kubernetes.io/projected/e85af58e-55ba-48c3-982e-49a0e4a1c3de-kube-api-access-vhkzd\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.343159 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85af58e-55ba-48c3-982e-49a0e4a1c3de-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.343253 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e85af58e-55ba-48c3-982e-49a0e4a1c3de-ring-data-devices\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.343341 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e85af58e-55ba-48c3-982e-49a0e4a1c3de-swiftconf\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.343395 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e85af58e-55ba-48c3-982e-49a0e4a1c3de-scripts\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.343548 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e85af58e-55ba-48c3-982e-49a0e4a1c3de-etc-swift\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.444961 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e85af58e-55ba-48c3-982e-49a0e4a1c3de-ring-data-devices\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.445072 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e85af58e-55ba-48c3-982e-49a0e4a1c3de-swiftconf\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.445117 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e85af58e-55ba-48c3-982e-49a0e4a1c3de-scripts\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.445154 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e85af58e-55ba-48c3-982e-49a0e4a1c3de-etc-swift\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.445189 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e85af58e-55ba-48c3-982e-49a0e4a1c3de-dispersionconf\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.445217 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhkzd\" (UniqueName: \"kubernetes.io/projected/e85af58e-55ba-48c3-982e-49a0e4a1c3de-kube-api-access-vhkzd\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.445237 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85af58e-55ba-48c3-982e-49a0e4a1c3de-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.446070 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e85af58e-55ba-48c3-982e-49a0e4a1c3de-etc-swift\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.446335 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e85af58e-55ba-48c3-982e-49a0e4a1c3de-scripts\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.446336 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e85af58e-55ba-48c3-982e-49a0e4a1c3de-ring-data-devices\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.449722 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85af58e-55ba-48c3-982e-49a0e4a1c3de-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.450084 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e85af58e-55ba-48c3-982e-49a0e4a1c3de-dispersionconf\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.453066 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e85af58e-55ba-48c3-982e-49a0e4a1c3de-swiftconf\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.467836 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhkzd\" (UniqueName: \"kubernetes.io/projected/e85af58e-55ba-48c3-982e-49a0e4a1c3de-kube-api-access-vhkzd\") pod \"swift-ring-rebalance-debug-h2jmm\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.504901 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:54:32 crc kubenswrapper[4954]: I1206 09:54:32.981606 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-h2jmm"] Dec 06 09:54:32 crc kubenswrapper[4954]: W1206 09:54:32.989217 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode85af58e_55ba_48c3_982e_49a0e4a1c3de.slice/crio-bcb52dfd35b61c6ec6ea9bb96ae86fae6f6d657f7b0a211d813bbff007c68ca9 WatchSource:0}: Error finding container bcb52dfd35b61c6ec6ea9bb96ae86fae6f6d657f7b0a211d813bbff007c68ca9: Status 404 returned error can't find the container with id bcb52dfd35b61c6ec6ea9bb96ae86fae6f6d657f7b0a211d813bbff007c68ca9 Dec 06 09:54:33 crc kubenswrapper[4954]: I1206 09:54:33.447907 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:54:33 crc kubenswrapper[4954]: E1206 09:54:33.448396 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:54:33 crc kubenswrapper[4954]: I1206 09:54:33.554437 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-h2jmm" event={"ID":"e85af58e-55ba-48c3-982e-49a0e4a1c3de","Type":"ContainerStarted","Data":"13ad78a44130d09cfd95ce82324edb40edcb03e12235ed82a2a99244bd48a869"} Dec 06 09:54:33 crc kubenswrapper[4954]: I1206 09:54:33.554495 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-h2jmm" event={"ID":"e85af58e-55ba-48c3-982e-49a0e4a1c3de","Type":"ContainerStarted","Data":"bcb52dfd35b61c6ec6ea9bb96ae86fae6f6d657f7b0a211d813bbff007c68ca9"} Dec 06 09:54:33 crc kubenswrapper[4954]: I1206 09:54:33.587965 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-h2jmm" podStartSLOduration=1.587940613 podStartE2EDuration="1.587940613s" podCreationTimestamp="2025-12-06 09:54:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:54:33.576641672 +0000 UTC m=+10648.390001131" watchObservedRunningTime="2025-12-06 09:54:33.587940613 +0000 UTC m=+10648.401300022" Dec 06 09:54:48 crc kubenswrapper[4954]: I1206 09:54:48.443190 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:54:48 crc kubenswrapper[4954]: E1206 09:54:48.445712 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:55:00 crc kubenswrapper[4954]: I1206 09:55:00.862909 4954 generic.go:334] "Generic (PLEG): container finished" podID="e85af58e-55ba-48c3-982e-49a0e4a1c3de" containerID="13ad78a44130d09cfd95ce82324edb40edcb03e12235ed82a2a99244bd48a869" exitCode=0 Dec 06 09:55:00 crc kubenswrapper[4954]: I1206 09:55:00.863123 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-h2jmm" event={"ID":"e85af58e-55ba-48c3-982e-49a0e4a1c3de","Type":"ContainerDied","Data":"13ad78a44130d09cfd95ce82324edb40edcb03e12235ed82a2a99244bd48a869"} Dec 06 09:55:01 crc kubenswrapper[4954]: I1206 09:55:01.444141 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:55:01 crc kubenswrapper[4954]: E1206 09:55:01.444458 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.312983 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.366301 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e85af58e-55ba-48c3-982e-49a0e4a1c3de-scripts\") pod \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.366406 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85af58e-55ba-48c3-982e-49a0e4a1c3de-combined-ca-bundle\") pod \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.366452 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e85af58e-55ba-48c3-982e-49a0e4a1c3de-ring-data-devices\") pod \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.366628 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e85af58e-55ba-48c3-982e-49a0e4a1c3de-swiftconf\") pod \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.366658 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e85af58e-55ba-48c3-982e-49a0e4a1c3de-etc-swift\") pod \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.366785 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhkzd\" (UniqueName: \"kubernetes.io/projected/e85af58e-55ba-48c3-982e-49a0e4a1c3de-kube-api-access-vhkzd\") pod \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.366866 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e85af58e-55ba-48c3-982e-49a0e4a1c3de-dispersionconf\") pod \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\" (UID: \"e85af58e-55ba-48c3-982e-49a0e4a1c3de\") " Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.368277 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e85af58e-55ba-48c3-982e-49a0e4a1c3de-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e85af58e-55ba-48c3-982e-49a0e4a1c3de" (UID: "e85af58e-55ba-48c3-982e-49a0e4a1c3de"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.368868 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e85af58e-55ba-48c3-982e-49a0e4a1c3de-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e85af58e-55ba-48c3-982e-49a0e4a1c3de" (UID: "e85af58e-55ba-48c3-982e-49a0e4a1c3de"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.377008 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-h2jmm"] Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.378850 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e85af58e-55ba-48c3-982e-49a0e4a1c3de-kube-api-access-vhkzd" (OuterVolumeSpecName: "kube-api-access-vhkzd") pod "e85af58e-55ba-48c3-982e-49a0e4a1c3de" (UID: "e85af58e-55ba-48c3-982e-49a0e4a1c3de"). InnerVolumeSpecName "kube-api-access-vhkzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.383064 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-h2jmm"] Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.400322 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85af58e-55ba-48c3-982e-49a0e4a1c3de-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e85af58e-55ba-48c3-982e-49a0e4a1c3de" (UID: "e85af58e-55ba-48c3-982e-49a0e4a1c3de"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.416026 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85af58e-55ba-48c3-982e-49a0e4a1c3de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e85af58e-55ba-48c3-982e-49a0e4a1c3de" (UID: "e85af58e-55ba-48c3-982e-49a0e4a1c3de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.421684 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e85af58e-55ba-48c3-982e-49a0e4a1c3de-scripts" (OuterVolumeSpecName: "scripts") pod "e85af58e-55ba-48c3-982e-49a0e4a1c3de" (UID: "e85af58e-55ba-48c3-982e-49a0e4a1c3de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.431904 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85af58e-55ba-48c3-982e-49a0e4a1c3de-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e85af58e-55ba-48c3-982e-49a0e4a1c3de" (UID: "e85af58e-55ba-48c3-982e-49a0e4a1c3de"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.469313 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e85af58e-55ba-48c3-982e-49a0e4a1c3de-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.469362 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e85af58e-55ba-48c3-982e-49a0e4a1c3de-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.469383 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85af58e-55ba-48c3-982e-49a0e4a1c3de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.469401 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e85af58e-55ba-48c3-982e-49a0e4a1c3de-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.469416 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e85af58e-55ba-48c3-982e-49a0e4a1c3de-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.469432 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e85af58e-55ba-48c3-982e-49a0e4a1c3de-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.469449 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhkzd\" (UniqueName: \"kubernetes.io/projected/e85af58e-55ba-48c3-982e-49a0e4a1c3de-kube-api-access-vhkzd\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.888707 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcb52dfd35b61c6ec6ea9bb96ae86fae6f6d657f7b0a211d813bbff007c68ca9" Dec 06 09:55:02 crc kubenswrapper[4954]: I1206 09:55:02.888792 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-h2jmm" Dec 06 09:55:03 crc kubenswrapper[4954]: I1206 09:55:03.457107 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e85af58e-55ba-48c3-982e-49a0e4a1c3de" path="/var/lib/kubelet/pods/e85af58e-55ba-48c3-982e-49a0e4a1c3de/volumes" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.538526 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 06 09:55:04 crc kubenswrapper[4954]: E1206 09:55:04.539600 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85af58e-55ba-48c3-982e-49a0e4a1c3de" containerName="swift-ring-rebalance" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.539622 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85af58e-55ba-48c3-982e-49a0e4a1c3de" containerName="swift-ring-rebalance" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.540053 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85af58e-55ba-48c3-982e-49a0e4a1c3de" containerName="swift-ring-rebalance" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.557508 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.565884 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.614751 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86dk8\" (UniqueName: \"kubernetes.io/projected/fa5412d3-ec01-4e3f-9525-89669e6a87fc-kube-api-access-86dk8\") pod \"swift-storage-0\" (UID: \"fa5412d3-ec01-4e3f-9525-89669e6a87fc\") " pod="openstack/swift-storage-0" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.615123 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa5412d3-ec01-4e3f-9525-89669e6a87fc-etc-swift\") pod \"swift-storage-0\" (UID: \"fa5412d3-ec01-4e3f-9525-89669e6a87fc\") " pod="openstack/swift-storage-0" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.615283 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fa5412d3-ec01-4e3f-9525-89669e6a87fc-cache\") pod \"swift-storage-0\" (UID: \"fa5412d3-ec01-4e3f-9525-89669e6a87fc\") " pod="openstack/swift-storage-0" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.615455 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fa5412d3-ec01-4e3f-9525-89669e6a87fc-lock\") pod \"swift-storage-0\" (UID: \"fa5412d3-ec01-4e3f-9525-89669e6a87fc\") " pod="openstack/swift-storage-0" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.615573 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b6720da3-504f-4839-a92b-f82de71e01cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6720da3-504f-4839-a92b-f82de71e01cc\") pod \"swift-storage-0\" (UID: \"fa5412d3-ec01-4e3f-9525-89669e6a87fc\") " pod="openstack/swift-storage-0" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.618686 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-2"] Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.643310 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-2" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.650468 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-1"] Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.658175 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-1" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.663052 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.680513 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-2"] Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.701819 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-1"] Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.716696 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b3964477-df96-45e2-99d9-3b1d14f0eff3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3964477-df96-45e2-99d9-3b1d14f0eff3\") pod \"swift-storage-2\" (UID: \"0054b4c8-5d04-47a9-8794-992ac486c936\") " pod="openstack/swift-storage-2" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.716777 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86dk8\" (UniqueName: \"kubernetes.io/projected/fa5412d3-ec01-4e3f-9525-89669e6a87fc-kube-api-access-86dk8\") pod \"swift-storage-0\" (UID: \"fa5412d3-ec01-4e3f-9525-89669e6a87fc\") " pod="openstack/swift-storage-0" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.716815 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz4r4\" (UniqueName: \"kubernetes.io/projected/0054b4c8-5d04-47a9-8794-992ac486c936-kube-api-access-lz4r4\") pod \"swift-storage-2\" (UID: \"0054b4c8-5d04-47a9-8794-992ac486c936\") " pod="openstack/swift-storage-2" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.716852 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa5412d3-ec01-4e3f-9525-89669e6a87fc-etc-swift\") pod \"swift-storage-0\" (UID: \"fa5412d3-ec01-4e3f-9525-89669e6a87fc\") " pod="openstack/swift-storage-0" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.717000 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fa5412d3-ec01-4e3f-9525-89669e6a87fc-cache\") pod \"swift-storage-0\" (UID: \"fa5412d3-ec01-4e3f-9525-89669e6a87fc\") " pod="openstack/swift-storage-0" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.717132 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0054b4c8-5d04-47a9-8794-992ac486c936-etc-swift\") pod \"swift-storage-2\" (UID: \"0054b4c8-5d04-47a9-8794-992ac486c936\") " pod="openstack/swift-storage-2" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.717163 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0054b4c8-5d04-47a9-8794-992ac486c936-lock\") pod \"swift-storage-2\" (UID: \"0054b4c8-5d04-47a9-8794-992ac486c936\") " pod="openstack/swift-storage-2" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.717230 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fa5412d3-ec01-4e3f-9525-89669e6a87fc-lock\") pod \"swift-storage-0\" (UID: \"fa5412d3-ec01-4e3f-9525-89669e6a87fc\") " pod="openstack/swift-storage-0" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.717253 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0054b4c8-5d04-47a9-8794-992ac486c936-cache\") pod \"swift-storage-2\" (UID: \"0054b4c8-5d04-47a9-8794-992ac486c936\") " pod="openstack/swift-storage-2" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.717279 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b6720da3-504f-4839-a92b-f82de71e01cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6720da3-504f-4839-a92b-f82de71e01cc\") pod \"swift-storage-0\" (UID: \"fa5412d3-ec01-4e3f-9525-89669e6a87fc\") " pod="openstack/swift-storage-0" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.718252 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fa5412d3-ec01-4e3f-9525-89669e6a87fc-cache\") pod \"swift-storage-0\" (UID: \"fa5412d3-ec01-4e3f-9525-89669e6a87fc\") " pod="openstack/swift-storage-0" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.718380 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fa5412d3-ec01-4e3f-9525-89669e6a87fc-lock\") pod \"swift-storage-0\" (UID: \"fa5412d3-ec01-4e3f-9525-89669e6a87fc\") " pod="openstack/swift-storage-0" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.720078 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.720107 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b6720da3-504f-4839-a92b-f82de71e01cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6720da3-504f-4839-a92b-f82de71e01cc\") pod \"swift-storage-0\" (UID: \"fa5412d3-ec01-4e3f-9525-89669e6a87fc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c95d03af544385da0019f7451c4a4de8a1e26f0adeefd74a0950810fdc8d4bfc/globalmount\"" pod="openstack/swift-storage-0" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.722498 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa5412d3-ec01-4e3f-9525-89669e6a87fc-etc-swift\") pod \"swift-storage-0\" (UID: \"fa5412d3-ec01-4e3f-9525-89669e6a87fc\") " pod="openstack/swift-storage-0" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.736462 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86dk8\" (UniqueName: \"kubernetes.io/projected/fa5412d3-ec01-4e3f-9525-89669e6a87fc-kube-api-access-86dk8\") pod \"swift-storage-0\" (UID: \"fa5412d3-ec01-4e3f-9525-89669e6a87fc\") " pod="openstack/swift-storage-0" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.756801 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b6720da3-504f-4839-a92b-f82de71e01cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6720da3-504f-4839-a92b-f82de71e01cc\") pod \"swift-storage-0\" (UID: \"fa5412d3-ec01-4e3f-9525-89669e6a87fc\") " pod="openstack/swift-storage-0" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.819295 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0054b4c8-5d04-47a9-8794-992ac486c936-etc-swift\") pod \"swift-storage-2\" (UID: \"0054b4c8-5d04-47a9-8794-992ac486c936\") " pod="openstack/swift-storage-2" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.819353 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0054b4c8-5d04-47a9-8794-992ac486c936-lock\") pod \"swift-storage-2\" (UID: \"0054b4c8-5d04-47a9-8794-992ac486c936\") " pod="openstack/swift-storage-2" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.819384 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-95541746-5cc2-42c3-9c8e-bd173d58092f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95541746-5cc2-42c3-9c8e-bd173d58092f\") pod \"swift-storage-1\" (UID: \"fb7ca5c7-3922-457c-9709-b024e8587ced\") " pod="openstack/swift-storage-1" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.819421 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0054b4c8-5d04-47a9-8794-992ac486c936-cache\") pod \"swift-storage-2\" (UID: \"0054b4c8-5d04-47a9-8794-992ac486c936\") " pod="openstack/swift-storage-2" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.819450 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fb7ca5c7-3922-457c-9709-b024e8587ced-etc-swift\") pod \"swift-storage-1\" (UID: \"fb7ca5c7-3922-457c-9709-b024e8587ced\") " pod="openstack/swift-storage-1" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.819472 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fb7ca5c7-3922-457c-9709-b024e8587ced-lock\") pod \"swift-storage-1\" (UID: \"fb7ca5c7-3922-457c-9709-b024e8587ced\") " pod="openstack/swift-storage-1" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.819508 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b3964477-df96-45e2-99d9-3b1d14f0eff3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3964477-df96-45e2-99d9-3b1d14f0eff3\") pod \"swift-storage-2\" (UID: \"0054b4c8-5d04-47a9-8794-992ac486c936\") " pod="openstack/swift-storage-2" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.819543 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz4r4\" (UniqueName: \"kubernetes.io/projected/0054b4c8-5d04-47a9-8794-992ac486c936-kube-api-access-lz4r4\") pod \"swift-storage-2\" (UID: \"0054b4c8-5d04-47a9-8794-992ac486c936\") " pod="openstack/swift-storage-2" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.819603 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fb7ca5c7-3922-457c-9709-b024e8587ced-cache\") pod \"swift-storage-1\" (UID: \"fb7ca5c7-3922-457c-9709-b024e8587ced\") " pod="openstack/swift-storage-1" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.819654 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrw4l\" (UniqueName: \"kubernetes.io/projected/fb7ca5c7-3922-457c-9709-b024e8587ced-kube-api-access-hrw4l\") pod \"swift-storage-1\" (UID: \"fb7ca5c7-3922-457c-9709-b024e8587ced\") " pod="openstack/swift-storage-1" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.824728 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0054b4c8-5d04-47a9-8794-992ac486c936-lock\") pod \"swift-storage-2\" (UID: \"0054b4c8-5d04-47a9-8794-992ac486c936\") " pod="openstack/swift-storage-2" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.824973 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0054b4c8-5d04-47a9-8794-992ac486c936-cache\") pod \"swift-storage-2\" (UID: \"0054b4c8-5d04-47a9-8794-992ac486c936\") " pod="openstack/swift-storage-2" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.826808 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0054b4c8-5d04-47a9-8794-992ac486c936-etc-swift\") pod \"swift-storage-2\" (UID: \"0054b4c8-5d04-47a9-8794-992ac486c936\") " pod="openstack/swift-storage-2" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.828510 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.828559 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b3964477-df96-45e2-99d9-3b1d14f0eff3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3964477-df96-45e2-99d9-3b1d14f0eff3\") pod \"swift-storage-2\" (UID: \"0054b4c8-5d04-47a9-8794-992ac486c936\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f1bdd633a1bac7adadefa34410f3eb534207c22a0354871fd39437d2d22eee13/globalmount\"" pod="openstack/swift-storage-2" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.844896 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz4r4\" (UniqueName: \"kubernetes.io/projected/0054b4c8-5d04-47a9-8794-992ac486c936-kube-api-access-lz4r4\") pod \"swift-storage-2\" (UID: \"0054b4c8-5d04-47a9-8794-992ac486c936\") " pod="openstack/swift-storage-2" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.866011 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b3964477-df96-45e2-99d9-3b1d14f0eff3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3964477-df96-45e2-99d9-3b1d14f0eff3\") pod \"swift-storage-2\" (UID: \"0054b4c8-5d04-47a9-8794-992ac486c936\") " pod="openstack/swift-storage-2" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.889812 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.921069 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fb7ca5c7-3922-457c-9709-b024e8587ced-cache\") pod \"swift-storage-1\" (UID: \"fb7ca5c7-3922-457c-9709-b024e8587ced\") " pod="openstack/swift-storage-1" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.921408 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrw4l\" (UniqueName: \"kubernetes.io/projected/fb7ca5c7-3922-457c-9709-b024e8587ced-kube-api-access-hrw4l\") pod \"swift-storage-1\" (UID: \"fb7ca5c7-3922-457c-9709-b024e8587ced\") " pod="openstack/swift-storage-1" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.921581 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-95541746-5cc2-42c3-9c8e-bd173d58092f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95541746-5cc2-42c3-9c8e-bd173d58092f\") pod \"swift-storage-1\" (UID: \"fb7ca5c7-3922-457c-9709-b024e8587ced\") " pod="openstack/swift-storage-1" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.927942 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fb7ca5c7-3922-457c-9709-b024e8587ced-etc-swift\") pod \"swift-storage-1\" (UID: \"fb7ca5c7-3922-457c-9709-b024e8587ced\") " pod="openstack/swift-storage-1" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.928185 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fb7ca5c7-3922-457c-9709-b024e8587ced-lock\") pod \"swift-storage-1\" (UID: \"fb7ca5c7-3922-457c-9709-b024e8587ced\") " pod="openstack/swift-storage-1" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.928819 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fb7ca5c7-3922-457c-9709-b024e8587ced-lock\") pod \"swift-storage-1\" (UID: \"fb7ca5c7-3922-457c-9709-b024e8587ced\") " pod="openstack/swift-storage-1" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.929479 4954 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.929597 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-95541746-5cc2-42c3-9c8e-bd173d58092f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95541746-5cc2-42c3-9c8e-bd173d58092f\") pod \"swift-storage-1\" (UID: \"fb7ca5c7-3922-457c-9709-b024e8587ced\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f145cbeb59a014edf6c7d5ec89ac3a7f4d649c04bcaeae552c36b69a3ada4255/globalmount\"" pod="openstack/swift-storage-1" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.922630 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fb7ca5c7-3922-457c-9709-b024e8587ced-cache\") pod \"swift-storage-1\" (UID: \"fb7ca5c7-3922-457c-9709-b024e8587ced\") " pod="openstack/swift-storage-1" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.957619 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrw4l\" (UniqueName: \"kubernetes.io/projected/fb7ca5c7-3922-457c-9709-b024e8587ced-kube-api-access-hrw4l\") pod \"swift-storage-1\" (UID: \"fb7ca5c7-3922-457c-9709-b024e8587ced\") " pod="openstack/swift-storage-1" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.957738 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fb7ca5c7-3922-457c-9709-b024e8587ced-etc-swift\") pod \"swift-storage-1\" (UID: \"fb7ca5c7-3922-457c-9709-b024e8587ced\") " pod="openstack/swift-storage-1" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.969162 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-2" Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.976485 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-dw9vz"] Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.988555 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-dw9vz"] Dec 06 09:55:04 crc kubenswrapper[4954]: I1206 09:55:04.999097 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-tfp9f"] Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.000650 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.002451 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.002485 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.012672 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-95541746-5cc2-42c3-9c8e-bd173d58092f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95541746-5cc2-42c3-9c8e-bd173d58092f\") pod \"swift-storage-1\" (UID: \"fb7ca5c7-3922-457c-9709-b024e8587ced\") " pod="openstack/swift-storage-1" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.019135 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-tfp9f"] Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.032265 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-combined-ca-bundle\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.032316 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-scripts\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.032355 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-ring-data-devices\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.032405 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-dispersionconf\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.032435 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-swiftconf\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.032603 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-etc-swift\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.032625 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgm6s\" (UniqueName: \"kubernetes.io/projected/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-kube-api-access-pgm6s\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.135863 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-swiftconf\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.136194 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-etc-swift\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.136225 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgm6s\" (UniqueName: \"kubernetes.io/projected/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-kube-api-access-pgm6s\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.136273 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-combined-ca-bundle\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.136299 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-scripts\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.136330 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-ring-data-devices\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.136368 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-dispersionconf\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.137283 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-etc-swift\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.139614 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-combined-ca-bundle\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.139637 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-swiftconf\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.139690 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-ring-data-devices\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.139736 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-scripts\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.140705 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-dispersionconf\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.152000 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgm6s\" (UniqueName: \"kubernetes.io/projected/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-kube-api-access-pgm6s\") pod \"swift-ring-rebalance-tfp9f\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.282705 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-1" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.332939 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.476850 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7a87fde-b771-4bc7-946a-a3e5a26f8992" path="/var/lib/kubelet/pods/b7a87fde-b771-4bc7-946a-a3e5a26f8992/volumes" Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.643844 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-2"] Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.889285 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-tfp9f"] Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.940929 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tfp9f" event={"ID":"72a00ae9-92aa-42b4-8e20-dd0bbb65c689","Type":"ContainerStarted","Data":"462f9c336381ec26da9831b0734bf8daf965a920917bc922e2f8184a700d2173"} Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.944272 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"0054b4c8-5d04-47a9-8794-992ac486c936","Type":"ContainerStarted","Data":"aff58fc8afcbacc403f5b228fb6b2397ac1e0ec79f245e57b466335dfc8aeba0"} Dec 06 09:55:05 crc kubenswrapper[4954]: I1206 09:55:05.957462 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-1"] Dec 06 09:55:06 crc kubenswrapper[4954]: I1206 09:55:06.829824 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 06 09:55:06 crc kubenswrapper[4954]: I1206 09:55:06.964747 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"fb7ca5c7-3922-457c-9709-b024e8587ced","Type":"ContainerStarted","Data":"814413ec6b37af1f3f982d6fc6ce70e379583e2c6572b1e9b04a4ec0719a2c21"} Dec 06 09:55:06 crc kubenswrapper[4954]: I1206 09:55:06.965037 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"fb7ca5c7-3922-457c-9709-b024e8587ced","Type":"ContainerStarted","Data":"a586e823c73e75a4ef5bba730f118682d87249f079cb65835923e771e89e2ba6"} Dec 06 09:55:06 crc kubenswrapper[4954]: I1206 09:55:06.965052 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"fb7ca5c7-3922-457c-9709-b024e8587ced","Type":"ContainerStarted","Data":"3e798a016a10da8452eebb423ce5a81aa264fe5e4e6accefb81ad78605d258fc"} Dec 06 09:55:06 crc kubenswrapper[4954]: I1206 09:55:06.989900 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tfp9f" event={"ID":"72a00ae9-92aa-42b4-8e20-dd0bbb65c689","Type":"ContainerStarted","Data":"1aa8d57065efdbb50603aaab54411ef0afe8971a8164a74f88a564001c790382"} Dec 06 09:55:07 crc kubenswrapper[4954]: I1206 09:55:07.007798 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa5412d3-ec01-4e3f-9525-89669e6a87fc","Type":"ContainerStarted","Data":"7c4004b950c8cc6ad9160d96d26e0086ba937308d9e1e0a355160f8691ee6e62"} Dec 06 09:55:07 crc kubenswrapper[4954]: I1206 09:55:07.016798 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-tfp9f" podStartSLOduration=3.016778283 podStartE2EDuration="3.016778283s" podCreationTimestamp="2025-12-06 09:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:55:07.010020563 +0000 UTC m=+10681.823379952" watchObservedRunningTime="2025-12-06 09:55:07.016778283 +0000 UTC m=+10681.830137672" Dec 06 09:55:07 crc kubenswrapper[4954]: I1206 09:55:07.043531 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"0054b4c8-5d04-47a9-8794-992ac486c936","Type":"ContainerStarted","Data":"098368db2a62a8f88541d280eb11a485b8b37d23e4cb2764acdc3edd8f425a90"} Dec 06 09:55:07 crc kubenswrapper[4954]: I1206 09:55:07.043603 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"0054b4c8-5d04-47a9-8794-992ac486c936","Type":"ContainerStarted","Data":"325b38ba55ccdd5b2e60503f49e686ca2089064897a694953989f8916277facb"} Dec 06 09:55:08 crc kubenswrapper[4954]: I1206 09:55:08.057295 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa5412d3-ec01-4e3f-9525-89669e6a87fc","Type":"ContainerStarted","Data":"9c1352cc39c87ebf2a6cdae45b2a7c03dc3e921fd219b9938436ad2ea67ddbd8"} Dec 06 09:55:08 crc kubenswrapper[4954]: I1206 09:55:08.058838 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa5412d3-ec01-4e3f-9525-89669e6a87fc","Type":"ContainerStarted","Data":"0533833b9adab85a129fde4829c7e70e8efab878456d3d7e970abf3f6dc183be"} Dec 06 09:55:08 crc kubenswrapper[4954]: I1206 09:55:08.058950 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa5412d3-ec01-4e3f-9525-89669e6a87fc","Type":"ContainerStarted","Data":"6309b114f22f263a6b7008e7e7c5c97bf1cc64fa76eefd77233ea739b899bf87"} Dec 06 09:55:08 crc kubenswrapper[4954]: I1206 09:55:08.059066 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa5412d3-ec01-4e3f-9525-89669e6a87fc","Type":"ContainerStarted","Data":"cda364fd6891a25233756fd453d8c6958f3f7c8ed49ea8c0430cf2bc5ad70ef7"} Dec 06 09:55:08 crc kubenswrapper[4954]: I1206 09:55:08.062845 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"0054b4c8-5d04-47a9-8794-992ac486c936","Type":"ContainerStarted","Data":"3ab2de5a7109fcfc9396cabda4383c680dc6852bc7af63dfcf311c6cffce24d8"} Dec 06 09:55:08 crc kubenswrapper[4954]: I1206 09:55:08.063022 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"0054b4c8-5d04-47a9-8794-992ac486c936","Type":"ContainerStarted","Data":"8880750dc47e64645bb184742a47f818bfe07852f7ae9a3aa609e6f8b7d8976e"} Dec 06 09:55:08 crc kubenswrapper[4954]: I1206 09:55:08.067242 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"fb7ca5c7-3922-457c-9709-b024e8587ced","Type":"ContainerStarted","Data":"b9c12ebd314025b70485613d4c91ab951a264845803aad059def7e6c5f807c5e"} Dec 06 09:55:08 crc kubenswrapper[4954]: I1206 09:55:08.067830 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"fb7ca5c7-3922-457c-9709-b024e8587ced","Type":"ContainerStarted","Data":"0d28958fba333ed75fcfa53450ff96e7445978296298a736030d010abc631285"} Dec 06 09:55:09 crc kubenswrapper[4954]: I1206 09:55:09.085020 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"fb7ca5c7-3922-457c-9709-b024e8587ced","Type":"ContainerStarted","Data":"6cae3af7d697c084e7d4c005880395c6b1c5e4d3c37765d1417234d196c5475e"} Dec 06 09:55:09 crc kubenswrapper[4954]: I1206 09:55:09.090475 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa5412d3-ec01-4e3f-9525-89669e6a87fc","Type":"ContainerStarted","Data":"e4300e66b43185c9447591d77abd07c7ca889205fee3979d468e62b09b6f71bd"} Dec 06 09:55:09 crc kubenswrapper[4954]: I1206 09:55:09.094118 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"0054b4c8-5d04-47a9-8794-992ac486c936","Type":"ContainerStarted","Data":"83843c886ba29f82a27aba8b2b6c57806a82934628b99afb2ad901a16590f003"} Dec 06 09:55:10 crc kubenswrapper[4954]: I1206 09:55:10.110308 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa5412d3-ec01-4e3f-9525-89669e6a87fc","Type":"ContainerStarted","Data":"00e0ff4794aa526d5e30a1de5ebb9916f4a5ed48fe8d7c85232319f766b814fd"} Dec 06 09:55:10 crc kubenswrapper[4954]: I1206 09:55:10.110814 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa5412d3-ec01-4e3f-9525-89669e6a87fc","Type":"ContainerStarted","Data":"309b29785247fd14c458978fe5bd66b05f9f2563ba3b79f5b7e0e9b114be7a3f"} Dec 06 09:55:10 crc kubenswrapper[4954]: I1206 09:55:10.110828 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa5412d3-ec01-4e3f-9525-89669e6a87fc","Type":"ContainerStarted","Data":"97c8814eee696fe1556dfe88a177c6cb949d1e560f60468ab0fa13e393a16c7e"} Dec 06 09:55:10 crc kubenswrapper[4954]: I1206 09:55:10.115194 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"0054b4c8-5d04-47a9-8794-992ac486c936","Type":"ContainerStarted","Data":"41483b3732f56475171637d37a00fdf085265ee2783883a22782ab5da14b0fdb"} Dec 06 09:55:10 crc kubenswrapper[4954]: I1206 09:55:10.115276 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"0054b4c8-5d04-47a9-8794-992ac486c936","Type":"ContainerStarted","Data":"eb4b8b136653f7caeef3897bfcc13e62be1a1e3cd46ce787daa3916287f56058"} Dec 06 09:55:10 crc kubenswrapper[4954]: I1206 09:55:10.115290 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"0054b4c8-5d04-47a9-8794-992ac486c936","Type":"ContainerStarted","Data":"25a4cb9482e36d9dfd634d3af5265172747fe41718d5f0a968a6f204e3f98575"} Dec 06 09:55:10 crc kubenswrapper[4954]: I1206 09:55:10.120119 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"fb7ca5c7-3922-457c-9709-b024e8587ced","Type":"ContainerStarted","Data":"d6907b3fd4a301b22c416439f00a5a37fcecfdf966f7e4153a5599496c9aa6ab"} Dec 06 09:55:10 crc kubenswrapper[4954]: I1206 09:55:10.120180 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"fb7ca5c7-3922-457c-9709-b024e8587ced","Type":"ContainerStarted","Data":"b0f7382c2957229c58a7bdaea44ae6f39107cfaadcf5940eab57f805771c14ef"} Dec 06 09:55:10 crc kubenswrapper[4954]: I1206 09:55:10.120193 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"fb7ca5c7-3922-457c-9709-b024e8587ced","Type":"ContainerStarted","Data":"4132462400d4b577fcc2b39ca15b8ec5a0c7a93c7db82f05bbeb422bb6124198"} Dec 06 09:55:11 crc kubenswrapper[4954]: I1206 09:55:11.137762 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"fb7ca5c7-3922-457c-9709-b024e8587ced","Type":"ContainerStarted","Data":"58f95ddb471068f146c19bc4b30a4b5187984452e4b17008d50c91a999ab5c22"} Dec 06 09:55:11 crc kubenswrapper[4954]: I1206 09:55:11.142192 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa5412d3-ec01-4e3f-9525-89669e6a87fc","Type":"ContainerStarted","Data":"3019fd225bf99baa065b62d860a9176ea974b2fe26ad04bc360ab5e2a11313f1"} Dec 06 09:55:11 crc kubenswrapper[4954]: I1206 09:55:11.145792 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"0054b4c8-5d04-47a9-8794-992ac486c936","Type":"ContainerStarted","Data":"10fd6590a627c13f019aabfd6e2174b7cfc58904944b79658fd51f297ba7ee79"} Dec 06 09:55:12 crc kubenswrapper[4954]: I1206 09:55:12.200912 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa5412d3-ec01-4e3f-9525-89669e6a87fc","Type":"ContainerStarted","Data":"7f37475687bbae838e68d4e8481795cd59d102a1c46d8a4d4bd3fd3e4a9b6a48"} Dec 06 09:55:12 crc kubenswrapper[4954]: I1206 09:55:12.201297 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa5412d3-ec01-4e3f-9525-89669e6a87fc","Type":"ContainerStarted","Data":"17a2a198747769d2f1ad713be959d4b3720ea75338190c252ca8b07eea278ba3"} Dec 06 09:55:12 crc kubenswrapper[4954]: I1206 09:55:12.218028 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"0054b4c8-5d04-47a9-8794-992ac486c936","Type":"ContainerStarted","Data":"6f9251fe116499c525262f54ab9957cdbbd491a030d4b3161576f9336f40f601"} Dec 06 09:55:12 crc kubenswrapper[4954]: I1206 09:55:12.218070 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"0054b4c8-5d04-47a9-8794-992ac486c936","Type":"ContainerStarted","Data":"8755945c5b6ae4dbd394c0e9631d0d8165726b2aa1e8acb43b403af8199b6570"} Dec 06 09:55:12 crc kubenswrapper[4954]: I1206 09:55:12.222749 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"fb7ca5c7-3922-457c-9709-b024e8587ced","Type":"ContainerStarted","Data":"33802e8ea40d97c1b4870a18ea4363326e59f2ddf5988f9e19f2ea00acf425e7"} Dec 06 09:55:12 crc kubenswrapper[4954]: I1206 09:55:12.222811 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"fb7ca5c7-3922-457c-9709-b024e8587ced","Type":"ContainerStarted","Data":"df68db37b5f5fd67d58bd141f7dffc49a9023174be95434da5a6f8ce903aea72"} Dec 06 09:55:12 crc kubenswrapper[4954]: I1206 09:55:12.443493 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:55:12 crc kubenswrapper[4954]: E1206 09:55:12.444070 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:55:13 crc kubenswrapper[4954]: I1206 09:55:13.242529 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa5412d3-ec01-4e3f-9525-89669e6a87fc","Type":"ContainerStarted","Data":"35ec06247a8c0d83924d10629f42dea16f3cec208031f5f3039be5799eb6fab3"} Dec 06 09:55:13 crc kubenswrapper[4954]: I1206 09:55:13.242897 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa5412d3-ec01-4e3f-9525-89669e6a87fc","Type":"ContainerStarted","Data":"dcbaab5045965524c33655742bf1a6e496969f28204838b3593c5084e2b81b6c"} Dec 06 09:55:13 crc kubenswrapper[4954]: I1206 09:55:13.242909 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa5412d3-ec01-4e3f-9525-89669e6a87fc","Type":"ContainerStarted","Data":"c6e91ca0cd7c51f6e17cbfc7fc612bb4618b4b037ca5cc4f25a7a363e5898b56"} Dec 06 09:55:13 crc kubenswrapper[4954]: I1206 09:55:13.254889 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"0054b4c8-5d04-47a9-8794-992ac486c936","Type":"ContainerStarted","Data":"5dc70b35753a65c067fe259d1bdb2ba388b41d25f4e154b9084397c5a744c9c9"} Dec 06 09:55:13 crc kubenswrapper[4954]: I1206 09:55:13.254933 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"0054b4c8-5d04-47a9-8794-992ac486c936","Type":"ContainerStarted","Data":"a80a57005285a3ad3783623a95ffcade0f9d413b3d19a44883cfb28c7b771959"} Dec 06 09:55:13 crc kubenswrapper[4954]: I1206 09:55:13.254945 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"0054b4c8-5d04-47a9-8794-992ac486c936","Type":"ContainerStarted","Data":"cb1a1361fb0e5552c6e4f810bfb74703a7ebde60b98e3012a5231d8523b535fc"} Dec 06 09:55:13 crc kubenswrapper[4954]: I1206 09:55:13.262021 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"fb7ca5c7-3922-457c-9709-b024e8587ced","Type":"ContainerStarted","Data":"6eeeba90ebbc97a7d547ca8494a98e21f103171ee3a0714b7043926c4c8570a6"} Dec 06 09:55:13 crc kubenswrapper[4954]: I1206 09:55:13.262073 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"fb7ca5c7-3922-457c-9709-b024e8587ced","Type":"ContainerStarted","Data":"7278d64a59e3eabff2b4a6cdc66da276e8da622f3abbfdc172340cbeb4114057"} Dec 06 09:55:13 crc kubenswrapper[4954]: I1206 09:55:13.262085 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"fb7ca5c7-3922-457c-9709-b024e8587ced","Type":"ContainerStarted","Data":"9e41479c106ebbdc61d3699f3c669228cc885b45ac2e684989b0f8e0db1783be"} Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.278114 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"fb7ca5c7-3922-457c-9709-b024e8587ced","Type":"ContainerStarted","Data":"85e50f032a5f2ccbad59a846e2dc8f934bb358c4a5c1743d57cea7af1d7878ab"} Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.284596 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa5412d3-ec01-4e3f-9525-89669e6a87fc","Type":"ContainerStarted","Data":"4cf8e49837d07c0532b1b2985f66d3a0bb6b6e51289d52bdfa9b95f01c82d8cd"} Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.289697 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"0054b4c8-5d04-47a9-8794-992ac486c936","Type":"ContainerStarted","Data":"aaf49fd64c91930c426ee76d17a7066154d4e315d8d1e8fadacb9a3b874af20f"} Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.322679 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-1" podStartSLOduration=6.686454189 podStartE2EDuration="11.322638634s" podCreationTimestamp="2025-12-06 09:55:03 +0000 UTC" firstStartedPulling="2025-12-06 09:55:05.965606269 +0000 UTC m=+10680.778965658" lastFinishedPulling="2025-12-06 09:55:10.601790704 +0000 UTC m=+10685.415150103" observedRunningTime="2025-12-06 09:55:14.316304355 +0000 UTC m=+10689.129663744" watchObservedRunningTime="2025-12-06 09:55:14.322638634 +0000 UTC m=+10689.135998023" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.363479 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=8.806082299 podStartE2EDuration="11.363452561s" podCreationTimestamp="2025-12-06 09:55:03 +0000 UTC" firstStartedPulling="2025-12-06 09:55:07.960733292 +0000 UTC m=+10682.774092681" lastFinishedPulling="2025-12-06 09:55:10.518103554 +0000 UTC m=+10685.331462943" observedRunningTime="2025-12-06 09:55:14.342901254 +0000 UTC m=+10689.156260663" watchObservedRunningTime="2025-12-06 09:55:14.363452561 +0000 UTC m=+10689.176811950" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.390414 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-2" podStartSLOduration=6.491123385 podStartE2EDuration="11.390389449s" podCreationTimestamp="2025-12-06 09:55:03 +0000 UTC" firstStartedPulling="2025-12-06 09:55:05.640906068 +0000 UTC m=+10680.454265457" lastFinishedPulling="2025-12-06 09:55:10.540172132 +0000 UTC m=+10685.353531521" observedRunningTime="2025-12-06 09:55:14.387299267 +0000 UTC m=+10689.200658656" watchObservedRunningTime="2025-12-06 09:55:14.390389449 +0000 UTC m=+10689.203748838" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.750783 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f4bfb744c-vpc7q"] Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.753494 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.756458 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.784056 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f4bfb744c-vpc7q"] Dec 06 09:55:14 crc kubenswrapper[4954]: E1206 09:55:14.788376 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-6rzmr openstack-cell1 openstack-networker ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[config dns-svc dns-swift-storage-0 kube-api-access-6rzmr openstack-cell1 openstack-networker ovsdbserver-nb ovsdbserver-sb]: context canceled" pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" podUID="926f0357-41ec-4478-8f98-ed18ea9df9c8" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.806106 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7569c45955-cmgjc"] Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.808767 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.811654 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-1" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.819141 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-2" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.851380 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7569c45955-cmgjc"] Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.865544 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f4bfb744c-vpc7q"] Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.911476 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-openstack-cell1\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.911528 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-openstack-networker\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.911587 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-openstack-cell1\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.911603 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-ovsdbserver-sb\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.911660 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd8vd\" (UniqueName: \"kubernetes.io/projected/9b7d4e13-c0ed-4d44-8586-0706363924f6-kube-api-access-nd8vd\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.911703 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-1\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-dns-swift-storage-1\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.911729 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-config\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.911826 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rzmr\" (UniqueName: \"kubernetes.io/projected/926f0357-41ec-4478-8f98-ed18ea9df9c8-kube-api-access-6rzmr\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.911860 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-config\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.911892 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-openstack-networker\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.911912 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-dns-swift-storage-0\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.911933 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-ovsdbserver-nb\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.911950 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-dns-svc\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.911970 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-dns-swift-storage-0\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.912035 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-ovsdbserver-sb\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.912058 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-dns-svc\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.912084 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-2\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-dns-swift-storage-2\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.912109 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-ovsdbserver-nb\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.975924 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-dmmbc"] Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.977808 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:14 crc kubenswrapper[4954]: I1206 09:55:14.989442 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-dmmbc"] Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.015928 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-openstack-cell1\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.016866 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-openstack-networker\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.017522 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-openstack-cell1\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.018271 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-ovsdbserver-sb\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.017446 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-openstack-networker\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.018211 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-openstack-cell1\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.016810 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-openstack-cell1\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.018476 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd8vd\" (UniqueName: \"kubernetes.io/projected/9b7d4e13-c0ed-4d44-8586-0706363924f6-kube-api-access-nd8vd\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.018519 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-1\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-dns-swift-storage-1\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.018547 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-config\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.018649 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rzmr\" (UniqueName: \"kubernetes.io/projected/926f0357-41ec-4478-8f98-ed18ea9df9c8-kube-api-access-6rzmr\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.018671 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-config\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.018707 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-openstack-networker\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.018735 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-dns-swift-storage-0\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.018753 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-ovsdbserver-nb\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.018770 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-dns-svc\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.018793 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-dns-swift-storage-0\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.018881 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-ovsdbserver-sb\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.018913 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-dns-svc\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.018948 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-2\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-dns-swift-storage-2\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.018976 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-ovsdbserver-nb\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.020197 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-2\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-dns-swift-storage-2\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.020390 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-dns-svc\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.020437 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-ovsdbserver-nb\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.021086 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-config\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.021112 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-ovsdbserver-sb\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.022662 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-dns-swift-storage-0\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.022855 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-ovsdbserver-nb\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.023680 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-dns-swift-storage-0\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.023738 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-1\" (UniqueName: \"kubernetes.io/configmap/9b7d4e13-c0ed-4d44-8586-0706363924f6-dns-swift-storage-1\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.023945 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-openstack-networker\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.023958 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-config\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.024229 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-dns-svc\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.024861 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-ovsdbserver-sb\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.039830 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rzmr\" (UniqueName: \"kubernetes.io/projected/926f0357-41ec-4478-8f98-ed18ea9df9c8-kube-api-access-6rzmr\") pod \"dnsmasq-dns-7f4bfb744c-vpc7q\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.045312 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd8vd\" (UniqueName: \"kubernetes.io/projected/9b7d4e13-c0ed-4d44-8586-0706363924f6-kube-api-access-nd8vd\") pod \"dnsmasq-dns-7569c45955-cmgjc\" (UID: \"9b7d4e13-c0ed-4d44-8586-0706363924f6\") " pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.121331 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb8b369-98b7-4429-b11c-bdb9120c2b89-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.121937 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dfb8b369-98b7-4429-b11c-bdb9120c2b89-dispersionconf\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.122075 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5652\" (UniqueName: \"kubernetes.io/projected/dfb8b369-98b7-4429-b11c-bdb9120c2b89-kube-api-access-v5652\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.122290 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dfb8b369-98b7-4429-b11c-bdb9120c2b89-swiftconf\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.122438 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dfb8b369-98b7-4429-b11c-bdb9120c2b89-etc-swift\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.122527 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dfb8b369-98b7-4429-b11c-bdb9120c2b89-scripts\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.122687 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dfb8b369-98b7-4429-b11c-bdb9120c2b89-ring-data-devices\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.137928 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.224911 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dfb8b369-98b7-4429-b11c-bdb9120c2b89-etc-swift\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.224969 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dfb8b369-98b7-4429-b11c-bdb9120c2b89-scripts\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.225014 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dfb8b369-98b7-4429-b11c-bdb9120c2b89-ring-data-devices\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.225050 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb8b369-98b7-4429-b11c-bdb9120c2b89-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.225066 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dfb8b369-98b7-4429-b11c-bdb9120c2b89-dispersionconf\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.225099 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5652\" (UniqueName: \"kubernetes.io/projected/dfb8b369-98b7-4429-b11c-bdb9120c2b89-kube-api-access-v5652\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.225175 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dfb8b369-98b7-4429-b11c-bdb9120c2b89-swiftconf\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.225764 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dfb8b369-98b7-4429-b11c-bdb9120c2b89-etc-swift\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.226354 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dfb8b369-98b7-4429-b11c-bdb9120c2b89-scripts\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.227189 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dfb8b369-98b7-4429-b11c-bdb9120c2b89-ring-data-devices\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.231267 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dfb8b369-98b7-4429-b11c-bdb9120c2b89-dispersionconf\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.231362 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dfb8b369-98b7-4429-b11c-bdb9120c2b89-swiftconf\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.231935 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb8b369-98b7-4429-b11c-bdb9120c2b89-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.245516 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5652\" (UniqueName: \"kubernetes.io/projected/dfb8b369-98b7-4429-b11c-bdb9120c2b89-kube-api-access-v5652\") pod \"swift-ring-rebalance-debug-dmmbc\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.315784 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.329795 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.437488 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-dns-svc\") pod \"926f0357-41ec-4478-8f98-ed18ea9df9c8\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.437959 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-openstack-cell1\") pod \"926f0357-41ec-4478-8f98-ed18ea9df9c8\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.437994 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-config\") pod \"926f0357-41ec-4478-8f98-ed18ea9df9c8\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.438204 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "926f0357-41ec-4478-8f98-ed18ea9df9c8" (UID: "926f0357-41ec-4478-8f98-ed18ea9df9c8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.438231 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-openstack-networker\") pod \"926f0357-41ec-4478-8f98-ed18ea9df9c8\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.438345 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-ovsdbserver-nb\") pod \"926f0357-41ec-4478-8f98-ed18ea9df9c8\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.438466 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-dns-swift-storage-0\") pod \"926f0357-41ec-4478-8f98-ed18ea9df9c8\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.438503 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rzmr\" (UniqueName: \"kubernetes.io/projected/926f0357-41ec-4478-8f98-ed18ea9df9c8-kube-api-access-6rzmr\") pod \"926f0357-41ec-4478-8f98-ed18ea9df9c8\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.438555 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "926f0357-41ec-4478-8f98-ed18ea9df9c8" (UID: "926f0357-41ec-4478-8f98-ed18ea9df9c8"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.438611 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-ovsdbserver-sb\") pod \"926f0357-41ec-4478-8f98-ed18ea9df9c8\" (UID: \"926f0357-41ec-4478-8f98-ed18ea9df9c8\") " Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.438954 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-config" (OuterVolumeSpecName: "config") pod "926f0357-41ec-4478-8f98-ed18ea9df9c8" (UID: "926f0357-41ec-4478-8f98-ed18ea9df9c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.438978 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-openstack-networker" (OuterVolumeSpecName: "openstack-networker") pod "926f0357-41ec-4478-8f98-ed18ea9df9c8" (UID: "926f0357-41ec-4478-8f98-ed18ea9df9c8"). InnerVolumeSpecName "openstack-networker". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.439097 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "926f0357-41ec-4478-8f98-ed18ea9df9c8" (UID: "926f0357-41ec-4478-8f98-ed18ea9df9c8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.439469 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "926f0357-41ec-4478-8f98-ed18ea9df9c8" (UID: "926f0357-41ec-4478-8f98-ed18ea9df9c8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.440268 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-openstack-networker\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.440435 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.440484 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.440521 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.440537 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.440551 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.442240 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "926f0357-41ec-4478-8f98-ed18ea9df9c8" (UID: "926f0357-41ec-4478-8f98-ed18ea9df9c8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.443467 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/926f0357-41ec-4478-8f98-ed18ea9df9c8-kube-api-access-6rzmr" (OuterVolumeSpecName: "kube-api-access-6rzmr") pod "926f0357-41ec-4478-8f98-ed18ea9df9c8" (UID: "926f0357-41ec-4478-8f98-ed18ea9df9c8"). InnerVolumeSpecName "kube-api-access-6rzmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.479023 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.550344 4954 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/926f0357-41ec-4478-8f98-ed18ea9df9c8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.550387 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rzmr\" (UniqueName: \"kubernetes.io/projected/926f0357-41ec-4478-8f98-ed18ea9df9c8-kube-api-access-6rzmr\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:15 crc kubenswrapper[4954]: I1206 09:55:15.733165 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7569c45955-cmgjc"] Dec 06 09:55:15 crc kubenswrapper[4954]: W1206 09:55:15.737191 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b7d4e13_c0ed_4d44_8586_0706363924f6.slice/crio-ba728f1c4b8b404c1840d3776760842b6c3bcf599564c70f05ad211138c2989e WatchSource:0}: Error finding container ba728f1c4b8b404c1840d3776760842b6c3bcf599564c70f05ad211138c2989e: Status 404 returned error can't find the container with id ba728f1c4b8b404c1840d3776760842b6c3bcf599564c70f05ad211138c2989e Dec 06 09:55:16 crc kubenswrapper[4954]: I1206 09:55:16.053630 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-dmmbc"] Dec 06 09:55:16 crc kubenswrapper[4954]: I1206 09:55:16.070757 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-dmmbc"] Dec 06 09:55:16 crc kubenswrapper[4954]: W1206 09:55:16.164710 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfb8b369_98b7_4429_b11c_bdb9120c2b89.slice/crio-e0e3415c1f15161837ee787c6cc624831767dadcf38cc68eed69ca260b53ed9a WatchSource:0}: Error finding container e0e3415c1f15161837ee787c6cc624831767dadcf38cc68eed69ca260b53ed9a: Status 404 returned error can't find the container with id e0e3415c1f15161837ee787c6cc624831767dadcf38cc68eed69ca260b53ed9a Dec 06 09:55:16 crc kubenswrapper[4954]: I1206 09:55:16.320453 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-dmmbc" event={"ID":"dfb8b369-98b7-4429-b11c-bdb9120c2b89","Type":"ContainerStarted","Data":"e0e3415c1f15161837ee787c6cc624831767dadcf38cc68eed69ca260b53ed9a"} Dec 06 09:55:16 crc kubenswrapper[4954]: I1206 09:55:16.322602 4954 generic.go:334] "Generic (PLEG): container finished" podID="9b7d4e13-c0ed-4d44-8586-0706363924f6" containerID="169e37794fb5df485a54136e3eb442b51556a0240992e418e244d72c09d620a9" exitCode=0 Dec 06 09:55:16 crc kubenswrapper[4954]: I1206 09:55:16.322654 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7569c45955-cmgjc" event={"ID":"9b7d4e13-c0ed-4d44-8586-0706363924f6","Type":"ContainerDied","Data":"169e37794fb5df485a54136e3eb442b51556a0240992e418e244d72c09d620a9"} Dec 06 09:55:16 crc kubenswrapper[4954]: I1206 09:55:16.322676 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f4bfb744c-vpc7q" Dec 06 09:55:16 crc kubenswrapper[4954]: I1206 09:55:16.322682 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7569c45955-cmgjc" event={"ID":"9b7d4e13-c0ed-4d44-8586-0706363924f6","Type":"ContainerStarted","Data":"ba728f1c4b8b404c1840d3776760842b6c3bcf599564c70f05ad211138c2989e"} Dec 06 09:55:16 crc kubenswrapper[4954]: I1206 09:55:16.577552 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f4bfb744c-vpc7q"] Dec 06 09:55:16 crc kubenswrapper[4954]: I1206 09:55:16.600373 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f4bfb744c-vpc7q"] Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.335532 4954 generic.go:334] "Generic (PLEG): container finished" podID="72a00ae9-92aa-42b4-8e20-dd0bbb65c689" containerID="1aa8d57065efdbb50603aaab54411ef0afe8971a8164a74f88a564001c790382" exitCode=0 Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.335632 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tfp9f" event={"ID":"72a00ae9-92aa-42b4-8e20-dd0bbb65c689","Type":"ContainerDied","Data":"1aa8d57065efdbb50603aaab54411ef0afe8971a8164a74f88a564001c790382"} Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.337347 4954 generic.go:334] "Generic (PLEG): container finished" podID="dfb8b369-98b7-4429-b11c-bdb9120c2b89" containerID="a71e563be2f5a572c0aeffe3cefa343f321fe56d88eb3b8e53ed600f26279fe7" exitCode=2 Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.337409 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-dmmbc" event={"ID":"dfb8b369-98b7-4429-b11c-bdb9120c2b89","Type":"ContainerDied","Data":"a71e563be2f5a572c0aeffe3cefa343f321fe56d88eb3b8e53ed600f26279fe7"} Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.343190 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7569c45955-cmgjc" event={"ID":"9b7d4e13-c0ed-4d44-8586-0706363924f6","Type":"ContainerStarted","Data":"1713aeae423e8e2be4616281fb877855f5d574cf2c8adf48fad191794dc354fc"} Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.343426 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.395876 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7569c45955-cmgjc" podStartSLOduration=3.39585321 podStartE2EDuration="3.39585321s" podCreationTimestamp="2025-12-06 09:55:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:55:17.39283656 +0000 UTC m=+10692.206195969" watchObservedRunningTime="2025-12-06 09:55:17.39585321 +0000 UTC m=+10692.209212599" Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.462260 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="926f0357-41ec-4478-8f98-ed18ea9df9c8" path="/var/lib/kubelet/pods/926f0357-41ec-4478-8f98-ed18ea9df9c8/volumes" Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.467729 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-dmmbc"] Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.482101 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-dmmbc"] Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.767702 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.910883 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dfb8b369-98b7-4429-b11c-bdb9120c2b89-swiftconf\") pod \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.911011 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dfb8b369-98b7-4429-b11c-bdb9120c2b89-ring-data-devices\") pod \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.911118 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dfb8b369-98b7-4429-b11c-bdb9120c2b89-dispersionconf\") pod \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.911296 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5652\" (UniqueName: \"kubernetes.io/projected/dfb8b369-98b7-4429-b11c-bdb9120c2b89-kube-api-access-v5652\") pod \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.911340 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb8b369-98b7-4429-b11c-bdb9120c2b89-combined-ca-bundle\") pod \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.911556 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfb8b369-98b7-4429-b11c-bdb9120c2b89-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "dfb8b369-98b7-4429-b11c-bdb9120c2b89" (UID: "dfb8b369-98b7-4429-b11c-bdb9120c2b89"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.911456 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dfb8b369-98b7-4429-b11c-bdb9120c2b89-scripts\") pod \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.912246 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dfb8b369-98b7-4429-b11c-bdb9120c2b89-etc-swift\") pod \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\" (UID: \"dfb8b369-98b7-4429-b11c-bdb9120c2b89\") " Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.913433 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dfb8b369-98b7-4429-b11c-bdb9120c2b89-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.913638 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfb8b369-98b7-4429-b11c-bdb9120c2b89-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "dfb8b369-98b7-4429-b11c-bdb9120c2b89" (UID: "dfb8b369-98b7-4429-b11c-bdb9120c2b89"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.916733 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfb8b369-98b7-4429-b11c-bdb9120c2b89-kube-api-access-v5652" (OuterVolumeSpecName: "kube-api-access-v5652") pod "dfb8b369-98b7-4429-b11c-bdb9120c2b89" (UID: "dfb8b369-98b7-4429-b11c-bdb9120c2b89"). InnerVolumeSpecName "kube-api-access-v5652". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.958654 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfb8b369-98b7-4429-b11c-bdb9120c2b89-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "dfb8b369-98b7-4429-b11c-bdb9120c2b89" (UID: "dfb8b369-98b7-4429-b11c-bdb9120c2b89"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.961599 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfb8b369-98b7-4429-b11c-bdb9120c2b89-scripts" (OuterVolumeSpecName: "scripts") pod "dfb8b369-98b7-4429-b11c-bdb9120c2b89" (UID: "dfb8b369-98b7-4429-b11c-bdb9120c2b89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.970745 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfb8b369-98b7-4429-b11c-bdb9120c2b89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfb8b369-98b7-4429-b11c-bdb9120c2b89" (UID: "dfb8b369-98b7-4429-b11c-bdb9120c2b89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:55:17 crc kubenswrapper[4954]: I1206 09:55:17.989415 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfb8b369-98b7-4429-b11c-bdb9120c2b89-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "dfb8b369-98b7-4429-b11c-bdb9120c2b89" (UID: "dfb8b369-98b7-4429-b11c-bdb9120c2b89"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.015878 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dfb8b369-98b7-4429-b11c-bdb9120c2b89-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.016131 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dfb8b369-98b7-4429-b11c-bdb9120c2b89-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.016222 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5652\" (UniqueName: \"kubernetes.io/projected/dfb8b369-98b7-4429-b11c-bdb9120c2b89-kube-api-access-v5652\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.016303 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb8b369-98b7-4429-b11c-bdb9120c2b89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.016384 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dfb8b369-98b7-4429-b11c-bdb9120c2b89-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.016511 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dfb8b369-98b7-4429-b11c-bdb9120c2b89-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.363042 4954 scope.go:117] "RemoveContainer" containerID="a71e563be2f5a572c0aeffe3cefa343f321fe56d88eb3b8e53ed600f26279fe7" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.364859 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-dmmbc" Dec 06 09:55:18 crc kubenswrapper[4954]: E1206 09:55:18.774151 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfb8b369_98b7_4429_b11c_bdb9120c2b89.slice\": RecentStats: unable to find data in memory cache]" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.871178 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.878667 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgm6s\" (UniqueName: \"kubernetes.io/projected/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-kube-api-access-pgm6s\") pod \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.878847 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-scripts\") pod \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.879147 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-dispersionconf\") pod \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.879174 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-swiftconf\") pod \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.879276 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-combined-ca-bundle\") pod \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.885926 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-kube-api-access-pgm6s" (OuterVolumeSpecName: "kube-api-access-pgm6s") pod "72a00ae9-92aa-42b4-8e20-dd0bbb65c689" (UID: "72a00ae9-92aa-42b4-8e20-dd0bbb65c689"). InnerVolumeSpecName "kube-api-access-pgm6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.913542 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "72a00ae9-92aa-42b4-8e20-dd0bbb65c689" (UID: "72a00ae9-92aa-42b4-8e20-dd0bbb65c689"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.933535 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-scripts" (OuterVolumeSpecName: "scripts") pod "72a00ae9-92aa-42b4-8e20-dd0bbb65c689" (UID: "72a00ae9-92aa-42b4-8e20-dd0bbb65c689"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.935120 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "72a00ae9-92aa-42b4-8e20-dd0bbb65c689" (UID: "72a00ae9-92aa-42b4-8e20-dd0bbb65c689"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.939640 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72a00ae9-92aa-42b4-8e20-dd0bbb65c689" (UID: "72a00ae9-92aa-42b4-8e20-dd0bbb65c689"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.981192 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-etc-swift\") pod \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.981239 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-ring-data-devices\") pod \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\" (UID: \"72a00ae9-92aa-42b4-8e20-dd0bbb65c689\") " Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.981649 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.981667 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.981677 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.981685 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.981695 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgm6s\" (UniqueName: \"kubernetes.io/projected/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-kube-api-access-pgm6s\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.981792 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "72a00ae9-92aa-42b4-8e20-dd0bbb65c689" (UID: "72a00ae9-92aa-42b4-8e20-dd0bbb65c689"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:18 crc kubenswrapper[4954]: I1206 09:55:18.981916 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "72a00ae9-92aa-42b4-8e20-dd0bbb65c689" (UID: "72a00ae9-92aa-42b4-8e20-dd0bbb65c689"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:55:19 crc kubenswrapper[4954]: I1206 09:55:19.082493 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:19 crc kubenswrapper[4954]: I1206 09:55:19.082521 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72a00ae9-92aa-42b4-8e20-dd0bbb65c689-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:19 crc kubenswrapper[4954]: I1206 09:55:19.377030 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tfp9f" event={"ID":"72a00ae9-92aa-42b4-8e20-dd0bbb65c689","Type":"ContainerDied","Data":"462f9c336381ec26da9831b0734bf8daf965a920917bc922e2f8184a700d2173"} Dec 06 09:55:19 crc kubenswrapper[4954]: I1206 09:55:19.377073 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="462f9c336381ec26da9831b0734bf8daf965a920917bc922e2f8184a700d2173" Dec 06 09:55:19 crc kubenswrapper[4954]: I1206 09:55:19.377120 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tfp9f" Dec 06 09:55:19 crc kubenswrapper[4954]: I1206 09:55:19.463483 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfb8b369-98b7-4429-b11c-bdb9120c2b89" path="/var/lib/kubelet/pods/dfb8b369-98b7-4429-b11c-bdb9120c2b89/volumes" Dec 06 09:55:23 crc kubenswrapper[4954]: I1206 09:55:23.460498 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:55:23 crc kubenswrapper[4954]: E1206 09:55:23.461478 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:55:25 crc kubenswrapper[4954]: I1206 09:55:25.139547 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7569c45955-cmgjc" Dec 06 09:55:25 crc kubenswrapper[4954]: I1206 09:55:25.231822 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77648b885f-ll6cm"] Dec 06 09:55:25 crc kubenswrapper[4954]: I1206 09:55:25.232159 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77648b885f-ll6cm" podUID="8664325b-eecf-4fdf-aa85-0a399a4b29c3" containerName="dnsmasq-dns" containerID="cri-o://f07ffba0499a139f0ec42b4e5d985dbdfd0c9dd5a579844a2d0a9ba510fa0011" gracePeriod=10 Dec 06 09:55:25 crc kubenswrapper[4954]: I1206 09:55:25.505723 4954 generic.go:334] "Generic (PLEG): container finished" podID="8664325b-eecf-4fdf-aa85-0a399a4b29c3" containerID="f07ffba0499a139f0ec42b4e5d985dbdfd0c9dd5a579844a2d0a9ba510fa0011" exitCode=0 Dec 06 09:55:25 crc kubenswrapper[4954]: I1206 09:55:25.505931 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77648b885f-ll6cm" event={"ID":"8664325b-eecf-4fdf-aa85-0a399a4b29c3","Type":"ContainerDied","Data":"f07ffba0499a139f0ec42b4e5d985dbdfd0c9dd5a579844a2d0a9ba510fa0011"} Dec 06 09:55:25 crc kubenswrapper[4954]: I1206 09:55:25.894220 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:55:25 crc kubenswrapper[4954]: I1206 09:55:25.982127 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25sd7\" (UniqueName: \"kubernetes.io/projected/8664325b-eecf-4fdf-aa85-0a399a4b29c3-kube-api-access-25sd7\") pod \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " Dec 06 09:55:25 crc kubenswrapper[4954]: I1206 09:55:25.982245 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-openstack-cell1\") pod \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " Dec 06 09:55:25 crc kubenswrapper[4954]: I1206 09:55:25.983076 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-config\") pod \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " Dec 06 09:55:25 crc kubenswrapper[4954]: I1206 09:55:25.983116 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-ovsdbserver-nb\") pod \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " Dec 06 09:55:25 crc kubenswrapper[4954]: I1206 09:55:25.983150 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-ovsdbserver-sb\") pod \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " Dec 06 09:55:25 crc kubenswrapper[4954]: I1206 09:55:25.983223 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-openstack-networker\") pod \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " Dec 06 09:55:25 crc kubenswrapper[4954]: I1206 09:55:25.983284 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-dns-svc\") pod \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\" (UID: \"8664325b-eecf-4fdf-aa85-0a399a4b29c3\") " Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:25.999403 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8664325b-eecf-4fdf-aa85-0a399a4b29c3-kube-api-access-25sd7" (OuterVolumeSpecName: "kube-api-access-25sd7") pod "8664325b-eecf-4fdf-aa85-0a399a4b29c3" (UID: "8664325b-eecf-4fdf-aa85-0a399a4b29c3"). InnerVolumeSpecName "kube-api-access-25sd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.047896 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "8664325b-eecf-4fdf-aa85-0a399a4b29c3" (UID: "8664325b-eecf-4fdf-aa85-0a399a4b29c3"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.052665 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-config" (OuterVolumeSpecName: "config") pod "8664325b-eecf-4fdf-aa85-0a399a4b29c3" (UID: "8664325b-eecf-4fdf-aa85-0a399a4b29c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.062610 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8664325b-eecf-4fdf-aa85-0a399a4b29c3" (UID: "8664325b-eecf-4fdf-aa85-0a399a4b29c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.064808 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8664325b-eecf-4fdf-aa85-0a399a4b29c3" (UID: "8664325b-eecf-4fdf-aa85-0a399a4b29c3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.068311 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-openstack-networker" (OuterVolumeSpecName: "openstack-networker") pod "8664325b-eecf-4fdf-aa85-0a399a4b29c3" (UID: "8664325b-eecf-4fdf-aa85-0a399a4b29c3"). InnerVolumeSpecName "openstack-networker". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.086008 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25sd7\" (UniqueName: \"kubernetes.io/projected/8664325b-eecf-4fdf-aa85-0a399a4b29c3-kube-api-access-25sd7\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.086052 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.086067 4954 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-config\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.086080 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.086093 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-openstack-networker\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.086103 4954 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.096166 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8664325b-eecf-4fdf-aa85-0a399a4b29c3" (UID: "8664325b-eecf-4fdf-aa85-0a399a4b29c3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.188128 4954 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8664325b-eecf-4fdf-aa85-0a399a4b29c3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.516979 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77648b885f-ll6cm" event={"ID":"8664325b-eecf-4fdf-aa85-0a399a4b29c3","Type":"ContainerDied","Data":"8c96d73bc317d76b42612281120ae5b228a8dfa23b65c2093a0983a7c82b7d27"} Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.517242 4954 scope.go:117] "RemoveContainer" containerID="f07ffba0499a139f0ec42b4e5d985dbdfd0c9dd5a579844a2d0a9ba510fa0011" Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.517260 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77648b885f-ll6cm" Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.574670 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77648b885f-ll6cm"] Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.575106 4954 scope.go:117] "RemoveContainer" containerID="c8b76ccac25d01f437eaea230ed0ae41432c9fe1429d68e11e3a1ae734cd03b3" Dec 06 09:55:26 crc kubenswrapper[4954]: I1206 09:55:26.587722 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77648b885f-ll6cm"] Dec 06 09:55:27 crc kubenswrapper[4954]: I1206 09:55:27.455492 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8664325b-eecf-4fdf-aa85-0a399a4b29c3" path="/var/lib/kubelet/pods/8664325b-eecf-4fdf-aa85-0a399a4b29c3/volumes" Dec 06 09:55:31 crc kubenswrapper[4954]: I1206 09:55:31.088695 4954 scope.go:117] "RemoveContainer" containerID="43cb550dbdcb844f46e4b11a4086dd9b61fa298d97c537b605a9f664c7fd0320" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.398104 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x5zbm"] Dec 06 09:55:33 crc kubenswrapper[4954]: E1206 09:55:33.398838 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8664325b-eecf-4fdf-aa85-0a399a4b29c3" containerName="init" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.398851 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8664325b-eecf-4fdf-aa85-0a399a4b29c3" containerName="init" Dec 06 09:55:33 crc kubenswrapper[4954]: E1206 09:55:33.398885 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a00ae9-92aa-42b4-8e20-dd0bbb65c689" containerName="swift-ring-rebalance" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.398893 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a00ae9-92aa-42b4-8e20-dd0bbb65c689" containerName="swift-ring-rebalance" Dec 06 09:55:33 crc kubenswrapper[4954]: E1206 09:55:33.398917 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8664325b-eecf-4fdf-aa85-0a399a4b29c3" containerName="dnsmasq-dns" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.398923 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8664325b-eecf-4fdf-aa85-0a399a4b29c3" containerName="dnsmasq-dns" Dec 06 09:55:33 crc kubenswrapper[4954]: E1206 09:55:33.398936 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfb8b369-98b7-4429-b11c-bdb9120c2b89" containerName="swift-ring-rebalance" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.398942 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb8b369-98b7-4429-b11c-bdb9120c2b89" containerName="swift-ring-rebalance" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.399141 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8664325b-eecf-4fdf-aa85-0a399a4b29c3" containerName="dnsmasq-dns" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.399157 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a00ae9-92aa-42b4-8e20-dd0bbb65c689" containerName="swift-ring-rebalance" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.399179 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfb8b369-98b7-4429-b11c-bdb9120c2b89" containerName="swift-ring-rebalance" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.400762 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5zbm" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.413843 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x5zbm"] Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.435803 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69750506-794d-4627-986c-d472fa97d023-catalog-content\") pod \"certified-operators-x5zbm\" (UID: \"69750506-794d-4627-986c-d472fa97d023\") " pod="openshift-marketplace/certified-operators-x5zbm" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.435974 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69750506-794d-4627-986c-d472fa97d023-utilities\") pod \"certified-operators-x5zbm\" (UID: \"69750506-794d-4627-986c-d472fa97d023\") " pod="openshift-marketplace/certified-operators-x5zbm" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.436056 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlj9k\" (UniqueName: \"kubernetes.io/projected/69750506-794d-4627-986c-d472fa97d023-kube-api-access-vlj9k\") pod \"certified-operators-x5zbm\" (UID: \"69750506-794d-4627-986c-d472fa97d023\") " pod="openshift-marketplace/certified-operators-x5zbm" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.538272 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69750506-794d-4627-986c-d472fa97d023-catalog-content\") pod \"certified-operators-x5zbm\" (UID: \"69750506-794d-4627-986c-d472fa97d023\") " pod="openshift-marketplace/certified-operators-x5zbm" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.538451 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69750506-794d-4627-986c-d472fa97d023-utilities\") pod \"certified-operators-x5zbm\" (UID: \"69750506-794d-4627-986c-d472fa97d023\") " pod="openshift-marketplace/certified-operators-x5zbm" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.538585 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlj9k\" (UniqueName: \"kubernetes.io/projected/69750506-794d-4627-986c-d472fa97d023-kube-api-access-vlj9k\") pod \"certified-operators-x5zbm\" (UID: \"69750506-794d-4627-986c-d472fa97d023\") " pod="openshift-marketplace/certified-operators-x5zbm" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.540239 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69750506-794d-4627-986c-d472fa97d023-catalog-content\") pod \"certified-operators-x5zbm\" (UID: \"69750506-794d-4627-986c-d472fa97d023\") " pod="openshift-marketplace/certified-operators-x5zbm" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.540262 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69750506-794d-4627-986c-d472fa97d023-utilities\") pod \"certified-operators-x5zbm\" (UID: \"69750506-794d-4627-986c-d472fa97d023\") " pod="openshift-marketplace/certified-operators-x5zbm" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.559913 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlj9k\" (UniqueName: \"kubernetes.io/projected/69750506-794d-4627-986c-d472fa97d023-kube-api-access-vlj9k\") pod \"certified-operators-x5zbm\" (UID: \"69750506-794d-4627-986c-d472fa97d023\") " pod="openshift-marketplace/certified-operators-x5zbm" Dec 06 09:55:33 crc kubenswrapper[4954]: I1206 09:55:33.718056 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5zbm" Dec 06 09:55:34 crc kubenswrapper[4954]: I1206 09:55:34.931534 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x5zbm"] Dec 06 09:55:34 crc kubenswrapper[4954]: W1206 09:55:34.939052 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69750506_794d_4627_986c_d472fa97d023.slice/crio-c87ac19451d7b726ea8a72ab432eaa78eb7090b5f04bc4f8c9d175af577efa7a WatchSource:0}: Error finding container c87ac19451d7b726ea8a72ab432eaa78eb7090b5f04bc4f8c9d175af577efa7a: Status 404 returned error can't find the container with id c87ac19451d7b726ea8a72ab432eaa78eb7090b5f04bc4f8c9d175af577efa7a Dec 06 09:55:35 crc kubenswrapper[4954]: I1206 09:55:35.706909 4954 generic.go:334] "Generic (PLEG): container finished" podID="69750506-794d-4627-986c-d472fa97d023" containerID="ecaabb5f01117db23314d65df04989a69cda0ae175636a19a330c30142ef9e1d" exitCode=0 Dec 06 09:55:35 crc kubenswrapper[4954]: I1206 09:55:35.707849 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5zbm" event={"ID":"69750506-794d-4627-986c-d472fa97d023","Type":"ContainerDied","Data":"ecaabb5f01117db23314d65df04989a69cda0ae175636a19a330c30142ef9e1d"} Dec 06 09:55:35 crc kubenswrapper[4954]: I1206 09:55:35.707996 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5zbm" event={"ID":"69750506-794d-4627-986c-d472fa97d023","Type":"ContainerStarted","Data":"c87ac19451d7b726ea8a72ab432eaa78eb7090b5f04bc4f8c9d175af577efa7a"} Dec 06 09:55:36 crc kubenswrapper[4954]: I1206 09:55:36.443445 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:55:36 crc kubenswrapper[4954]: E1206 09:55:36.446525 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:55:36 crc kubenswrapper[4954]: I1206 09:55:36.722060 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5zbm" event={"ID":"69750506-794d-4627-986c-d472fa97d023","Type":"ContainerStarted","Data":"7be453a558327248bb3e3450f9735bd1aad0945c6df0ae332f39ec8afe59f897"} Dec 06 09:55:37 crc kubenswrapper[4954]: I1206 09:55:37.733631 4954 generic.go:334] "Generic (PLEG): container finished" podID="69750506-794d-4627-986c-d472fa97d023" containerID="7be453a558327248bb3e3450f9735bd1aad0945c6df0ae332f39ec8afe59f897" exitCode=0 Dec 06 09:55:37 crc kubenswrapper[4954]: I1206 09:55:37.733688 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5zbm" event={"ID":"69750506-794d-4627-986c-d472fa97d023","Type":"ContainerDied","Data":"7be453a558327248bb3e3450f9735bd1aad0945c6df0ae332f39ec8afe59f897"} Dec 06 09:55:38 crc kubenswrapper[4954]: I1206 09:55:38.745961 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5zbm" event={"ID":"69750506-794d-4627-986c-d472fa97d023","Type":"ContainerStarted","Data":"554bb72f906938226bce4191dcb01c4606aeb5a4ab94f6d986bb33d9803a18dc"} Dec 06 09:55:38 crc kubenswrapper[4954]: I1206 09:55:38.810133 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x5zbm" podStartSLOduration=3.369590043 podStartE2EDuration="5.810090182s" podCreationTimestamp="2025-12-06 09:55:33 +0000 UTC" firstStartedPulling="2025-12-06 09:55:35.71260467 +0000 UTC m=+10710.525964059" lastFinishedPulling="2025-12-06 09:55:38.153104809 +0000 UTC m=+10712.966464198" observedRunningTime="2025-12-06 09:55:38.775677075 +0000 UTC m=+10713.589036474" watchObservedRunningTime="2025-12-06 09:55:38.810090182 +0000 UTC m=+10713.623449571" Dec 06 09:55:43 crc kubenswrapper[4954]: I1206 09:55:43.718208 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x5zbm" Dec 06 09:55:43 crc kubenswrapper[4954]: I1206 09:55:43.718813 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x5zbm" Dec 06 09:55:43 crc kubenswrapper[4954]: I1206 09:55:43.794128 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x5zbm" Dec 06 09:55:43 crc kubenswrapper[4954]: I1206 09:55:43.933381 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x5zbm" Dec 06 09:55:44 crc kubenswrapper[4954]: I1206 09:55:44.033947 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x5zbm"] Dec 06 09:55:45 crc kubenswrapper[4954]: I1206 09:55:45.863465 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x5zbm" podUID="69750506-794d-4627-986c-d472fa97d023" containerName="registry-server" containerID="cri-o://554bb72f906938226bce4191dcb01c4606aeb5a4ab94f6d986bb33d9803a18dc" gracePeriod=2 Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.328066 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5zbm" Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.464952 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlj9k\" (UniqueName: \"kubernetes.io/projected/69750506-794d-4627-986c-d472fa97d023-kube-api-access-vlj9k\") pod \"69750506-794d-4627-986c-d472fa97d023\" (UID: \"69750506-794d-4627-986c-d472fa97d023\") " Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.465012 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69750506-794d-4627-986c-d472fa97d023-catalog-content\") pod \"69750506-794d-4627-986c-d472fa97d023\" (UID: \"69750506-794d-4627-986c-d472fa97d023\") " Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.465273 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69750506-794d-4627-986c-d472fa97d023-utilities\") pod \"69750506-794d-4627-986c-d472fa97d023\" (UID: \"69750506-794d-4627-986c-d472fa97d023\") " Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.466066 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69750506-794d-4627-986c-d472fa97d023-utilities" (OuterVolumeSpecName: "utilities") pod "69750506-794d-4627-986c-d472fa97d023" (UID: "69750506-794d-4627-986c-d472fa97d023"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.471013 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69750506-794d-4627-986c-d472fa97d023-kube-api-access-vlj9k" (OuterVolumeSpecName: "kube-api-access-vlj9k") pod "69750506-794d-4627-986c-d472fa97d023" (UID: "69750506-794d-4627-986c-d472fa97d023"). InnerVolumeSpecName "kube-api-access-vlj9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.521502 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69750506-794d-4627-986c-d472fa97d023-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69750506-794d-4627-986c-d472fa97d023" (UID: "69750506-794d-4627-986c-d472fa97d023"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.572348 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69750506-794d-4627-986c-d472fa97d023-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.572381 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlj9k\" (UniqueName: \"kubernetes.io/projected/69750506-794d-4627-986c-d472fa97d023-kube-api-access-vlj9k\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.572392 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69750506-794d-4627-986c-d472fa97d023-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.884600 4954 generic.go:334] "Generic (PLEG): container finished" podID="69750506-794d-4627-986c-d472fa97d023" containerID="554bb72f906938226bce4191dcb01c4606aeb5a4ab94f6d986bb33d9803a18dc" exitCode=0 Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.884717 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5zbm" Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.884743 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5zbm" event={"ID":"69750506-794d-4627-986c-d472fa97d023","Type":"ContainerDied","Data":"554bb72f906938226bce4191dcb01c4606aeb5a4ab94f6d986bb33d9803a18dc"} Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.885170 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5zbm" event={"ID":"69750506-794d-4627-986c-d472fa97d023","Type":"ContainerDied","Data":"c87ac19451d7b726ea8a72ab432eaa78eb7090b5f04bc4f8c9d175af577efa7a"} Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.885208 4954 scope.go:117] "RemoveContainer" containerID="554bb72f906938226bce4191dcb01c4606aeb5a4ab94f6d986bb33d9803a18dc" Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.923363 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x5zbm"] Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.925389 4954 scope.go:117] "RemoveContainer" containerID="7be453a558327248bb3e3450f9735bd1aad0945c6df0ae332f39ec8afe59f897" Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.940478 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x5zbm"] Dec 06 09:55:46 crc kubenswrapper[4954]: I1206 09:55:46.947996 4954 scope.go:117] "RemoveContainer" containerID="ecaabb5f01117db23314d65df04989a69cda0ae175636a19a330c30142ef9e1d" Dec 06 09:55:47 crc kubenswrapper[4954]: I1206 09:55:47.020229 4954 scope.go:117] "RemoveContainer" containerID="554bb72f906938226bce4191dcb01c4606aeb5a4ab94f6d986bb33d9803a18dc" Dec 06 09:55:47 crc kubenswrapper[4954]: E1206 09:55:47.021213 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"554bb72f906938226bce4191dcb01c4606aeb5a4ab94f6d986bb33d9803a18dc\": container with ID starting with 554bb72f906938226bce4191dcb01c4606aeb5a4ab94f6d986bb33d9803a18dc not found: ID does not exist" containerID="554bb72f906938226bce4191dcb01c4606aeb5a4ab94f6d986bb33d9803a18dc" Dec 06 09:55:47 crc kubenswrapper[4954]: I1206 09:55:47.021282 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"554bb72f906938226bce4191dcb01c4606aeb5a4ab94f6d986bb33d9803a18dc"} err="failed to get container status \"554bb72f906938226bce4191dcb01c4606aeb5a4ab94f6d986bb33d9803a18dc\": rpc error: code = NotFound desc = could not find container \"554bb72f906938226bce4191dcb01c4606aeb5a4ab94f6d986bb33d9803a18dc\": container with ID starting with 554bb72f906938226bce4191dcb01c4606aeb5a4ab94f6d986bb33d9803a18dc not found: ID does not exist" Dec 06 09:55:47 crc kubenswrapper[4954]: I1206 09:55:47.021318 4954 scope.go:117] "RemoveContainer" containerID="7be453a558327248bb3e3450f9735bd1aad0945c6df0ae332f39ec8afe59f897" Dec 06 09:55:47 crc kubenswrapper[4954]: E1206 09:55:47.021815 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7be453a558327248bb3e3450f9735bd1aad0945c6df0ae332f39ec8afe59f897\": container with ID starting with 7be453a558327248bb3e3450f9735bd1aad0945c6df0ae332f39ec8afe59f897 not found: ID does not exist" containerID="7be453a558327248bb3e3450f9735bd1aad0945c6df0ae332f39ec8afe59f897" Dec 06 09:55:47 crc kubenswrapper[4954]: I1206 09:55:47.021863 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7be453a558327248bb3e3450f9735bd1aad0945c6df0ae332f39ec8afe59f897"} err="failed to get container status \"7be453a558327248bb3e3450f9735bd1aad0945c6df0ae332f39ec8afe59f897\": rpc error: code = NotFound desc = could not find container \"7be453a558327248bb3e3450f9735bd1aad0945c6df0ae332f39ec8afe59f897\": container with ID starting with 7be453a558327248bb3e3450f9735bd1aad0945c6df0ae332f39ec8afe59f897 not found: ID does not exist" Dec 06 09:55:47 crc kubenswrapper[4954]: I1206 09:55:47.021892 4954 scope.go:117] "RemoveContainer" containerID="ecaabb5f01117db23314d65df04989a69cda0ae175636a19a330c30142ef9e1d" Dec 06 09:55:47 crc kubenswrapper[4954]: E1206 09:55:47.022229 4954 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecaabb5f01117db23314d65df04989a69cda0ae175636a19a330c30142ef9e1d\": container with ID starting with ecaabb5f01117db23314d65df04989a69cda0ae175636a19a330c30142ef9e1d not found: ID does not exist" containerID="ecaabb5f01117db23314d65df04989a69cda0ae175636a19a330c30142ef9e1d" Dec 06 09:55:47 crc kubenswrapper[4954]: I1206 09:55:47.022257 4954 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecaabb5f01117db23314d65df04989a69cda0ae175636a19a330c30142ef9e1d"} err="failed to get container status \"ecaabb5f01117db23314d65df04989a69cda0ae175636a19a330c30142ef9e1d\": rpc error: code = NotFound desc = could not find container \"ecaabb5f01117db23314d65df04989a69cda0ae175636a19a330c30142ef9e1d\": container with ID starting with ecaabb5f01117db23314d65df04989a69cda0ae175636a19a330c30142ef9e1d not found: ID does not exist" Dec 06 09:55:47 crc kubenswrapper[4954]: I1206 09:55:47.443555 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:55:47 crc kubenswrapper[4954]: E1206 09:55:47.443835 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:55:47 crc kubenswrapper[4954]: I1206 09:55:47.454158 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69750506-794d-4627-986c-d472fa97d023" path="/var/lib/kubelet/pods/69750506-794d-4627-986c-d472fa97d023/volumes" Dec 06 09:55:59 crc kubenswrapper[4954]: I1206 09:55:59.444041 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:55:59 crc kubenswrapper[4954]: E1206 09:55:59.444777 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:56:12 crc kubenswrapper[4954]: I1206 09:56:12.444263 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:56:12 crc kubenswrapper[4954]: E1206 09:56:12.446550 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.683796 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-n9dxt"] Dec 06 09:56:17 crc kubenswrapper[4954]: E1206 09:56:17.684803 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69750506-794d-4627-986c-d472fa97d023" containerName="registry-server" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.684818 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="69750506-794d-4627-986c-d472fa97d023" containerName="registry-server" Dec 06 09:56:17 crc kubenswrapper[4954]: E1206 09:56:17.684855 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69750506-794d-4627-986c-d472fa97d023" containerName="extract-content" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.684861 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="69750506-794d-4627-986c-d472fa97d023" containerName="extract-content" Dec 06 09:56:17 crc kubenswrapper[4954]: E1206 09:56:17.684884 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69750506-794d-4627-986c-d472fa97d023" containerName="extract-utilities" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.684890 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="69750506-794d-4627-986c-d472fa97d023" containerName="extract-utilities" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.685109 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="69750506-794d-4627-986c-d472fa97d023" containerName="registry-server" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.685941 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.688161 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.688475 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.714711 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-n9dxt"] Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.841253 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9a551be6-c68f-4a66-b4fe-c988c394a7dd-ring-data-devices\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.841350 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a551be6-c68f-4a66-b4fe-c988c394a7dd-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.841409 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a551be6-c68f-4a66-b4fe-c988c394a7dd-scripts\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.841446 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhmbv\" (UniqueName: \"kubernetes.io/projected/9a551be6-c68f-4a66-b4fe-c988c394a7dd-kube-api-access-dhmbv\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.841500 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9a551be6-c68f-4a66-b4fe-c988c394a7dd-swiftconf\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.841554 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9a551be6-c68f-4a66-b4fe-c988c394a7dd-etc-swift\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.841611 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9a551be6-c68f-4a66-b4fe-c988c394a7dd-dispersionconf\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.943510 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9a551be6-c68f-4a66-b4fe-c988c394a7dd-swiftconf\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.943605 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9a551be6-c68f-4a66-b4fe-c988c394a7dd-etc-swift\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.943647 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9a551be6-c68f-4a66-b4fe-c988c394a7dd-dispersionconf\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.943696 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9a551be6-c68f-4a66-b4fe-c988c394a7dd-ring-data-devices\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.943771 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a551be6-c68f-4a66-b4fe-c988c394a7dd-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.943826 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a551be6-c68f-4a66-b4fe-c988c394a7dd-scripts\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.943863 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhmbv\" (UniqueName: \"kubernetes.io/projected/9a551be6-c68f-4a66-b4fe-c988c394a7dd-kube-api-access-dhmbv\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.944166 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9a551be6-c68f-4a66-b4fe-c988c394a7dd-etc-swift\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.944789 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a551be6-c68f-4a66-b4fe-c988c394a7dd-scripts\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.944812 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9a551be6-c68f-4a66-b4fe-c988c394a7dd-ring-data-devices\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.949645 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a551be6-c68f-4a66-b4fe-c988c394a7dd-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.956159 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9a551be6-c68f-4a66-b4fe-c988c394a7dd-dispersionconf\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.956947 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9a551be6-c68f-4a66-b4fe-c988c394a7dd-swiftconf\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:17 crc kubenswrapper[4954]: I1206 09:56:17.965115 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhmbv\" (UniqueName: \"kubernetes.io/projected/9a551be6-c68f-4a66-b4fe-c988c394a7dd-kube-api-access-dhmbv\") pod \"swift-ring-rebalance-debug-n9dxt\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:18 crc kubenswrapper[4954]: I1206 09:56:18.022262 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:18 crc kubenswrapper[4954]: I1206 09:56:18.486990 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-n9dxt"] Dec 06 09:56:18 crc kubenswrapper[4954]: W1206 09:56:18.495053 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a551be6_c68f_4a66_b4fe_c988c394a7dd.slice/crio-3a393864a02a3ed5660e48a6add38643addeadb343bc96c84d56a1a732abdfb5 WatchSource:0}: Error finding container 3a393864a02a3ed5660e48a6add38643addeadb343bc96c84d56a1a732abdfb5: Status 404 returned error can't find the container with id 3a393864a02a3ed5660e48a6add38643addeadb343bc96c84d56a1a732abdfb5 Dec 06 09:56:19 crc kubenswrapper[4954]: I1206 09:56:19.319160 4954 generic.go:334] "Generic (PLEG): container finished" podID="9a551be6-c68f-4a66-b4fe-c988c394a7dd" containerID="ab697388e14dbaf3c28f2d6314567482baffe393342d0219d891d8b8d5413de3" exitCode=0 Dec 06 09:56:19 crc kubenswrapper[4954]: I1206 09:56:19.319312 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-n9dxt" event={"ID":"9a551be6-c68f-4a66-b4fe-c988c394a7dd","Type":"ContainerDied","Data":"ab697388e14dbaf3c28f2d6314567482baffe393342d0219d891d8b8d5413de3"} Dec 06 09:56:19 crc kubenswrapper[4954]: I1206 09:56:19.319539 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-n9dxt" event={"ID":"9a551be6-c68f-4a66-b4fe-c988c394a7dd","Type":"ContainerStarted","Data":"3a393864a02a3ed5660e48a6add38643addeadb343bc96c84d56a1a732abdfb5"} Dec 06 09:56:19 crc kubenswrapper[4954]: I1206 09:56:19.379637 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-n9dxt"] Dec 06 09:56:19 crc kubenswrapper[4954]: I1206 09:56:19.388523 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-n9dxt"] Dec 06 09:56:19 crc kubenswrapper[4954]: I1206 09:56:19.916332 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-9fdwm"] Dec 06 09:56:19 crc kubenswrapper[4954]: E1206 09:56:19.917177 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a551be6-c68f-4a66-b4fe-c988c394a7dd" containerName="swift-ring-rebalance" Dec 06 09:56:19 crc kubenswrapper[4954]: I1206 09:56:19.917193 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a551be6-c68f-4a66-b4fe-c988c394a7dd" containerName="swift-ring-rebalance" Dec 06 09:56:19 crc kubenswrapper[4954]: I1206 09:56:19.917493 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a551be6-c68f-4a66-b4fe-c988c394a7dd" containerName="swift-ring-rebalance" Dec 06 09:56:19 crc kubenswrapper[4954]: I1206 09:56:19.918446 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:19 crc kubenswrapper[4954]: I1206 09:56:19.936506 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-9fdwm"] Dec 06 09:56:19 crc kubenswrapper[4954]: I1206 09:56:19.975133 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7710e371-287b-429f-8db4-a0fc93b953ee-scripts\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:19 crc kubenswrapper[4954]: I1206 09:56:19.975192 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7710e371-287b-429f-8db4-a0fc93b953ee-dispersionconf\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:19 crc kubenswrapper[4954]: I1206 09:56:19.975223 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcm2g\" (UniqueName: \"kubernetes.io/projected/7710e371-287b-429f-8db4-a0fc93b953ee-kube-api-access-wcm2g\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:19 crc kubenswrapper[4954]: I1206 09:56:19.975290 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7710e371-287b-429f-8db4-a0fc93b953ee-swiftconf\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:19 crc kubenswrapper[4954]: I1206 09:56:19.975309 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7710e371-287b-429f-8db4-a0fc93b953ee-ring-data-devices\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:19 crc kubenswrapper[4954]: I1206 09:56:19.975356 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7710e371-287b-429f-8db4-a0fc93b953ee-etc-swift\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:19 crc kubenswrapper[4954]: I1206 09:56:19.975526 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7710e371-287b-429f-8db4-a0fc93b953ee-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.078543 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7710e371-287b-429f-8db4-a0fc93b953ee-scripts\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.078633 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7710e371-287b-429f-8db4-a0fc93b953ee-dispersionconf\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.078700 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcm2g\" (UniqueName: \"kubernetes.io/projected/7710e371-287b-429f-8db4-a0fc93b953ee-kube-api-access-wcm2g\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.078803 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7710e371-287b-429f-8db4-a0fc93b953ee-swiftconf\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.078831 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7710e371-287b-429f-8db4-a0fc93b953ee-ring-data-devices\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.078895 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7710e371-287b-429f-8db4-a0fc93b953ee-etc-swift\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.078923 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7710e371-287b-429f-8db4-a0fc93b953ee-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.080673 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7710e371-287b-429f-8db4-a0fc93b953ee-scripts\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.081780 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7710e371-287b-429f-8db4-a0fc93b953ee-ring-data-devices\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.081996 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7710e371-287b-429f-8db4-a0fc93b953ee-etc-swift\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.084546 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7710e371-287b-429f-8db4-a0fc93b953ee-swiftconf\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.088130 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7710e371-287b-429f-8db4-a0fc93b953ee-dispersionconf\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.108713 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7710e371-287b-429f-8db4-a0fc93b953ee-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.109985 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcm2g\" (UniqueName: \"kubernetes.io/projected/7710e371-287b-429f-8db4-a0fc93b953ee-kube-api-access-wcm2g\") pod \"swift-ring-rebalance-debug-9fdwm\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.250902 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.738323 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.766909 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-9fdwm"] Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.792437 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a551be6-c68f-4a66-b4fe-c988c394a7dd-scripts\") pod \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.792517 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9a551be6-c68f-4a66-b4fe-c988c394a7dd-dispersionconf\") pod \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.792623 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhmbv\" (UniqueName: \"kubernetes.io/projected/9a551be6-c68f-4a66-b4fe-c988c394a7dd-kube-api-access-dhmbv\") pod \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.792690 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a551be6-c68f-4a66-b4fe-c988c394a7dd-combined-ca-bundle\") pod \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.792722 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9a551be6-c68f-4a66-b4fe-c988c394a7dd-ring-data-devices\") pod \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.792764 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9a551be6-c68f-4a66-b4fe-c988c394a7dd-swiftconf\") pod \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.792969 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9a551be6-c68f-4a66-b4fe-c988c394a7dd-etc-swift\") pod \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\" (UID: \"9a551be6-c68f-4a66-b4fe-c988c394a7dd\") " Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.793994 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a551be6-c68f-4a66-b4fe-c988c394a7dd-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9a551be6-c68f-4a66-b4fe-c988c394a7dd" (UID: "9a551be6-c68f-4a66-b4fe-c988c394a7dd"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.794365 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a551be6-c68f-4a66-b4fe-c988c394a7dd-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9a551be6-c68f-4a66-b4fe-c988c394a7dd" (UID: "9a551be6-c68f-4a66-b4fe-c988c394a7dd"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.794549 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9a551be6-c68f-4a66-b4fe-c988c394a7dd-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.794602 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9a551be6-c68f-4a66-b4fe-c988c394a7dd-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.797813 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a551be6-c68f-4a66-b4fe-c988c394a7dd-kube-api-access-dhmbv" (OuterVolumeSpecName: "kube-api-access-dhmbv") pod "9a551be6-c68f-4a66-b4fe-c988c394a7dd" (UID: "9a551be6-c68f-4a66-b4fe-c988c394a7dd"). InnerVolumeSpecName "kube-api-access-dhmbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.822669 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a551be6-c68f-4a66-b4fe-c988c394a7dd-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9a551be6-c68f-4a66-b4fe-c988c394a7dd" (UID: "9a551be6-c68f-4a66-b4fe-c988c394a7dd"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.822819 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a551be6-c68f-4a66-b4fe-c988c394a7dd-scripts" (OuterVolumeSpecName: "scripts") pod "9a551be6-c68f-4a66-b4fe-c988c394a7dd" (UID: "9a551be6-c68f-4a66-b4fe-c988c394a7dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.837301 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a551be6-c68f-4a66-b4fe-c988c394a7dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a551be6-c68f-4a66-b4fe-c988c394a7dd" (UID: "9a551be6-c68f-4a66-b4fe-c988c394a7dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.839523 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a551be6-c68f-4a66-b4fe-c988c394a7dd-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9a551be6-c68f-4a66-b4fe-c988c394a7dd" (UID: "9a551be6-c68f-4a66-b4fe-c988c394a7dd"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.896679 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a551be6-c68f-4a66-b4fe-c988c394a7dd-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.896983 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9a551be6-c68f-4a66-b4fe-c988c394a7dd-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.897102 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhmbv\" (UniqueName: \"kubernetes.io/projected/9a551be6-c68f-4a66-b4fe-c988c394a7dd-kube-api-access-dhmbv\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.897209 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a551be6-c68f-4a66-b4fe-c988c394a7dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:20 crc kubenswrapper[4954]: I1206 09:56:20.897283 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9a551be6-c68f-4a66-b4fe-c988c394a7dd-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:21 crc kubenswrapper[4954]: I1206 09:56:21.353293 4954 scope.go:117] "RemoveContainer" containerID="ab697388e14dbaf3c28f2d6314567482baffe393342d0219d891d8b8d5413de3" Dec 06 09:56:21 crc kubenswrapper[4954]: I1206 09:56:21.353304 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-n9dxt" Dec 06 09:56:21 crc kubenswrapper[4954]: I1206 09:56:21.356348 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-9fdwm" event={"ID":"7710e371-287b-429f-8db4-a0fc93b953ee","Type":"ContainerStarted","Data":"7084f9f2c6d4f2c646974f30ba278cdceb8866223eb16a1744fe61780fca38ca"} Dec 06 09:56:21 crc kubenswrapper[4954]: I1206 09:56:21.356381 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-9fdwm" event={"ID":"7710e371-287b-429f-8db4-a0fc93b953ee","Type":"ContainerStarted","Data":"64f914814d6773a7a37d7bb3bdbee7d60a2f3bed59d15be355fa0650f64e4876"} Dec 06 09:56:21 crc kubenswrapper[4954]: I1206 09:56:21.383379 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-9fdwm" podStartSLOduration=2.383344197 podStartE2EDuration="2.383344197s" podCreationTimestamp="2025-12-06 09:56:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:56:21.376592177 +0000 UTC m=+10756.189951626" watchObservedRunningTime="2025-12-06 09:56:21.383344197 +0000 UTC m=+10756.196703626" Dec 06 09:56:21 crc kubenswrapper[4954]: I1206 09:56:21.457910 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a551be6-c68f-4a66-b4fe-c988c394a7dd" path="/var/lib/kubelet/pods/9a551be6-c68f-4a66-b4fe-c988c394a7dd/volumes" Dec 06 09:56:25 crc kubenswrapper[4954]: I1206 09:56:25.452828 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:56:25 crc kubenswrapper[4954]: E1206 09:56:25.453477 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:56:29 crc kubenswrapper[4954]: I1206 09:56:29.442173 4954 generic.go:334] "Generic (PLEG): container finished" podID="7710e371-287b-429f-8db4-a0fc93b953ee" containerID="7084f9f2c6d4f2c646974f30ba278cdceb8866223eb16a1744fe61780fca38ca" exitCode=0 Dec 06 09:56:29 crc kubenswrapper[4954]: I1206 09:56:29.459581 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-9fdwm" event={"ID":"7710e371-287b-429f-8db4-a0fc93b953ee","Type":"ContainerDied","Data":"7084f9f2c6d4f2c646974f30ba278cdceb8866223eb16a1744fe61780fca38ca"} Dec 06 09:56:30 crc kubenswrapper[4954]: I1206 09:56:30.957242 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.020389 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-9fdwm"] Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.035190 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-9fdwm"] Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.066923 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7710e371-287b-429f-8db4-a0fc93b953ee-scripts\") pod \"7710e371-287b-429f-8db4-a0fc93b953ee\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.067782 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7710e371-287b-429f-8db4-a0fc93b953ee-ring-data-devices\") pod \"7710e371-287b-429f-8db4-a0fc93b953ee\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.067911 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7710e371-287b-429f-8db4-a0fc93b953ee-swiftconf\") pod \"7710e371-287b-429f-8db4-a0fc93b953ee\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.067970 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcm2g\" (UniqueName: \"kubernetes.io/projected/7710e371-287b-429f-8db4-a0fc93b953ee-kube-api-access-wcm2g\") pod \"7710e371-287b-429f-8db4-a0fc93b953ee\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.068097 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7710e371-287b-429f-8db4-a0fc93b953ee-dispersionconf\") pod \"7710e371-287b-429f-8db4-a0fc93b953ee\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.068155 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7710e371-287b-429f-8db4-a0fc93b953ee-etc-swift\") pod \"7710e371-287b-429f-8db4-a0fc93b953ee\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.068178 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7710e371-287b-429f-8db4-a0fc93b953ee-combined-ca-bundle\") pod \"7710e371-287b-429f-8db4-a0fc93b953ee\" (UID: \"7710e371-287b-429f-8db4-a0fc93b953ee\") " Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.068466 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7710e371-287b-429f-8db4-a0fc93b953ee-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7710e371-287b-429f-8db4-a0fc93b953ee" (UID: "7710e371-287b-429f-8db4-a0fc93b953ee"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.069010 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7710e371-287b-429f-8db4-a0fc93b953ee-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.069281 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7710e371-287b-429f-8db4-a0fc93b953ee-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7710e371-287b-429f-8db4-a0fc93b953ee" (UID: "7710e371-287b-429f-8db4-a0fc93b953ee"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.082007 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7710e371-287b-429f-8db4-a0fc93b953ee-kube-api-access-wcm2g" (OuterVolumeSpecName: "kube-api-access-wcm2g") pod "7710e371-287b-429f-8db4-a0fc93b953ee" (UID: "7710e371-287b-429f-8db4-a0fc93b953ee"). InnerVolumeSpecName "kube-api-access-wcm2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.101020 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7710e371-287b-429f-8db4-a0fc93b953ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7710e371-287b-429f-8db4-a0fc93b953ee" (UID: "7710e371-287b-429f-8db4-a0fc93b953ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.101729 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7710e371-287b-429f-8db4-a0fc93b953ee-scripts" (OuterVolumeSpecName: "scripts") pod "7710e371-287b-429f-8db4-a0fc93b953ee" (UID: "7710e371-287b-429f-8db4-a0fc93b953ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.116437 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7710e371-287b-429f-8db4-a0fc93b953ee-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7710e371-287b-429f-8db4-a0fc93b953ee" (UID: "7710e371-287b-429f-8db4-a0fc93b953ee"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.128852 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7710e371-287b-429f-8db4-a0fc93b953ee-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7710e371-287b-429f-8db4-a0fc93b953ee" (UID: "7710e371-287b-429f-8db4-a0fc93b953ee"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.171936 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7710e371-287b-429f-8db4-a0fc93b953ee-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.172017 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7710e371-287b-429f-8db4-a0fc93b953ee-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.172040 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcm2g\" (UniqueName: \"kubernetes.io/projected/7710e371-287b-429f-8db4-a0fc93b953ee-kube-api-access-wcm2g\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.172060 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7710e371-287b-429f-8db4-a0fc93b953ee-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.172082 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7710e371-287b-429f-8db4-a0fc93b953ee-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.172099 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7710e371-287b-429f-8db4-a0fc93b953ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.464332 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7710e371-287b-429f-8db4-a0fc93b953ee" path="/var/lib/kubelet/pods/7710e371-287b-429f-8db4-a0fc93b953ee/volumes" Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.469502 4954 scope.go:117] "RemoveContainer" containerID="7084f9f2c6d4f2c646974f30ba278cdceb8866223eb16a1744fe61780fca38ca" Dec 06 09:56:31 crc kubenswrapper[4954]: I1206 09:56:31.469712 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-9fdwm" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.557280 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-hgl6c"] Dec 06 09:56:34 crc kubenswrapper[4954]: E1206 09:56:34.558667 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7710e371-287b-429f-8db4-a0fc93b953ee" containerName="swift-ring-rebalance" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.558696 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7710e371-287b-429f-8db4-a0fc93b953ee" containerName="swift-ring-rebalance" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.559091 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7710e371-287b-429f-8db4-a0fc93b953ee" containerName="swift-ring-rebalance" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.560519 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.562654 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.563209 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.588592 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-hgl6c"] Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.648456 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-ring-data-devices\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.648597 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.648627 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7qzg\" (UniqueName: \"kubernetes.io/projected/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-kube-api-access-t7qzg\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.649012 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-scripts\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.649205 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-dispersionconf\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.649517 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-swiftconf\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.649670 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-etc-swift\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.751452 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-scripts\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.751999 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-dispersionconf\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.752091 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-swiftconf\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.752143 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-etc-swift\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.752240 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-ring-data-devices\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.752344 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.752381 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7qzg\" (UniqueName: \"kubernetes.io/projected/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-kube-api-access-t7qzg\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.752790 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-etc-swift\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.752805 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-scripts\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.753535 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-ring-data-devices\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.760177 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.761734 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-swiftconf\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.765385 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-dispersionconf\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.786812 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7qzg\" (UniqueName: \"kubernetes.io/projected/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-kube-api-access-t7qzg\") pod \"swift-ring-rebalance-debug-hgl6c\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:34 crc kubenswrapper[4954]: I1206 09:56:34.896155 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:35 crc kubenswrapper[4954]: I1206 09:56:35.440350 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-hgl6c"] Dec 06 09:56:35 crc kubenswrapper[4954]: I1206 09:56:35.525111 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-hgl6c" event={"ID":"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf","Type":"ContainerStarted","Data":"b92502cde3a653d5bee8b7fd84644a8a275ebb29b47969a0c6f70eb191274c66"} Dec 06 09:56:36 crc kubenswrapper[4954]: I1206 09:56:36.547073 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-hgl6c" event={"ID":"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf","Type":"ContainerStarted","Data":"b83a58551c2fc4d66eac24c96715a9875d9e86a16142d90e2cad9833aeb237e0"} Dec 06 09:56:36 crc kubenswrapper[4954]: I1206 09:56:36.582394 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-hgl6c" podStartSLOduration=2.5823747040000002 podStartE2EDuration="2.582374704s" podCreationTimestamp="2025-12-06 09:56:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:56:36.57693967 +0000 UTC m=+10771.390299059" watchObservedRunningTime="2025-12-06 09:56:36.582374704 +0000 UTC m=+10771.395734093" Dec 06 09:56:38 crc kubenswrapper[4954]: I1206 09:56:38.444586 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:56:38 crc kubenswrapper[4954]: E1206 09:56:38.445207 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 09:56:44 crc kubenswrapper[4954]: I1206 09:56:44.643666 4954 generic.go:334] "Generic (PLEG): container finished" podID="7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf" containerID="b83a58551c2fc4d66eac24c96715a9875d9e86a16142d90e2cad9833aeb237e0" exitCode=0 Dec 06 09:56:44 crc kubenswrapper[4954]: I1206 09:56:44.643758 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-hgl6c" event={"ID":"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf","Type":"ContainerDied","Data":"b83a58551c2fc4d66eac24c96715a9875d9e86a16142d90e2cad9833aeb237e0"} Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.467784 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.541195 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-hgl6c"] Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.548646 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-combined-ca-bundle\") pod \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.548692 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-scripts\") pod \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.548717 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-dispersionconf\") pod \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.548809 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-etc-swift\") pod \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.548844 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-ring-data-devices\") pod \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.548893 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-swiftconf\") pod \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.549327 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7qzg\" (UniqueName: \"kubernetes.io/projected/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-kube-api-access-t7qzg\") pod \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\" (UID: \"7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf\") " Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.550004 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf" (UID: "7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.550150 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf" (UID: "7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.552268 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.552299 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.552543 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-hgl6c"] Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.556034 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-kube-api-access-t7qzg" (OuterVolumeSpecName: "kube-api-access-t7qzg") pod "7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf" (UID: "7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf"). InnerVolumeSpecName "kube-api-access-t7qzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.593094 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf" (UID: "7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.600289 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-scripts" (OuterVolumeSpecName: "scripts") pod "7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf" (UID: "7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.602938 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf" (UID: "7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.606002 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf" (UID: "7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.655342 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7qzg\" (UniqueName: \"kubernetes.io/projected/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-kube-api-access-t7qzg\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.655412 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.655438 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.655456 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.655474 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.688144 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b92502cde3a653d5bee8b7fd84644a8a275ebb29b47969a0c6f70eb191274c66" Dec 06 09:56:46 crc kubenswrapper[4954]: I1206 09:56:46.688184 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-hgl6c" Dec 06 09:56:47 crc kubenswrapper[4954]: I1206 09:56:47.456745 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf" path="/var/lib/kubelet/pods/7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf/volumes" Dec 06 09:56:53 crc kubenswrapper[4954]: I1206 09:56:53.443994 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 09:56:53 crc kubenswrapper[4954]: I1206 09:56:53.772320 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"71099a7502436019cc48a8c0425ae64d0598a10ec6e6d6d7c4657bec700876e0"} Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.726643 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-lx6vg"] Dec 06 09:57:46 crc kubenswrapper[4954]: E1206 09:57:46.727801 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf" containerName="swift-ring-rebalance" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.727820 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf" containerName="swift-ring-rebalance" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.728263 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3dcdef-ba95-4e8d-bb2f-e44c783f9bcf" containerName="swift-ring-rebalance" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.729224 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.733665 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.733989 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.767141 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-lx6vg"] Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.820929 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd5f2963-a530-4e55-ac7d-cc9f89742c41-etc-swift\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.821024 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd5f2963-a530-4e55-ac7d-cc9f89742c41-dispersionconf\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.821099 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd5f2963-a530-4e55-ac7d-cc9f89742c41-swiftconf\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.821127 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd5f2963-a530-4e55-ac7d-cc9f89742c41-scripts\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.821159 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx2rb\" (UniqueName: \"kubernetes.io/projected/fd5f2963-a530-4e55-ac7d-cc9f89742c41-kube-api-access-nx2rb\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.821183 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd5f2963-a530-4e55-ac7d-cc9f89742c41-ring-data-devices\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.821301 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd5f2963-a530-4e55-ac7d-cc9f89742c41-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.923514 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd5f2963-a530-4e55-ac7d-cc9f89742c41-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.923631 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd5f2963-a530-4e55-ac7d-cc9f89742c41-etc-swift\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.923703 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd5f2963-a530-4e55-ac7d-cc9f89742c41-dispersionconf\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.923766 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd5f2963-a530-4e55-ac7d-cc9f89742c41-swiftconf\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.923796 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd5f2963-a530-4e55-ac7d-cc9f89742c41-scripts\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.923832 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx2rb\" (UniqueName: \"kubernetes.io/projected/fd5f2963-a530-4e55-ac7d-cc9f89742c41-kube-api-access-nx2rb\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.923861 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd5f2963-a530-4e55-ac7d-cc9f89742c41-ring-data-devices\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.924628 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd5f2963-a530-4e55-ac7d-cc9f89742c41-etc-swift\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.924987 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd5f2963-a530-4e55-ac7d-cc9f89742c41-scripts\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.925041 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd5f2963-a530-4e55-ac7d-cc9f89742c41-ring-data-devices\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.929671 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd5f2963-a530-4e55-ac7d-cc9f89742c41-dispersionconf\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.929888 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd5f2963-a530-4e55-ac7d-cc9f89742c41-swiftconf\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.930693 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd5f2963-a530-4e55-ac7d-cc9f89742c41-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:46 crc kubenswrapper[4954]: I1206 09:57:46.945702 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx2rb\" (UniqueName: \"kubernetes.io/projected/fd5f2963-a530-4e55-ac7d-cc9f89742c41-kube-api-access-nx2rb\") pod \"swift-ring-rebalance-debug-lx6vg\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:47 crc kubenswrapper[4954]: I1206 09:57:47.052077 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:47 crc kubenswrapper[4954]: I1206 09:57:47.587068 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-lx6vg"] Dec 06 09:57:48 crc kubenswrapper[4954]: I1206 09:57:48.471083 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-lx6vg" event={"ID":"fd5f2963-a530-4e55-ac7d-cc9f89742c41","Type":"ContainerStarted","Data":"c87e953c34aab8e8d5c12c6e1164cb4b2de32ca87ac8f996996f046b5e205d09"} Dec 06 09:57:48 crc kubenswrapper[4954]: I1206 09:57:48.471482 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-lx6vg" event={"ID":"fd5f2963-a530-4e55-ac7d-cc9f89742c41","Type":"ContainerStarted","Data":"514e95af6ec9dd656fad178c0785d4973a5ff10866c27d17083ea34b342c711b"} Dec 06 09:57:48 crc kubenswrapper[4954]: I1206 09:57:48.586783 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-lx6vg" podStartSLOduration=2.586754137 podStartE2EDuration="2.586754137s" podCreationTimestamp="2025-12-06 09:57:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:57:48.563889468 +0000 UTC m=+10843.377248927" watchObservedRunningTime="2025-12-06 09:57:48.586754137 +0000 UTC m=+10843.400113556" Dec 06 09:57:57 crc kubenswrapper[4954]: I1206 09:57:57.581037 4954 generic.go:334] "Generic (PLEG): container finished" podID="fd5f2963-a530-4e55-ac7d-cc9f89742c41" containerID="c87e953c34aab8e8d5c12c6e1164cb4b2de32ca87ac8f996996f046b5e205d09" exitCode=0 Dec 06 09:57:57 crc kubenswrapper[4954]: I1206 09:57:57.581112 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-lx6vg" event={"ID":"fd5f2963-a530-4e55-ac7d-cc9f89742c41","Type":"ContainerDied","Data":"c87e953c34aab8e8d5c12c6e1164cb4b2de32ca87ac8f996996f046b5e205d09"} Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.570602 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.631680 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-lx6vg"] Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.640695 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-lx6vg"] Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.644387 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="514e95af6ec9dd656fad178c0785d4973a5ff10866c27d17083ea34b342c711b" Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.644452 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-lx6vg" Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.644954 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd5f2963-a530-4e55-ac7d-cc9f89742c41-scripts\") pod \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.645118 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd5f2963-a530-4e55-ac7d-cc9f89742c41-ring-data-devices\") pod \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.645244 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd5f2963-a530-4e55-ac7d-cc9f89742c41-swiftconf\") pod \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.645300 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd5f2963-a530-4e55-ac7d-cc9f89742c41-combined-ca-bundle\") pod \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.645336 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd5f2963-a530-4e55-ac7d-cc9f89742c41-dispersionconf\") pod \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.645448 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx2rb\" (UniqueName: \"kubernetes.io/projected/fd5f2963-a530-4e55-ac7d-cc9f89742c41-kube-api-access-nx2rb\") pod \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.645587 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd5f2963-a530-4e55-ac7d-cc9f89742c41-etc-swift\") pod \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\" (UID: \"fd5f2963-a530-4e55-ac7d-cc9f89742c41\") " Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.645779 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd5f2963-a530-4e55-ac7d-cc9f89742c41-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "fd5f2963-a530-4e55-ac7d-cc9f89742c41" (UID: "fd5f2963-a530-4e55-ac7d-cc9f89742c41"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.646050 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd5f2963-a530-4e55-ac7d-cc9f89742c41-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.646887 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd5f2963-a530-4e55-ac7d-cc9f89742c41-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fd5f2963-a530-4e55-ac7d-cc9f89742c41" (UID: "fd5f2963-a530-4e55-ac7d-cc9f89742c41"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.671110 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd5f2963-a530-4e55-ac7d-cc9f89742c41-kube-api-access-nx2rb" (OuterVolumeSpecName: "kube-api-access-nx2rb") pod "fd5f2963-a530-4e55-ac7d-cc9f89742c41" (UID: "fd5f2963-a530-4e55-ac7d-cc9f89742c41"). InnerVolumeSpecName "kube-api-access-nx2rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.687706 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd5f2963-a530-4e55-ac7d-cc9f89742c41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd5f2963-a530-4e55-ac7d-cc9f89742c41" (UID: "fd5f2963-a530-4e55-ac7d-cc9f89742c41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.692920 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd5f2963-a530-4e55-ac7d-cc9f89742c41-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "fd5f2963-a530-4e55-ac7d-cc9f89742c41" (UID: "fd5f2963-a530-4e55-ac7d-cc9f89742c41"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.704857 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd5f2963-a530-4e55-ac7d-cc9f89742c41-scripts" (OuterVolumeSpecName: "scripts") pod "fd5f2963-a530-4e55-ac7d-cc9f89742c41" (UID: "fd5f2963-a530-4e55-ac7d-cc9f89742c41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.748103 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd5f2963-a530-4e55-ac7d-cc9f89742c41-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.748141 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd5f2963-a530-4e55-ac7d-cc9f89742c41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.748153 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx2rb\" (UniqueName: \"kubernetes.io/projected/fd5f2963-a530-4e55-ac7d-cc9f89742c41-kube-api-access-nx2rb\") on node \"crc\" DevicePath \"\"" Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.748161 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd5f2963-a530-4e55-ac7d-cc9f89742c41-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.748176 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd5f2963-a530-4e55-ac7d-cc9f89742c41-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.752401 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd5f2963-a530-4e55-ac7d-cc9f89742c41-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "fd5f2963-a530-4e55-ac7d-cc9f89742c41" (UID: "fd5f2963-a530-4e55-ac7d-cc9f89742c41"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:57:59 crc kubenswrapper[4954]: I1206 09:57:59.849655 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd5f2963-a530-4e55-ac7d-cc9f89742c41-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 09:58:01 crc kubenswrapper[4954]: I1206 09:58:01.456880 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd5f2963-a530-4e55-ac7d-cc9f89742c41" path="/var/lib/kubelet/pods/fd5f2963-a530-4e55-ac7d-cc9f89742c41/volumes" Dec 06 09:58:59 crc kubenswrapper[4954]: I1206 09:58:59.850885 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-dfb6p"] Dec 06 09:58:59 crc kubenswrapper[4954]: E1206 09:58:59.851903 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5f2963-a530-4e55-ac7d-cc9f89742c41" containerName="swift-ring-rebalance" Dec 06 09:58:59 crc kubenswrapper[4954]: I1206 09:58:59.851980 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5f2963-a530-4e55-ac7d-cc9f89742c41" containerName="swift-ring-rebalance" Dec 06 09:58:59 crc kubenswrapper[4954]: I1206 09:58:59.852269 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5f2963-a530-4e55-ac7d-cc9f89742c41" containerName="swift-ring-rebalance" Dec 06 09:58:59 crc kubenswrapper[4954]: I1206 09:58:59.852985 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:58:59 crc kubenswrapper[4954]: I1206 09:58:59.863155 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-dfb6p"] Dec 06 09:58:59 crc kubenswrapper[4954]: I1206 09:58:59.913332 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 09:58:59 crc kubenswrapper[4954]: I1206 09:58:59.913778 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.037036 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cecd3145-8e9b-4a17-8764-a2f4afd6945c-scripts\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.037125 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cecd3145-8e9b-4a17-8764-a2f4afd6945c-swiftconf\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.037170 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cecd3145-8e9b-4a17-8764-a2f4afd6945c-dispersionconf\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.037196 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cecd3145-8e9b-4a17-8764-a2f4afd6945c-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.037253 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cecd3145-8e9b-4a17-8764-a2f4afd6945c-ring-data-devices\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.037390 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cecd3145-8e9b-4a17-8764-a2f4afd6945c-etc-swift\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.037445 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgbvs\" (UniqueName: \"kubernetes.io/projected/cecd3145-8e9b-4a17-8764-a2f4afd6945c-kube-api-access-tgbvs\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.140130 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cecd3145-8e9b-4a17-8764-a2f4afd6945c-dispersionconf\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.140185 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cecd3145-8e9b-4a17-8764-a2f4afd6945c-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.140260 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cecd3145-8e9b-4a17-8764-a2f4afd6945c-ring-data-devices\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.140312 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgbvs\" (UniqueName: \"kubernetes.io/projected/cecd3145-8e9b-4a17-8764-a2f4afd6945c-kube-api-access-tgbvs\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.140340 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cecd3145-8e9b-4a17-8764-a2f4afd6945c-etc-swift\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.140552 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cecd3145-8e9b-4a17-8764-a2f4afd6945c-scripts\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.140614 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cecd3145-8e9b-4a17-8764-a2f4afd6945c-swiftconf\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.140963 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cecd3145-8e9b-4a17-8764-a2f4afd6945c-etc-swift\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.141606 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cecd3145-8e9b-4a17-8764-a2f4afd6945c-ring-data-devices\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.142120 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cecd3145-8e9b-4a17-8764-a2f4afd6945c-scripts\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.150261 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cecd3145-8e9b-4a17-8764-a2f4afd6945c-swiftconf\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.150362 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cecd3145-8e9b-4a17-8764-a2f4afd6945c-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.150375 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cecd3145-8e9b-4a17-8764-a2f4afd6945c-dispersionconf\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.162676 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgbvs\" (UniqueName: \"kubernetes.io/projected/cecd3145-8e9b-4a17-8764-a2f4afd6945c-kube-api-access-tgbvs\") pod \"swift-ring-rebalance-debug-dfb6p\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.237146 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:00 crc kubenswrapper[4954]: I1206 09:59:00.735710 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-dfb6p"] Dec 06 09:59:00 crc kubenswrapper[4954]: W1206 09:59:00.741783 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcecd3145_8e9b_4a17_8764_a2f4afd6945c.slice/crio-81cad106bdb9e284c7c6b87287afddbf0c6006553730336fcdd9424fa7c6750d WatchSource:0}: Error finding container 81cad106bdb9e284c7c6b87287afddbf0c6006553730336fcdd9424fa7c6750d: Status 404 returned error can't find the container with id 81cad106bdb9e284c7c6b87287afddbf0c6006553730336fcdd9424fa7c6750d Dec 06 09:59:01 crc kubenswrapper[4954]: I1206 09:59:01.267374 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-dfb6p" event={"ID":"cecd3145-8e9b-4a17-8764-a2f4afd6945c","Type":"ContainerStarted","Data":"6a62c39c882933b46cfe1a28ff01fcc7b26b1365d2bd856458f159ad0aeeb326"} Dec 06 09:59:01 crc kubenswrapper[4954]: I1206 09:59:01.267863 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-dfb6p" event={"ID":"cecd3145-8e9b-4a17-8764-a2f4afd6945c","Type":"ContainerStarted","Data":"81cad106bdb9e284c7c6b87287afddbf0c6006553730336fcdd9424fa7c6750d"} Dec 06 09:59:01 crc kubenswrapper[4954]: I1206 09:59:01.297745 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-dfb6p" podStartSLOduration=2.297550669 podStartE2EDuration="2.297550669s" podCreationTimestamp="2025-12-06 09:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 09:59:01.283035242 +0000 UTC m=+10916.096394631" watchObservedRunningTime="2025-12-06 09:59:01.297550669 +0000 UTC m=+10916.110910058" Dec 06 09:59:10 crc kubenswrapper[4954]: I1206 09:59:10.101827 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:59:10 crc kubenswrapper[4954]: I1206 09:59:10.102373 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 09:59:10 crc kubenswrapper[4954]: I1206 09:59:10.361220 4954 generic.go:334] "Generic (PLEG): container finished" podID="cecd3145-8e9b-4a17-8764-a2f4afd6945c" containerID="6a62c39c882933b46cfe1a28ff01fcc7b26b1365d2bd856458f159ad0aeeb326" exitCode=0 Dec 06 09:59:10 crc kubenswrapper[4954]: I1206 09:59:10.361263 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-dfb6p" event={"ID":"cecd3145-8e9b-4a17-8764-a2f4afd6945c","Type":"ContainerDied","Data":"6a62c39c882933b46cfe1a28ff01fcc7b26b1365d2bd856458f159ad0aeeb326"} Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.245070 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.306386 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgbvs\" (UniqueName: \"kubernetes.io/projected/cecd3145-8e9b-4a17-8764-a2f4afd6945c-kube-api-access-tgbvs\") pod \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.306832 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cecd3145-8e9b-4a17-8764-a2f4afd6945c-scripts\") pod \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.307121 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cecd3145-8e9b-4a17-8764-a2f4afd6945c-ring-data-devices\") pod \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.307265 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cecd3145-8e9b-4a17-8764-a2f4afd6945c-swiftconf\") pod \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.307401 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cecd3145-8e9b-4a17-8764-a2f4afd6945c-dispersionconf\") pod \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.307503 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cecd3145-8e9b-4a17-8764-a2f4afd6945c-etc-swift\") pod \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.307671 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cecd3145-8e9b-4a17-8764-a2f4afd6945c-combined-ca-bundle\") pod \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\" (UID: \"cecd3145-8e9b-4a17-8764-a2f4afd6945c\") " Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.312671 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-dfb6p"] Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.313366 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cecd3145-8e9b-4a17-8764-a2f4afd6945c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "cecd3145-8e9b-4a17-8764-a2f4afd6945c" (UID: "cecd3145-8e9b-4a17-8764-a2f4afd6945c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.315755 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cecd3145-8e9b-4a17-8764-a2f4afd6945c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cecd3145-8e9b-4a17-8764-a2f4afd6945c" (UID: "cecd3145-8e9b-4a17-8764-a2f4afd6945c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.321142 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cecd3145-8e9b-4a17-8764-a2f4afd6945c-kube-api-access-tgbvs" (OuterVolumeSpecName: "kube-api-access-tgbvs") pod "cecd3145-8e9b-4a17-8764-a2f4afd6945c" (UID: "cecd3145-8e9b-4a17-8764-a2f4afd6945c"). InnerVolumeSpecName "kube-api-access-tgbvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.353044 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-dfb6p"] Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.365282 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cecd3145-8e9b-4a17-8764-a2f4afd6945c-scripts" (OuterVolumeSpecName: "scripts") pod "cecd3145-8e9b-4a17-8764-a2f4afd6945c" (UID: "cecd3145-8e9b-4a17-8764-a2f4afd6945c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.371217 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cecd3145-8e9b-4a17-8764-a2f4afd6945c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cecd3145-8e9b-4a17-8764-a2f4afd6945c" (UID: "cecd3145-8e9b-4a17-8764-a2f4afd6945c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.376723 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cecd3145-8e9b-4a17-8764-a2f4afd6945c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "cecd3145-8e9b-4a17-8764-a2f4afd6945c" (UID: "cecd3145-8e9b-4a17-8764-a2f4afd6945c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.383793 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cecd3145-8e9b-4a17-8764-a2f4afd6945c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "cecd3145-8e9b-4a17-8764-a2f4afd6945c" (UID: "cecd3145-8e9b-4a17-8764-a2f4afd6945c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.386965 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81cad106bdb9e284c7c6b87287afddbf0c6006553730336fcdd9424fa7c6750d" Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.387088 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-dfb6p" Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.410945 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cecd3145-8e9b-4a17-8764-a2f4afd6945c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.410978 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cecd3145-8e9b-4a17-8764-a2f4afd6945c-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.410987 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cecd3145-8e9b-4a17-8764-a2f4afd6945c-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.410996 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cecd3145-8e9b-4a17-8764-a2f4afd6945c-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.411003 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cecd3145-8e9b-4a17-8764-a2f4afd6945c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.411012 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgbvs\" (UniqueName: \"kubernetes.io/projected/cecd3145-8e9b-4a17-8764-a2f4afd6945c-kube-api-access-tgbvs\") on node \"crc\" DevicePath \"\"" Dec 06 09:59:12 crc kubenswrapper[4954]: I1206 09:59:12.411023 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cecd3145-8e9b-4a17-8764-a2f4afd6945c-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 09:59:13 crc kubenswrapper[4954]: I1206 09:59:13.454236 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cecd3145-8e9b-4a17-8764-a2f4afd6945c" path="/var/lib/kubelet/pods/cecd3145-8e9b-4a17-8764-a2f4afd6945c/volumes" Dec 06 09:59:40 crc kubenswrapper[4954]: I1206 09:59:40.101917 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 09:59:40 crc kubenswrapper[4954]: I1206 09:59:40.102607 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:00:00 crc kubenswrapper[4954]: I1206 10:00:00.149053 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j"] Dec 06 10:00:00 crc kubenswrapper[4954]: E1206 10:00:00.150074 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cecd3145-8e9b-4a17-8764-a2f4afd6945c" containerName="swift-ring-rebalance" Dec 06 10:00:00 crc kubenswrapper[4954]: I1206 10:00:00.150089 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="cecd3145-8e9b-4a17-8764-a2f4afd6945c" containerName="swift-ring-rebalance" Dec 06 10:00:00 crc kubenswrapper[4954]: I1206 10:00:00.150376 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="cecd3145-8e9b-4a17-8764-a2f4afd6945c" containerName="swift-ring-rebalance" Dec 06 10:00:00 crc kubenswrapper[4954]: I1206 10:00:00.151160 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j" Dec 06 10:00:00 crc kubenswrapper[4954]: I1206 10:00:00.153131 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 10:00:00 crc kubenswrapper[4954]: I1206 10:00:00.153822 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 10:00:00 crc kubenswrapper[4954]: I1206 10:00:00.191484 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j"] Dec 06 10:00:00 crc kubenswrapper[4954]: I1206 10:00:00.253011 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bda2e2be-14f5-448f-994e-6a7c0e9a43a1-config-volume\") pod \"collect-profiles-29416920-v2n7j\" (UID: \"bda2e2be-14f5-448f-994e-6a7c0e9a43a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j" Dec 06 10:00:00 crc kubenswrapper[4954]: I1206 10:00:00.253094 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bda2e2be-14f5-448f-994e-6a7c0e9a43a1-secret-volume\") pod \"collect-profiles-29416920-v2n7j\" (UID: \"bda2e2be-14f5-448f-994e-6a7c0e9a43a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j" Dec 06 10:00:00 crc kubenswrapper[4954]: I1206 10:00:00.253172 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q9gn\" (UniqueName: \"kubernetes.io/projected/bda2e2be-14f5-448f-994e-6a7c0e9a43a1-kube-api-access-6q9gn\") pod \"collect-profiles-29416920-v2n7j\" (UID: \"bda2e2be-14f5-448f-994e-6a7c0e9a43a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j" Dec 06 10:00:00 crc kubenswrapper[4954]: I1206 10:00:00.354765 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bda2e2be-14f5-448f-994e-6a7c0e9a43a1-config-volume\") pod \"collect-profiles-29416920-v2n7j\" (UID: \"bda2e2be-14f5-448f-994e-6a7c0e9a43a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j" Dec 06 10:00:00 crc kubenswrapper[4954]: I1206 10:00:00.354843 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bda2e2be-14f5-448f-994e-6a7c0e9a43a1-secret-volume\") pod \"collect-profiles-29416920-v2n7j\" (UID: \"bda2e2be-14f5-448f-994e-6a7c0e9a43a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j" Dec 06 10:00:00 crc kubenswrapper[4954]: I1206 10:00:00.354906 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q9gn\" (UniqueName: \"kubernetes.io/projected/bda2e2be-14f5-448f-994e-6a7c0e9a43a1-kube-api-access-6q9gn\") pod \"collect-profiles-29416920-v2n7j\" (UID: \"bda2e2be-14f5-448f-994e-6a7c0e9a43a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j" Dec 06 10:00:00 crc kubenswrapper[4954]: I1206 10:00:00.355781 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bda2e2be-14f5-448f-994e-6a7c0e9a43a1-config-volume\") pod \"collect-profiles-29416920-v2n7j\" (UID: \"bda2e2be-14f5-448f-994e-6a7c0e9a43a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j" Dec 06 10:00:00 crc kubenswrapper[4954]: I1206 10:00:00.367754 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bda2e2be-14f5-448f-994e-6a7c0e9a43a1-secret-volume\") pod \"collect-profiles-29416920-v2n7j\" (UID: \"bda2e2be-14f5-448f-994e-6a7c0e9a43a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j" Dec 06 10:00:00 crc kubenswrapper[4954]: I1206 10:00:00.371758 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q9gn\" (UniqueName: \"kubernetes.io/projected/bda2e2be-14f5-448f-994e-6a7c0e9a43a1-kube-api-access-6q9gn\") pod \"collect-profiles-29416920-v2n7j\" (UID: \"bda2e2be-14f5-448f-994e-6a7c0e9a43a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j" Dec 06 10:00:00 crc kubenswrapper[4954]: I1206 10:00:00.474251 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j" Dec 06 10:00:00 crc kubenswrapper[4954]: I1206 10:00:00.956737 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j"] Dec 06 10:00:01 crc kubenswrapper[4954]: I1206 10:00:01.015057 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j" event={"ID":"bda2e2be-14f5-448f-994e-6a7c0e9a43a1","Type":"ContainerStarted","Data":"5e57fe35c62d70b8c1524009d0022fe6c208844772b68ed8525f355494a93914"} Dec 06 10:00:02 crc kubenswrapper[4954]: I1206 10:00:02.025689 4954 generic.go:334] "Generic (PLEG): container finished" podID="bda2e2be-14f5-448f-994e-6a7c0e9a43a1" containerID="bdc87d53b14d2e5ba2258f705b75e6e7ec2c804645b37c8dab7631c7d23994c4" exitCode=0 Dec 06 10:00:02 crc kubenswrapper[4954]: I1206 10:00:02.025784 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j" event={"ID":"bda2e2be-14f5-448f-994e-6a7c0e9a43a1","Type":"ContainerDied","Data":"bdc87d53b14d2e5ba2258f705b75e6e7ec2c804645b37c8dab7631c7d23994c4"} Dec 06 10:00:03 crc kubenswrapper[4954]: I1206 10:00:03.708655 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j" Dec 06 10:00:03 crc kubenswrapper[4954]: I1206 10:00:03.847448 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q9gn\" (UniqueName: \"kubernetes.io/projected/bda2e2be-14f5-448f-994e-6a7c0e9a43a1-kube-api-access-6q9gn\") pod \"bda2e2be-14f5-448f-994e-6a7c0e9a43a1\" (UID: \"bda2e2be-14f5-448f-994e-6a7c0e9a43a1\") " Dec 06 10:00:03 crc kubenswrapper[4954]: I1206 10:00:03.847578 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bda2e2be-14f5-448f-994e-6a7c0e9a43a1-config-volume\") pod \"bda2e2be-14f5-448f-994e-6a7c0e9a43a1\" (UID: \"bda2e2be-14f5-448f-994e-6a7c0e9a43a1\") " Dec 06 10:00:03 crc kubenswrapper[4954]: I1206 10:00:03.847858 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bda2e2be-14f5-448f-994e-6a7c0e9a43a1-secret-volume\") pod \"bda2e2be-14f5-448f-994e-6a7c0e9a43a1\" (UID: \"bda2e2be-14f5-448f-994e-6a7c0e9a43a1\") " Dec 06 10:00:03 crc kubenswrapper[4954]: I1206 10:00:03.848707 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bda2e2be-14f5-448f-994e-6a7c0e9a43a1-config-volume" (OuterVolumeSpecName: "config-volume") pod "bda2e2be-14f5-448f-994e-6a7c0e9a43a1" (UID: "bda2e2be-14f5-448f-994e-6a7c0e9a43a1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:00:03 crc kubenswrapper[4954]: I1206 10:00:03.854681 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bda2e2be-14f5-448f-994e-6a7c0e9a43a1-kube-api-access-6q9gn" (OuterVolumeSpecName: "kube-api-access-6q9gn") pod "bda2e2be-14f5-448f-994e-6a7c0e9a43a1" (UID: "bda2e2be-14f5-448f-994e-6a7c0e9a43a1"). InnerVolumeSpecName "kube-api-access-6q9gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:00:03 crc kubenswrapper[4954]: I1206 10:00:03.855708 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bda2e2be-14f5-448f-994e-6a7c0e9a43a1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bda2e2be-14f5-448f-994e-6a7c0e9a43a1" (UID: "bda2e2be-14f5-448f-994e-6a7c0e9a43a1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:00:03 crc kubenswrapper[4954]: I1206 10:00:03.950019 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q9gn\" (UniqueName: \"kubernetes.io/projected/bda2e2be-14f5-448f-994e-6a7c0e9a43a1-kube-api-access-6q9gn\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:03 crc kubenswrapper[4954]: I1206 10:00:03.950366 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bda2e2be-14f5-448f-994e-6a7c0e9a43a1-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:03 crc kubenswrapper[4954]: I1206 10:00:03.950377 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bda2e2be-14f5-448f-994e-6a7c0e9a43a1-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:04 crc kubenswrapper[4954]: I1206 10:00:04.047694 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j" event={"ID":"bda2e2be-14f5-448f-994e-6a7c0e9a43a1","Type":"ContainerDied","Data":"5e57fe35c62d70b8c1524009d0022fe6c208844772b68ed8525f355494a93914"} Dec 06 10:00:04 crc kubenswrapper[4954]: I1206 10:00:04.047740 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e57fe35c62d70b8c1524009d0022fe6c208844772b68ed8525f355494a93914" Dec 06 10:00:04 crc kubenswrapper[4954]: I1206 10:00:04.047775 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j" Dec 06 10:00:04 crc kubenswrapper[4954]: I1206 10:00:04.801564 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd"] Dec 06 10:00:04 crc kubenswrapper[4954]: I1206 10:00:04.811951 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416875-p2cwd"] Dec 06 10:00:05 crc kubenswrapper[4954]: I1206 10:00:05.458339 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e64b559-f2f9-42d5-9c45-bbdab57efb99" path="/var/lib/kubelet/pods/5e64b559-f2f9-42d5-9c45-bbdab57efb99/volumes" Dec 06 10:00:10 crc kubenswrapper[4954]: I1206 10:00:10.102896 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:00:10 crc kubenswrapper[4954]: I1206 10:00:10.103532 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:00:10 crc kubenswrapper[4954]: I1206 10:00:10.103636 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 10:00:10 crc kubenswrapper[4954]: I1206 10:00:10.104840 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"71099a7502436019cc48a8c0425ae64d0598a10ec6e6d6d7c4657bec700876e0"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 10:00:10 crc kubenswrapper[4954]: I1206 10:00:10.104929 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://71099a7502436019cc48a8c0425ae64d0598a10ec6e6d6d7c4657bec700876e0" gracePeriod=600 Dec 06 10:00:11 crc kubenswrapper[4954]: I1206 10:00:11.123950 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="71099a7502436019cc48a8c0425ae64d0598a10ec6e6d6d7c4657bec700876e0" exitCode=0 Dec 06 10:00:11 crc kubenswrapper[4954]: I1206 10:00:11.124000 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"71099a7502436019cc48a8c0425ae64d0598a10ec6e6d6d7c4657bec700876e0"} Dec 06 10:00:11 crc kubenswrapper[4954]: I1206 10:00:11.124392 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda"} Dec 06 10:00:11 crc kubenswrapper[4954]: I1206 10:00:11.124428 4954 scope.go:117] "RemoveContainer" containerID="7a653149255b93444edbd7fc63916789980779af86e85dd15525a62b6f955921" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.461425 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-d4qmh"] Dec 06 10:00:12 crc kubenswrapper[4954]: E1206 10:00:12.462454 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bda2e2be-14f5-448f-994e-6a7c0e9a43a1" containerName="collect-profiles" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.462471 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda2e2be-14f5-448f-994e-6a7c0e9a43a1" containerName="collect-profiles" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.462848 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="bda2e2be-14f5-448f-994e-6a7c0e9a43a1" containerName="collect-profiles" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.463852 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.467007 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.467463 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.484500 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-d4qmh"] Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.615084 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-swiftconf\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.615148 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-dispersionconf\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.615177 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqwj8\" (UniqueName: \"kubernetes.io/projected/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-kube-api-access-bqwj8\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.615216 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-scripts\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.615252 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.615303 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-etc-swift\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.615343 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-ring-data-devices\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.717303 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-swiftconf\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.717372 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-dispersionconf\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.717400 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqwj8\" (UniqueName: \"kubernetes.io/projected/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-kube-api-access-bqwj8\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.717435 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-scripts\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.717461 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.717528 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-etc-swift\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.717616 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-ring-data-devices\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.718318 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-scripts\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.718844 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-etc-swift\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.719015 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-ring-data-devices\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.723805 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-swiftconf\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.727006 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-dispersionconf\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.730115 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.736105 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqwj8\" (UniqueName: \"kubernetes.io/projected/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-kube-api-access-bqwj8\") pod \"swift-ring-rebalance-debug-d4qmh\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:12 crc kubenswrapper[4954]: I1206 10:00:12.792506 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:13 crc kubenswrapper[4954]: I1206 10:00:13.291021 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-d4qmh"] Dec 06 10:00:13 crc kubenswrapper[4954]: W1206 10:00:13.302452 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48cb23e4_64a3_4817_b6c0_9e8eb6636f49.slice/crio-e7cf5b83abd8c5ab9f281b6b1c48cbde743159f273a4d40698090163399c892e WatchSource:0}: Error finding container e7cf5b83abd8c5ab9f281b6b1c48cbde743159f273a4d40698090163399c892e: Status 404 returned error can't find the container with id e7cf5b83abd8c5ab9f281b6b1c48cbde743159f273a4d40698090163399c892e Dec 06 10:00:14 crc kubenswrapper[4954]: I1206 10:00:14.163224 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-d4qmh" event={"ID":"48cb23e4-64a3-4817-b6c0-9e8eb6636f49","Type":"ContainerStarted","Data":"85d3d67cf2e7c4a8ca306645a3026d3abc7a6a9b2302357b9580eeb9da9a729f"} Dec 06 10:00:14 crc kubenswrapper[4954]: I1206 10:00:14.164671 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-d4qmh" event={"ID":"48cb23e4-64a3-4817-b6c0-9e8eb6636f49","Type":"ContainerStarted","Data":"e7cf5b83abd8c5ab9f281b6b1c48cbde743159f273a4d40698090163399c892e"} Dec 06 10:00:14 crc kubenswrapper[4954]: I1206 10:00:14.197086 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-d4qmh" podStartSLOduration=2.197065588 podStartE2EDuration="2.197065588s" podCreationTimestamp="2025-12-06 10:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:00:14.182004387 +0000 UTC m=+10988.995363776" watchObservedRunningTime="2025-12-06 10:00:14.197065588 +0000 UTC m=+10989.010424977" Dec 06 10:00:22 crc kubenswrapper[4954]: I1206 10:00:22.248228 4954 generic.go:334] "Generic (PLEG): container finished" podID="48cb23e4-64a3-4817-b6c0-9e8eb6636f49" containerID="85d3d67cf2e7c4a8ca306645a3026d3abc7a6a9b2302357b9580eeb9da9a729f" exitCode=0 Dec 06 10:00:22 crc kubenswrapper[4954]: I1206 10:00:22.248312 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-d4qmh" event={"ID":"48cb23e4-64a3-4817-b6c0-9e8eb6636f49","Type":"ContainerDied","Data":"85d3d67cf2e7c4a8ca306645a3026d3abc7a6a9b2302357b9580eeb9da9a729f"} Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.262359 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.271964 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-d4qmh" event={"ID":"48cb23e4-64a3-4817-b6c0-9e8eb6636f49","Type":"ContainerDied","Data":"e7cf5b83abd8c5ab9f281b6b1c48cbde743159f273a4d40698090163399c892e"} Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.272015 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7cf5b83abd8c5ab9f281b6b1c48cbde743159f273a4d40698090163399c892e" Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.272096 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-d4qmh" Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.313248 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-d4qmh"] Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.327365 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-d4qmh"] Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.447615 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-ring-data-devices\") pod \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.447681 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqwj8\" (UniqueName: \"kubernetes.io/projected/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-kube-api-access-bqwj8\") pod \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.447808 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-dispersionconf\") pod \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.447848 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-swiftconf\") pod \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.447940 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-combined-ca-bundle\") pod \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.448034 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-scripts\") pod \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.448077 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-etc-swift\") pod \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\" (UID: \"48cb23e4-64a3-4817-b6c0-9e8eb6636f49\") " Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.448351 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "48cb23e4-64a3-4817-b6c0-9e8eb6636f49" (UID: "48cb23e4-64a3-4817-b6c0-9e8eb6636f49"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.449345 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.449490 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "48cb23e4-64a3-4817-b6c0-9e8eb6636f49" (UID: "48cb23e4-64a3-4817-b6c0-9e8eb6636f49"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.453420 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-kube-api-access-bqwj8" (OuterVolumeSpecName: "kube-api-access-bqwj8") pod "48cb23e4-64a3-4817-b6c0-9e8eb6636f49" (UID: "48cb23e4-64a3-4817-b6c0-9e8eb6636f49"). InnerVolumeSpecName "kube-api-access-bqwj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.478404 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "48cb23e4-64a3-4817-b6c0-9e8eb6636f49" (UID: "48cb23e4-64a3-4817-b6c0-9e8eb6636f49"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.486690 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "48cb23e4-64a3-4817-b6c0-9e8eb6636f49" (UID: "48cb23e4-64a3-4817-b6c0-9e8eb6636f49"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.491966 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48cb23e4-64a3-4817-b6c0-9e8eb6636f49" (UID: "48cb23e4-64a3-4817-b6c0-9e8eb6636f49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.508450 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-scripts" (OuterVolumeSpecName: "scripts") pod "48cb23e4-64a3-4817-b6c0-9e8eb6636f49" (UID: "48cb23e4-64a3-4817-b6c0-9e8eb6636f49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.552502 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqwj8\" (UniqueName: \"kubernetes.io/projected/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-kube-api-access-bqwj8\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.552841 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.552952 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.553182 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.553268 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:24 crc kubenswrapper[4954]: I1206 10:00:24.553348 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48cb23e4-64a3-4817-b6c0-9e8eb6636f49-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 10:00:25 crc kubenswrapper[4954]: I1206 10:00:25.462653 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48cb23e4-64a3-4817-b6c0-9e8eb6636f49" path="/var/lib/kubelet/pods/48cb23e4-64a3-4817-b6c0-9e8eb6636f49/volumes" Dec 06 10:00:31 crc kubenswrapper[4954]: I1206 10:00:31.413776 4954 scope.go:117] "RemoveContainer" containerID="a63ac1296a8ae2835867f922d1c0a55b83dffbea4307f1a1c5e243b18dd445a9" Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.162289 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29416921-n7phh"] Dec 06 10:01:00 crc kubenswrapper[4954]: E1206 10:01:00.163926 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48cb23e4-64a3-4817-b6c0-9e8eb6636f49" containerName="swift-ring-rebalance" Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.163955 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="48cb23e4-64a3-4817-b6c0-9e8eb6636f49" containerName="swift-ring-rebalance" Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.164453 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="48cb23e4-64a3-4817-b6c0-9e8eb6636f49" containerName="swift-ring-rebalance" Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.165927 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416921-n7phh" Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.185509 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416921-n7phh"] Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.216500 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5mgq\" (UniqueName: \"kubernetes.io/projected/df3840b1-bd73-4b75-9b02-2e48ccc35182-kube-api-access-n5mgq\") pod \"keystone-cron-29416921-n7phh\" (UID: \"df3840b1-bd73-4b75-9b02-2e48ccc35182\") " pod="openstack/keystone-cron-29416921-n7phh" Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.216631 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3840b1-bd73-4b75-9b02-2e48ccc35182-combined-ca-bundle\") pod \"keystone-cron-29416921-n7phh\" (UID: \"df3840b1-bd73-4b75-9b02-2e48ccc35182\") " pod="openstack/keystone-cron-29416921-n7phh" Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.216713 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df3840b1-bd73-4b75-9b02-2e48ccc35182-fernet-keys\") pod \"keystone-cron-29416921-n7phh\" (UID: \"df3840b1-bd73-4b75-9b02-2e48ccc35182\") " pod="openstack/keystone-cron-29416921-n7phh" Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.216879 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3840b1-bd73-4b75-9b02-2e48ccc35182-config-data\") pod \"keystone-cron-29416921-n7phh\" (UID: \"df3840b1-bd73-4b75-9b02-2e48ccc35182\") " pod="openstack/keystone-cron-29416921-n7phh" Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.318804 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5mgq\" (UniqueName: \"kubernetes.io/projected/df3840b1-bd73-4b75-9b02-2e48ccc35182-kube-api-access-n5mgq\") pod \"keystone-cron-29416921-n7phh\" (UID: \"df3840b1-bd73-4b75-9b02-2e48ccc35182\") " pod="openstack/keystone-cron-29416921-n7phh" Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.318857 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3840b1-bd73-4b75-9b02-2e48ccc35182-combined-ca-bundle\") pod \"keystone-cron-29416921-n7phh\" (UID: \"df3840b1-bd73-4b75-9b02-2e48ccc35182\") " pod="openstack/keystone-cron-29416921-n7phh" Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.318895 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df3840b1-bd73-4b75-9b02-2e48ccc35182-fernet-keys\") pod \"keystone-cron-29416921-n7phh\" (UID: \"df3840b1-bd73-4b75-9b02-2e48ccc35182\") " pod="openstack/keystone-cron-29416921-n7phh" Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.318980 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3840b1-bd73-4b75-9b02-2e48ccc35182-config-data\") pod \"keystone-cron-29416921-n7phh\" (UID: \"df3840b1-bd73-4b75-9b02-2e48ccc35182\") " pod="openstack/keystone-cron-29416921-n7phh" Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.324922 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3840b1-bd73-4b75-9b02-2e48ccc35182-config-data\") pod \"keystone-cron-29416921-n7phh\" (UID: \"df3840b1-bd73-4b75-9b02-2e48ccc35182\") " pod="openstack/keystone-cron-29416921-n7phh" Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.327688 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df3840b1-bd73-4b75-9b02-2e48ccc35182-fernet-keys\") pod \"keystone-cron-29416921-n7phh\" (UID: \"df3840b1-bd73-4b75-9b02-2e48ccc35182\") " pod="openstack/keystone-cron-29416921-n7phh" Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.330616 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3840b1-bd73-4b75-9b02-2e48ccc35182-combined-ca-bundle\") pod \"keystone-cron-29416921-n7phh\" (UID: \"df3840b1-bd73-4b75-9b02-2e48ccc35182\") " pod="openstack/keystone-cron-29416921-n7phh" Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.334192 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5mgq\" (UniqueName: \"kubernetes.io/projected/df3840b1-bd73-4b75-9b02-2e48ccc35182-kube-api-access-n5mgq\") pod \"keystone-cron-29416921-n7phh\" (UID: \"df3840b1-bd73-4b75-9b02-2e48ccc35182\") " pod="openstack/keystone-cron-29416921-n7phh" Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.485891 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416921-n7phh" Dec 06 10:01:00 crc kubenswrapper[4954]: I1206 10:01:00.962539 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416921-n7phh"] Dec 06 10:01:01 crc kubenswrapper[4954]: I1206 10:01:01.804110 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416921-n7phh" event={"ID":"df3840b1-bd73-4b75-9b02-2e48ccc35182","Type":"ContainerStarted","Data":"4ef80a7e0f3d59fda52b71e434d50d3a4cabce0ea66e3bf4bc049c16d55ccb57"} Dec 06 10:01:01 crc kubenswrapper[4954]: I1206 10:01:01.804447 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416921-n7phh" event={"ID":"df3840b1-bd73-4b75-9b02-2e48ccc35182","Type":"ContainerStarted","Data":"1be8c162808d006ef2b93e6be3a52293ba0804aa81f03b6cd541edbf494ad923"} Dec 06 10:01:01 crc kubenswrapper[4954]: I1206 10:01:01.838478 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29416921-n7phh" podStartSLOduration=1.838425275 podStartE2EDuration="1.838425275s" podCreationTimestamp="2025-12-06 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:01:01.819508501 +0000 UTC m=+11036.632867900" watchObservedRunningTime="2025-12-06 10:01:01.838425275 +0000 UTC m=+11036.651784714" Dec 06 10:01:03 crc kubenswrapper[4954]: I1206 10:01:03.826181 4954 generic.go:334] "Generic (PLEG): container finished" podID="df3840b1-bd73-4b75-9b02-2e48ccc35182" containerID="4ef80a7e0f3d59fda52b71e434d50d3a4cabce0ea66e3bf4bc049c16d55ccb57" exitCode=0 Dec 06 10:01:03 crc kubenswrapper[4954]: I1206 10:01:03.826259 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416921-n7phh" event={"ID":"df3840b1-bd73-4b75-9b02-2e48ccc35182","Type":"ContainerDied","Data":"4ef80a7e0f3d59fda52b71e434d50d3a4cabce0ea66e3bf4bc049c16d55ccb57"} Dec 06 10:01:05 crc kubenswrapper[4954]: I1206 10:01:05.694103 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416921-n7phh" Dec 06 10:01:05 crc kubenswrapper[4954]: I1206 10:01:05.741493 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3840b1-bd73-4b75-9b02-2e48ccc35182-combined-ca-bundle\") pod \"df3840b1-bd73-4b75-9b02-2e48ccc35182\" (UID: \"df3840b1-bd73-4b75-9b02-2e48ccc35182\") " Dec 06 10:01:05 crc kubenswrapper[4954]: I1206 10:01:05.741644 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df3840b1-bd73-4b75-9b02-2e48ccc35182-fernet-keys\") pod \"df3840b1-bd73-4b75-9b02-2e48ccc35182\" (UID: \"df3840b1-bd73-4b75-9b02-2e48ccc35182\") " Dec 06 10:01:05 crc kubenswrapper[4954]: I1206 10:01:05.741739 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3840b1-bd73-4b75-9b02-2e48ccc35182-config-data\") pod \"df3840b1-bd73-4b75-9b02-2e48ccc35182\" (UID: \"df3840b1-bd73-4b75-9b02-2e48ccc35182\") " Dec 06 10:01:05 crc kubenswrapper[4954]: I1206 10:01:05.741801 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5mgq\" (UniqueName: \"kubernetes.io/projected/df3840b1-bd73-4b75-9b02-2e48ccc35182-kube-api-access-n5mgq\") pod \"df3840b1-bd73-4b75-9b02-2e48ccc35182\" (UID: \"df3840b1-bd73-4b75-9b02-2e48ccc35182\") " Dec 06 10:01:05 crc kubenswrapper[4954]: I1206 10:01:05.747408 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df3840b1-bd73-4b75-9b02-2e48ccc35182-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "df3840b1-bd73-4b75-9b02-2e48ccc35182" (UID: "df3840b1-bd73-4b75-9b02-2e48ccc35182"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:05 crc kubenswrapper[4954]: I1206 10:01:05.757917 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df3840b1-bd73-4b75-9b02-2e48ccc35182-kube-api-access-n5mgq" (OuterVolumeSpecName: "kube-api-access-n5mgq") pod "df3840b1-bd73-4b75-9b02-2e48ccc35182" (UID: "df3840b1-bd73-4b75-9b02-2e48ccc35182"). InnerVolumeSpecName "kube-api-access-n5mgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:01:05 crc kubenswrapper[4954]: I1206 10:01:05.786993 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df3840b1-bd73-4b75-9b02-2e48ccc35182-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df3840b1-bd73-4b75-9b02-2e48ccc35182" (UID: "df3840b1-bd73-4b75-9b02-2e48ccc35182"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:05 crc kubenswrapper[4954]: I1206 10:01:05.821402 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df3840b1-bd73-4b75-9b02-2e48ccc35182-config-data" (OuterVolumeSpecName: "config-data") pod "df3840b1-bd73-4b75-9b02-2e48ccc35182" (UID: "df3840b1-bd73-4b75-9b02-2e48ccc35182"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:05 crc kubenswrapper[4954]: I1206 10:01:05.846993 4954 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df3840b1-bd73-4b75-9b02-2e48ccc35182-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:05 crc kubenswrapper[4954]: I1206 10:01:05.847028 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3840b1-bd73-4b75-9b02-2e48ccc35182-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:05 crc kubenswrapper[4954]: I1206 10:01:05.847039 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5mgq\" (UniqueName: \"kubernetes.io/projected/df3840b1-bd73-4b75-9b02-2e48ccc35182-kube-api-access-n5mgq\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:05 crc kubenswrapper[4954]: I1206 10:01:05.847049 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3840b1-bd73-4b75-9b02-2e48ccc35182-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:05 crc kubenswrapper[4954]: I1206 10:01:05.851805 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416921-n7phh" event={"ID":"df3840b1-bd73-4b75-9b02-2e48ccc35182","Type":"ContainerDied","Data":"1be8c162808d006ef2b93e6be3a52293ba0804aa81f03b6cd541edbf494ad923"} Dec 06 10:01:05 crc kubenswrapper[4954]: I1206 10:01:05.851846 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1be8c162808d006ef2b93e6be3a52293ba0804aa81f03b6cd541edbf494ad923" Dec 06 10:01:05 crc kubenswrapper[4954]: I1206 10:01:05.851878 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416921-n7phh" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.444418 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-qlm7v"] Dec 06 10:01:24 crc kubenswrapper[4954]: E1206 10:01:24.445738 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df3840b1-bd73-4b75-9b02-2e48ccc35182" containerName="keystone-cron" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.445753 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="df3840b1-bd73-4b75-9b02-2e48ccc35182" containerName="keystone-cron" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.445963 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="df3840b1-bd73-4b75-9b02-2e48ccc35182" containerName="keystone-cron" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.446649 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.449355 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.450147 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.467411 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d441d22a-4e2f-4c93-9527-2c8a15624435-ring-data-devices\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.467801 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d441d22a-4e2f-4c93-9527-2c8a15624435-etc-swift\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.467886 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d441d22a-4e2f-4c93-9527-2c8a15624435-scripts\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.468057 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d441d22a-4e2f-4c93-9527-2c8a15624435-dispersionconf\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.468097 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d441d22a-4e2f-4c93-9527-2c8a15624435-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.468124 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d441d22a-4e2f-4c93-9527-2c8a15624435-swiftconf\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.468390 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g4fv\" (UniqueName: \"kubernetes.io/projected/d441d22a-4e2f-4c93-9527-2c8a15624435-kube-api-access-4g4fv\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.470052 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-qlm7v"] Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.571181 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d441d22a-4e2f-4c93-9527-2c8a15624435-etc-swift\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.571290 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d441d22a-4e2f-4c93-9527-2c8a15624435-scripts\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.571374 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d441d22a-4e2f-4c93-9527-2c8a15624435-dispersionconf\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.571402 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d441d22a-4e2f-4c93-9527-2c8a15624435-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.571429 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d441d22a-4e2f-4c93-9527-2c8a15624435-swiftconf\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.571520 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g4fv\" (UniqueName: \"kubernetes.io/projected/d441d22a-4e2f-4c93-9527-2c8a15624435-kube-api-access-4g4fv\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.571591 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d441d22a-4e2f-4c93-9527-2c8a15624435-ring-data-devices\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.572157 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d441d22a-4e2f-4c93-9527-2c8a15624435-etc-swift\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.572383 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d441d22a-4e2f-4c93-9527-2c8a15624435-scripts\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.572667 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d441d22a-4e2f-4c93-9527-2c8a15624435-ring-data-devices\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.577553 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d441d22a-4e2f-4c93-9527-2c8a15624435-swiftconf\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.577633 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d441d22a-4e2f-4c93-9527-2c8a15624435-dispersionconf\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.603559 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d441d22a-4e2f-4c93-9527-2c8a15624435-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.605211 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g4fv\" (UniqueName: \"kubernetes.io/projected/d441d22a-4e2f-4c93-9527-2c8a15624435-kube-api-access-4g4fv\") pod \"swift-ring-rebalance-debug-qlm7v\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:24 crc kubenswrapper[4954]: I1206 10:01:24.769860 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:25 crc kubenswrapper[4954]: I1206 10:01:25.322137 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-qlm7v"] Dec 06 10:01:26 crc kubenswrapper[4954]: I1206 10:01:26.203304 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-qlm7v" event={"ID":"d441d22a-4e2f-4c93-9527-2c8a15624435","Type":"ContainerStarted","Data":"4a0fa9af378e124c36b0ef5dcedf96c573d59c556299926d25525b4eac28a312"} Dec 06 10:01:26 crc kubenswrapper[4954]: I1206 10:01:26.203623 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-qlm7v" event={"ID":"d441d22a-4e2f-4c93-9527-2c8a15624435","Type":"ContainerStarted","Data":"7a726fb97102fe8f6bc7bba7ab8c7e1c1390992b2c29da528877888c42c1059a"} Dec 06 10:01:26 crc kubenswrapper[4954]: I1206 10:01:26.234651 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-qlm7v" podStartSLOduration=2.234634362 podStartE2EDuration="2.234634362s" podCreationTimestamp="2025-12-06 10:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:01:26.219843718 +0000 UTC m=+11061.033203107" watchObservedRunningTime="2025-12-06 10:01:26.234634362 +0000 UTC m=+11061.047993751" Dec 06 10:01:31 crc kubenswrapper[4954]: I1206 10:01:31.492926 4954 scope.go:117] "RemoveContainer" containerID="13ad78a44130d09cfd95ce82324edb40edcb03e12235ed82a2a99244bd48a869" Dec 06 10:01:35 crc kubenswrapper[4954]: I1206 10:01:35.290705 4954 generic.go:334] "Generic (PLEG): container finished" podID="d441d22a-4e2f-4c93-9527-2c8a15624435" containerID="4a0fa9af378e124c36b0ef5dcedf96c573d59c556299926d25525b4eac28a312" exitCode=0 Dec 06 10:01:35 crc kubenswrapper[4954]: I1206 10:01:35.290753 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-qlm7v" event={"ID":"d441d22a-4e2f-4c93-9527-2c8a15624435","Type":"ContainerDied","Data":"4a0fa9af378e124c36b0ef5dcedf96c573d59c556299926d25525b4eac28a312"} Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.491003 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.531941 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-qlm7v"] Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.542712 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-qlm7v"] Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.545387 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g4fv\" (UniqueName: \"kubernetes.io/projected/d441d22a-4e2f-4c93-9527-2c8a15624435-kube-api-access-4g4fv\") pod \"d441d22a-4e2f-4c93-9527-2c8a15624435\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.545552 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d441d22a-4e2f-4c93-9527-2c8a15624435-scripts\") pod \"d441d22a-4e2f-4c93-9527-2c8a15624435\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.545672 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d441d22a-4e2f-4c93-9527-2c8a15624435-ring-data-devices\") pod \"d441d22a-4e2f-4c93-9527-2c8a15624435\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.545746 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d441d22a-4e2f-4c93-9527-2c8a15624435-combined-ca-bundle\") pod \"d441d22a-4e2f-4c93-9527-2c8a15624435\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.545773 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d441d22a-4e2f-4c93-9527-2c8a15624435-swiftconf\") pod \"d441d22a-4e2f-4c93-9527-2c8a15624435\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.545831 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d441d22a-4e2f-4c93-9527-2c8a15624435-dispersionconf\") pod \"d441d22a-4e2f-4c93-9527-2c8a15624435\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.545857 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d441d22a-4e2f-4c93-9527-2c8a15624435-etc-swift\") pod \"d441d22a-4e2f-4c93-9527-2c8a15624435\" (UID: \"d441d22a-4e2f-4c93-9527-2c8a15624435\") " Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.551017 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d441d22a-4e2f-4c93-9527-2c8a15624435-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d441d22a-4e2f-4c93-9527-2c8a15624435" (UID: "d441d22a-4e2f-4c93-9527-2c8a15624435"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.551158 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d441d22a-4e2f-4c93-9527-2c8a15624435-kube-api-access-4g4fv" (OuterVolumeSpecName: "kube-api-access-4g4fv") pod "d441d22a-4e2f-4c93-9527-2c8a15624435" (UID: "d441d22a-4e2f-4c93-9527-2c8a15624435"). InnerVolumeSpecName "kube-api-access-4g4fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.551715 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d441d22a-4e2f-4c93-9527-2c8a15624435-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d441d22a-4e2f-4c93-9527-2c8a15624435" (UID: "d441d22a-4e2f-4c93-9527-2c8a15624435"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.578856 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d441d22a-4e2f-4c93-9527-2c8a15624435-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d441d22a-4e2f-4c93-9527-2c8a15624435" (UID: "d441d22a-4e2f-4c93-9527-2c8a15624435"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.586196 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d441d22a-4e2f-4c93-9527-2c8a15624435-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d441d22a-4e2f-4c93-9527-2c8a15624435" (UID: "d441d22a-4e2f-4c93-9527-2c8a15624435"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.593694 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d441d22a-4e2f-4c93-9527-2c8a15624435-scripts" (OuterVolumeSpecName: "scripts") pod "d441d22a-4e2f-4c93-9527-2c8a15624435" (UID: "d441d22a-4e2f-4c93-9527-2c8a15624435"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.597592 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d441d22a-4e2f-4c93-9527-2c8a15624435-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d441d22a-4e2f-4c93-9527-2c8a15624435" (UID: "d441d22a-4e2f-4c93-9527-2c8a15624435"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.648202 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d441d22a-4e2f-4c93-9527-2c8a15624435-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.648251 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g4fv\" (UniqueName: \"kubernetes.io/projected/d441d22a-4e2f-4c93-9527-2c8a15624435-kube-api-access-4g4fv\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.648266 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d441d22a-4e2f-4c93-9527-2c8a15624435-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.648278 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d441d22a-4e2f-4c93-9527-2c8a15624435-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.648289 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d441d22a-4e2f-4c93-9527-2c8a15624435-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.648298 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d441d22a-4e2f-4c93-9527-2c8a15624435-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:37 crc kubenswrapper[4954]: I1206 10:01:37.648307 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d441d22a-4e2f-4c93-9527-2c8a15624435-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:01:38 crc kubenswrapper[4954]: I1206 10:01:38.322840 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a726fb97102fe8f6bc7bba7ab8c7e1c1390992b2c29da528877888c42c1059a" Dec 06 10:01:38 crc kubenswrapper[4954]: I1206 10:01:38.322927 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-qlm7v" Dec 06 10:01:39 crc kubenswrapper[4954]: I1206 10:01:39.459427 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d441d22a-4e2f-4c93-9527-2c8a15624435" path="/var/lib/kubelet/pods/d441d22a-4e2f-4c93-9527-2c8a15624435/volumes" Dec 06 10:02:10 crc kubenswrapper[4954]: I1206 10:02:10.101807 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:02:10 crc kubenswrapper[4954]: I1206 10:02:10.102307 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:02:33 crc kubenswrapper[4954]: I1206 10:02:33.634944 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-69rkh"] Dec 06 10:02:33 crc kubenswrapper[4954]: E1206 10:02:33.635881 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d441d22a-4e2f-4c93-9527-2c8a15624435" containerName="swift-ring-rebalance" Dec 06 10:02:33 crc kubenswrapper[4954]: I1206 10:02:33.635893 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d441d22a-4e2f-4c93-9527-2c8a15624435" containerName="swift-ring-rebalance" Dec 06 10:02:33 crc kubenswrapper[4954]: I1206 10:02:33.636111 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d441d22a-4e2f-4c93-9527-2c8a15624435" containerName="swift-ring-rebalance" Dec 06 10:02:33 crc kubenswrapper[4954]: I1206 10:02:33.637646 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69rkh" Dec 06 10:02:33 crc kubenswrapper[4954]: I1206 10:02:33.649002 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-69rkh"] Dec 06 10:02:33 crc kubenswrapper[4954]: I1206 10:02:33.666612 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0-catalog-content\") pod \"community-operators-69rkh\" (UID: \"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0\") " pod="openshift-marketplace/community-operators-69rkh" Dec 06 10:02:33 crc kubenswrapper[4954]: I1206 10:02:33.666658 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0-utilities\") pod \"community-operators-69rkh\" (UID: \"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0\") " pod="openshift-marketplace/community-operators-69rkh" Dec 06 10:02:33 crc kubenswrapper[4954]: I1206 10:02:33.666722 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jl4r\" (UniqueName: \"kubernetes.io/projected/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0-kube-api-access-7jl4r\") pod \"community-operators-69rkh\" (UID: \"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0\") " pod="openshift-marketplace/community-operators-69rkh" Dec 06 10:02:33 crc kubenswrapper[4954]: I1206 10:02:33.768577 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jl4r\" (UniqueName: \"kubernetes.io/projected/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0-kube-api-access-7jl4r\") pod \"community-operators-69rkh\" (UID: \"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0\") " pod="openshift-marketplace/community-operators-69rkh" Dec 06 10:02:33 crc kubenswrapper[4954]: I1206 10:02:33.768743 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0-catalog-content\") pod \"community-operators-69rkh\" (UID: \"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0\") " pod="openshift-marketplace/community-operators-69rkh" Dec 06 10:02:33 crc kubenswrapper[4954]: I1206 10:02:33.768771 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0-utilities\") pod \"community-operators-69rkh\" (UID: \"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0\") " pod="openshift-marketplace/community-operators-69rkh" Dec 06 10:02:33 crc kubenswrapper[4954]: I1206 10:02:33.769463 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0-utilities\") pod \"community-operators-69rkh\" (UID: \"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0\") " pod="openshift-marketplace/community-operators-69rkh" Dec 06 10:02:33 crc kubenswrapper[4954]: I1206 10:02:33.769636 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0-catalog-content\") pod \"community-operators-69rkh\" (UID: \"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0\") " pod="openshift-marketplace/community-operators-69rkh" Dec 06 10:02:33 crc kubenswrapper[4954]: I1206 10:02:33.818484 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jl4r\" (UniqueName: \"kubernetes.io/projected/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0-kube-api-access-7jl4r\") pod \"community-operators-69rkh\" (UID: \"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0\") " pod="openshift-marketplace/community-operators-69rkh" Dec 06 10:02:33 crc kubenswrapper[4954]: I1206 10:02:33.959438 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69rkh" Dec 06 10:02:34 crc kubenswrapper[4954]: I1206 10:02:34.551711 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-69rkh"] Dec 06 10:02:34 crc kubenswrapper[4954]: I1206 10:02:34.958532 4954 generic.go:334] "Generic (PLEG): container finished" podID="4a4d5486-4b94-4f76-9eaa-adcffb9c81c0" containerID="db8b27535ca7959c24eefc4bae860da73fd6ebcb73bcccb8d21ceb921562195f" exitCode=0 Dec 06 10:02:34 crc kubenswrapper[4954]: I1206 10:02:34.958632 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69rkh" event={"ID":"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0","Type":"ContainerDied","Data":"db8b27535ca7959c24eefc4bae860da73fd6ebcb73bcccb8d21ceb921562195f"} Dec 06 10:02:34 crc kubenswrapper[4954]: I1206 10:02:34.959344 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69rkh" event={"ID":"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0","Type":"ContainerStarted","Data":"e6e7864e1a835058581ff327d091ca3edda707497f652ccf3969cb96c19b0a64"} Dec 06 10:02:34 crc kubenswrapper[4954]: I1206 10:02:34.960951 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.731265 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-kd24t"] Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.733589 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.735741 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.737314 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.744881 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-kd24t"] Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.864934 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/272fe111-b642-4df0-b30b-291f8a7c6ba3-scripts\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.865352 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hf2l\" (UniqueName: \"kubernetes.io/projected/272fe111-b642-4df0-b30b-291f8a7c6ba3-kube-api-access-7hf2l\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.865445 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272fe111-b642-4df0-b30b-291f8a7c6ba3-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.865590 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/272fe111-b642-4df0-b30b-291f8a7c6ba3-swiftconf\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.865729 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/272fe111-b642-4df0-b30b-291f8a7c6ba3-dispersionconf\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.865829 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/272fe111-b642-4df0-b30b-291f8a7c6ba3-ring-data-devices\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.865871 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/272fe111-b642-4df0-b30b-291f8a7c6ba3-etc-swift\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.967634 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272fe111-b642-4df0-b30b-291f8a7c6ba3-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.967745 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/272fe111-b642-4df0-b30b-291f8a7c6ba3-swiftconf\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.967830 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/272fe111-b642-4df0-b30b-291f8a7c6ba3-dispersionconf\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.967880 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/272fe111-b642-4df0-b30b-291f8a7c6ba3-ring-data-devices\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.967917 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/272fe111-b642-4df0-b30b-291f8a7c6ba3-etc-swift\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.968015 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/272fe111-b642-4df0-b30b-291f8a7c6ba3-scripts\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.968070 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hf2l\" (UniqueName: \"kubernetes.io/projected/272fe111-b642-4df0-b30b-291f8a7c6ba3-kube-api-access-7hf2l\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.968443 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/272fe111-b642-4df0-b30b-291f8a7c6ba3-etc-swift\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.968996 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/272fe111-b642-4df0-b30b-291f8a7c6ba3-scripts\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.969378 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/272fe111-b642-4df0-b30b-291f8a7c6ba3-ring-data-devices\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.972789 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/272fe111-b642-4df0-b30b-291f8a7c6ba3-dispersionconf\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.973163 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/272fe111-b642-4df0-b30b-291f8a7c6ba3-swiftconf\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.973599 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272fe111-b642-4df0-b30b-291f8a7c6ba3-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:37 crc kubenswrapper[4954]: I1206 10:02:37.988376 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hf2l\" (UniqueName: \"kubernetes.io/projected/272fe111-b642-4df0-b30b-291f8a7c6ba3-kube-api-access-7hf2l\") pod \"swift-ring-rebalance-debug-kd24t\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:38 crc kubenswrapper[4954]: I1206 10:02:38.063946 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:38 crc kubenswrapper[4954]: I1206 10:02:38.601035 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-kd24t"] Dec 06 10:02:38 crc kubenswrapper[4954]: W1206 10:02:38.602097 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod272fe111_b642_4df0_b30b_291f8a7c6ba3.slice/crio-c5438c460a3d178b686aec6458526c63ff3fcf4c91d3430f0124db6e2e2c2240 WatchSource:0}: Error finding container c5438c460a3d178b686aec6458526c63ff3fcf4c91d3430f0124db6e2e2c2240: Status 404 returned error can't find the container with id c5438c460a3d178b686aec6458526c63ff3fcf4c91d3430f0124db6e2e2c2240 Dec 06 10:02:38 crc kubenswrapper[4954]: I1206 10:02:38.998426 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-kd24t" event={"ID":"272fe111-b642-4df0-b30b-291f8a7c6ba3","Type":"ContainerStarted","Data":"c5438c460a3d178b686aec6458526c63ff3fcf4c91d3430f0124db6e2e2c2240"} Dec 06 10:02:39 crc kubenswrapper[4954]: I1206 10:02:39.001888 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69rkh" event={"ID":"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0","Type":"ContainerStarted","Data":"f0a29e7dc0aa94c9496ed9e4b5dc6b7674b1f165ddb395e93fce4ef2ad45cf2e"} Dec 06 10:02:40 crc kubenswrapper[4954]: I1206 10:02:40.101235 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:02:40 crc kubenswrapper[4954]: I1206 10:02:40.101585 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:02:41 crc kubenswrapper[4954]: I1206 10:02:41.045374 4954 generic.go:334] "Generic (PLEG): container finished" podID="4a4d5486-4b94-4f76-9eaa-adcffb9c81c0" containerID="f0a29e7dc0aa94c9496ed9e4b5dc6b7674b1f165ddb395e93fce4ef2ad45cf2e" exitCode=0 Dec 06 10:02:41 crc kubenswrapper[4954]: I1206 10:02:41.046188 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69rkh" event={"ID":"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0","Type":"ContainerDied","Data":"f0a29e7dc0aa94c9496ed9e4b5dc6b7674b1f165ddb395e93fce4ef2ad45cf2e"} Dec 06 10:02:41 crc kubenswrapper[4954]: I1206 10:02:41.050188 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-kd24t" event={"ID":"272fe111-b642-4df0-b30b-291f8a7c6ba3","Type":"ContainerStarted","Data":"9e8e1d32cc876a2a039eb292b37621c139e72b2ae14a3cbe8a49b57e7ba43ef0"} Dec 06 10:02:41 crc kubenswrapper[4954]: I1206 10:02:41.100622 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-kd24t" podStartSLOduration=4.100603411 podStartE2EDuration="4.100603411s" podCreationTimestamp="2025-12-06 10:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:02:41.079208521 +0000 UTC m=+11135.892567950" watchObservedRunningTime="2025-12-06 10:02:41.100603411 +0000 UTC m=+11135.913962790" Dec 06 10:02:42 crc kubenswrapper[4954]: I1206 10:02:42.065180 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69rkh" event={"ID":"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0","Type":"ContainerStarted","Data":"8fe8a6e7a10522504d6edfe5b7a5cebdcb9ecbfddb52a69e3a6b7e2fd5977c26"} Dec 06 10:02:42 crc kubenswrapper[4954]: I1206 10:02:42.092704 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-69rkh" podStartSLOduration=2.554201975 podStartE2EDuration="9.092683411s" podCreationTimestamp="2025-12-06 10:02:33 +0000 UTC" firstStartedPulling="2025-12-06 10:02:34.960554649 +0000 UTC m=+11129.773914048" lastFinishedPulling="2025-12-06 10:02:41.499036095 +0000 UTC m=+11136.312395484" observedRunningTime="2025-12-06 10:02:42.087075842 +0000 UTC m=+11136.900435231" watchObservedRunningTime="2025-12-06 10:02:42.092683411 +0000 UTC m=+11136.906042800" Dec 06 10:02:43 crc kubenswrapper[4954]: I1206 10:02:43.959703 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-69rkh" Dec 06 10:02:43 crc kubenswrapper[4954]: I1206 10:02:43.960850 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-69rkh" Dec 06 10:02:45 crc kubenswrapper[4954]: I1206 10:02:45.007732 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-69rkh" podUID="4a4d5486-4b94-4f76-9eaa-adcffb9c81c0" containerName="registry-server" probeResult="failure" output=< Dec 06 10:02:45 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 10:02:45 crc kubenswrapper[4954]: > Dec 06 10:02:50 crc kubenswrapper[4954]: I1206 10:02:50.153790 4954 generic.go:334] "Generic (PLEG): container finished" podID="272fe111-b642-4df0-b30b-291f8a7c6ba3" containerID="9e8e1d32cc876a2a039eb292b37621c139e72b2ae14a3cbe8a49b57e7ba43ef0" exitCode=0 Dec 06 10:02:50 crc kubenswrapper[4954]: I1206 10:02:50.153875 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-kd24t" event={"ID":"272fe111-b642-4df0-b30b-291f8a7c6ba3","Type":"ContainerDied","Data":"9e8e1d32cc876a2a039eb292b37621c139e72b2ae14a3cbe8a49b57e7ba43ef0"} Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.444688 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.536367 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-kd24t"] Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.547598 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-kd24t"] Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.567874 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272fe111-b642-4df0-b30b-291f8a7c6ba3-combined-ca-bundle\") pod \"272fe111-b642-4df0-b30b-291f8a7c6ba3\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.567928 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/272fe111-b642-4df0-b30b-291f8a7c6ba3-dispersionconf\") pod \"272fe111-b642-4df0-b30b-291f8a7c6ba3\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.568046 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/272fe111-b642-4df0-b30b-291f8a7c6ba3-swiftconf\") pod \"272fe111-b642-4df0-b30b-291f8a7c6ba3\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.568096 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/272fe111-b642-4df0-b30b-291f8a7c6ba3-scripts\") pod \"272fe111-b642-4df0-b30b-291f8a7c6ba3\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.568140 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hf2l\" (UniqueName: \"kubernetes.io/projected/272fe111-b642-4df0-b30b-291f8a7c6ba3-kube-api-access-7hf2l\") pod \"272fe111-b642-4df0-b30b-291f8a7c6ba3\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.568172 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/272fe111-b642-4df0-b30b-291f8a7c6ba3-etc-swift\") pod \"272fe111-b642-4df0-b30b-291f8a7c6ba3\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.568188 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/272fe111-b642-4df0-b30b-291f8a7c6ba3-ring-data-devices\") pod \"272fe111-b642-4df0-b30b-291f8a7c6ba3\" (UID: \"272fe111-b642-4df0-b30b-291f8a7c6ba3\") " Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.570261 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/272fe111-b642-4df0-b30b-291f8a7c6ba3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "272fe111-b642-4df0-b30b-291f8a7c6ba3" (UID: "272fe111-b642-4df0-b30b-291f8a7c6ba3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.571400 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/272fe111-b642-4df0-b30b-291f8a7c6ba3-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "272fe111-b642-4df0-b30b-291f8a7c6ba3" (UID: "272fe111-b642-4df0-b30b-291f8a7c6ba3"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.573906 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/272fe111-b642-4df0-b30b-291f8a7c6ba3-kube-api-access-7hf2l" (OuterVolumeSpecName: "kube-api-access-7hf2l") pod "272fe111-b642-4df0-b30b-291f8a7c6ba3" (UID: "272fe111-b642-4df0-b30b-291f8a7c6ba3"). InnerVolumeSpecName "kube-api-access-7hf2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.595144 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/272fe111-b642-4df0-b30b-291f8a7c6ba3-scripts" (OuterVolumeSpecName: "scripts") pod "272fe111-b642-4df0-b30b-291f8a7c6ba3" (UID: "272fe111-b642-4df0-b30b-291f8a7c6ba3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.602376 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/272fe111-b642-4df0-b30b-291f8a7c6ba3-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "272fe111-b642-4df0-b30b-291f8a7c6ba3" (UID: "272fe111-b642-4df0-b30b-291f8a7c6ba3"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.602910 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/272fe111-b642-4df0-b30b-291f8a7c6ba3-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "272fe111-b642-4df0-b30b-291f8a7c6ba3" (UID: "272fe111-b642-4df0-b30b-291f8a7c6ba3"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.605379 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/272fe111-b642-4df0-b30b-291f8a7c6ba3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "272fe111-b642-4df0-b30b-291f8a7c6ba3" (UID: "272fe111-b642-4df0-b30b-291f8a7c6ba3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.670107 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272fe111-b642-4df0-b30b-291f8a7c6ba3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.670147 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/272fe111-b642-4df0-b30b-291f8a7c6ba3-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.670160 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/272fe111-b642-4df0-b30b-291f8a7c6ba3-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.670174 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/272fe111-b642-4df0-b30b-291f8a7c6ba3-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.670188 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hf2l\" (UniqueName: \"kubernetes.io/projected/272fe111-b642-4df0-b30b-291f8a7c6ba3-kube-api-access-7hf2l\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.670203 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/272fe111-b642-4df0-b30b-291f8a7c6ba3-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:52 crc kubenswrapper[4954]: I1206 10:02:52.670217 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/272fe111-b642-4df0-b30b-291f8a7c6ba3-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:53 crc kubenswrapper[4954]: I1206 10:02:53.186443 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5438c460a3d178b686aec6458526c63ff3fcf4c91d3430f0124db6e2e2c2240" Dec 06 10:02:53 crc kubenswrapper[4954]: I1206 10:02:53.186527 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-kd24t" Dec 06 10:02:53 crc kubenswrapper[4954]: I1206 10:02:53.456943 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="272fe111-b642-4df0-b30b-291f8a7c6ba3" path="/var/lib/kubelet/pods/272fe111-b642-4df0-b30b-291f8a7c6ba3/volumes" Dec 06 10:02:54 crc kubenswrapper[4954]: I1206 10:02:54.026702 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-69rkh" Dec 06 10:02:54 crc kubenswrapper[4954]: I1206 10:02:54.084474 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-69rkh" Dec 06 10:02:54 crc kubenswrapper[4954]: I1206 10:02:54.266110 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-69rkh"] Dec 06 10:02:55 crc kubenswrapper[4954]: I1206 10:02:55.206301 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-69rkh" podUID="4a4d5486-4b94-4f76-9eaa-adcffb9c81c0" containerName="registry-server" containerID="cri-o://8fe8a6e7a10522504d6edfe5b7a5cebdcb9ecbfddb52a69e3a6b7e2fd5977c26" gracePeriod=2 Dec 06 10:02:56 crc kubenswrapper[4954]: I1206 10:02:56.216637 4954 generic.go:334] "Generic (PLEG): container finished" podID="4a4d5486-4b94-4f76-9eaa-adcffb9c81c0" containerID="8fe8a6e7a10522504d6edfe5b7a5cebdcb9ecbfddb52a69e3a6b7e2fd5977c26" exitCode=0 Dec 06 10:02:56 crc kubenswrapper[4954]: I1206 10:02:56.216728 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69rkh" event={"ID":"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0","Type":"ContainerDied","Data":"8fe8a6e7a10522504d6edfe5b7a5cebdcb9ecbfddb52a69e3a6b7e2fd5977c26"} Dec 06 10:02:56 crc kubenswrapper[4954]: I1206 10:02:56.216951 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69rkh" event={"ID":"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0","Type":"ContainerDied","Data":"e6e7864e1a835058581ff327d091ca3edda707497f652ccf3969cb96c19b0a64"} Dec 06 10:02:56 crc kubenswrapper[4954]: I1206 10:02:56.216966 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6e7864e1a835058581ff327d091ca3edda707497f652ccf3969cb96c19b0a64" Dec 06 10:02:56 crc kubenswrapper[4954]: I1206 10:02:56.270668 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69rkh" Dec 06 10:02:56 crc kubenswrapper[4954]: I1206 10:02:56.361802 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0-utilities\") pod \"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0\" (UID: \"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0\") " Dec 06 10:02:56 crc kubenswrapper[4954]: I1206 10:02:56.362356 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jl4r\" (UniqueName: \"kubernetes.io/projected/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0-kube-api-access-7jl4r\") pod \"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0\" (UID: \"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0\") " Dec 06 10:02:56 crc kubenswrapper[4954]: I1206 10:02:56.362556 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0-catalog-content\") pod \"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0\" (UID: \"4a4d5486-4b94-4f76-9eaa-adcffb9c81c0\") " Dec 06 10:02:56 crc kubenswrapper[4954]: I1206 10:02:56.362772 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0-utilities" (OuterVolumeSpecName: "utilities") pod "4a4d5486-4b94-4f76-9eaa-adcffb9c81c0" (UID: "4a4d5486-4b94-4f76-9eaa-adcffb9c81c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:02:56 crc kubenswrapper[4954]: I1206 10:02:56.363323 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:56 crc kubenswrapper[4954]: I1206 10:02:56.370586 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0-kube-api-access-7jl4r" (OuterVolumeSpecName: "kube-api-access-7jl4r") pod "4a4d5486-4b94-4f76-9eaa-adcffb9c81c0" (UID: "4a4d5486-4b94-4f76-9eaa-adcffb9c81c0"). InnerVolumeSpecName "kube-api-access-7jl4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:02:56 crc kubenswrapper[4954]: I1206 10:02:56.412958 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a4d5486-4b94-4f76-9eaa-adcffb9c81c0" (UID: "4a4d5486-4b94-4f76-9eaa-adcffb9c81c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:02:56 crc kubenswrapper[4954]: I1206 10:02:56.465020 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:56 crc kubenswrapper[4954]: I1206 10:02:56.465259 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jl4r\" (UniqueName: \"kubernetes.io/projected/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0-kube-api-access-7jl4r\") on node \"crc\" DevicePath \"\"" Dec 06 10:02:57 crc kubenswrapper[4954]: I1206 10:02:57.231235 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69rkh" Dec 06 10:02:57 crc kubenswrapper[4954]: I1206 10:02:57.271618 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-69rkh"] Dec 06 10:02:57 crc kubenswrapper[4954]: I1206 10:02:57.282471 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-69rkh"] Dec 06 10:02:57 crc kubenswrapper[4954]: I1206 10:02:57.454624 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a4d5486-4b94-4f76-9eaa-adcffb9c81c0" path="/var/lib/kubelet/pods/4a4d5486-4b94-4f76-9eaa-adcffb9c81c0/volumes" Dec 06 10:03:10 crc kubenswrapper[4954]: I1206 10:03:10.101049 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:03:10 crc kubenswrapper[4954]: I1206 10:03:10.101486 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:03:10 crc kubenswrapper[4954]: I1206 10:03:10.101530 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 10:03:10 crc kubenswrapper[4954]: I1206 10:03:10.102420 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 10:03:10 crc kubenswrapper[4954]: I1206 10:03:10.102490 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" gracePeriod=600 Dec 06 10:03:12 crc kubenswrapper[4954]: I1206 10:03:12.395093 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" exitCode=0 Dec 06 10:03:12 crc kubenswrapper[4954]: I1206 10:03:12.395723 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda"} Dec 06 10:03:12 crc kubenswrapper[4954]: I1206 10:03:12.395766 4954 scope.go:117] "RemoveContainer" containerID="71099a7502436019cc48a8c0425ae64d0598a10ec6e6d6d7c4657bec700876e0" Dec 06 10:03:13 crc kubenswrapper[4954]: E1206 10:03:13.127378 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:03:13 crc kubenswrapper[4954]: I1206 10:03:13.408744 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:03:13 crc kubenswrapper[4954]: E1206 10:03:13.409411 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:03:28 crc kubenswrapper[4954]: I1206 10:03:28.444991 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:03:28 crc kubenswrapper[4954]: E1206 10:03:28.445904 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:03:31 crc kubenswrapper[4954]: I1206 10:03:31.594839 4954 scope.go:117] "RemoveContainer" containerID="b83a58551c2fc4d66eac24c96715a9875d9e86a16142d90e2cad9833aeb237e0" Dec 06 10:03:44 crc kubenswrapper[4954]: I1206 10:03:44.443786 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:03:44 crc kubenswrapper[4954]: E1206 10:03:44.444657 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.651503 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-frkxd"] Dec 06 10:03:52 crc kubenswrapper[4954]: E1206 10:03:52.652449 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a4d5486-4b94-4f76-9eaa-adcffb9c81c0" containerName="extract-utilities" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.652464 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a4d5486-4b94-4f76-9eaa-adcffb9c81c0" containerName="extract-utilities" Dec 06 10:03:52 crc kubenswrapper[4954]: E1206 10:03:52.652491 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a4d5486-4b94-4f76-9eaa-adcffb9c81c0" containerName="extract-content" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.652497 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a4d5486-4b94-4f76-9eaa-adcffb9c81c0" containerName="extract-content" Dec 06 10:03:52 crc kubenswrapper[4954]: E1206 10:03:52.652533 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a4d5486-4b94-4f76-9eaa-adcffb9c81c0" containerName="registry-server" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.652539 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a4d5486-4b94-4f76-9eaa-adcffb9c81c0" containerName="registry-server" Dec 06 10:03:52 crc kubenswrapper[4954]: E1206 10:03:52.652547 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="272fe111-b642-4df0-b30b-291f8a7c6ba3" containerName="swift-ring-rebalance" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.652554 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="272fe111-b642-4df0-b30b-291f8a7c6ba3" containerName="swift-ring-rebalance" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.652788 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="272fe111-b642-4df0-b30b-291f8a7c6ba3" containerName="swift-ring-rebalance" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.652802 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a4d5486-4b94-4f76-9eaa-adcffb9c81c0" containerName="registry-server" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.653553 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.655659 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.656615 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.667975 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-frkxd"] Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.832471 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ce1f5c7-c146-420b-9a08-e104387f643e-scripts\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.832513 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ce1f5c7-c146-420b-9a08-e104387f643e-etc-swift\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.832568 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce1f5c7-c146-420b-9a08-e104387f643e-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.832626 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ce1f5c7-c146-420b-9a08-e104387f643e-dispersionconf\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.832647 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ce1f5c7-c146-420b-9a08-e104387f643e-ring-data-devices\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.832688 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ce1f5c7-c146-420b-9a08-e104387f643e-swiftconf\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.832783 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b64cl\" (UniqueName: \"kubernetes.io/projected/9ce1f5c7-c146-420b-9a08-e104387f643e-kube-api-access-b64cl\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.934618 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b64cl\" (UniqueName: \"kubernetes.io/projected/9ce1f5c7-c146-420b-9a08-e104387f643e-kube-api-access-b64cl\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.934725 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ce1f5c7-c146-420b-9a08-e104387f643e-scripts\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.934753 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ce1f5c7-c146-420b-9a08-e104387f643e-etc-swift\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.934781 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce1f5c7-c146-420b-9a08-e104387f643e-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.934865 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ce1f5c7-c146-420b-9a08-e104387f643e-dispersionconf\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.934894 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ce1f5c7-c146-420b-9a08-e104387f643e-ring-data-devices\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.934947 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ce1f5c7-c146-420b-9a08-e104387f643e-swiftconf\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.935723 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ce1f5c7-c146-420b-9a08-e104387f643e-scripts\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.935808 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ce1f5c7-c146-420b-9a08-e104387f643e-ring-data-devices\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.936085 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ce1f5c7-c146-420b-9a08-e104387f643e-etc-swift\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.941029 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ce1f5c7-c146-420b-9a08-e104387f643e-dispersionconf\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.941291 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce1f5c7-c146-420b-9a08-e104387f643e-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.941337 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ce1f5c7-c146-420b-9a08-e104387f643e-swiftconf\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.952348 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b64cl\" (UniqueName: \"kubernetes.io/projected/9ce1f5c7-c146-420b-9a08-e104387f643e-kube-api-access-b64cl\") pod \"swift-ring-rebalance-debug-frkxd\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:52 crc kubenswrapper[4954]: I1206 10:03:52.978727 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:03:53 crc kubenswrapper[4954]: I1206 10:03:53.697691 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-frkxd"] Dec 06 10:03:53 crc kubenswrapper[4954]: I1206 10:03:53.926446 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-frkxd" event={"ID":"9ce1f5c7-c146-420b-9a08-e104387f643e","Type":"ContainerStarted","Data":"6b08411c84c4f4dcf92df19ddbc11b4a42b08d0c94a9c0b1e4caeaf118c5429b"} Dec 06 10:03:54 crc kubenswrapper[4954]: I1206 10:03:54.943481 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-frkxd" event={"ID":"9ce1f5c7-c146-420b-9a08-e104387f643e","Type":"ContainerStarted","Data":"f79a54180969298c63100a693c76b6af39709fb2166750bf0dbf68529c47dfd9"} Dec 06 10:03:54 crc kubenswrapper[4954]: I1206 10:03:54.966373 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-frkxd" podStartSLOduration=2.96635062 podStartE2EDuration="2.96635062s" podCreationTimestamp="2025-12-06 10:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:03:54.957294529 +0000 UTC m=+11209.770653928" watchObservedRunningTime="2025-12-06 10:03:54.96635062 +0000 UTC m=+11209.779710009" Dec 06 10:03:58 crc kubenswrapper[4954]: I1206 10:03:58.444118 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:03:58 crc kubenswrapper[4954]: E1206 10:03:58.444900 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:04:04 crc kubenswrapper[4954]: I1206 10:04:04.034268 4954 generic.go:334] "Generic (PLEG): container finished" podID="9ce1f5c7-c146-420b-9a08-e104387f643e" containerID="f79a54180969298c63100a693c76b6af39709fb2166750bf0dbf68529c47dfd9" exitCode=0 Dec 06 10:04:04 crc kubenswrapper[4954]: I1206 10:04:04.034374 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-frkxd" event={"ID":"9ce1f5c7-c146-420b-9a08-e104387f643e","Type":"ContainerDied","Data":"f79a54180969298c63100a693c76b6af39709fb2166750bf0dbf68529c47dfd9"} Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.501466 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.544541 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-frkxd"] Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.554745 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-frkxd"] Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.627855 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ce1f5c7-c146-420b-9a08-e104387f643e-ring-data-devices\") pod \"9ce1f5c7-c146-420b-9a08-e104387f643e\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.627983 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ce1f5c7-c146-420b-9a08-e104387f643e-swiftconf\") pod \"9ce1f5c7-c146-420b-9a08-e104387f643e\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.628026 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ce1f5c7-c146-420b-9a08-e104387f643e-etc-swift\") pod \"9ce1f5c7-c146-420b-9a08-e104387f643e\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.628063 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce1f5c7-c146-420b-9a08-e104387f643e-combined-ca-bundle\") pod \"9ce1f5c7-c146-420b-9a08-e104387f643e\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.628115 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ce1f5c7-c146-420b-9a08-e104387f643e-dispersionconf\") pod \"9ce1f5c7-c146-420b-9a08-e104387f643e\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.628135 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b64cl\" (UniqueName: \"kubernetes.io/projected/9ce1f5c7-c146-420b-9a08-e104387f643e-kube-api-access-b64cl\") pod \"9ce1f5c7-c146-420b-9a08-e104387f643e\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.628177 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ce1f5c7-c146-420b-9a08-e104387f643e-scripts\") pod \"9ce1f5c7-c146-420b-9a08-e104387f643e\" (UID: \"9ce1f5c7-c146-420b-9a08-e104387f643e\") " Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.628538 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ce1f5c7-c146-420b-9a08-e104387f643e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9ce1f5c7-c146-420b-9a08-e104387f643e" (UID: "9ce1f5c7-c146-420b-9a08-e104387f643e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.629065 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ce1f5c7-c146-420b-9a08-e104387f643e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.629544 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ce1f5c7-c146-420b-9a08-e104387f643e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9ce1f5c7-c146-420b-9a08-e104387f643e" (UID: "9ce1f5c7-c146-420b-9a08-e104387f643e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.634826 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ce1f5c7-c146-420b-9a08-e104387f643e-kube-api-access-b64cl" (OuterVolumeSpecName: "kube-api-access-b64cl") pod "9ce1f5c7-c146-420b-9a08-e104387f643e" (UID: "9ce1f5c7-c146-420b-9a08-e104387f643e"). InnerVolumeSpecName "kube-api-access-b64cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.657663 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ce1f5c7-c146-420b-9a08-e104387f643e-scripts" (OuterVolumeSpecName: "scripts") pod "9ce1f5c7-c146-420b-9a08-e104387f643e" (UID: "9ce1f5c7-c146-420b-9a08-e104387f643e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.660388 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ce1f5c7-c146-420b-9a08-e104387f643e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ce1f5c7-c146-420b-9a08-e104387f643e" (UID: "9ce1f5c7-c146-420b-9a08-e104387f643e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.661645 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ce1f5c7-c146-420b-9a08-e104387f643e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9ce1f5c7-c146-420b-9a08-e104387f643e" (UID: "9ce1f5c7-c146-420b-9a08-e104387f643e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.662100 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ce1f5c7-c146-420b-9a08-e104387f643e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9ce1f5c7-c146-420b-9a08-e104387f643e" (UID: "9ce1f5c7-c146-420b-9a08-e104387f643e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.731200 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ce1f5c7-c146-420b-9a08-e104387f643e-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.731248 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ce1f5c7-c146-420b-9a08-e104387f643e-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.731259 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce1f5c7-c146-420b-9a08-e104387f643e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.731273 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ce1f5c7-c146-420b-9a08-e104387f643e-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.731294 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b64cl\" (UniqueName: \"kubernetes.io/projected/9ce1f5c7-c146-420b-9a08-e104387f643e-kube-api-access-b64cl\") on node \"crc\" DevicePath \"\"" Dec 06 10:04:06 crc kubenswrapper[4954]: I1206 10:04:06.731309 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ce1f5c7-c146-420b-9a08-e104387f643e-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:04:07 crc kubenswrapper[4954]: I1206 10:04:07.065936 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b08411c84c4f4dcf92df19ddbc11b4a42b08d0c94a9c0b1e4caeaf118c5429b" Dec 06 10:04:07 crc kubenswrapper[4954]: I1206 10:04:07.065998 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-frkxd" Dec 06 10:04:07 crc kubenswrapper[4954]: I1206 10:04:07.459586 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ce1f5c7-c146-420b-9a08-e104387f643e" path="/var/lib/kubelet/pods/9ce1f5c7-c146-420b-9a08-e104387f643e/volumes" Dec 06 10:04:13 crc kubenswrapper[4954]: I1206 10:04:13.444002 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:04:13 crc kubenswrapper[4954]: E1206 10:04:13.444699 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:04:26 crc kubenswrapper[4954]: I1206 10:04:26.443745 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:04:26 crc kubenswrapper[4954]: E1206 10:04:26.444620 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:04:31 crc kubenswrapper[4954]: I1206 10:04:31.681600 4954 scope.go:117] "RemoveContainer" containerID="c87e953c34aab8e8d5c12c6e1164cb4b2de32ca87ac8f996996f046b5e205d09" Dec 06 10:04:37 crc kubenswrapper[4954]: I1206 10:04:37.443402 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:04:37 crc kubenswrapper[4954]: E1206 10:04:37.444096 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:04:50 crc kubenswrapper[4954]: I1206 10:04:50.443672 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:04:50 crc kubenswrapper[4954]: E1206 10:04:50.444466 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:04:53 crc kubenswrapper[4954]: I1206 10:04:53.537833 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8jb5j"] Dec 06 10:04:53 crc kubenswrapper[4954]: E1206 10:04:53.538866 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce1f5c7-c146-420b-9a08-e104387f643e" containerName="swift-ring-rebalance" Dec 06 10:04:53 crc kubenswrapper[4954]: I1206 10:04:53.538907 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce1f5c7-c146-420b-9a08-e104387f643e" containerName="swift-ring-rebalance" Dec 06 10:04:53 crc kubenswrapper[4954]: I1206 10:04:53.539244 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ce1f5c7-c146-420b-9a08-e104387f643e" containerName="swift-ring-rebalance" Dec 06 10:04:53 crc kubenswrapper[4954]: I1206 10:04:53.540949 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8jb5j" Dec 06 10:04:53 crc kubenswrapper[4954]: I1206 10:04:53.560652 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8jb5j"] Dec 06 10:04:53 crc kubenswrapper[4954]: I1206 10:04:53.686178 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637d562d-110f-46c4-90c7-425e929bd162-catalog-content\") pod \"redhat-operators-8jb5j\" (UID: \"637d562d-110f-46c4-90c7-425e929bd162\") " pod="openshift-marketplace/redhat-operators-8jb5j" Dec 06 10:04:53 crc kubenswrapper[4954]: I1206 10:04:53.686438 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2s4d\" (UniqueName: \"kubernetes.io/projected/637d562d-110f-46c4-90c7-425e929bd162-kube-api-access-l2s4d\") pod \"redhat-operators-8jb5j\" (UID: \"637d562d-110f-46c4-90c7-425e929bd162\") " pod="openshift-marketplace/redhat-operators-8jb5j" Dec 06 10:04:53 crc kubenswrapper[4954]: I1206 10:04:53.686537 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637d562d-110f-46c4-90c7-425e929bd162-utilities\") pod \"redhat-operators-8jb5j\" (UID: \"637d562d-110f-46c4-90c7-425e929bd162\") " pod="openshift-marketplace/redhat-operators-8jb5j" Dec 06 10:04:53 crc kubenswrapper[4954]: I1206 10:04:53.788922 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637d562d-110f-46c4-90c7-425e929bd162-catalog-content\") pod \"redhat-operators-8jb5j\" (UID: \"637d562d-110f-46c4-90c7-425e929bd162\") " pod="openshift-marketplace/redhat-operators-8jb5j" Dec 06 10:04:53 crc kubenswrapper[4954]: I1206 10:04:53.789064 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2s4d\" (UniqueName: \"kubernetes.io/projected/637d562d-110f-46c4-90c7-425e929bd162-kube-api-access-l2s4d\") pod \"redhat-operators-8jb5j\" (UID: \"637d562d-110f-46c4-90c7-425e929bd162\") " pod="openshift-marketplace/redhat-operators-8jb5j" Dec 06 10:04:53 crc kubenswrapper[4954]: I1206 10:04:53.789094 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637d562d-110f-46c4-90c7-425e929bd162-utilities\") pod \"redhat-operators-8jb5j\" (UID: \"637d562d-110f-46c4-90c7-425e929bd162\") " pod="openshift-marketplace/redhat-operators-8jb5j" Dec 06 10:04:53 crc kubenswrapper[4954]: I1206 10:04:53.789606 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637d562d-110f-46c4-90c7-425e929bd162-catalog-content\") pod \"redhat-operators-8jb5j\" (UID: \"637d562d-110f-46c4-90c7-425e929bd162\") " pod="openshift-marketplace/redhat-operators-8jb5j" Dec 06 10:04:53 crc kubenswrapper[4954]: I1206 10:04:53.789679 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637d562d-110f-46c4-90c7-425e929bd162-utilities\") pod \"redhat-operators-8jb5j\" (UID: \"637d562d-110f-46c4-90c7-425e929bd162\") " pod="openshift-marketplace/redhat-operators-8jb5j" Dec 06 10:04:53 crc kubenswrapper[4954]: I1206 10:04:53.810766 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2s4d\" (UniqueName: \"kubernetes.io/projected/637d562d-110f-46c4-90c7-425e929bd162-kube-api-access-l2s4d\") pod \"redhat-operators-8jb5j\" (UID: \"637d562d-110f-46c4-90c7-425e929bd162\") " pod="openshift-marketplace/redhat-operators-8jb5j" Dec 06 10:04:53 crc kubenswrapper[4954]: I1206 10:04:53.876317 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8jb5j" Dec 06 10:04:54 crc kubenswrapper[4954]: I1206 10:04:54.632009 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8jb5j"] Dec 06 10:04:55 crc kubenswrapper[4954]: I1206 10:04:55.588263 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8jb5j" event={"ID":"637d562d-110f-46c4-90c7-425e929bd162","Type":"ContainerStarted","Data":"c19e82613793caead7fd54bfb7863ae2bdb7da6b3063189581a41be40c43c214"} Dec 06 10:04:57 crc kubenswrapper[4954]: I1206 10:04:57.617128 4954 generic.go:334] "Generic (PLEG): container finished" podID="637d562d-110f-46c4-90c7-425e929bd162" containerID="14ff26e962bcdd5694542271e8d5b051d45447f6fec0238b6a69faaffa143e8f" exitCode=0 Dec 06 10:04:57 crc kubenswrapper[4954]: I1206 10:04:57.617198 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8jb5j" event={"ID":"637d562d-110f-46c4-90c7-425e929bd162","Type":"ContainerDied","Data":"14ff26e962bcdd5694542271e8d5b051d45447f6fec0238b6a69faaffa143e8f"} Dec 06 10:04:59 crc kubenswrapper[4954]: I1206 10:04:59.644639 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8jb5j" event={"ID":"637d562d-110f-46c4-90c7-425e929bd162","Type":"ContainerStarted","Data":"fac1f9c81eb2660b98f3527a012dcb55ce7febae208aabcf0760ffac0c10cb17"} Dec 06 10:05:01 crc kubenswrapper[4954]: I1206 10:05:01.443152 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:05:01 crc kubenswrapper[4954]: E1206 10:05:01.443503 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:05:01 crc kubenswrapper[4954]: I1206 10:05:01.962670 4954 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4x2sh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 10:05:01 crc kubenswrapper[4954]: I1206 10:05:01.962759 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" podUID="e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.733083 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-4qfbm"] Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.735481 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.738622 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.738792 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.748334 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-4qfbm"] Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.758671 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8b504388-7698-4909-90c0-039b4dc89774-ring-data-devices\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.759296 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d792\" (UniqueName: \"kubernetes.io/projected/8b504388-7698-4909-90c0-039b4dc89774-kube-api-access-2d792\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.759455 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8b504388-7698-4909-90c0-039b4dc89774-etc-swift\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.759509 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8b504388-7698-4909-90c0-039b4dc89774-dispersionconf\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.759594 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b504388-7698-4909-90c0-039b4dc89774-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.759657 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b504388-7698-4909-90c0-039b4dc89774-scripts\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.759760 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8b504388-7698-4909-90c0-039b4dc89774-swiftconf\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.861528 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8b504388-7698-4909-90c0-039b4dc89774-etc-swift\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.861986 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8b504388-7698-4909-90c0-039b4dc89774-dispersionconf\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.862166 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b504388-7698-4909-90c0-039b4dc89774-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.862269 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8b504388-7698-4909-90c0-039b4dc89774-etc-swift\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.862424 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b504388-7698-4909-90c0-039b4dc89774-scripts\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.862607 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8b504388-7698-4909-90c0-039b4dc89774-swiftconf\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.862782 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8b504388-7698-4909-90c0-039b4dc89774-ring-data-devices\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.862950 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d792\" (UniqueName: \"kubernetes.io/projected/8b504388-7698-4909-90c0-039b4dc89774-kube-api-access-2d792\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.863539 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b504388-7698-4909-90c0-039b4dc89774-scripts\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.863969 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8b504388-7698-4909-90c0-039b4dc89774-ring-data-devices\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.869125 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8b504388-7698-4909-90c0-039b4dc89774-swiftconf\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.869190 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b504388-7698-4909-90c0-039b4dc89774-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.876450 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8b504388-7698-4909-90c0-039b4dc89774-dispersionconf\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:06 crc kubenswrapper[4954]: I1206 10:05:06.880178 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d792\" (UniqueName: \"kubernetes.io/projected/8b504388-7698-4909-90c0-039b4dc89774-kube-api-access-2d792\") pod \"swift-ring-rebalance-debug-4qfbm\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:07 crc kubenswrapper[4954]: I1206 10:05:07.064626 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:09 crc kubenswrapper[4954]: I1206 10:05:09.593711 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-4qfbm"] Dec 06 10:05:09 crc kubenswrapper[4954]: I1206 10:05:09.754728 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-4qfbm" event={"ID":"8b504388-7698-4909-90c0-039b4dc89774","Type":"ContainerStarted","Data":"2d449f1f92a0705ab223f087a351a43119a0e08c841d212dd5a0d638b198796b"} Dec 06 10:05:09 crc kubenswrapper[4954]: I1206 10:05:09.757295 4954 generic.go:334] "Generic (PLEG): container finished" podID="637d562d-110f-46c4-90c7-425e929bd162" containerID="fac1f9c81eb2660b98f3527a012dcb55ce7febae208aabcf0760ffac0c10cb17" exitCode=0 Dec 06 10:05:09 crc kubenswrapper[4954]: I1206 10:05:09.757338 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8jb5j" event={"ID":"637d562d-110f-46c4-90c7-425e929bd162","Type":"ContainerDied","Data":"fac1f9c81eb2660b98f3527a012dcb55ce7febae208aabcf0760ffac0c10cb17"} Dec 06 10:05:10 crc kubenswrapper[4954]: I1206 10:05:10.772328 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8jb5j" event={"ID":"637d562d-110f-46c4-90c7-425e929bd162","Type":"ContainerStarted","Data":"885e8c8f4bd85874bf8917a28cdf4ab8ffa7e48fe025031e76395bdd48e6d14c"} Dec 06 10:05:10 crc kubenswrapper[4954]: I1206 10:05:10.775480 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-4qfbm" event={"ID":"8b504388-7698-4909-90c0-039b4dc89774","Type":"ContainerStarted","Data":"80a74c25dc861f1846f0f61a69e3f4e06dc809ec4686a2a724e5cac68f3d206b"} Dec 06 10:05:10 crc kubenswrapper[4954]: I1206 10:05:10.811343 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8jb5j" podStartSLOduration=5.259355604 podStartE2EDuration="17.81129197s" podCreationTimestamp="2025-12-06 10:04:53 +0000 UTC" firstStartedPulling="2025-12-06 10:04:57.620084833 +0000 UTC m=+11272.433444222" lastFinishedPulling="2025-12-06 10:05:10.172021199 +0000 UTC m=+11284.985380588" observedRunningTime="2025-12-06 10:05:10.789204852 +0000 UTC m=+11285.602564251" watchObservedRunningTime="2025-12-06 10:05:10.81129197 +0000 UTC m=+11285.624651369" Dec 06 10:05:10 crc kubenswrapper[4954]: I1206 10:05:10.828223 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-4qfbm" podStartSLOduration=4.828201431 podStartE2EDuration="4.828201431s" podCreationTimestamp="2025-12-06 10:05:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:05:10.8067721 +0000 UTC m=+11285.620131489" watchObservedRunningTime="2025-12-06 10:05:10.828201431 +0000 UTC m=+11285.641560820" Dec 06 10:05:12 crc kubenswrapper[4954]: I1206 10:05:12.442909 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:05:12 crc kubenswrapper[4954]: E1206 10:05:12.443424 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:05:13 crc kubenswrapper[4954]: I1206 10:05:13.876751 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8jb5j" Dec 06 10:05:13 crc kubenswrapper[4954]: I1206 10:05:13.877065 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8jb5j" Dec 06 10:05:14 crc kubenswrapper[4954]: I1206 10:05:14.932753 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8jb5j" podUID="637d562d-110f-46c4-90c7-425e929bd162" containerName="registry-server" probeResult="failure" output=< Dec 06 10:05:14 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 10:05:14 crc kubenswrapper[4954]: > Dec 06 10:05:19 crc kubenswrapper[4954]: I1206 10:05:19.877198 4954 generic.go:334] "Generic (PLEG): container finished" podID="8b504388-7698-4909-90c0-039b4dc89774" containerID="80a74c25dc861f1846f0f61a69e3f4e06dc809ec4686a2a724e5cac68f3d206b" exitCode=0 Dec 06 10:05:19 crc kubenswrapper[4954]: I1206 10:05:19.877292 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-4qfbm" event={"ID":"8b504388-7698-4909-90c0-039b4dc89774","Type":"ContainerDied","Data":"80a74c25dc861f1846f0f61a69e3f4e06dc809ec4686a2a724e5cac68f3d206b"} Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.206958 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.247851 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-4qfbm"] Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.257491 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-4qfbm"] Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.369888 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d792\" (UniqueName: \"kubernetes.io/projected/8b504388-7698-4909-90c0-039b4dc89774-kube-api-access-2d792\") pod \"8b504388-7698-4909-90c0-039b4dc89774\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.369941 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b504388-7698-4909-90c0-039b4dc89774-combined-ca-bundle\") pod \"8b504388-7698-4909-90c0-039b4dc89774\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.369997 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8b504388-7698-4909-90c0-039b4dc89774-dispersionconf\") pod \"8b504388-7698-4909-90c0-039b4dc89774\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.370033 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8b504388-7698-4909-90c0-039b4dc89774-swiftconf\") pod \"8b504388-7698-4909-90c0-039b4dc89774\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.370082 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b504388-7698-4909-90c0-039b4dc89774-scripts\") pod \"8b504388-7698-4909-90c0-039b4dc89774\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.370287 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8b504388-7698-4909-90c0-039b4dc89774-etc-swift\") pod \"8b504388-7698-4909-90c0-039b4dc89774\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.370321 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8b504388-7698-4909-90c0-039b4dc89774-ring-data-devices\") pod \"8b504388-7698-4909-90c0-039b4dc89774\" (UID: \"8b504388-7698-4909-90c0-039b4dc89774\") " Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.371139 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b504388-7698-4909-90c0-039b4dc89774-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8b504388-7698-4909-90c0-039b4dc89774" (UID: "8b504388-7698-4909-90c0-039b4dc89774"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.371167 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b504388-7698-4909-90c0-039b4dc89774-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8b504388-7698-4909-90c0-039b4dc89774" (UID: "8b504388-7698-4909-90c0-039b4dc89774"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.395939 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b504388-7698-4909-90c0-039b4dc89774-kube-api-access-2d792" (OuterVolumeSpecName: "kube-api-access-2d792") pod "8b504388-7698-4909-90c0-039b4dc89774" (UID: "8b504388-7698-4909-90c0-039b4dc89774"). InnerVolumeSpecName "kube-api-access-2d792". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.409001 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b504388-7698-4909-90c0-039b4dc89774-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8b504388-7698-4909-90c0-039b4dc89774" (UID: "8b504388-7698-4909-90c0-039b4dc89774"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.411278 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b504388-7698-4909-90c0-039b4dc89774-scripts" (OuterVolumeSpecName: "scripts") pod "8b504388-7698-4909-90c0-039b4dc89774" (UID: "8b504388-7698-4909-90c0-039b4dc89774"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.413706 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b504388-7698-4909-90c0-039b4dc89774-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b504388-7698-4909-90c0-039b4dc89774" (UID: "8b504388-7698-4909-90c0-039b4dc89774"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.441385 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b504388-7698-4909-90c0-039b4dc89774-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8b504388-7698-4909-90c0-039b4dc89774" (UID: "8b504388-7698-4909-90c0-039b4dc89774"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.472674 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b504388-7698-4909-90c0-039b4dc89774-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.472703 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8b504388-7698-4909-90c0-039b4dc89774-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.472713 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8b504388-7698-4909-90c0-039b4dc89774-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.472723 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d792\" (UniqueName: \"kubernetes.io/projected/8b504388-7698-4909-90c0-039b4dc89774-kube-api-access-2d792\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.472733 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b504388-7698-4909-90c0-039b4dc89774-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.472742 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8b504388-7698-4909-90c0-039b4dc89774-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.472751 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8b504388-7698-4909-90c0-039b4dc89774-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.713500 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-bhhv9"] Dec 06 10:05:22 crc kubenswrapper[4954]: E1206 10:05:22.714173 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b504388-7698-4909-90c0-039b4dc89774" containerName="swift-ring-rebalance" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.714258 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b504388-7698-4909-90c0-039b4dc89774" containerName="swift-ring-rebalance" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.714555 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b504388-7698-4909-90c0-039b4dc89774" containerName="swift-ring-rebalance" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.715329 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.724601 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-bhhv9"] Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.881027 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b081c7f9-a3db-4ca2-8837-aa51d639c048-etc-swift\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.881923 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b081c7f9-a3db-4ca2-8837-aa51d639c048-swiftconf\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.882264 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b081c7f9-a3db-4ca2-8837-aa51d639c048-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.882313 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwh4w\" (UniqueName: \"kubernetes.io/projected/b081c7f9-a3db-4ca2-8837-aa51d639c048-kube-api-access-vwh4w\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.882345 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b081c7f9-a3db-4ca2-8837-aa51d639c048-dispersionconf\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.882365 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b081c7f9-a3db-4ca2-8837-aa51d639c048-ring-data-devices\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.882502 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b081c7f9-a3db-4ca2-8837-aa51d639c048-scripts\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.911097 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d449f1f92a0705ab223f087a351a43119a0e08c841d212dd5a0d638b198796b" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.911196 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-4qfbm" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.984002 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b081c7f9-a3db-4ca2-8837-aa51d639c048-etc-swift\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.984070 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b081c7f9-a3db-4ca2-8837-aa51d639c048-swiftconf\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.984195 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b081c7f9-a3db-4ca2-8837-aa51d639c048-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.984218 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwh4w\" (UniqueName: \"kubernetes.io/projected/b081c7f9-a3db-4ca2-8837-aa51d639c048-kube-api-access-vwh4w\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.984237 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b081c7f9-a3db-4ca2-8837-aa51d639c048-dispersionconf\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.984255 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b081c7f9-a3db-4ca2-8837-aa51d639c048-ring-data-devices\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.984298 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b081c7f9-a3db-4ca2-8837-aa51d639c048-scripts\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.984437 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b081c7f9-a3db-4ca2-8837-aa51d639c048-etc-swift\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.985126 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b081c7f9-a3db-4ca2-8837-aa51d639c048-scripts\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.985347 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b081c7f9-a3db-4ca2-8837-aa51d639c048-ring-data-devices\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.989018 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b081c7f9-a3db-4ca2-8837-aa51d639c048-dispersionconf\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.989044 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b081c7f9-a3db-4ca2-8837-aa51d639c048-swiftconf\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:22 crc kubenswrapper[4954]: I1206 10:05:22.989166 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b081c7f9-a3db-4ca2-8837-aa51d639c048-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:23 crc kubenswrapper[4954]: I1206 10:05:23.000799 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwh4w\" (UniqueName: \"kubernetes.io/projected/b081c7f9-a3db-4ca2-8837-aa51d639c048-kube-api-access-vwh4w\") pod \"swift-ring-rebalance-debug-bhhv9\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:23 crc kubenswrapper[4954]: I1206 10:05:23.047338 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:23 crc kubenswrapper[4954]: I1206 10:05:23.456991 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b504388-7698-4909-90c0-039b4dc89774" path="/var/lib/kubelet/pods/8b504388-7698-4909-90c0-039b4dc89774/volumes" Dec 06 10:05:23 crc kubenswrapper[4954]: I1206 10:05:23.642905 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-bhhv9"] Dec 06 10:05:23 crc kubenswrapper[4954]: I1206 10:05:23.925226 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-bhhv9" event={"ID":"b081c7f9-a3db-4ca2-8837-aa51d639c048","Type":"ContainerStarted","Data":"f53d739daa1e60d1597407739683978b62a9fcc5948d3523c6ea84f9d9677663"} Dec 06 10:05:23 crc kubenswrapper[4954]: I1206 10:05:23.947243 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8jb5j" Dec 06 10:05:24 crc kubenswrapper[4954]: I1206 10:05:24.004055 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8jb5j" Dec 06 10:05:24 crc kubenswrapper[4954]: I1206 10:05:24.741771 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8jb5j"] Dec 06 10:05:24 crc kubenswrapper[4954]: I1206 10:05:24.942370 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-bhhv9" event={"ID":"b081c7f9-a3db-4ca2-8837-aa51d639c048","Type":"ContainerStarted","Data":"130ea2278f6920a0111a57c3d384d4e86caf46e68392b9dfde42690c1a03c091"} Dec 06 10:05:24 crc kubenswrapper[4954]: I1206 10:05:24.973059 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-bhhv9" podStartSLOduration=2.9730364419999997 podStartE2EDuration="2.973036442s" podCreationTimestamp="2025-12-06 10:05:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:05:24.961615588 +0000 UTC m=+11299.774974987" watchObservedRunningTime="2025-12-06 10:05:24.973036442 +0000 UTC m=+11299.786395831" Dec 06 10:05:25 crc kubenswrapper[4954]: I1206 10:05:25.969580 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8jb5j" podUID="637d562d-110f-46c4-90c7-425e929bd162" containerName="registry-server" containerID="cri-o://885e8c8f4bd85874bf8917a28cdf4ab8ffa7e48fe025031e76395bdd48e6d14c" gracePeriod=2 Dec 06 10:05:26 crc kubenswrapper[4954]: I1206 10:05:26.443285 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:05:26 crc kubenswrapper[4954]: E1206 10:05:26.443640 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:05:26 crc kubenswrapper[4954]: I1206 10:05:26.979894 4954 generic.go:334] "Generic (PLEG): container finished" podID="637d562d-110f-46c4-90c7-425e929bd162" containerID="885e8c8f4bd85874bf8917a28cdf4ab8ffa7e48fe025031e76395bdd48e6d14c" exitCode=0 Dec 06 10:05:26 crc kubenswrapper[4954]: I1206 10:05:26.979957 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8jb5j" event={"ID":"637d562d-110f-46c4-90c7-425e929bd162","Type":"ContainerDied","Data":"885e8c8f4bd85874bf8917a28cdf4ab8ffa7e48fe025031e76395bdd48e6d14c"} Dec 06 10:05:27 crc kubenswrapper[4954]: I1206 10:05:27.099472 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8jb5j" Dec 06 10:05:27 crc kubenswrapper[4954]: I1206 10:05:27.159282 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637d562d-110f-46c4-90c7-425e929bd162-catalog-content\") pod \"637d562d-110f-46c4-90c7-425e929bd162\" (UID: \"637d562d-110f-46c4-90c7-425e929bd162\") " Dec 06 10:05:27 crc kubenswrapper[4954]: I1206 10:05:27.159402 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2s4d\" (UniqueName: \"kubernetes.io/projected/637d562d-110f-46c4-90c7-425e929bd162-kube-api-access-l2s4d\") pod \"637d562d-110f-46c4-90c7-425e929bd162\" (UID: \"637d562d-110f-46c4-90c7-425e929bd162\") " Dec 06 10:05:27 crc kubenswrapper[4954]: I1206 10:05:27.159660 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637d562d-110f-46c4-90c7-425e929bd162-utilities\") pod \"637d562d-110f-46c4-90c7-425e929bd162\" (UID: \"637d562d-110f-46c4-90c7-425e929bd162\") " Dec 06 10:05:27 crc kubenswrapper[4954]: I1206 10:05:27.160816 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/637d562d-110f-46c4-90c7-425e929bd162-utilities" (OuterVolumeSpecName: "utilities") pod "637d562d-110f-46c4-90c7-425e929bd162" (UID: "637d562d-110f-46c4-90c7-425e929bd162"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:05:27 crc kubenswrapper[4954]: I1206 10:05:27.165787 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/637d562d-110f-46c4-90c7-425e929bd162-kube-api-access-l2s4d" (OuterVolumeSpecName: "kube-api-access-l2s4d") pod "637d562d-110f-46c4-90c7-425e929bd162" (UID: "637d562d-110f-46c4-90c7-425e929bd162"). InnerVolumeSpecName "kube-api-access-l2s4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:05:27 crc kubenswrapper[4954]: I1206 10:05:27.262426 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637d562d-110f-46c4-90c7-425e929bd162-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:27 crc kubenswrapper[4954]: I1206 10:05:27.262473 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2s4d\" (UniqueName: \"kubernetes.io/projected/637d562d-110f-46c4-90c7-425e929bd162-kube-api-access-l2s4d\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:27 crc kubenswrapper[4954]: I1206 10:05:27.296371 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/637d562d-110f-46c4-90c7-425e929bd162-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "637d562d-110f-46c4-90c7-425e929bd162" (UID: "637d562d-110f-46c4-90c7-425e929bd162"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:05:27 crc kubenswrapper[4954]: I1206 10:05:27.364053 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637d562d-110f-46c4-90c7-425e929bd162-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:27 crc kubenswrapper[4954]: I1206 10:05:27.991282 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8jb5j" event={"ID":"637d562d-110f-46c4-90c7-425e929bd162","Type":"ContainerDied","Data":"c19e82613793caead7fd54bfb7863ae2bdb7da6b3063189581a41be40c43c214"} Dec 06 10:05:27 crc kubenswrapper[4954]: I1206 10:05:27.991341 4954 scope.go:117] "RemoveContainer" containerID="885e8c8f4bd85874bf8917a28cdf4ab8ffa7e48fe025031e76395bdd48e6d14c" Dec 06 10:05:27 crc kubenswrapper[4954]: I1206 10:05:27.991362 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8jb5j" Dec 06 10:05:28 crc kubenswrapper[4954]: I1206 10:05:28.019037 4954 scope.go:117] "RemoveContainer" containerID="fac1f9c81eb2660b98f3527a012dcb55ce7febae208aabcf0760ffac0c10cb17" Dec 06 10:05:28 crc kubenswrapper[4954]: I1206 10:05:28.024467 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8jb5j"] Dec 06 10:05:28 crc kubenswrapper[4954]: I1206 10:05:28.048371 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8jb5j"] Dec 06 10:05:28 crc kubenswrapper[4954]: I1206 10:05:28.060946 4954 scope.go:117] "RemoveContainer" containerID="14ff26e962bcdd5694542271e8d5b051d45447f6fec0238b6a69faaffa143e8f" Dec 06 10:05:29 crc kubenswrapper[4954]: I1206 10:05:29.459052 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="637d562d-110f-46c4-90c7-425e929bd162" path="/var/lib/kubelet/pods/637d562d-110f-46c4-90c7-425e929bd162/volumes" Dec 06 10:05:30 crc kubenswrapper[4954]: I1206 10:05:30.017126 4954 generic.go:334] "Generic (PLEG): container finished" podID="b081c7f9-a3db-4ca2-8837-aa51d639c048" containerID="130ea2278f6920a0111a57c3d384d4e86caf46e68392b9dfde42690c1a03c091" exitCode=0 Dec 06 10:05:30 crc kubenswrapper[4954]: I1206 10:05:30.017173 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-bhhv9" event={"ID":"b081c7f9-a3db-4ca2-8837-aa51d639c048","Type":"ContainerDied","Data":"130ea2278f6920a0111a57c3d384d4e86caf46e68392b9dfde42690c1a03c091"} Dec 06 10:05:31 crc kubenswrapper[4954]: I1206 10:05:31.769370 4954 scope.go:117] "RemoveContainer" containerID="6a62c39c882933b46cfe1a28ff01fcc7b26b1365d2bd856458f159ad0aeeb326" Dec 06 10:05:31 crc kubenswrapper[4954]: I1206 10:05:31.915130 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:31 crc kubenswrapper[4954]: I1206 10:05:31.967009 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-bhhv9"] Dec 06 10:05:31 crc kubenswrapper[4954]: I1206 10:05:31.977454 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-bhhv9"] Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.036924 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f53d739daa1e60d1597407739683978b62a9fcc5948d3523c6ea84f9d9677663" Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.036972 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-bhhv9" Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.052555 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b081c7f9-a3db-4ca2-8837-aa51d639c048-combined-ca-bundle\") pod \"b081c7f9-a3db-4ca2-8837-aa51d639c048\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.052673 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwh4w\" (UniqueName: \"kubernetes.io/projected/b081c7f9-a3db-4ca2-8837-aa51d639c048-kube-api-access-vwh4w\") pod \"b081c7f9-a3db-4ca2-8837-aa51d639c048\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.052728 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b081c7f9-a3db-4ca2-8837-aa51d639c048-scripts\") pod \"b081c7f9-a3db-4ca2-8837-aa51d639c048\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.052763 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b081c7f9-a3db-4ca2-8837-aa51d639c048-dispersionconf\") pod \"b081c7f9-a3db-4ca2-8837-aa51d639c048\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.052837 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b081c7f9-a3db-4ca2-8837-aa51d639c048-swiftconf\") pod \"b081c7f9-a3db-4ca2-8837-aa51d639c048\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.052867 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b081c7f9-a3db-4ca2-8837-aa51d639c048-etc-swift\") pod \"b081c7f9-a3db-4ca2-8837-aa51d639c048\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.052967 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b081c7f9-a3db-4ca2-8837-aa51d639c048-ring-data-devices\") pod \"b081c7f9-a3db-4ca2-8837-aa51d639c048\" (UID: \"b081c7f9-a3db-4ca2-8837-aa51d639c048\") " Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.053768 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b081c7f9-a3db-4ca2-8837-aa51d639c048-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b081c7f9-a3db-4ca2-8837-aa51d639c048" (UID: "b081c7f9-a3db-4ca2-8837-aa51d639c048"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.054172 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b081c7f9-a3db-4ca2-8837-aa51d639c048-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b081c7f9-a3db-4ca2-8837-aa51d639c048" (UID: "b081c7f9-a3db-4ca2-8837-aa51d639c048"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.067445 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b081c7f9-a3db-4ca2-8837-aa51d639c048-kube-api-access-vwh4w" (OuterVolumeSpecName: "kube-api-access-vwh4w") pod "b081c7f9-a3db-4ca2-8837-aa51d639c048" (UID: "b081c7f9-a3db-4ca2-8837-aa51d639c048"). InnerVolumeSpecName "kube-api-access-vwh4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.083411 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b081c7f9-a3db-4ca2-8837-aa51d639c048-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b081c7f9-a3db-4ca2-8837-aa51d639c048" (UID: "b081c7f9-a3db-4ca2-8837-aa51d639c048"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.088604 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b081c7f9-a3db-4ca2-8837-aa51d639c048-scripts" (OuterVolumeSpecName: "scripts") pod "b081c7f9-a3db-4ca2-8837-aa51d639c048" (UID: "b081c7f9-a3db-4ca2-8837-aa51d639c048"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.090841 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b081c7f9-a3db-4ca2-8837-aa51d639c048-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b081c7f9-a3db-4ca2-8837-aa51d639c048" (UID: "b081c7f9-a3db-4ca2-8837-aa51d639c048"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.097503 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b081c7f9-a3db-4ca2-8837-aa51d639c048-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b081c7f9-a3db-4ca2-8837-aa51d639c048" (UID: "b081c7f9-a3db-4ca2-8837-aa51d639c048"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.155645 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b081c7f9-a3db-4ca2-8837-aa51d639c048-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.155681 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b081c7f9-a3db-4ca2-8837-aa51d639c048-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.155694 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b081c7f9-a3db-4ca2-8837-aa51d639c048-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.155706 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b081c7f9-a3db-4ca2-8837-aa51d639c048-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.155717 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwh4w\" (UniqueName: \"kubernetes.io/projected/b081c7f9-a3db-4ca2-8837-aa51d639c048-kube-api-access-vwh4w\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.155730 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b081c7f9-a3db-4ca2-8837-aa51d639c048-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:32 crc kubenswrapper[4954]: I1206 10:05:32.155738 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b081c7f9-a3db-4ca2-8837-aa51d639c048-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:33 crc kubenswrapper[4954]: I1206 10:05:33.455737 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b081c7f9-a3db-4ca2-8837-aa51d639c048" path="/var/lib/kubelet/pods/b081c7f9-a3db-4ca2-8837-aa51d639c048/volumes" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.410389 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-frjxz"] Dec 06 10:05:35 crc kubenswrapper[4954]: E1206 10:05:35.411343 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b081c7f9-a3db-4ca2-8837-aa51d639c048" containerName="swift-ring-rebalance" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.411358 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b081c7f9-a3db-4ca2-8837-aa51d639c048" containerName="swift-ring-rebalance" Dec 06 10:05:35 crc kubenswrapper[4954]: E1206 10:05:35.411377 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637d562d-110f-46c4-90c7-425e929bd162" containerName="extract-utilities" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.411384 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="637d562d-110f-46c4-90c7-425e929bd162" containerName="extract-utilities" Dec 06 10:05:35 crc kubenswrapper[4954]: E1206 10:05:35.411411 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637d562d-110f-46c4-90c7-425e929bd162" containerName="extract-content" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.411416 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="637d562d-110f-46c4-90c7-425e929bd162" containerName="extract-content" Dec 06 10:05:35 crc kubenswrapper[4954]: E1206 10:05:35.411441 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637d562d-110f-46c4-90c7-425e929bd162" containerName="registry-server" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.411448 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="637d562d-110f-46c4-90c7-425e929bd162" containerName="registry-server" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.411679 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="637d562d-110f-46c4-90c7-425e929bd162" containerName="registry-server" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.411714 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b081c7f9-a3db-4ca2-8837-aa51d639c048" containerName="swift-ring-rebalance" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.412777 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.414992 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.418243 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.421954 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-frjxz"] Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.481570 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/93459e7d-0be5-472b-a304-5d08f48ab6a5-ring-data-devices\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.481648 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93459e7d-0be5-472b-a304-5d08f48ab6a5-scripts\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.481751 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/93459e7d-0be5-472b-a304-5d08f48ab6a5-swiftconf\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.481767 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22ccd\" (UniqueName: \"kubernetes.io/projected/93459e7d-0be5-472b-a304-5d08f48ab6a5-kube-api-access-22ccd\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.481806 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/93459e7d-0be5-472b-a304-5d08f48ab6a5-etc-swift\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.481838 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/93459e7d-0be5-472b-a304-5d08f48ab6a5-dispersionconf\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.481890 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93459e7d-0be5-472b-a304-5d08f48ab6a5-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.583778 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/93459e7d-0be5-472b-a304-5d08f48ab6a5-ring-data-devices\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.583850 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93459e7d-0be5-472b-a304-5d08f48ab6a5-scripts\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.583976 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22ccd\" (UniqueName: \"kubernetes.io/projected/93459e7d-0be5-472b-a304-5d08f48ab6a5-kube-api-access-22ccd\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.583998 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/93459e7d-0be5-472b-a304-5d08f48ab6a5-swiftconf\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.584029 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/93459e7d-0be5-472b-a304-5d08f48ab6a5-etc-swift\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.584065 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/93459e7d-0be5-472b-a304-5d08f48ab6a5-dispersionconf\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.584610 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93459e7d-0be5-472b-a304-5d08f48ab6a5-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.584697 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/93459e7d-0be5-472b-a304-5d08f48ab6a5-ring-data-devices\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.585865 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/93459e7d-0be5-472b-a304-5d08f48ab6a5-etc-swift\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.586273 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93459e7d-0be5-472b-a304-5d08f48ab6a5-scripts\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.590498 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/93459e7d-0be5-472b-a304-5d08f48ab6a5-swiftconf\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.591059 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93459e7d-0be5-472b-a304-5d08f48ab6a5-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.593757 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/93459e7d-0be5-472b-a304-5d08f48ab6a5-dispersionconf\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.602295 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22ccd\" (UniqueName: \"kubernetes.io/projected/93459e7d-0be5-472b-a304-5d08f48ab6a5-kube-api-access-22ccd\") pod \"swift-ring-rebalance-debug-frjxz\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:35 crc kubenswrapper[4954]: I1206 10:05:35.745089 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:36 crc kubenswrapper[4954]: I1206 10:05:36.230738 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-frjxz"] Dec 06 10:05:37 crc kubenswrapper[4954]: I1206 10:05:37.087388 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-frjxz" event={"ID":"93459e7d-0be5-472b-a304-5d08f48ab6a5","Type":"ContainerStarted","Data":"75ee6942ae87684e64cf9fc4273c140a3f27f61a2ea846c15b420a4daf9fd381"} Dec 06 10:05:37 crc kubenswrapper[4954]: I1206 10:05:37.087727 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-frjxz" event={"ID":"93459e7d-0be5-472b-a304-5d08f48ab6a5","Type":"ContainerStarted","Data":"afa782d656eb2cde491c4b302b3d6507884a2eab3f1a99fcc2c26a02da57282d"} Dec 06 10:05:37 crc kubenswrapper[4954]: I1206 10:05:37.117076 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-frjxz" podStartSLOduration=2.117056339 podStartE2EDuration="2.117056339s" podCreationTimestamp="2025-12-06 10:05:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:05:37.103040646 +0000 UTC m=+11311.916400035" watchObservedRunningTime="2025-12-06 10:05:37.117056339 +0000 UTC m=+11311.930415728" Dec 06 10:05:40 crc kubenswrapper[4954]: I1206 10:05:40.443784 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:05:40 crc kubenswrapper[4954]: E1206 10:05:40.444490 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:05:45 crc kubenswrapper[4954]: I1206 10:05:45.460952 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4jn84"] Dec 06 10:05:45 crc kubenswrapper[4954]: I1206 10:05:45.463601 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4jn84" Dec 06 10:05:45 crc kubenswrapper[4954]: I1206 10:05:45.465927 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4jn84"] Dec 06 10:05:45 crc kubenswrapper[4954]: I1206 10:05:45.584942 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml5b8\" (UniqueName: \"kubernetes.io/projected/8a907150-6f43-4ce1-9605-7dd27217f384-kube-api-access-ml5b8\") pod \"certified-operators-4jn84\" (UID: \"8a907150-6f43-4ce1-9605-7dd27217f384\") " pod="openshift-marketplace/certified-operators-4jn84" Dec 06 10:05:45 crc kubenswrapper[4954]: I1206 10:05:45.588368 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a907150-6f43-4ce1-9605-7dd27217f384-utilities\") pod \"certified-operators-4jn84\" (UID: \"8a907150-6f43-4ce1-9605-7dd27217f384\") " pod="openshift-marketplace/certified-operators-4jn84" Dec 06 10:05:45 crc kubenswrapper[4954]: I1206 10:05:45.588428 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a907150-6f43-4ce1-9605-7dd27217f384-catalog-content\") pod \"certified-operators-4jn84\" (UID: \"8a907150-6f43-4ce1-9605-7dd27217f384\") " pod="openshift-marketplace/certified-operators-4jn84" Dec 06 10:05:45 crc kubenswrapper[4954]: I1206 10:05:45.691001 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml5b8\" (UniqueName: \"kubernetes.io/projected/8a907150-6f43-4ce1-9605-7dd27217f384-kube-api-access-ml5b8\") pod \"certified-operators-4jn84\" (UID: \"8a907150-6f43-4ce1-9605-7dd27217f384\") " pod="openshift-marketplace/certified-operators-4jn84" Dec 06 10:05:45 crc kubenswrapper[4954]: I1206 10:05:45.691241 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a907150-6f43-4ce1-9605-7dd27217f384-utilities\") pod \"certified-operators-4jn84\" (UID: \"8a907150-6f43-4ce1-9605-7dd27217f384\") " pod="openshift-marketplace/certified-operators-4jn84" Dec 06 10:05:45 crc kubenswrapper[4954]: I1206 10:05:45.691271 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a907150-6f43-4ce1-9605-7dd27217f384-catalog-content\") pod \"certified-operators-4jn84\" (UID: \"8a907150-6f43-4ce1-9605-7dd27217f384\") " pod="openshift-marketplace/certified-operators-4jn84" Dec 06 10:05:45 crc kubenswrapper[4954]: I1206 10:05:45.691744 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a907150-6f43-4ce1-9605-7dd27217f384-catalog-content\") pod \"certified-operators-4jn84\" (UID: \"8a907150-6f43-4ce1-9605-7dd27217f384\") " pod="openshift-marketplace/certified-operators-4jn84" Dec 06 10:05:45 crc kubenswrapper[4954]: I1206 10:05:45.691775 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a907150-6f43-4ce1-9605-7dd27217f384-utilities\") pod \"certified-operators-4jn84\" (UID: \"8a907150-6f43-4ce1-9605-7dd27217f384\") " pod="openshift-marketplace/certified-operators-4jn84" Dec 06 10:05:45 crc kubenswrapper[4954]: I1206 10:05:45.720638 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml5b8\" (UniqueName: \"kubernetes.io/projected/8a907150-6f43-4ce1-9605-7dd27217f384-kube-api-access-ml5b8\") pod \"certified-operators-4jn84\" (UID: \"8a907150-6f43-4ce1-9605-7dd27217f384\") " pod="openshift-marketplace/certified-operators-4jn84" Dec 06 10:05:45 crc kubenswrapper[4954]: I1206 10:05:45.800749 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4jn84" Dec 06 10:05:46 crc kubenswrapper[4954]: I1206 10:05:46.664923 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4jn84"] Dec 06 10:05:46 crc kubenswrapper[4954]: W1206 10:05:46.691861 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a907150_6f43_4ce1_9605_7dd27217f384.slice/crio-a2f9b49ab8aa8a65c0e58c960645df21f49af6a2ccf4700946db397c20991a0b WatchSource:0}: Error finding container a2f9b49ab8aa8a65c0e58c960645df21f49af6a2ccf4700946db397c20991a0b: Status 404 returned error can't find the container with id a2f9b49ab8aa8a65c0e58c960645df21f49af6a2ccf4700946db397c20991a0b Dec 06 10:05:47 crc kubenswrapper[4954]: I1206 10:05:47.200580 4954 generic.go:334] "Generic (PLEG): container finished" podID="93459e7d-0be5-472b-a304-5d08f48ab6a5" containerID="75ee6942ae87684e64cf9fc4273c140a3f27f61a2ea846c15b420a4daf9fd381" exitCode=0 Dec 06 10:05:47 crc kubenswrapper[4954]: I1206 10:05:47.200710 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-frjxz" event={"ID":"93459e7d-0be5-472b-a304-5d08f48ab6a5","Type":"ContainerDied","Data":"75ee6942ae87684e64cf9fc4273c140a3f27f61a2ea846c15b420a4daf9fd381"} Dec 06 10:05:47 crc kubenswrapper[4954]: I1206 10:05:47.203210 4954 generic.go:334] "Generic (PLEG): container finished" podID="8a907150-6f43-4ce1-9605-7dd27217f384" containerID="ede712135015bfc9641b6ea2f687ef91e793b24f23ac61c9bb5c8d6584543b36" exitCode=0 Dec 06 10:05:47 crc kubenswrapper[4954]: I1206 10:05:47.203242 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jn84" event={"ID":"8a907150-6f43-4ce1-9605-7dd27217f384","Type":"ContainerDied","Data":"ede712135015bfc9641b6ea2f687ef91e793b24f23ac61c9bb5c8d6584543b36"} Dec 06 10:05:47 crc kubenswrapper[4954]: I1206 10:05:47.203263 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jn84" event={"ID":"8a907150-6f43-4ce1-9605-7dd27217f384","Type":"ContainerStarted","Data":"a2f9b49ab8aa8a65c0e58c960645df21f49af6a2ccf4700946db397c20991a0b"} Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.123382 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.211195 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-frjxz"] Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.222964 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-frjxz"] Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.240348 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jn84" event={"ID":"8a907150-6f43-4ce1-9605-7dd27217f384","Type":"ContainerStarted","Data":"a3c354c813be5f1722b1a655cc8699a8cd434119cc5b3bea3551d94fd6d1ce5a"} Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.248622 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afa782d656eb2cde491c4b302b3d6507884a2eab3f1a99fcc2c26a02da57282d" Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.248704 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-frjxz" Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.262464 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/93459e7d-0be5-472b-a304-5d08f48ab6a5-etc-swift\") pod \"93459e7d-0be5-472b-a304-5d08f48ab6a5\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.262553 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93459e7d-0be5-472b-a304-5d08f48ab6a5-combined-ca-bundle\") pod \"93459e7d-0be5-472b-a304-5d08f48ab6a5\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.262663 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/93459e7d-0be5-472b-a304-5d08f48ab6a5-ring-data-devices\") pod \"93459e7d-0be5-472b-a304-5d08f48ab6a5\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.262726 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/93459e7d-0be5-472b-a304-5d08f48ab6a5-dispersionconf\") pod \"93459e7d-0be5-472b-a304-5d08f48ab6a5\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.262790 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/93459e7d-0be5-472b-a304-5d08f48ab6a5-swiftconf\") pod \"93459e7d-0be5-472b-a304-5d08f48ab6a5\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.263182 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93459e7d-0be5-472b-a304-5d08f48ab6a5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "93459e7d-0be5-472b-a304-5d08f48ab6a5" (UID: "93459e7d-0be5-472b-a304-5d08f48ab6a5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.263444 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22ccd\" (UniqueName: \"kubernetes.io/projected/93459e7d-0be5-472b-a304-5d08f48ab6a5-kube-api-access-22ccd\") pod \"93459e7d-0be5-472b-a304-5d08f48ab6a5\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.263477 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93459e7d-0be5-472b-a304-5d08f48ab6a5-scripts\") pod \"93459e7d-0be5-472b-a304-5d08f48ab6a5\" (UID: \"93459e7d-0be5-472b-a304-5d08f48ab6a5\") " Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.263932 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/93459e7d-0be5-472b-a304-5d08f48ab6a5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.268397 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93459e7d-0be5-472b-a304-5d08f48ab6a5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "93459e7d-0be5-472b-a304-5d08f48ab6a5" (UID: "93459e7d-0be5-472b-a304-5d08f48ab6a5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.274228 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93459e7d-0be5-472b-a304-5d08f48ab6a5-kube-api-access-22ccd" (OuterVolumeSpecName: "kube-api-access-22ccd") pod "93459e7d-0be5-472b-a304-5d08f48ab6a5" (UID: "93459e7d-0be5-472b-a304-5d08f48ab6a5"). InnerVolumeSpecName "kube-api-access-22ccd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.299451 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93459e7d-0be5-472b-a304-5d08f48ab6a5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "93459e7d-0be5-472b-a304-5d08f48ab6a5" (UID: "93459e7d-0be5-472b-a304-5d08f48ab6a5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.300911 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93459e7d-0be5-472b-a304-5d08f48ab6a5-scripts" (OuterVolumeSpecName: "scripts") pod "93459e7d-0be5-472b-a304-5d08f48ab6a5" (UID: "93459e7d-0be5-472b-a304-5d08f48ab6a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.304073 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93459e7d-0be5-472b-a304-5d08f48ab6a5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "93459e7d-0be5-472b-a304-5d08f48ab6a5" (UID: "93459e7d-0be5-472b-a304-5d08f48ab6a5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.312321 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93459e7d-0be5-472b-a304-5d08f48ab6a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93459e7d-0be5-472b-a304-5d08f48ab6a5" (UID: "93459e7d-0be5-472b-a304-5d08f48ab6a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.365916 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/93459e7d-0be5-472b-a304-5d08f48ab6a5-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.365964 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/93459e7d-0be5-472b-a304-5d08f48ab6a5-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.365974 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22ccd\" (UniqueName: \"kubernetes.io/projected/93459e7d-0be5-472b-a304-5d08f48ab6a5-kube-api-access-22ccd\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.365985 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93459e7d-0be5-472b-a304-5d08f48ab6a5-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.365994 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/93459e7d-0be5-472b-a304-5d08f48ab6a5-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.366004 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93459e7d-0be5-472b-a304-5d08f48ab6a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:49 crc kubenswrapper[4954]: I1206 10:05:49.455424 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93459e7d-0be5-472b-a304-5d08f48ab6a5" path="/var/lib/kubelet/pods/93459e7d-0be5-472b-a304-5d08f48ab6a5/volumes" Dec 06 10:05:50 crc kubenswrapper[4954]: I1206 10:05:50.269001 4954 generic.go:334] "Generic (PLEG): container finished" podID="8a907150-6f43-4ce1-9605-7dd27217f384" containerID="a3c354c813be5f1722b1a655cc8699a8cd434119cc5b3bea3551d94fd6d1ce5a" exitCode=0 Dec 06 10:05:50 crc kubenswrapper[4954]: I1206 10:05:50.269331 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jn84" event={"ID":"8a907150-6f43-4ce1-9605-7dd27217f384","Type":"ContainerDied","Data":"a3c354c813be5f1722b1a655cc8699a8cd434119cc5b3bea3551d94fd6d1ce5a"} Dec 06 10:05:51 crc kubenswrapper[4954]: I1206 10:05:51.280715 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jn84" event={"ID":"8a907150-6f43-4ce1-9605-7dd27217f384","Type":"ContainerStarted","Data":"aad4bcb92c5b699da8fa86f2945b33ca28cf75aeeef56a2788d80c5d31393aa1"} Dec 06 10:05:51 crc kubenswrapper[4954]: I1206 10:05:51.301707 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4jn84" podStartSLOduration=2.859307511 podStartE2EDuration="6.301688262s" podCreationTimestamp="2025-12-06 10:05:45 +0000 UTC" firstStartedPulling="2025-12-06 10:05:47.20686884 +0000 UTC m=+11322.020228229" lastFinishedPulling="2025-12-06 10:05:50.649249591 +0000 UTC m=+11325.462608980" observedRunningTime="2025-12-06 10:05:51.299635657 +0000 UTC m=+11326.112995036" watchObservedRunningTime="2025-12-06 10:05:51.301688262 +0000 UTC m=+11326.115047651" Dec 06 10:05:54 crc kubenswrapper[4954]: I1206 10:05:54.443365 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:05:54 crc kubenswrapper[4954]: E1206 10:05:54.444336 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:05:55 crc kubenswrapper[4954]: I1206 10:05:55.802778 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4jn84" Dec 06 10:05:55 crc kubenswrapper[4954]: I1206 10:05:55.803152 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4jn84" Dec 06 10:05:55 crc kubenswrapper[4954]: I1206 10:05:55.859855 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4jn84" Dec 06 10:05:56 crc kubenswrapper[4954]: I1206 10:05:56.395521 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4jn84" Dec 06 10:05:56 crc kubenswrapper[4954]: I1206 10:05:56.447328 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4jn84"] Dec 06 10:05:58 crc kubenswrapper[4954]: I1206 10:05:58.372295 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4jn84" podUID="8a907150-6f43-4ce1-9605-7dd27217f384" containerName="registry-server" containerID="cri-o://aad4bcb92c5b699da8fa86f2945b33ca28cf75aeeef56a2788d80c5d31393aa1" gracePeriod=2 Dec 06 10:05:59 crc kubenswrapper[4954]: I1206 10:05:59.385165 4954 generic.go:334] "Generic (PLEG): container finished" podID="8a907150-6f43-4ce1-9605-7dd27217f384" containerID="aad4bcb92c5b699da8fa86f2945b33ca28cf75aeeef56a2788d80c5d31393aa1" exitCode=0 Dec 06 10:05:59 crc kubenswrapper[4954]: I1206 10:05:59.385248 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jn84" event={"ID":"8a907150-6f43-4ce1-9605-7dd27217f384","Type":"ContainerDied","Data":"aad4bcb92c5b699da8fa86f2945b33ca28cf75aeeef56a2788d80c5d31393aa1"} Dec 06 10:05:59 crc kubenswrapper[4954]: I1206 10:05:59.527941 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4jn84" Dec 06 10:05:59 crc kubenswrapper[4954]: I1206 10:05:59.682660 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml5b8\" (UniqueName: \"kubernetes.io/projected/8a907150-6f43-4ce1-9605-7dd27217f384-kube-api-access-ml5b8\") pod \"8a907150-6f43-4ce1-9605-7dd27217f384\" (UID: \"8a907150-6f43-4ce1-9605-7dd27217f384\") " Dec 06 10:05:59 crc kubenswrapper[4954]: I1206 10:05:59.682778 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a907150-6f43-4ce1-9605-7dd27217f384-utilities\") pod \"8a907150-6f43-4ce1-9605-7dd27217f384\" (UID: \"8a907150-6f43-4ce1-9605-7dd27217f384\") " Dec 06 10:05:59 crc kubenswrapper[4954]: I1206 10:05:59.682883 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a907150-6f43-4ce1-9605-7dd27217f384-catalog-content\") pod \"8a907150-6f43-4ce1-9605-7dd27217f384\" (UID: \"8a907150-6f43-4ce1-9605-7dd27217f384\") " Dec 06 10:05:59 crc kubenswrapper[4954]: I1206 10:05:59.683888 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a907150-6f43-4ce1-9605-7dd27217f384-utilities" (OuterVolumeSpecName: "utilities") pod "8a907150-6f43-4ce1-9605-7dd27217f384" (UID: "8a907150-6f43-4ce1-9605-7dd27217f384"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:05:59 crc kubenswrapper[4954]: I1206 10:05:59.688163 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a907150-6f43-4ce1-9605-7dd27217f384-kube-api-access-ml5b8" (OuterVolumeSpecName: "kube-api-access-ml5b8") pod "8a907150-6f43-4ce1-9605-7dd27217f384" (UID: "8a907150-6f43-4ce1-9605-7dd27217f384"). InnerVolumeSpecName "kube-api-access-ml5b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:05:59 crc kubenswrapper[4954]: I1206 10:05:59.785194 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml5b8\" (UniqueName: \"kubernetes.io/projected/8a907150-6f43-4ce1-9605-7dd27217f384-kube-api-access-ml5b8\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:59 crc kubenswrapper[4954]: I1206 10:05:59.785227 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a907150-6f43-4ce1-9605-7dd27217f384-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:05:59 crc kubenswrapper[4954]: I1206 10:05:59.933384 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a907150-6f43-4ce1-9605-7dd27217f384-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a907150-6f43-4ce1-9605-7dd27217f384" (UID: "8a907150-6f43-4ce1-9605-7dd27217f384"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:05:59 crc kubenswrapper[4954]: I1206 10:05:59.989077 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a907150-6f43-4ce1-9605-7dd27217f384-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:06:00 crc kubenswrapper[4954]: I1206 10:06:00.397842 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jn84" event={"ID":"8a907150-6f43-4ce1-9605-7dd27217f384","Type":"ContainerDied","Data":"a2f9b49ab8aa8a65c0e58c960645df21f49af6a2ccf4700946db397c20991a0b"} Dec 06 10:06:00 crc kubenswrapper[4954]: I1206 10:06:00.397903 4954 scope.go:117] "RemoveContainer" containerID="aad4bcb92c5b699da8fa86f2945b33ca28cf75aeeef56a2788d80c5d31393aa1" Dec 06 10:06:00 crc kubenswrapper[4954]: I1206 10:06:00.398087 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4jn84" Dec 06 10:06:00 crc kubenswrapper[4954]: I1206 10:06:00.422283 4954 scope.go:117] "RemoveContainer" containerID="a3c354c813be5f1722b1a655cc8699a8cd434119cc5b3bea3551d94fd6d1ce5a" Dec 06 10:06:00 crc kubenswrapper[4954]: I1206 10:06:00.452228 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4jn84"] Dec 06 10:06:00 crc kubenswrapper[4954]: I1206 10:06:00.465207 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4jn84"] Dec 06 10:06:00 crc kubenswrapper[4954]: I1206 10:06:00.481219 4954 scope.go:117] "RemoveContainer" containerID="ede712135015bfc9641b6ea2f687ef91e793b24f23ac61c9bb5c8d6584543b36" Dec 06 10:06:01 crc kubenswrapper[4954]: I1206 10:06:01.455177 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a907150-6f43-4ce1-9605-7dd27217f384" path="/var/lib/kubelet/pods/8a907150-6f43-4ce1-9605-7dd27217f384/volumes" Dec 06 10:06:05 crc kubenswrapper[4954]: I1206 10:06:05.453080 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:06:05 crc kubenswrapper[4954]: E1206 10:06:05.454205 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:06:16 crc kubenswrapper[4954]: I1206 10:06:16.443996 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:06:16 crc kubenswrapper[4954]: E1206 10:06:16.444774 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:06:31 crc kubenswrapper[4954]: I1206 10:06:31.444517 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:06:31 crc kubenswrapper[4954]: E1206 10:06:31.445550 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:06:32 crc kubenswrapper[4954]: I1206 10:06:32.010736 4954 scope.go:117] "RemoveContainer" containerID="85d3d67cf2e7c4a8ca306645a3026d3abc7a6a9b2302357b9580eeb9da9a729f" Dec 06 10:06:42 crc kubenswrapper[4954]: I1206 10:06:42.444361 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:06:42 crc kubenswrapper[4954]: E1206 10:06:42.445147 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.382652 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-w5qpq"] Dec 06 10:06:49 crc kubenswrapper[4954]: E1206 10:06:49.385053 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a907150-6f43-4ce1-9605-7dd27217f384" containerName="registry-server" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.385204 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a907150-6f43-4ce1-9605-7dd27217f384" containerName="registry-server" Dec 06 10:06:49 crc kubenswrapper[4954]: E1206 10:06:49.385306 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93459e7d-0be5-472b-a304-5d08f48ab6a5" containerName="swift-ring-rebalance" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.385419 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="93459e7d-0be5-472b-a304-5d08f48ab6a5" containerName="swift-ring-rebalance" Dec 06 10:06:49 crc kubenswrapper[4954]: E1206 10:06:49.385531 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a907150-6f43-4ce1-9605-7dd27217f384" containerName="extract-content" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.385649 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a907150-6f43-4ce1-9605-7dd27217f384" containerName="extract-content" Dec 06 10:06:49 crc kubenswrapper[4954]: E1206 10:06:49.385748 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a907150-6f43-4ce1-9605-7dd27217f384" containerName="extract-utilities" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.385836 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a907150-6f43-4ce1-9605-7dd27217f384" containerName="extract-utilities" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.386271 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="93459e7d-0be5-472b-a304-5d08f48ab6a5" containerName="swift-ring-rebalance" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.386389 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a907150-6f43-4ce1-9605-7dd27217f384" containerName="registry-server" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.387527 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.390079 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.390519 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.396744 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-w5qpq"] Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.534922 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3475b0a3-3159-4734-a75c-47a7c70aca60-scripts\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.535209 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3475b0a3-3159-4734-a75c-47a7c70aca60-ring-data-devices\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.535275 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s7n7\" (UniqueName: \"kubernetes.io/projected/3475b0a3-3159-4734-a75c-47a7c70aca60-kube-api-access-2s7n7\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.535369 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3475b0a3-3159-4734-a75c-47a7c70aca60-dispersionconf\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.535411 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3475b0a3-3159-4734-a75c-47a7c70aca60-swiftconf\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.535481 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3475b0a3-3159-4734-a75c-47a7c70aca60-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.535678 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3475b0a3-3159-4734-a75c-47a7c70aca60-etc-swift\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.638023 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3475b0a3-3159-4734-a75c-47a7c70aca60-etc-swift\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.638110 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3475b0a3-3159-4734-a75c-47a7c70aca60-scripts\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.638140 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3475b0a3-3159-4734-a75c-47a7c70aca60-ring-data-devices\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.638193 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s7n7\" (UniqueName: \"kubernetes.io/projected/3475b0a3-3159-4734-a75c-47a7c70aca60-kube-api-access-2s7n7\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.638266 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3475b0a3-3159-4734-a75c-47a7c70aca60-dispersionconf\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.638293 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3475b0a3-3159-4734-a75c-47a7c70aca60-swiftconf\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.638325 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3475b0a3-3159-4734-a75c-47a7c70aca60-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.638794 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3475b0a3-3159-4734-a75c-47a7c70aca60-etc-swift\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.640245 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3475b0a3-3159-4734-a75c-47a7c70aca60-scripts\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.641320 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3475b0a3-3159-4734-a75c-47a7c70aca60-ring-data-devices\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.649919 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3475b0a3-3159-4734-a75c-47a7c70aca60-swiftconf\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.655296 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3475b0a3-3159-4734-a75c-47a7c70aca60-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.656883 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3475b0a3-3159-4734-a75c-47a7c70aca60-dispersionconf\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.661030 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s7n7\" (UniqueName: \"kubernetes.io/projected/3475b0a3-3159-4734-a75c-47a7c70aca60-kube-api-access-2s7n7\") pod \"swift-ring-rebalance-debug-w5qpq\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:49 crc kubenswrapper[4954]: I1206 10:06:49.707226 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:06:50 crc kubenswrapper[4954]: I1206 10:06:50.477172 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-w5qpq"] Dec 06 10:06:50 crc kubenswrapper[4954]: I1206 10:06:50.990621 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-w5qpq" event={"ID":"3475b0a3-3159-4734-a75c-47a7c70aca60","Type":"ContainerStarted","Data":"2cb2e0aa370ab9f0741ae11d86b39e5b76a3921aeea935079b746baac62f57de"} Dec 06 10:06:50 crc kubenswrapper[4954]: I1206 10:06:50.990967 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-w5qpq" event={"ID":"3475b0a3-3159-4734-a75c-47a7c70aca60","Type":"ContainerStarted","Data":"0beaaecd77ee1cb76b2863aded06e22851598ed4c1d6854c1a823f40d3f6bc26"} Dec 06 10:06:51 crc kubenswrapper[4954]: I1206 10:06:51.013450 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-w5qpq" podStartSLOduration=2.013419247 podStartE2EDuration="2.013419247s" podCreationTimestamp="2025-12-06 10:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:06:51.007260082 +0000 UTC m=+11385.820619471" watchObservedRunningTime="2025-12-06 10:06:51.013419247 +0000 UTC m=+11385.826778636" Dec 06 10:06:57 crc kubenswrapper[4954]: I1206 10:06:57.443736 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:06:57 crc kubenswrapper[4954]: E1206 10:06:57.444634 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:07:02 crc kubenswrapper[4954]: I1206 10:07:02.098305 4954 generic.go:334] "Generic (PLEG): container finished" podID="3475b0a3-3159-4734-a75c-47a7c70aca60" containerID="2cb2e0aa370ab9f0741ae11d86b39e5b76a3921aeea935079b746baac62f57de" exitCode=0 Dec 06 10:07:02 crc kubenswrapper[4954]: I1206 10:07:02.098387 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-w5qpq" event={"ID":"3475b0a3-3159-4734-a75c-47a7c70aca60","Type":"ContainerDied","Data":"2cb2e0aa370ab9f0741ae11d86b39e5b76a3921aeea935079b746baac62f57de"} Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.702797 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.740077 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-w5qpq"] Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.754354 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-w5qpq"] Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.875743 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s7n7\" (UniqueName: \"kubernetes.io/projected/3475b0a3-3159-4734-a75c-47a7c70aca60-kube-api-access-2s7n7\") pod \"3475b0a3-3159-4734-a75c-47a7c70aca60\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.875798 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3475b0a3-3159-4734-a75c-47a7c70aca60-ring-data-devices\") pod \"3475b0a3-3159-4734-a75c-47a7c70aca60\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.875899 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3475b0a3-3159-4734-a75c-47a7c70aca60-combined-ca-bundle\") pod \"3475b0a3-3159-4734-a75c-47a7c70aca60\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.876010 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3475b0a3-3159-4734-a75c-47a7c70aca60-scripts\") pod \"3475b0a3-3159-4734-a75c-47a7c70aca60\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.876073 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3475b0a3-3159-4734-a75c-47a7c70aca60-swiftconf\") pod \"3475b0a3-3159-4734-a75c-47a7c70aca60\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.876103 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3475b0a3-3159-4734-a75c-47a7c70aca60-etc-swift\") pod \"3475b0a3-3159-4734-a75c-47a7c70aca60\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.876123 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3475b0a3-3159-4734-a75c-47a7c70aca60-dispersionconf\") pod \"3475b0a3-3159-4734-a75c-47a7c70aca60\" (UID: \"3475b0a3-3159-4734-a75c-47a7c70aca60\") " Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.877494 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3475b0a3-3159-4734-a75c-47a7c70aca60-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3475b0a3-3159-4734-a75c-47a7c70aca60" (UID: "3475b0a3-3159-4734-a75c-47a7c70aca60"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.878108 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3475b0a3-3159-4734-a75c-47a7c70aca60-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3475b0a3-3159-4734-a75c-47a7c70aca60" (UID: "3475b0a3-3159-4734-a75c-47a7c70aca60"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.884789 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3475b0a3-3159-4734-a75c-47a7c70aca60-kube-api-access-2s7n7" (OuterVolumeSpecName: "kube-api-access-2s7n7") pod "3475b0a3-3159-4734-a75c-47a7c70aca60" (UID: "3475b0a3-3159-4734-a75c-47a7c70aca60"). InnerVolumeSpecName "kube-api-access-2s7n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.914688 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3475b0a3-3159-4734-a75c-47a7c70aca60-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3475b0a3-3159-4734-a75c-47a7c70aca60" (UID: "3475b0a3-3159-4734-a75c-47a7c70aca60"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.921918 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3475b0a3-3159-4734-a75c-47a7c70aca60-scripts" (OuterVolumeSpecName: "scripts") pod "3475b0a3-3159-4734-a75c-47a7c70aca60" (UID: "3475b0a3-3159-4734-a75c-47a7c70aca60"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.936902 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3475b0a3-3159-4734-a75c-47a7c70aca60-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3475b0a3-3159-4734-a75c-47a7c70aca60" (UID: "3475b0a3-3159-4734-a75c-47a7c70aca60"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.973758 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3475b0a3-3159-4734-a75c-47a7c70aca60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3475b0a3-3159-4734-a75c-47a7c70aca60" (UID: "3475b0a3-3159-4734-a75c-47a7c70aca60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.978583 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3475b0a3-3159-4734-a75c-47a7c70aca60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.979289 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3475b0a3-3159-4734-a75c-47a7c70aca60-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.979381 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3475b0a3-3159-4734-a75c-47a7c70aca60-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.979468 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3475b0a3-3159-4734-a75c-47a7c70aca60-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.979545 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3475b0a3-3159-4734-a75c-47a7c70aca60-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.979684 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s7n7\" (UniqueName: \"kubernetes.io/projected/3475b0a3-3159-4734-a75c-47a7c70aca60-kube-api-access-2s7n7\") on node \"crc\" DevicePath \"\"" Dec 06 10:07:04 crc kubenswrapper[4954]: I1206 10:07:04.979783 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3475b0a3-3159-4734-a75c-47a7c70aca60-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 10:07:05 crc kubenswrapper[4954]: I1206 10:07:05.132755 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0beaaecd77ee1cb76b2863aded06e22851598ed4c1d6854c1a823f40d3f6bc26" Dec 06 10:07:05 crc kubenswrapper[4954]: I1206 10:07:05.133020 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-w5qpq" Dec 06 10:07:05 crc kubenswrapper[4954]: I1206 10:07:05.463294 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3475b0a3-3159-4734-a75c-47a7c70aca60" path="/var/lib/kubelet/pods/3475b0a3-3159-4734-a75c-47a7c70aca60/volumes" Dec 06 10:07:09 crc kubenswrapper[4954]: I1206 10:07:09.444798 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:07:09 crc kubenswrapper[4954]: E1206 10:07:09.445980 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:07:24 crc kubenswrapper[4954]: I1206 10:07:24.443473 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:07:24 crc kubenswrapper[4954]: E1206 10:07:24.444291 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:07:32 crc kubenswrapper[4954]: I1206 10:07:32.121064 4954 scope.go:117] "RemoveContainer" containerID="4a0fa9af378e124c36b0ef5dcedf96c573d59c556299926d25525b4eac28a312" Dec 06 10:07:35 crc kubenswrapper[4954]: I1206 10:07:35.797274 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xvdkc"] Dec 06 10:07:35 crc kubenswrapper[4954]: E1206 10:07:35.798287 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3475b0a3-3159-4734-a75c-47a7c70aca60" containerName="swift-ring-rebalance" Dec 06 10:07:35 crc kubenswrapper[4954]: I1206 10:07:35.798299 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3475b0a3-3159-4734-a75c-47a7c70aca60" containerName="swift-ring-rebalance" Dec 06 10:07:35 crc kubenswrapper[4954]: I1206 10:07:35.798588 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3475b0a3-3159-4734-a75c-47a7c70aca60" containerName="swift-ring-rebalance" Dec 06 10:07:35 crc kubenswrapper[4954]: I1206 10:07:35.800651 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvdkc" Dec 06 10:07:35 crc kubenswrapper[4954]: I1206 10:07:35.834727 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvdkc"] Dec 06 10:07:35 crc kubenswrapper[4954]: I1206 10:07:35.963992 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5d2801-7809-4385-9a28-f742bdad7618-utilities\") pod \"redhat-marketplace-xvdkc\" (UID: \"df5d2801-7809-4385-9a28-f742bdad7618\") " pod="openshift-marketplace/redhat-marketplace-xvdkc" Dec 06 10:07:35 crc kubenswrapper[4954]: I1206 10:07:35.964074 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5d2801-7809-4385-9a28-f742bdad7618-catalog-content\") pod \"redhat-marketplace-xvdkc\" (UID: \"df5d2801-7809-4385-9a28-f742bdad7618\") " pod="openshift-marketplace/redhat-marketplace-xvdkc" Dec 06 10:07:35 crc kubenswrapper[4954]: I1206 10:07:35.964399 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmd6w\" (UniqueName: \"kubernetes.io/projected/df5d2801-7809-4385-9a28-f742bdad7618-kube-api-access-fmd6w\") pod \"redhat-marketplace-xvdkc\" (UID: \"df5d2801-7809-4385-9a28-f742bdad7618\") " pod="openshift-marketplace/redhat-marketplace-xvdkc" Dec 06 10:07:36 crc kubenswrapper[4954]: I1206 10:07:36.065986 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmd6w\" (UniqueName: \"kubernetes.io/projected/df5d2801-7809-4385-9a28-f742bdad7618-kube-api-access-fmd6w\") pod \"redhat-marketplace-xvdkc\" (UID: \"df5d2801-7809-4385-9a28-f742bdad7618\") " pod="openshift-marketplace/redhat-marketplace-xvdkc" Dec 06 10:07:36 crc kubenswrapper[4954]: I1206 10:07:36.066058 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5d2801-7809-4385-9a28-f742bdad7618-utilities\") pod \"redhat-marketplace-xvdkc\" (UID: \"df5d2801-7809-4385-9a28-f742bdad7618\") " pod="openshift-marketplace/redhat-marketplace-xvdkc" Dec 06 10:07:36 crc kubenswrapper[4954]: I1206 10:07:36.066137 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5d2801-7809-4385-9a28-f742bdad7618-catalog-content\") pod \"redhat-marketplace-xvdkc\" (UID: \"df5d2801-7809-4385-9a28-f742bdad7618\") " pod="openshift-marketplace/redhat-marketplace-xvdkc" Dec 06 10:07:36 crc kubenswrapper[4954]: I1206 10:07:36.066645 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5d2801-7809-4385-9a28-f742bdad7618-utilities\") pod \"redhat-marketplace-xvdkc\" (UID: \"df5d2801-7809-4385-9a28-f742bdad7618\") " pod="openshift-marketplace/redhat-marketplace-xvdkc" Dec 06 10:07:36 crc kubenswrapper[4954]: I1206 10:07:36.066765 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5d2801-7809-4385-9a28-f742bdad7618-catalog-content\") pod \"redhat-marketplace-xvdkc\" (UID: \"df5d2801-7809-4385-9a28-f742bdad7618\") " pod="openshift-marketplace/redhat-marketplace-xvdkc" Dec 06 10:07:36 crc kubenswrapper[4954]: I1206 10:07:36.088954 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmd6w\" (UniqueName: \"kubernetes.io/projected/df5d2801-7809-4385-9a28-f742bdad7618-kube-api-access-fmd6w\") pod \"redhat-marketplace-xvdkc\" (UID: \"df5d2801-7809-4385-9a28-f742bdad7618\") " pod="openshift-marketplace/redhat-marketplace-xvdkc" Dec 06 10:07:36 crc kubenswrapper[4954]: I1206 10:07:36.121166 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvdkc" Dec 06 10:07:36 crc kubenswrapper[4954]: I1206 10:07:36.443830 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:07:36 crc kubenswrapper[4954]: E1206 10:07:36.444635 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:07:36 crc kubenswrapper[4954]: I1206 10:07:36.881313 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvdkc"] Dec 06 10:07:37 crc kubenswrapper[4954]: I1206 10:07:37.499772 4954 generic.go:334] "Generic (PLEG): container finished" podID="df5d2801-7809-4385-9a28-f742bdad7618" containerID="18cd866eb56f8dd7ec26898e1c3229a05f6863c3b754f8fc2d2035c57db17f3a" exitCode=0 Dec 06 10:07:37 crc kubenswrapper[4954]: I1206 10:07:37.499864 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvdkc" event={"ID":"df5d2801-7809-4385-9a28-f742bdad7618","Type":"ContainerDied","Data":"18cd866eb56f8dd7ec26898e1c3229a05f6863c3b754f8fc2d2035c57db17f3a"} Dec 06 10:07:37 crc kubenswrapper[4954]: I1206 10:07:37.500157 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvdkc" event={"ID":"df5d2801-7809-4385-9a28-f742bdad7618","Type":"ContainerStarted","Data":"2e9aecd3ab52c191b3822672ff26bd5309e8994eea558e5f9b8edbc3fe09789d"} Dec 06 10:07:37 crc kubenswrapper[4954]: I1206 10:07:37.501774 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 10:07:38 crc kubenswrapper[4954]: I1206 10:07:38.512444 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvdkc" event={"ID":"df5d2801-7809-4385-9a28-f742bdad7618","Type":"ContainerStarted","Data":"9f1d3f7cfdc5f53c132968008e62641ef94a08ae0becbf4a37157a5c4a95058f"} Dec 06 10:07:39 crc kubenswrapper[4954]: I1206 10:07:39.534068 4954 generic.go:334] "Generic (PLEG): container finished" podID="df5d2801-7809-4385-9a28-f742bdad7618" containerID="9f1d3f7cfdc5f53c132968008e62641ef94a08ae0becbf4a37157a5c4a95058f" exitCode=0 Dec 06 10:07:39 crc kubenswrapper[4954]: I1206 10:07:39.534112 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvdkc" event={"ID":"df5d2801-7809-4385-9a28-f742bdad7618","Type":"ContainerDied","Data":"9f1d3f7cfdc5f53c132968008e62641ef94a08ae0becbf4a37157a5c4a95058f"} Dec 06 10:07:40 crc kubenswrapper[4954]: I1206 10:07:40.547505 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvdkc" event={"ID":"df5d2801-7809-4385-9a28-f742bdad7618","Type":"ContainerStarted","Data":"85b59352e77cd8029fa8f4b75a8cd5b37ab05a246c09afdcd9513fe4ea11b32e"} Dec 06 10:07:40 crc kubenswrapper[4954]: I1206 10:07:40.566699 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xvdkc" podStartSLOduration=3.151528906 podStartE2EDuration="5.566676449s" podCreationTimestamp="2025-12-06 10:07:35 +0000 UTC" firstStartedPulling="2025-12-06 10:07:37.501539469 +0000 UTC m=+11432.314898858" lastFinishedPulling="2025-12-06 10:07:39.916687012 +0000 UTC m=+11434.730046401" observedRunningTime="2025-12-06 10:07:40.563000701 +0000 UTC m=+11435.376360110" watchObservedRunningTime="2025-12-06 10:07:40.566676449 +0000 UTC m=+11435.380035838" Dec 06 10:07:46 crc kubenswrapper[4954]: I1206 10:07:46.121939 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xvdkc" Dec 06 10:07:46 crc kubenswrapper[4954]: I1206 10:07:46.122594 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xvdkc" Dec 06 10:07:46 crc kubenswrapper[4954]: I1206 10:07:46.180450 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xvdkc" Dec 06 10:07:46 crc kubenswrapper[4954]: I1206 10:07:46.661042 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xvdkc" Dec 06 10:07:46 crc kubenswrapper[4954]: I1206 10:07:46.706956 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvdkc"] Dec 06 10:07:48 crc kubenswrapper[4954]: I1206 10:07:48.628726 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xvdkc" podUID="df5d2801-7809-4385-9a28-f742bdad7618" containerName="registry-server" containerID="cri-o://85b59352e77cd8029fa8f4b75a8cd5b37ab05a246c09afdcd9513fe4ea11b32e" gracePeriod=2 Dec 06 10:07:49 crc kubenswrapper[4954]: I1206 10:07:49.643501 4954 generic.go:334] "Generic (PLEG): container finished" podID="df5d2801-7809-4385-9a28-f742bdad7618" containerID="85b59352e77cd8029fa8f4b75a8cd5b37ab05a246c09afdcd9513fe4ea11b32e" exitCode=0 Dec 06 10:07:49 crc kubenswrapper[4954]: I1206 10:07:49.643540 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvdkc" event={"ID":"df5d2801-7809-4385-9a28-f742bdad7618","Type":"ContainerDied","Data":"85b59352e77cd8029fa8f4b75a8cd5b37ab05a246c09afdcd9513fe4ea11b32e"} Dec 06 10:07:49 crc kubenswrapper[4954]: I1206 10:07:49.944809 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvdkc" Dec 06 10:07:50 crc kubenswrapper[4954]: I1206 10:07:50.051072 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmd6w\" (UniqueName: \"kubernetes.io/projected/df5d2801-7809-4385-9a28-f742bdad7618-kube-api-access-fmd6w\") pod \"df5d2801-7809-4385-9a28-f742bdad7618\" (UID: \"df5d2801-7809-4385-9a28-f742bdad7618\") " Dec 06 10:07:50 crc kubenswrapper[4954]: I1206 10:07:50.051136 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5d2801-7809-4385-9a28-f742bdad7618-utilities\") pod \"df5d2801-7809-4385-9a28-f742bdad7618\" (UID: \"df5d2801-7809-4385-9a28-f742bdad7618\") " Dec 06 10:07:50 crc kubenswrapper[4954]: I1206 10:07:50.051228 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5d2801-7809-4385-9a28-f742bdad7618-catalog-content\") pod \"df5d2801-7809-4385-9a28-f742bdad7618\" (UID: \"df5d2801-7809-4385-9a28-f742bdad7618\") " Dec 06 10:07:50 crc kubenswrapper[4954]: I1206 10:07:50.052536 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df5d2801-7809-4385-9a28-f742bdad7618-utilities" (OuterVolumeSpecName: "utilities") pod "df5d2801-7809-4385-9a28-f742bdad7618" (UID: "df5d2801-7809-4385-9a28-f742bdad7618"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:07:50 crc kubenswrapper[4954]: I1206 10:07:50.067463 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df5d2801-7809-4385-9a28-f742bdad7618-kube-api-access-fmd6w" (OuterVolumeSpecName: "kube-api-access-fmd6w") pod "df5d2801-7809-4385-9a28-f742bdad7618" (UID: "df5d2801-7809-4385-9a28-f742bdad7618"). InnerVolumeSpecName "kube-api-access-fmd6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:07:50 crc kubenswrapper[4954]: I1206 10:07:50.074073 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df5d2801-7809-4385-9a28-f742bdad7618-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df5d2801-7809-4385-9a28-f742bdad7618" (UID: "df5d2801-7809-4385-9a28-f742bdad7618"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:07:50 crc kubenswrapper[4954]: I1206 10:07:50.154277 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmd6w\" (UniqueName: \"kubernetes.io/projected/df5d2801-7809-4385-9a28-f742bdad7618-kube-api-access-fmd6w\") on node \"crc\" DevicePath \"\"" Dec 06 10:07:50 crc kubenswrapper[4954]: I1206 10:07:50.154315 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5d2801-7809-4385-9a28-f742bdad7618-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:07:50 crc kubenswrapper[4954]: I1206 10:07:50.154324 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5d2801-7809-4385-9a28-f742bdad7618-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:07:50 crc kubenswrapper[4954]: I1206 10:07:50.669090 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvdkc" event={"ID":"df5d2801-7809-4385-9a28-f742bdad7618","Type":"ContainerDied","Data":"2e9aecd3ab52c191b3822672ff26bd5309e8994eea558e5f9b8edbc3fe09789d"} Dec 06 10:07:50 crc kubenswrapper[4954]: I1206 10:07:50.669146 4954 scope.go:117] "RemoveContainer" containerID="85b59352e77cd8029fa8f4b75a8cd5b37ab05a246c09afdcd9513fe4ea11b32e" Dec 06 10:07:50 crc kubenswrapper[4954]: I1206 10:07:50.669158 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvdkc" Dec 06 10:07:50 crc kubenswrapper[4954]: I1206 10:07:50.702360 4954 scope.go:117] "RemoveContainer" containerID="9f1d3f7cfdc5f53c132968008e62641ef94a08ae0becbf4a37157a5c4a95058f" Dec 06 10:07:50 crc kubenswrapper[4954]: I1206 10:07:50.714961 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvdkc"] Dec 06 10:07:50 crc kubenswrapper[4954]: I1206 10:07:50.724706 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvdkc"] Dec 06 10:07:50 crc kubenswrapper[4954]: I1206 10:07:50.737992 4954 scope.go:117] "RemoveContainer" containerID="18cd866eb56f8dd7ec26898e1c3229a05f6863c3b754f8fc2d2035c57db17f3a" Dec 06 10:07:51 crc kubenswrapper[4954]: I1206 10:07:51.444991 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:07:51 crc kubenswrapper[4954]: E1206 10:07:51.445408 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:07:51 crc kubenswrapper[4954]: I1206 10:07:51.456881 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df5d2801-7809-4385-9a28-f742bdad7618" path="/var/lib/kubelet/pods/df5d2801-7809-4385-9a28-f742bdad7618/volumes" Dec 06 10:08:04 crc kubenswrapper[4954]: I1206 10:08:04.921138 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-s5h2c"] Dec 06 10:08:04 crc kubenswrapper[4954]: E1206 10:08:04.922257 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5d2801-7809-4385-9a28-f742bdad7618" containerName="extract-content" Dec 06 10:08:04 crc kubenswrapper[4954]: I1206 10:08:04.922273 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5d2801-7809-4385-9a28-f742bdad7618" containerName="extract-content" Dec 06 10:08:04 crc kubenswrapper[4954]: E1206 10:08:04.922303 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5d2801-7809-4385-9a28-f742bdad7618" containerName="extract-utilities" Dec 06 10:08:04 crc kubenswrapper[4954]: I1206 10:08:04.922312 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5d2801-7809-4385-9a28-f742bdad7618" containerName="extract-utilities" Dec 06 10:08:04 crc kubenswrapper[4954]: E1206 10:08:04.922374 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5d2801-7809-4385-9a28-f742bdad7618" containerName="registry-server" Dec 06 10:08:04 crc kubenswrapper[4954]: I1206 10:08:04.922383 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5d2801-7809-4385-9a28-f742bdad7618" containerName="registry-server" Dec 06 10:08:04 crc kubenswrapper[4954]: I1206 10:08:04.922672 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5d2801-7809-4385-9a28-f742bdad7618" containerName="registry-server" Dec 06 10:08:04 crc kubenswrapper[4954]: I1206 10:08:04.923613 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:04 crc kubenswrapper[4954]: I1206 10:08:04.929470 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 10:08:04 crc kubenswrapper[4954]: I1206 10:08:04.930698 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 10:08:04 crc kubenswrapper[4954]: I1206 10:08:04.937320 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-s5h2c"] Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.058531 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/08406709-abae-4631-85b9-69fd218c0634-etc-swift\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.058655 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08406709-abae-4631-85b9-69fd218c0634-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.058728 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/08406709-abae-4631-85b9-69fd218c0634-dispersionconf\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.058896 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08406709-abae-4631-85b9-69fd218c0634-scripts\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.058916 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/08406709-abae-4631-85b9-69fd218c0634-swiftconf\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.059185 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8h27\" (UniqueName: \"kubernetes.io/projected/08406709-abae-4631-85b9-69fd218c0634-kube-api-access-c8h27\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.059218 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/08406709-abae-4631-85b9-69fd218c0634-ring-data-devices\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.160794 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/08406709-abae-4631-85b9-69fd218c0634-etc-swift\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.161065 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08406709-abae-4631-85b9-69fd218c0634-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.161229 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/08406709-abae-4631-85b9-69fd218c0634-dispersionconf\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.161357 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/08406709-abae-4631-85b9-69fd218c0634-etc-swift\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.161485 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08406709-abae-4631-85b9-69fd218c0634-scripts\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.161587 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/08406709-abae-4631-85b9-69fd218c0634-swiftconf\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.161714 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8h27\" (UniqueName: \"kubernetes.io/projected/08406709-abae-4631-85b9-69fd218c0634-kube-api-access-c8h27\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.161811 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/08406709-abae-4631-85b9-69fd218c0634-ring-data-devices\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.163282 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/08406709-abae-4631-85b9-69fd218c0634-ring-data-devices\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.163397 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08406709-abae-4631-85b9-69fd218c0634-scripts\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.169137 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/08406709-abae-4631-85b9-69fd218c0634-swiftconf\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.169149 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/08406709-abae-4631-85b9-69fd218c0634-dispersionconf\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.170221 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08406709-abae-4631-85b9-69fd218c0634-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.189145 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8h27\" (UniqueName: \"kubernetes.io/projected/08406709-abae-4631-85b9-69fd218c0634-kube-api-access-c8h27\") pod \"swift-ring-rebalance-debug-s5h2c\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:05 crc kubenswrapper[4954]: I1206 10:08:05.274657 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:06 crc kubenswrapper[4954]: I1206 10:08:06.246322 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-s5h2c"] Dec 06 10:08:06 crc kubenswrapper[4954]: I1206 10:08:06.443137 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:08:06 crc kubenswrapper[4954]: E1206 10:08:06.443664 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:08:06 crc kubenswrapper[4954]: I1206 10:08:06.928834 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-s5h2c" event={"ID":"08406709-abae-4631-85b9-69fd218c0634","Type":"ContainerStarted","Data":"0dacb5f582e1b9282f4e2dd4ad76fc22bd2f53a365726b3deff2c831a9064945"} Dec 06 10:08:06 crc kubenswrapper[4954]: I1206 10:08:06.929175 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-s5h2c" event={"ID":"08406709-abae-4631-85b9-69fd218c0634","Type":"ContainerStarted","Data":"73f719484ddefc6992a1a2ffe1b78594976872208beeac279d7606838c42a3f7"} Dec 06 10:08:19 crc kubenswrapper[4954]: I1206 10:08:19.443934 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:08:20 crc kubenswrapper[4954]: I1206 10:08:20.065280 4954 generic.go:334] "Generic (PLEG): container finished" podID="08406709-abae-4631-85b9-69fd218c0634" containerID="0dacb5f582e1b9282f4e2dd4ad76fc22bd2f53a365726b3deff2c831a9064945" exitCode=0 Dec 06 10:08:20 crc kubenswrapper[4954]: I1206 10:08:20.065336 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-s5h2c" event={"ID":"08406709-abae-4631-85b9-69fd218c0634","Type":"ContainerDied","Data":"0dacb5f582e1b9282f4e2dd4ad76fc22bd2f53a365726b3deff2c831a9064945"} Dec 06 10:08:20 crc kubenswrapper[4954]: I1206 10:08:20.075136 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"247a93ffc994912c409c703902464a29e6833755e47687bdbdff1a3dbad9101f"} Dec 06 10:08:22 crc kubenswrapper[4954]: I1206 10:08:22.991320 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.040227 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-s5h2c"] Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.056823 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-s5h2c"] Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.108286 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73f719484ddefc6992a1a2ffe1b78594976872208beeac279d7606838c42a3f7" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.108354 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-s5h2c" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.174690 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/08406709-abae-4631-85b9-69fd218c0634-ring-data-devices\") pod \"08406709-abae-4631-85b9-69fd218c0634\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.174780 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/08406709-abae-4631-85b9-69fd218c0634-dispersionconf\") pod \"08406709-abae-4631-85b9-69fd218c0634\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.174803 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08406709-abae-4631-85b9-69fd218c0634-scripts\") pod \"08406709-abae-4631-85b9-69fd218c0634\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.174931 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/08406709-abae-4631-85b9-69fd218c0634-swiftconf\") pod \"08406709-abae-4631-85b9-69fd218c0634\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.174985 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/08406709-abae-4631-85b9-69fd218c0634-etc-swift\") pod \"08406709-abae-4631-85b9-69fd218c0634\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.175074 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8h27\" (UniqueName: \"kubernetes.io/projected/08406709-abae-4631-85b9-69fd218c0634-kube-api-access-c8h27\") pod \"08406709-abae-4631-85b9-69fd218c0634\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.175163 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08406709-abae-4631-85b9-69fd218c0634-combined-ca-bundle\") pod \"08406709-abae-4631-85b9-69fd218c0634\" (UID: \"08406709-abae-4631-85b9-69fd218c0634\") " Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.175730 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08406709-abae-4631-85b9-69fd218c0634-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "08406709-abae-4631-85b9-69fd218c0634" (UID: "08406709-abae-4631-85b9-69fd218c0634"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.176940 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08406709-abae-4631-85b9-69fd218c0634-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "08406709-abae-4631-85b9-69fd218c0634" (UID: "08406709-abae-4631-85b9-69fd218c0634"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.199666 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08406709-abae-4631-85b9-69fd218c0634-kube-api-access-c8h27" (OuterVolumeSpecName: "kube-api-access-c8h27") pod "08406709-abae-4631-85b9-69fd218c0634" (UID: "08406709-abae-4631-85b9-69fd218c0634"). InnerVolumeSpecName "kube-api-access-c8h27". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.218002 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08406709-abae-4631-85b9-69fd218c0634-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08406709-abae-4631-85b9-69fd218c0634" (UID: "08406709-abae-4631-85b9-69fd218c0634"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.241787 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08406709-abae-4631-85b9-69fd218c0634-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "08406709-abae-4631-85b9-69fd218c0634" (UID: "08406709-abae-4631-85b9-69fd218c0634"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.261242 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08406709-abae-4631-85b9-69fd218c0634-scripts" (OuterVolumeSpecName: "scripts") pod "08406709-abae-4631-85b9-69fd218c0634" (UID: "08406709-abae-4631-85b9-69fd218c0634"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.265730 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08406709-abae-4631-85b9-69fd218c0634-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "08406709-abae-4631-85b9-69fd218c0634" (UID: "08406709-abae-4631-85b9-69fd218c0634"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.277137 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08406709-abae-4631-85b9-69fd218c0634-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.277167 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/08406709-abae-4631-85b9-69fd218c0634-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.277278 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/08406709-abae-4631-85b9-69fd218c0634-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.277291 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8h27\" (UniqueName: \"kubernetes.io/projected/08406709-abae-4631-85b9-69fd218c0634-kube-api-access-c8h27\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.277300 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08406709-abae-4631-85b9-69fd218c0634-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.277308 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/08406709-abae-4631-85b9-69fd218c0634-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.277338 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/08406709-abae-4631-85b9-69fd218c0634-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.516800 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08406709-abae-4631-85b9-69fd218c0634" path="/var/lib/kubelet/pods/08406709-abae-4631-85b9-69fd218c0634/volumes" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.517451 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-phnv2"] Dec 06 10:08:23 crc kubenswrapper[4954]: E1206 10:08:23.519205 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08406709-abae-4631-85b9-69fd218c0634" containerName="swift-ring-rebalance" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.519231 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="08406709-abae-4631-85b9-69fd218c0634" containerName="swift-ring-rebalance" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.519496 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="08406709-abae-4631-85b9-69fd218c0634" containerName="swift-ring-rebalance" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.520392 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-phnv2"] Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.520500 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.522311 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.523235 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.615018 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91e95b00-60bb-4f8a-933e-47bd1f36c336-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.615262 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/91e95b00-60bb-4f8a-933e-47bd1f36c336-swiftconf\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.615459 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/91e95b00-60bb-4f8a-933e-47bd1f36c336-ring-data-devices\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.615829 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/91e95b00-60bb-4f8a-933e-47bd1f36c336-etc-swift\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.616104 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c969s\" (UniqueName: \"kubernetes.io/projected/91e95b00-60bb-4f8a-933e-47bd1f36c336-kube-api-access-c969s\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.616233 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/91e95b00-60bb-4f8a-933e-47bd1f36c336-dispersionconf\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.616340 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91e95b00-60bb-4f8a-933e-47bd1f36c336-scripts\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.718028 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91e95b00-60bb-4f8a-933e-47bd1f36c336-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.718386 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/91e95b00-60bb-4f8a-933e-47bd1f36c336-swiftconf\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.718409 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/91e95b00-60bb-4f8a-933e-47bd1f36c336-ring-data-devices\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.718472 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/91e95b00-60bb-4f8a-933e-47bd1f36c336-etc-swift\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.718526 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c969s\" (UniqueName: \"kubernetes.io/projected/91e95b00-60bb-4f8a-933e-47bd1f36c336-kube-api-access-c969s\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.718546 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/91e95b00-60bb-4f8a-933e-47bd1f36c336-dispersionconf\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.718583 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91e95b00-60bb-4f8a-933e-47bd1f36c336-scripts\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.719353 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91e95b00-60bb-4f8a-933e-47bd1f36c336-scripts\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.719496 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/91e95b00-60bb-4f8a-933e-47bd1f36c336-ring-data-devices\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.719611 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/91e95b00-60bb-4f8a-933e-47bd1f36c336-etc-swift\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.724538 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/91e95b00-60bb-4f8a-933e-47bd1f36c336-swiftconf\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.724575 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/91e95b00-60bb-4f8a-933e-47bd1f36c336-dispersionconf\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.724749 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91e95b00-60bb-4f8a-933e-47bd1f36c336-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.735493 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c969s\" (UniqueName: \"kubernetes.io/projected/91e95b00-60bb-4f8a-933e-47bd1f36c336-kube-api-access-c969s\") pod \"swift-ring-rebalance-debug-phnv2\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:23 crc kubenswrapper[4954]: I1206 10:08:23.860389 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:24 crc kubenswrapper[4954]: I1206 10:08:24.643352 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-phnv2"] Dec 06 10:08:25 crc kubenswrapper[4954]: I1206 10:08:25.131228 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-phnv2" event={"ID":"91e95b00-60bb-4f8a-933e-47bd1f36c336","Type":"ContainerStarted","Data":"ed920bb85f510b053b00bae050c4ea96fc2448c010e61a1ab30c164a9abb654b"} Dec 06 10:08:25 crc kubenswrapper[4954]: I1206 10:08:25.131526 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-phnv2" event={"ID":"91e95b00-60bb-4f8a-933e-47bd1f36c336","Type":"ContainerStarted","Data":"d7c77fd704d604b75f3d36297760920ccdddc0c2714641217bd8dbf199411149"} Dec 06 10:08:31 crc kubenswrapper[4954]: I1206 10:08:31.191930 4954 generic.go:334] "Generic (PLEG): container finished" podID="91e95b00-60bb-4f8a-933e-47bd1f36c336" containerID="ed920bb85f510b053b00bae050c4ea96fc2448c010e61a1ab30c164a9abb654b" exitCode=0 Dec 06 10:08:31 crc kubenswrapper[4954]: I1206 10:08:31.192438 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-phnv2" event={"ID":"91e95b00-60bb-4f8a-933e-47bd1f36c336","Type":"ContainerDied","Data":"ed920bb85f510b053b00bae050c4ea96fc2448c010e61a1ab30c164a9abb654b"} Dec 06 10:08:33 crc kubenswrapper[4954]: I1206 10:08:33.867043 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:33 crc kubenswrapper[4954]: I1206 10:08:33.913383 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-phnv2"] Dec 06 10:08:33 crc kubenswrapper[4954]: I1206 10:08:33.918165 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-phnv2"] Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.023390 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91e95b00-60bb-4f8a-933e-47bd1f36c336-combined-ca-bundle\") pod \"91e95b00-60bb-4f8a-933e-47bd1f36c336\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.023747 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/91e95b00-60bb-4f8a-933e-47bd1f36c336-dispersionconf\") pod \"91e95b00-60bb-4f8a-933e-47bd1f36c336\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.023809 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/91e95b00-60bb-4f8a-933e-47bd1f36c336-etc-swift\") pod \"91e95b00-60bb-4f8a-933e-47bd1f36c336\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.023892 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/91e95b00-60bb-4f8a-933e-47bd1f36c336-ring-data-devices\") pod \"91e95b00-60bb-4f8a-933e-47bd1f36c336\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.023969 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/91e95b00-60bb-4f8a-933e-47bd1f36c336-swiftconf\") pod \"91e95b00-60bb-4f8a-933e-47bd1f36c336\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.024007 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91e95b00-60bb-4f8a-933e-47bd1f36c336-scripts\") pod \"91e95b00-60bb-4f8a-933e-47bd1f36c336\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.024074 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c969s\" (UniqueName: \"kubernetes.io/projected/91e95b00-60bb-4f8a-933e-47bd1f36c336-kube-api-access-c969s\") pod \"91e95b00-60bb-4f8a-933e-47bd1f36c336\" (UID: \"91e95b00-60bb-4f8a-933e-47bd1f36c336\") " Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.025276 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91e95b00-60bb-4f8a-933e-47bd1f36c336-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "91e95b00-60bb-4f8a-933e-47bd1f36c336" (UID: "91e95b00-60bb-4f8a-933e-47bd1f36c336"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.026289 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91e95b00-60bb-4f8a-933e-47bd1f36c336-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "91e95b00-60bb-4f8a-933e-47bd1f36c336" (UID: "91e95b00-60bb-4f8a-933e-47bd1f36c336"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.046826 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91e95b00-60bb-4f8a-933e-47bd1f36c336-kube-api-access-c969s" (OuterVolumeSpecName: "kube-api-access-c969s") pod "91e95b00-60bb-4f8a-933e-47bd1f36c336" (UID: "91e95b00-60bb-4f8a-933e-47bd1f36c336"). InnerVolumeSpecName "kube-api-access-c969s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.073156 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91e95b00-60bb-4f8a-933e-47bd1f36c336-scripts" (OuterVolumeSpecName: "scripts") pod "91e95b00-60bb-4f8a-933e-47bd1f36c336" (UID: "91e95b00-60bb-4f8a-933e-47bd1f36c336"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.078800 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91e95b00-60bb-4f8a-933e-47bd1f36c336-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "91e95b00-60bb-4f8a-933e-47bd1f36c336" (UID: "91e95b00-60bb-4f8a-933e-47bd1f36c336"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.083957 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91e95b00-60bb-4f8a-933e-47bd1f36c336-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "91e95b00-60bb-4f8a-933e-47bd1f36c336" (UID: "91e95b00-60bb-4f8a-933e-47bd1f36c336"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.102649 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91e95b00-60bb-4f8a-933e-47bd1f36c336-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91e95b00-60bb-4f8a-933e-47bd1f36c336" (UID: "91e95b00-60bb-4f8a-933e-47bd1f36c336"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.126393 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91e95b00-60bb-4f8a-933e-47bd1f36c336-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.126437 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/91e95b00-60bb-4f8a-933e-47bd1f36c336-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.126446 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/91e95b00-60bb-4f8a-933e-47bd1f36c336-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.126454 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/91e95b00-60bb-4f8a-933e-47bd1f36c336-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.126463 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/91e95b00-60bb-4f8a-933e-47bd1f36c336-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.126498 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91e95b00-60bb-4f8a-933e-47bd1f36c336-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.126508 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c969s\" (UniqueName: \"kubernetes.io/projected/91e95b00-60bb-4f8a-933e-47bd1f36c336-kube-api-access-c969s\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.245772 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7c77fd704d604b75f3d36297760920ccdddc0c2714641217bd8dbf199411149" Dec 06 10:08:34 crc kubenswrapper[4954]: I1206 10:08:34.245788 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-phnv2" Dec 06 10:08:35 crc kubenswrapper[4954]: I1206 10:08:35.455956 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91e95b00-60bb-4f8a-933e-47bd1f36c336" path="/var/lib/kubelet/pods/91e95b00-60bb-4f8a-933e-47bd1f36c336/volumes" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.526154 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-bsns4"] Dec 06 10:08:37 crc kubenswrapper[4954]: E1206 10:08:37.528654 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e95b00-60bb-4f8a-933e-47bd1f36c336" containerName="swift-ring-rebalance" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.528679 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e95b00-60bb-4f8a-933e-47bd1f36c336" containerName="swift-ring-rebalance" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.528919 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="91e95b00-60bb-4f8a-933e-47bd1f36c336" containerName="swift-ring-rebalance" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.529675 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.532433 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.541138 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.544236 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-bsns4"] Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.722700 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/037710e0-a7fd-43e5-9b43-bbf872e102e8-scripts\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.723069 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/037710e0-a7fd-43e5-9b43-bbf872e102e8-swiftconf\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.723240 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037710e0-a7fd-43e5-9b43-bbf872e102e8-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.723344 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/037710e0-a7fd-43e5-9b43-bbf872e102e8-dispersionconf\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.723448 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/037710e0-a7fd-43e5-9b43-bbf872e102e8-etc-swift\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.723532 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/037710e0-a7fd-43e5-9b43-bbf872e102e8-ring-data-devices\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.723627 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k55d4\" (UniqueName: \"kubernetes.io/projected/037710e0-a7fd-43e5-9b43-bbf872e102e8-kube-api-access-k55d4\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.826131 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/037710e0-a7fd-43e5-9b43-bbf872e102e8-dispersionconf\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.826210 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/037710e0-a7fd-43e5-9b43-bbf872e102e8-etc-swift\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.826230 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/037710e0-a7fd-43e5-9b43-bbf872e102e8-ring-data-devices\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.826251 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k55d4\" (UniqueName: \"kubernetes.io/projected/037710e0-a7fd-43e5-9b43-bbf872e102e8-kube-api-access-k55d4\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.826305 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/037710e0-a7fd-43e5-9b43-bbf872e102e8-scripts\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.826334 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/037710e0-a7fd-43e5-9b43-bbf872e102e8-swiftconf\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.826425 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037710e0-a7fd-43e5-9b43-bbf872e102e8-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.827419 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/037710e0-a7fd-43e5-9b43-bbf872e102e8-ring-data-devices\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.828316 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/037710e0-a7fd-43e5-9b43-bbf872e102e8-etc-swift\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.828519 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/037710e0-a7fd-43e5-9b43-bbf872e102e8-scripts\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.837163 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037710e0-a7fd-43e5-9b43-bbf872e102e8-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.844454 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/037710e0-a7fd-43e5-9b43-bbf872e102e8-swiftconf\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.857581 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/037710e0-a7fd-43e5-9b43-bbf872e102e8-dispersionconf\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:37 crc kubenswrapper[4954]: I1206 10:08:37.900289 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k55d4\" (UniqueName: \"kubernetes.io/projected/037710e0-a7fd-43e5-9b43-bbf872e102e8-kube-api-access-k55d4\") pod \"swift-ring-rebalance-debug-bsns4\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:38 crc kubenswrapper[4954]: I1206 10:08:38.185086 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:39 crc kubenswrapper[4954]: I1206 10:08:39.151374 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-bsns4"] Dec 06 10:08:39 crc kubenswrapper[4954]: I1206 10:08:39.383621 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-bsns4" event={"ID":"037710e0-a7fd-43e5-9b43-bbf872e102e8","Type":"ContainerStarted","Data":"5a01f4b14e087bcfae6b230aefd8f9f3e020247f23bd416efedd445d9a98a348"} Dec 06 10:08:40 crc kubenswrapper[4954]: I1206 10:08:40.396010 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-bsns4" event={"ID":"037710e0-a7fd-43e5-9b43-bbf872e102e8","Type":"ContainerStarted","Data":"ae6e69204278cc49d3a4d078fe257fe7f46edd8c3b9c8b24a6545a65ae4c8d0a"} Dec 06 10:08:40 crc kubenswrapper[4954]: I1206 10:08:40.420488 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-bsns4" podStartSLOduration=3.420463877 podStartE2EDuration="3.420463877s" podCreationTimestamp="2025-12-06 10:08:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:08:40.410765939 +0000 UTC m=+11495.224125338" watchObservedRunningTime="2025-12-06 10:08:40.420463877 +0000 UTC m=+11495.233823266" Dec 06 10:08:49 crc kubenswrapper[4954]: I1206 10:08:49.497532 4954 generic.go:334] "Generic (PLEG): container finished" podID="037710e0-a7fd-43e5-9b43-bbf872e102e8" containerID="ae6e69204278cc49d3a4d078fe257fe7f46edd8c3b9c8b24a6545a65ae4c8d0a" exitCode=0 Dec 06 10:08:49 crc kubenswrapper[4954]: I1206 10:08:49.497684 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-bsns4" event={"ID":"037710e0-a7fd-43e5-9b43-bbf872e102e8","Type":"ContainerDied","Data":"ae6e69204278cc49d3a4d078fe257fe7f46edd8c3b9c8b24a6545a65ae4c8d0a"} Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.291671 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.327545 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-bsns4"] Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.336946 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-bsns4"] Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.347330 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/037710e0-a7fd-43e5-9b43-bbf872e102e8-ring-data-devices\") pod \"037710e0-a7fd-43e5-9b43-bbf872e102e8\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.347394 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/037710e0-a7fd-43e5-9b43-bbf872e102e8-scripts\") pod \"037710e0-a7fd-43e5-9b43-bbf872e102e8\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.347480 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/037710e0-a7fd-43e5-9b43-bbf872e102e8-swiftconf\") pod \"037710e0-a7fd-43e5-9b43-bbf872e102e8\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.347519 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037710e0-a7fd-43e5-9b43-bbf872e102e8-combined-ca-bundle\") pod \"037710e0-a7fd-43e5-9b43-bbf872e102e8\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.347547 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k55d4\" (UniqueName: \"kubernetes.io/projected/037710e0-a7fd-43e5-9b43-bbf872e102e8-kube-api-access-k55d4\") pod \"037710e0-a7fd-43e5-9b43-bbf872e102e8\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.347589 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/037710e0-a7fd-43e5-9b43-bbf872e102e8-etc-swift\") pod \"037710e0-a7fd-43e5-9b43-bbf872e102e8\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.347610 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/037710e0-a7fd-43e5-9b43-bbf872e102e8-dispersionconf\") pod \"037710e0-a7fd-43e5-9b43-bbf872e102e8\" (UID: \"037710e0-a7fd-43e5-9b43-bbf872e102e8\") " Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.348441 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/037710e0-a7fd-43e5-9b43-bbf872e102e8-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "037710e0-a7fd-43e5-9b43-bbf872e102e8" (UID: "037710e0-a7fd-43e5-9b43-bbf872e102e8"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.349250 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/037710e0-a7fd-43e5-9b43-bbf872e102e8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "037710e0-a7fd-43e5-9b43-bbf872e102e8" (UID: "037710e0-a7fd-43e5-9b43-bbf872e102e8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.363838 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/037710e0-a7fd-43e5-9b43-bbf872e102e8-kube-api-access-k55d4" (OuterVolumeSpecName: "kube-api-access-k55d4") pod "037710e0-a7fd-43e5-9b43-bbf872e102e8" (UID: "037710e0-a7fd-43e5-9b43-bbf872e102e8"). InnerVolumeSpecName "kube-api-access-k55d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.376337 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/037710e0-a7fd-43e5-9b43-bbf872e102e8-scripts" (OuterVolumeSpecName: "scripts") pod "037710e0-a7fd-43e5-9b43-bbf872e102e8" (UID: "037710e0-a7fd-43e5-9b43-bbf872e102e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.381745 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037710e0-a7fd-43e5-9b43-bbf872e102e8-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "037710e0-a7fd-43e5-9b43-bbf872e102e8" (UID: "037710e0-a7fd-43e5-9b43-bbf872e102e8"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.387299 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037710e0-a7fd-43e5-9b43-bbf872e102e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "037710e0-a7fd-43e5-9b43-bbf872e102e8" (UID: "037710e0-a7fd-43e5-9b43-bbf872e102e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.395734 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037710e0-a7fd-43e5-9b43-bbf872e102e8-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "037710e0-a7fd-43e5-9b43-bbf872e102e8" (UID: "037710e0-a7fd-43e5-9b43-bbf872e102e8"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.449928 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/037710e0-a7fd-43e5-9b43-bbf872e102e8-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.449973 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037710e0-a7fd-43e5-9b43-bbf872e102e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.449986 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k55d4\" (UniqueName: \"kubernetes.io/projected/037710e0-a7fd-43e5-9b43-bbf872e102e8-kube-api-access-k55d4\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.450000 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/037710e0-a7fd-43e5-9b43-bbf872e102e8-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.450012 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/037710e0-a7fd-43e5-9b43-bbf872e102e8-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.450023 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/037710e0-a7fd-43e5-9b43-bbf872e102e8-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.450034 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/037710e0-a7fd-43e5-9b43-bbf872e102e8-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.527090 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a01f4b14e087bcfae6b230aefd8f9f3e020247f23bd416efedd445d9a98a348" Dec 06 10:08:52 crc kubenswrapper[4954]: I1206 10:08:52.527161 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-bsns4" Dec 06 10:08:53 crc kubenswrapper[4954]: I1206 10:08:53.456480 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="037710e0-a7fd-43e5-9b43-bbf872e102e8" path="/var/lib/kubelet/pods/037710e0-a7fd-43e5-9b43-bbf872e102e8/volumes" Dec 06 10:09:32 crc kubenswrapper[4954]: I1206 10:09:32.287510 4954 scope.go:117] "RemoveContainer" containerID="9e8e1d32cc876a2a039eb292b37621c139e72b2ae14a3cbe8a49b57e7ba43ef0" Dec 06 10:09:32 crc kubenswrapper[4954]: I1206 10:09:32.322204 4954 scope.go:117] "RemoveContainer" containerID="f0a29e7dc0aa94c9496ed9e4b5dc6b7674b1f165ddb395e93fce4ef2ad45cf2e" Dec 06 10:09:32 crc kubenswrapper[4954]: I1206 10:09:32.372094 4954 scope.go:117] "RemoveContainer" containerID="8fe8a6e7a10522504d6edfe5b7a5cebdcb9ecbfddb52a69e3a6b7e2fd5977c26" Dec 06 10:09:32 crc kubenswrapper[4954]: I1206 10:09:32.420914 4954 scope.go:117] "RemoveContainer" containerID="db8b27535ca7959c24eefc4bae860da73fd6ebcb73bcccb8d21ceb921562195f" Dec 06 10:10:32 crc kubenswrapper[4954]: I1206 10:10:32.494986 4954 scope.go:117] "RemoveContainer" containerID="f79a54180969298c63100a693c76b6af39709fb2166750bf0dbf68529c47dfd9" Dec 06 10:10:40 crc kubenswrapper[4954]: I1206 10:10:40.101323 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:10:40 crc kubenswrapper[4954]: I1206 10:10:40.102023 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:11:10 crc kubenswrapper[4954]: I1206 10:11:10.101818 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:11:10 crc kubenswrapper[4954]: I1206 10:11:10.102439 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:11:32 crc kubenswrapper[4954]: I1206 10:11:32.588445 4954 scope.go:117] "RemoveContainer" containerID="80a74c25dc861f1846f0f61a69e3f4e06dc809ec4686a2a724e5cac68f3d206b" Dec 06 10:11:32 crc kubenswrapper[4954]: I1206 10:11:32.628727 4954 scope.go:117] "RemoveContainer" containerID="130ea2278f6920a0111a57c3d384d4e86caf46e68392b9dfde42690c1a03c091" Dec 06 10:11:40 crc kubenswrapper[4954]: I1206 10:11:40.100843 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:11:40 crc kubenswrapper[4954]: I1206 10:11:40.101349 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:11:40 crc kubenswrapper[4954]: I1206 10:11:40.101408 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 10:11:40 crc kubenswrapper[4954]: I1206 10:11:40.102222 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"247a93ffc994912c409c703902464a29e6833755e47687bdbdff1a3dbad9101f"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 10:11:40 crc kubenswrapper[4954]: I1206 10:11:40.102278 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://247a93ffc994912c409c703902464a29e6833755e47687bdbdff1a3dbad9101f" gracePeriod=600 Dec 06 10:11:40 crc kubenswrapper[4954]: I1206 10:11:40.807310 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="247a93ffc994912c409c703902464a29e6833755e47687bdbdff1a3dbad9101f" exitCode=0 Dec 06 10:11:40 crc kubenswrapper[4954]: I1206 10:11:40.807382 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"247a93ffc994912c409c703902464a29e6833755e47687bdbdff1a3dbad9101f"} Dec 06 10:11:40 crc kubenswrapper[4954]: I1206 10:11:40.807661 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a"} Dec 06 10:11:40 crc kubenswrapper[4954]: I1206 10:11:40.807766 4954 scope.go:117] "RemoveContainer" containerID="c073981738cc00937b03fc8349fd3eaa107a2442cadc92f047e773b47c5bcfda" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.327391 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-cnp2n"] Dec 06 10:11:59 crc kubenswrapper[4954]: E1206 10:11:59.328611 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="037710e0-a7fd-43e5-9b43-bbf872e102e8" containerName="swift-ring-rebalance" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.328626 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="037710e0-a7fd-43e5-9b43-bbf872e102e8" containerName="swift-ring-rebalance" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.328864 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="037710e0-a7fd-43e5-9b43-bbf872e102e8" containerName="swift-ring-rebalance" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.329576 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.333605 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.344358 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.377332 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-cnp2n"] Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.417901 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-swiftconf\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.417953 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw6xt\" (UniqueName: \"kubernetes.io/projected/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-kube-api-access-xw6xt\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.417982 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-dispersionconf\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.418060 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-etc-swift\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.418087 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-ring-data-devices\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.418166 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-scripts\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.418198 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.522302 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.522369 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-swiftconf\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.522408 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw6xt\" (UniqueName: \"kubernetes.io/projected/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-kube-api-access-xw6xt\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.522444 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-dispersionconf\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.522633 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-etc-swift\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.522673 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-ring-data-devices\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.522804 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-scripts\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.523973 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-etc-swift\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.524973 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-ring-data-devices\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.525910 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-scripts\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.529096 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-swiftconf\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.531982 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-dispersionconf\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.536067 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.549192 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw6xt\" (UniqueName: \"kubernetes.io/projected/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-kube-api-access-xw6xt\") pod \"swift-ring-rebalance-debug-cnp2n\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:11:59 crc kubenswrapper[4954]: I1206 10:11:59.668186 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:12:00 crc kubenswrapper[4954]: I1206 10:12:00.424355 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-cnp2n"] Dec 06 10:12:00 crc kubenswrapper[4954]: W1206 10:12:00.438075 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c2bfda5_d72d_4195_8e0d_09c55a4b5e73.slice/crio-1e15b22df5d1f990b238fac2d66059d3dadda3111724071b04195c4bb86cca2a WatchSource:0}: Error finding container 1e15b22df5d1f990b238fac2d66059d3dadda3111724071b04195c4bb86cca2a: Status 404 returned error can't find the container with id 1e15b22df5d1f990b238fac2d66059d3dadda3111724071b04195c4bb86cca2a Dec 06 10:12:01 crc kubenswrapper[4954]: I1206 10:12:01.050628 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-cnp2n" event={"ID":"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73","Type":"ContainerStarted","Data":"7b4ee08b1c1fb73bc7e831f032a9a540ca446f2b59e405d66fb2945abb8e17c4"} Dec 06 10:12:01 crc kubenswrapper[4954]: I1206 10:12:01.051143 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-cnp2n" event={"ID":"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73","Type":"ContainerStarted","Data":"1e15b22df5d1f990b238fac2d66059d3dadda3111724071b04195c4bb86cca2a"} Dec 06 10:12:01 crc kubenswrapper[4954]: I1206 10:12:01.078056 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-cnp2n" podStartSLOduration=2.078021594 podStartE2EDuration="2.078021594s" podCreationTimestamp="2025-12-06 10:11:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 10:12:01.070517304 +0000 UTC m=+11695.883876683" watchObservedRunningTime="2025-12-06 10:12:01.078021594 +0000 UTC m=+11695.891380983" Dec 06 10:12:08 crc kubenswrapper[4954]: I1206 10:12:08.139401 4954 generic.go:334] "Generic (PLEG): container finished" podID="3c2bfda5-d72d-4195-8e0d-09c55a4b5e73" containerID="7b4ee08b1c1fb73bc7e831f032a9a540ca446f2b59e405d66fb2945abb8e17c4" exitCode=0 Dec 06 10:12:08 crc kubenswrapper[4954]: I1206 10:12:08.139502 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-cnp2n" event={"ID":"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73","Type":"ContainerDied","Data":"7b4ee08b1c1fb73bc7e831f032a9a540ca446f2b59e405d66fb2945abb8e17c4"} Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.610805 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.661993 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-cnp2n"] Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.673641 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-cnp2n"] Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.749437 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-scripts\") pod \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.749479 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-dispersionconf\") pod \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.749670 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-swiftconf\") pod \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.749766 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-combined-ca-bundle\") pod \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.749803 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw6xt\" (UniqueName: \"kubernetes.io/projected/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-kube-api-access-xw6xt\") pod \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.749869 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-etc-swift\") pod \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.749901 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-ring-data-devices\") pod \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\" (UID: \"3c2bfda5-d72d-4195-8e0d-09c55a4b5e73\") " Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.750659 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3c2bfda5-d72d-4195-8e0d-09c55a4b5e73" (UID: "3c2bfda5-d72d-4195-8e0d-09c55a4b5e73"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.751090 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3c2bfda5-d72d-4195-8e0d-09c55a4b5e73" (UID: "3c2bfda5-d72d-4195-8e0d-09c55a4b5e73"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.761993 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-kube-api-access-xw6xt" (OuterVolumeSpecName: "kube-api-access-xw6xt") pod "3c2bfda5-d72d-4195-8e0d-09c55a4b5e73" (UID: "3c2bfda5-d72d-4195-8e0d-09c55a4b5e73"). InnerVolumeSpecName "kube-api-access-xw6xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.786067 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-scripts" (OuterVolumeSpecName: "scripts") pod "3c2bfda5-d72d-4195-8e0d-09c55a4b5e73" (UID: "3c2bfda5-d72d-4195-8e0d-09c55a4b5e73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.799732 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3c2bfda5-d72d-4195-8e0d-09c55a4b5e73" (UID: "3c2bfda5-d72d-4195-8e0d-09c55a4b5e73"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.808164 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c2bfda5-d72d-4195-8e0d-09c55a4b5e73" (UID: "3c2bfda5-d72d-4195-8e0d-09c55a4b5e73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.811702 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3c2bfda5-d72d-4195-8e0d-09c55a4b5e73" (UID: "3c2bfda5-d72d-4195-8e0d-09c55a4b5e73"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.851859 4954 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.851889 4954 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.851902 4954 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.851910 4954 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.851918 4954 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.851926 4954 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 10:12:10 crc kubenswrapper[4954]: I1206 10:12:10.851936 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw6xt\" (UniqueName: \"kubernetes.io/projected/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73-kube-api-access-xw6xt\") on node \"crc\" DevicePath \"\"" Dec 06 10:12:11 crc kubenswrapper[4954]: I1206 10:12:11.168640 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e15b22df5d1f990b238fac2d66059d3dadda3111724071b04195c4bb86cca2a" Dec 06 10:12:11 crc kubenswrapper[4954]: I1206 10:12:11.168681 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-cnp2n" Dec 06 10:12:11 crc kubenswrapper[4954]: I1206 10:12:11.453972 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c2bfda5-d72d-4195-8e0d-09c55a4b5e73" path="/var/lib/kubelet/pods/3c2bfda5-d72d-4195-8e0d-09c55a4b5e73/volumes" Dec 06 10:12:24 crc kubenswrapper[4954]: I1206 10:12:24.844129 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 06 10:12:24 crc kubenswrapper[4954]: E1206 10:12:24.845224 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c2bfda5-d72d-4195-8e0d-09c55a4b5e73" containerName="swift-ring-rebalance" Dec 06 10:12:24 crc kubenswrapper[4954]: I1206 10:12:24.845240 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c2bfda5-d72d-4195-8e0d-09c55a4b5e73" containerName="swift-ring-rebalance" Dec 06 10:12:24 crc kubenswrapper[4954]: I1206 10:12:24.845496 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c2bfda5-d72d-4195-8e0d-09c55a4b5e73" containerName="swift-ring-rebalance" Dec 06 10:12:24 crc kubenswrapper[4954]: I1206 10:12:24.846280 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 10:12:24 crc kubenswrapper[4954]: I1206 10:12:24.849087 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 06 10:12:24 crc kubenswrapper[4954]: I1206 10:12:24.849880 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 06 10:12:24 crc kubenswrapper[4954]: I1206 10:12:24.850093 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-s69fq" Dec 06 10:12:24 crc kubenswrapper[4954]: I1206 10:12:24.850208 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 06 10:12:24 crc kubenswrapper[4954]: I1206 10:12:24.868325 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 06 10:12:24 crc kubenswrapper[4954]: I1206 10:12:24.932511 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/86f6275c-9439-4a32-a0b7-467f7df9670f-config-data\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:24 crc kubenswrapper[4954]: I1206 10:12:24.932802 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/86f6275c-9439-4a32-a0b7-467f7df9670f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:24 crc kubenswrapper[4954]: I1206 10:12:24.932961 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:24 crc kubenswrapper[4954]: I1206 10:12:24.933038 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/86f6275c-9439-4a32-a0b7-467f7df9670f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:24 crc kubenswrapper[4954]: I1206 10:12:24.933114 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86f6275c-9439-4a32-a0b7-467f7df9670f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:24 crc kubenswrapper[4954]: I1206 10:12:24.933197 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/86f6275c-9439-4a32-a0b7-467f7df9670f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:24 crc kubenswrapper[4954]: I1206 10:12:24.933297 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/86f6275c-9439-4a32-a0b7-467f7df9670f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:24 crc kubenswrapper[4954]: I1206 10:12:24.933382 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z6t6\" (UniqueName: \"kubernetes.io/projected/86f6275c-9439-4a32-a0b7-467f7df9670f-kube-api-access-7z6t6\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:24 crc kubenswrapper[4954]: I1206 10:12:24.933458 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/86f6275c-9439-4a32-a0b7-467f7df9670f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.034908 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/86f6275c-9439-4a32-a0b7-467f7df9670f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.036073 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/86f6275c-9439-4a32-a0b7-467f7df9670f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.036280 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/86f6275c-9439-4a32-a0b7-467f7df9670f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.036385 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z6t6\" (UniqueName: \"kubernetes.io/projected/86f6275c-9439-4a32-a0b7-467f7df9670f-kube-api-access-7z6t6\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.036504 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/86f6275c-9439-4a32-a0b7-467f7df9670f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.036681 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/86f6275c-9439-4a32-a0b7-467f7df9670f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.037718 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/86f6275c-9439-4a32-a0b7-467f7df9670f-config-data\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.037873 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/86f6275c-9439-4a32-a0b7-467f7df9670f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.038216 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.038326 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/86f6275c-9439-4a32-a0b7-467f7df9670f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.038445 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86f6275c-9439-4a32-a0b7-467f7df9670f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.038613 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/86f6275c-9439-4a32-a0b7-467f7df9670f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.039087 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/86f6275c-9439-4a32-a0b7-467f7df9670f-config-data\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.039449 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.052464 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86f6275c-9439-4a32-a0b7-467f7df9670f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.052481 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/86f6275c-9439-4a32-a0b7-467f7df9670f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.052497 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/86f6275c-9439-4a32-a0b7-467f7df9670f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.057684 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z6t6\" (UniqueName: \"kubernetes.io/projected/86f6275c-9439-4a32-a0b7-467f7df9670f-kube-api-access-7z6t6\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.108054 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.165118 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 10:12:25 crc kubenswrapper[4954]: I1206 10:12:25.891490 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 06 10:12:26 crc kubenswrapper[4954]: I1206 10:12:26.348442 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"86f6275c-9439-4a32-a0b7-467f7df9670f","Type":"ContainerStarted","Data":"9f948789366bfc24f3b17859f6ae0920a0692a8cebc5b71d42e4a487777fdc08"} Dec 06 10:12:32 crc kubenswrapper[4954]: I1206 10:12:32.761837 4954 scope.go:117] "RemoveContainer" containerID="75ee6942ae87684e64cf9fc4273c140a3f27f61a2ea846c15b420a4daf9fd381" Dec 06 10:12:48 crc kubenswrapper[4954]: I1206 10:12:48.352398 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9rp7t"] Dec 06 10:12:48 crc kubenswrapper[4954]: I1206 10:12:48.356932 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9rp7t" Dec 06 10:12:48 crc kubenswrapper[4954]: I1206 10:12:48.378820 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9rp7t"] Dec 06 10:12:48 crc kubenswrapper[4954]: I1206 10:12:48.536075 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4d5f19c-c3fd-4287-b020-724b7d178a15-utilities\") pod \"community-operators-9rp7t\" (UID: \"c4d5f19c-c3fd-4287-b020-724b7d178a15\") " pod="openshift-marketplace/community-operators-9rp7t" Dec 06 10:12:48 crc kubenswrapper[4954]: I1206 10:12:48.536392 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4d5f19c-c3fd-4287-b020-724b7d178a15-catalog-content\") pod \"community-operators-9rp7t\" (UID: \"c4d5f19c-c3fd-4287-b020-724b7d178a15\") " pod="openshift-marketplace/community-operators-9rp7t" Dec 06 10:12:48 crc kubenswrapper[4954]: I1206 10:12:48.536790 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cl4c\" (UniqueName: \"kubernetes.io/projected/c4d5f19c-c3fd-4287-b020-724b7d178a15-kube-api-access-5cl4c\") pod \"community-operators-9rp7t\" (UID: \"c4d5f19c-c3fd-4287-b020-724b7d178a15\") " pod="openshift-marketplace/community-operators-9rp7t" Dec 06 10:12:48 crc kubenswrapper[4954]: I1206 10:12:48.639447 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cl4c\" (UniqueName: \"kubernetes.io/projected/c4d5f19c-c3fd-4287-b020-724b7d178a15-kube-api-access-5cl4c\") pod \"community-operators-9rp7t\" (UID: \"c4d5f19c-c3fd-4287-b020-724b7d178a15\") " pod="openshift-marketplace/community-operators-9rp7t" Dec 06 10:12:48 crc kubenswrapper[4954]: I1206 10:12:48.639608 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4d5f19c-c3fd-4287-b020-724b7d178a15-utilities\") pod \"community-operators-9rp7t\" (UID: \"c4d5f19c-c3fd-4287-b020-724b7d178a15\") " pod="openshift-marketplace/community-operators-9rp7t" Dec 06 10:12:48 crc kubenswrapper[4954]: I1206 10:12:48.639836 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4d5f19c-c3fd-4287-b020-724b7d178a15-catalog-content\") pod \"community-operators-9rp7t\" (UID: \"c4d5f19c-c3fd-4287-b020-724b7d178a15\") " pod="openshift-marketplace/community-operators-9rp7t" Dec 06 10:12:48 crc kubenswrapper[4954]: I1206 10:12:48.640729 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4d5f19c-c3fd-4287-b020-724b7d178a15-utilities\") pod \"community-operators-9rp7t\" (UID: \"c4d5f19c-c3fd-4287-b020-724b7d178a15\") " pod="openshift-marketplace/community-operators-9rp7t" Dec 06 10:12:48 crc kubenswrapper[4954]: I1206 10:12:48.640911 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4d5f19c-c3fd-4287-b020-724b7d178a15-catalog-content\") pod \"community-operators-9rp7t\" (UID: \"c4d5f19c-c3fd-4287-b020-724b7d178a15\") " pod="openshift-marketplace/community-operators-9rp7t" Dec 06 10:12:48 crc kubenswrapper[4954]: I1206 10:12:48.675973 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cl4c\" (UniqueName: \"kubernetes.io/projected/c4d5f19c-c3fd-4287-b020-724b7d178a15-kube-api-access-5cl4c\") pod \"community-operators-9rp7t\" (UID: \"c4d5f19c-c3fd-4287-b020-724b7d178a15\") " pod="openshift-marketplace/community-operators-9rp7t" Dec 06 10:12:48 crc kubenswrapper[4954]: I1206 10:12:48.683250 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9rp7t" Dec 06 10:12:49 crc kubenswrapper[4954]: I1206 10:12:49.757331 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9rp7t"] Dec 06 10:12:50 crc kubenswrapper[4954]: I1206 10:12:50.643149 4954 generic.go:334] "Generic (PLEG): container finished" podID="c4d5f19c-c3fd-4287-b020-724b7d178a15" containerID="e2756996125acbf740e934eeb3bb2e6a0a390554c92ad8f6d074ecf43d986a24" exitCode=0 Dec 06 10:12:50 crc kubenswrapper[4954]: I1206 10:12:50.643224 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9rp7t" event={"ID":"c4d5f19c-c3fd-4287-b020-724b7d178a15","Type":"ContainerDied","Data":"e2756996125acbf740e934eeb3bb2e6a0a390554c92ad8f6d074ecf43d986a24"} Dec 06 10:12:50 crc kubenswrapper[4954]: I1206 10:12:50.644047 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9rp7t" event={"ID":"c4d5f19c-c3fd-4287-b020-724b7d178a15","Type":"ContainerStarted","Data":"f4a9dbeb8c09598484f576e6eab25ad3b910fef08391bd6aa37c17540c8a9f57"} Dec 06 10:12:50 crc kubenswrapper[4954]: I1206 10:12:50.646463 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 10:12:52 crc kubenswrapper[4954]: I1206 10:12:52.672909 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9rp7t" event={"ID":"c4d5f19c-c3fd-4287-b020-724b7d178a15","Type":"ContainerStarted","Data":"21ce468ad5a7e3fa9e2726e5a11fbfc5f51f970101725b236f504cf84639d97c"} Dec 06 10:12:53 crc kubenswrapper[4954]: I1206 10:12:53.686916 4954 generic.go:334] "Generic (PLEG): container finished" podID="c4d5f19c-c3fd-4287-b020-724b7d178a15" containerID="21ce468ad5a7e3fa9e2726e5a11fbfc5f51f970101725b236f504cf84639d97c" exitCode=0 Dec 06 10:12:53 crc kubenswrapper[4954]: I1206 10:12:53.686968 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9rp7t" event={"ID":"c4d5f19c-c3fd-4287-b020-724b7d178a15","Type":"ContainerDied","Data":"21ce468ad5a7e3fa9e2726e5a11fbfc5f51f970101725b236f504cf84639d97c"} Dec 06 10:13:32 crc kubenswrapper[4954]: E1206 10:13:32.480683 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:c3923531bcda0b0811b2d5053f189beb" Dec 06 10:13:32 crc kubenswrapper[4954]: E1206 10:13:32.481201 4954 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:c3923531bcda0b0811b2d5053f189beb" Dec 06 10:13:32 crc kubenswrapper[4954]: E1206 10:13:32.483310 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:c3923531bcda0b0811b2d5053f189beb,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7z6t6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(86f6275c-9439-4a32-a0b7-467f7df9670f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 10:13:32 crc kubenswrapper[4954]: E1206 10:13:32.484602 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="86f6275c-9439-4a32-a0b7-467f7df9670f" Dec 06 10:13:33 crc kubenswrapper[4954]: I1206 10:13:33.307044 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9rp7t" event={"ID":"c4d5f19c-c3fd-4287-b020-724b7d178a15","Type":"ContainerStarted","Data":"0a310c367000bf570ee485d1a15884d7977b698e45364126e72c3474167eeed5"} Dec 06 10:13:33 crc kubenswrapper[4954]: E1206 10:13:33.309345 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:c3923531bcda0b0811b2d5053f189beb\\\"\"" pod="openstack/tempest-tests-tempest" podUID="86f6275c-9439-4a32-a0b7-467f7df9670f" Dec 06 10:13:33 crc kubenswrapper[4954]: I1206 10:13:33.339470 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9rp7t" podStartSLOduration=3.189250293 podStartE2EDuration="45.339443657s" podCreationTimestamp="2025-12-06 10:12:48 +0000 UTC" firstStartedPulling="2025-12-06 10:12:50.646192268 +0000 UTC m=+11745.459551657" lastFinishedPulling="2025-12-06 10:13:32.796385632 +0000 UTC m=+11787.609745021" observedRunningTime="2025-12-06 10:13:33.326230925 +0000 UTC m=+11788.139590314" watchObservedRunningTime="2025-12-06 10:13:33.339443657 +0000 UTC m=+11788.152803046" Dec 06 10:13:38 crc kubenswrapper[4954]: I1206 10:13:38.074401 4954 scope.go:117] "RemoveContainer" containerID="2cb2e0aa370ab9f0741ae11d86b39e5b76a3921aeea935079b746baac62f57de" Dec 06 10:13:38 crc kubenswrapper[4954]: I1206 10:13:38.684501 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9rp7t" Dec 06 10:13:38 crc kubenswrapper[4954]: I1206 10:13:38.685113 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9rp7t" Dec 06 10:13:38 crc kubenswrapper[4954]: I1206 10:13:38.739743 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9rp7t" Dec 06 10:13:39 crc kubenswrapper[4954]: I1206 10:13:39.436422 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9rp7t" Dec 06 10:13:39 crc kubenswrapper[4954]: I1206 10:13:39.509046 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9rp7t"] Dec 06 10:13:40 crc kubenswrapper[4954]: I1206 10:13:40.100993 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:13:40 crc kubenswrapper[4954]: I1206 10:13:40.101048 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:13:41 crc kubenswrapper[4954]: I1206 10:13:41.407328 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9rp7t" podUID="c4d5f19c-c3fd-4287-b020-724b7d178a15" containerName="registry-server" containerID="cri-o://0a310c367000bf570ee485d1a15884d7977b698e45364126e72c3474167eeed5" gracePeriod=2 Dec 06 10:13:42 crc kubenswrapper[4954]: I1206 10:13:42.418932 4954 generic.go:334] "Generic (PLEG): container finished" podID="c4d5f19c-c3fd-4287-b020-724b7d178a15" containerID="0a310c367000bf570ee485d1a15884d7977b698e45364126e72c3474167eeed5" exitCode=0 Dec 06 10:13:42 crc kubenswrapper[4954]: I1206 10:13:42.419021 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9rp7t" event={"ID":"c4d5f19c-c3fd-4287-b020-724b7d178a15","Type":"ContainerDied","Data":"0a310c367000bf570ee485d1a15884d7977b698e45364126e72c3474167eeed5"} Dec 06 10:13:42 crc kubenswrapper[4954]: I1206 10:13:42.957074 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9rp7t" Dec 06 10:13:43 crc kubenswrapper[4954]: I1206 10:13:43.055127 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4d5f19c-c3fd-4287-b020-724b7d178a15-catalog-content\") pod \"c4d5f19c-c3fd-4287-b020-724b7d178a15\" (UID: \"c4d5f19c-c3fd-4287-b020-724b7d178a15\") " Dec 06 10:13:43 crc kubenswrapper[4954]: I1206 10:13:43.055263 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4d5f19c-c3fd-4287-b020-724b7d178a15-utilities\") pod \"c4d5f19c-c3fd-4287-b020-724b7d178a15\" (UID: \"c4d5f19c-c3fd-4287-b020-724b7d178a15\") " Dec 06 10:13:43 crc kubenswrapper[4954]: I1206 10:13:43.055298 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cl4c\" (UniqueName: \"kubernetes.io/projected/c4d5f19c-c3fd-4287-b020-724b7d178a15-kube-api-access-5cl4c\") pod \"c4d5f19c-c3fd-4287-b020-724b7d178a15\" (UID: \"c4d5f19c-c3fd-4287-b020-724b7d178a15\") " Dec 06 10:13:43 crc kubenswrapper[4954]: I1206 10:13:43.055889 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4d5f19c-c3fd-4287-b020-724b7d178a15-utilities" (OuterVolumeSpecName: "utilities") pod "c4d5f19c-c3fd-4287-b020-724b7d178a15" (UID: "c4d5f19c-c3fd-4287-b020-724b7d178a15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:13:43 crc kubenswrapper[4954]: I1206 10:13:43.062240 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4d5f19c-c3fd-4287-b020-724b7d178a15-kube-api-access-5cl4c" (OuterVolumeSpecName: "kube-api-access-5cl4c") pod "c4d5f19c-c3fd-4287-b020-724b7d178a15" (UID: "c4d5f19c-c3fd-4287-b020-724b7d178a15"). InnerVolumeSpecName "kube-api-access-5cl4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:13:43 crc kubenswrapper[4954]: I1206 10:13:43.122282 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4d5f19c-c3fd-4287-b020-724b7d178a15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4d5f19c-c3fd-4287-b020-724b7d178a15" (UID: "c4d5f19c-c3fd-4287-b020-724b7d178a15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:13:43 crc kubenswrapper[4954]: I1206 10:13:43.158503 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4d5f19c-c3fd-4287-b020-724b7d178a15-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:13:43 crc kubenswrapper[4954]: I1206 10:13:43.158542 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4d5f19c-c3fd-4287-b020-724b7d178a15-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:13:43 crc kubenswrapper[4954]: I1206 10:13:43.158554 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cl4c\" (UniqueName: \"kubernetes.io/projected/c4d5f19c-c3fd-4287-b020-724b7d178a15-kube-api-access-5cl4c\") on node \"crc\" DevicePath \"\"" Dec 06 10:13:43 crc kubenswrapper[4954]: I1206 10:13:43.432985 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9rp7t" event={"ID":"c4d5f19c-c3fd-4287-b020-724b7d178a15","Type":"ContainerDied","Data":"f4a9dbeb8c09598484f576e6eab25ad3b910fef08391bd6aa37c17540c8a9f57"} Dec 06 10:13:43 crc kubenswrapper[4954]: I1206 10:13:43.433031 4954 scope.go:117] "RemoveContainer" containerID="0a310c367000bf570ee485d1a15884d7977b698e45364126e72c3474167eeed5" Dec 06 10:13:43 crc kubenswrapper[4954]: I1206 10:13:43.433112 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9rp7t" Dec 06 10:13:43 crc kubenswrapper[4954]: I1206 10:13:43.459102 4954 scope.go:117] "RemoveContainer" containerID="21ce468ad5a7e3fa9e2726e5a11fbfc5f51f970101725b236f504cf84639d97c" Dec 06 10:13:43 crc kubenswrapper[4954]: I1206 10:13:43.494555 4954 scope.go:117] "RemoveContainer" containerID="e2756996125acbf740e934eeb3bb2e6a0a390554c92ad8f6d074ecf43d986a24" Dec 06 10:13:43 crc kubenswrapper[4954]: I1206 10:13:43.494768 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9rp7t"] Dec 06 10:13:43 crc kubenswrapper[4954]: I1206 10:13:43.507288 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9rp7t"] Dec 06 10:13:44 crc kubenswrapper[4954]: I1206 10:13:44.634446 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 06 10:13:45 crc kubenswrapper[4954]: I1206 10:13:45.460053 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4d5f19c-c3fd-4287-b020-724b7d178a15" path="/var/lib/kubelet/pods/c4d5f19c-c3fd-4287-b020-724b7d178a15/volumes" Dec 06 10:13:46 crc kubenswrapper[4954]: I1206 10:13:46.473643 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"86f6275c-9439-4a32-a0b7-467f7df9670f","Type":"ContainerStarted","Data":"3292d8886bc231cb5507d8f5dfba26c04d8627aec908a3d6ebf2fc547a27f018"} Dec 06 10:13:46 crc kubenswrapper[4954]: I1206 10:13:46.506302 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.784455441 podStartE2EDuration="1m23.506277016s" podCreationTimestamp="2025-12-06 10:12:23 +0000 UTC" firstStartedPulling="2025-12-06 10:12:25.908635795 +0000 UTC m=+11720.721995184" lastFinishedPulling="2025-12-06 10:13:44.63045737 +0000 UTC m=+11799.443816759" observedRunningTime="2025-12-06 10:13:46.489032966 +0000 UTC m=+11801.302392355" watchObservedRunningTime="2025-12-06 10:13:46.506277016 +0000 UTC m=+11801.319636405" Dec 06 10:14:10 crc kubenswrapper[4954]: I1206 10:14:10.101382 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:14:10 crc kubenswrapper[4954]: I1206 10:14:10.101986 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:14:38 crc kubenswrapper[4954]: I1206 10:14:38.257823 4954 scope.go:117] "RemoveContainer" containerID="ed920bb85f510b053b00bae050c4ea96fc2448c010e61a1ab30c164a9abb654b" Dec 06 10:14:38 crc kubenswrapper[4954]: I1206 10:14:38.299891 4954 scope.go:117] "RemoveContainer" containerID="0dacb5f582e1b9282f4e2dd4ad76fc22bd2f53a365726b3deff2c831a9064945" Dec 06 10:14:40 crc kubenswrapper[4954]: I1206 10:14:40.100745 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:14:40 crc kubenswrapper[4954]: I1206 10:14:40.101139 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:14:40 crc kubenswrapper[4954]: I1206 10:14:40.101188 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 10:14:40 crc kubenswrapper[4954]: I1206 10:14:40.102044 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 10:14:40 crc kubenswrapper[4954]: I1206 10:14:40.102099 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" gracePeriod=600 Dec 06 10:14:40 crc kubenswrapper[4954]: E1206 10:14:40.227378 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:14:41 crc kubenswrapper[4954]: I1206 10:14:41.043316 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a"} Dec 06 10:14:41 crc kubenswrapper[4954]: I1206 10:14:41.043375 4954 scope.go:117] "RemoveContainer" containerID="247a93ffc994912c409c703902464a29e6833755e47687bdbdff1a3dbad9101f" Dec 06 10:14:41 crc kubenswrapper[4954]: I1206 10:14:41.043246 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" exitCode=0 Dec 06 10:14:41 crc kubenswrapper[4954]: I1206 10:14:41.044333 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:14:41 crc kubenswrapper[4954]: E1206 10:14:41.044781 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:14:55 crc kubenswrapper[4954]: I1206 10:14:55.452322 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:14:55 crc kubenswrapper[4954]: E1206 10:14:55.453249 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.175280 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4"] Dec 06 10:15:00 crc kubenswrapper[4954]: E1206 10:15:00.177512 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d5f19c-c3fd-4287-b020-724b7d178a15" containerName="extract-content" Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.177647 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d5f19c-c3fd-4287-b020-724b7d178a15" containerName="extract-content" Dec 06 10:15:00 crc kubenswrapper[4954]: E1206 10:15:00.177742 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d5f19c-c3fd-4287-b020-724b7d178a15" containerName="extract-utilities" Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.177801 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d5f19c-c3fd-4287-b020-724b7d178a15" containerName="extract-utilities" Dec 06 10:15:00 crc kubenswrapper[4954]: E1206 10:15:00.177861 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d5f19c-c3fd-4287-b020-724b7d178a15" containerName="registry-server" Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.177922 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d5f19c-c3fd-4287-b020-724b7d178a15" containerName="registry-server" Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.178267 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d5f19c-c3fd-4287-b020-724b7d178a15" containerName="registry-server" Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.179103 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4" Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.184500 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.184589 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.191576 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4"] Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.237098 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv55l\" (UniqueName: \"kubernetes.io/projected/c262a1d8-116e-4361-87e9-f38f668f13bf-kube-api-access-gv55l\") pod \"collect-profiles-29416935-gpwr4\" (UID: \"c262a1d8-116e-4361-87e9-f38f668f13bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4" Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.237180 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c262a1d8-116e-4361-87e9-f38f668f13bf-config-volume\") pod \"collect-profiles-29416935-gpwr4\" (UID: \"c262a1d8-116e-4361-87e9-f38f668f13bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4" Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.237609 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c262a1d8-116e-4361-87e9-f38f668f13bf-secret-volume\") pod \"collect-profiles-29416935-gpwr4\" (UID: \"c262a1d8-116e-4361-87e9-f38f668f13bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4" Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.339937 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv55l\" (UniqueName: \"kubernetes.io/projected/c262a1d8-116e-4361-87e9-f38f668f13bf-kube-api-access-gv55l\") pod \"collect-profiles-29416935-gpwr4\" (UID: \"c262a1d8-116e-4361-87e9-f38f668f13bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4" Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.340008 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c262a1d8-116e-4361-87e9-f38f668f13bf-config-volume\") pod \"collect-profiles-29416935-gpwr4\" (UID: \"c262a1d8-116e-4361-87e9-f38f668f13bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4" Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.340108 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c262a1d8-116e-4361-87e9-f38f668f13bf-secret-volume\") pod \"collect-profiles-29416935-gpwr4\" (UID: \"c262a1d8-116e-4361-87e9-f38f668f13bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4" Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.341121 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c262a1d8-116e-4361-87e9-f38f668f13bf-config-volume\") pod \"collect-profiles-29416935-gpwr4\" (UID: \"c262a1d8-116e-4361-87e9-f38f668f13bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4" Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.346405 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c262a1d8-116e-4361-87e9-f38f668f13bf-secret-volume\") pod \"collect-profiles-29416935-gpwr4\" (UID: \"c262a1d8-116e-4361-87e9-f38f668f13bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4" Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.359129 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv55l\" (UniqueName: \"kubernetes.io/projected/c262a1d8-116e-4361-87e9-f38f668f13bf-kube-api-access-gv55l\") pod \"collect-profiles-29416935-gpwr4\" (UID: \"c262a1d8-116e-4361-87e9-f38f668f13bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4" Dec 06 10:15:00 crc kubenswrapper[4954]: I1206 10:15:00.503959 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4" Dec 06 10:15:01 crc kubenswrapper[4954]: I1206 10:15:01.546349 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4"] Dec 06 10:15:01 crc kubenswrapper[4954]: W1206 10:15:01.550684 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc262a1d8_116e_4361_87e9_f38f668f13bf.slice/crio-a4dafb90975242726eb80e6439f0469f3a0d09f66ca129777ff2fbfdb3cfa4d6 WatchSource:0}: Error finding container a4dafb90975242726eb80e6439f0469f3a0d09f66ca129777ff2fbfdb3cfa4d6: Status 404 returned error can't find the container with id a4dafb90975242726eb80e6439f0469f3a0d09f66ca129777ff2fbfdb3cfa4d6 Dec 06 10:15:02 crc kubenswrapper[4954]: I1206 10:15:02.262411 4954 generic.go:334] "Generic (PLEG): container finished" podID="c262a1d8-116e-4361-87e9-f38f668f13bf" containerID="65f80e6223af02ce9b8c931df949710122a67e588dcf71313bc4218c3f16adb7" exitCode=0 Dec 06 10:15:02 crc kubenswrapper[4954]: I1206 10:15:02.262513 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4" event={"ID":"c262a1d8-116e-4361-87e9-f38f668f13bf","Type":"ContainerDied","Data":"65f80e6223af02ce9b8c931df949710122a67e588dcf71313bc4218c3f16adb7"} Dec 06 10:15:02 crc kubenswrapper[4954]: I1206 10:15:02.262734 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4" event={"ID":"c262a1d8-116e-4361-87e9-f38f668f13bf","Type":"ContainerStarted","Data":"a4dafb90975242726eb80e6439f0469f3a0d09f66ca129777ff2fbfdb3cfa4d6"} Dec 06 10:15:04 crc kubenswrapper[4954]: I1206 10:15:04.942452 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4" Dec 06 10:15:05 crc kubenswrapper[4954]: I1206 10:15:05.014668 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c262a1d8-116e-4361-87e9-f38f668f13bf-config-volume\") pod \"c262a1d8-116e-4361-87e9-f38f668f13bf\" (UID: \"c262a1d8-116e-4361-87e9-f38f668f13bf\") " Dec 06 10:15:05 crc kubenswrapper[4954]: I1206 10:15:05.014814 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv55l\" (UniqueName: \"kubernetes.io/projected/c262a1d8-116e-4361-87e9-f38f668f13bf-kube-api-access-gv55l\") pod \"c262a1d8-116e-4361-87e9-f38f668f13bf\" (UID: \"c262a1d8-116e-4361-87e9-f38f668f13bf\") " Dec 06 10:15:05 crc kubenswrapper[4954]: I1206 10:15:05.015018 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c262a1d8-116e-4361-87e9-f38f668f13bf-secret-volume\") pod \"c262a1d8-116e-4361-87e9-f38f668f13bf\" (UID: \"c262a1d8-116e-4361-87e9-f38f668f13bf\") " Dec 06 10:15:05 crc kubenswrapper[4954]: I1206 10:15:05.015315 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c262a1d8-116e-4361-87e9-f38f668f13bf-config-volume" (OuterVolumeSpecName: "config-volume") pod "c262a1d8-116e-4361-87e9-f38f668f13bf" (UID: "c262a1d8-116e-4361-87e9-f38f668f13bf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:15:05 crc kubenswrapper[4954]: I1206 10:15:05.015764 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c262a1d8-116e-4361-87e9-f38f668f13bf-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 10:15:05 crc kubenswrapper[4954]: I1206 10:15:05.050718 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c262a1d8-116e-4361-87e9-f38f668f13bf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c262a1d8-116e-4361-87e9-f38f668f13bf" (UID: "c262a1d8-116e-4361-87e9-f38f668f13bf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:15:05 crc kubenswrapper[4954]: I1206 10:15:05.050801 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c262a1d8-116e-4361-87e9-f38f668f13bf-kube-api-access-gv55l" (OuterVolumeSpecName: "kube-api-access-gv55l") pod "c262a1d8-116e-4361-87e9-f38f668f13bf" (UID: "c262a1d8-116e-4361-87e9-f38f668f13bf"). InnerVolumeSpecName "kube-api-access-gv55l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:15:05 crc kubenswrapper[4954]: I1206 10:15:05.119904 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv55l\" (UniqueName: \"kubernetes.io/projected/c262a1d8-116e-4361-87e9-f38f668f13bf-kube-api-access-gv55l\") on node \"crc\" DevicePath \"\"" Dec 06 10:15:05 crc kubenswrapper[4954]: I1206 10:15:05.119938 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c262a1d8-116e-4361-87e9-f38f668f13bf-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 10:15:05 crc kubenswrapper[4954]: I1206 10:15:05.291614 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4" event={"ID":"c262a1d8-116e-4361-87e9-f38f668f13bf","Type":"ContainerDied","Data":"a4dafb90975242726eb80e6439f0469f3a0d09f66ca129777ff2fbfdb3cfa4d6"} Dec 06 10:15:05 crc kubenswrapper[4954]: I1206 10:15:05.291660 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4dafb90975242726eb80e6439f0469f3a0d09f66ca129777ff2fbfdb3cfa4d6" Dec 06 10:15:05 crc kubenswrapper[4954]: I1206 10:15:05.291723 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416935-gpwr4" Dec 06 10:15:06 crc kubenswrapper[4954]: I1206 10:15:06.057528 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb"] Dec 06 10:15:06 crc kubenswrapper[4954]: I1206 10:15:06.068440 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416890-5w8nb"] Dec 06 10:15:07 crc kubenswrapper[4954]: I1206 10:15:07.443101 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:15:07 crc kubenswrapper[4954]: E1206 10:15:07.444393 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:15:07 crc kubenswrapper[4954]: I1206 10:15:07.453733 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e" path="/var/lib/kubelet/pods/1cb1fbbb-35ab-4c0b-ba48-b0b2b9a6d54e/volumes" Dec 06 10:15:19 crc kubenswrapper[4954]: I1206 10:15:19.443854 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:15:19 crc kubenswrapper[4954]: E1206 10:15:19.444706 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:15:32 crc kubenswrapper[4954]: I1206 10:15:32.444328 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:15:32 crc kubenswrapper[4954]: E1206 10:15:32.446469 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:15:38 crc kubenswrapper[4954]: I1206 10:15:38.459303 4954 scope.go:117] "RemoveContainer" containerID="8ba1edb0760f827f659527418cacefb800eeff40f5dbb8ed52da1de212a97b0d" Dec 06 10:15:38 crc kubenswrapper[4954]: I1206 10:15:38.490830 4954 scope.go:117] "RemoveContainer" containerID="ae6e69204278cc49d3a4d078fe257fe7f46edd8c3b9c8b24a6545a65ae4c8d0a" Dec 06 10:15:44 crc kubenswrapper[4954]: I1206 10:15:44.444053 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:15:44 crc kubenswrapper[4954]: E1206 10:15:44.444767 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:15:59 crc kubenswrapper[4954]: I1206 10:15:59.446220 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:15:59 crc kubenswrapper[4954]: E1206 10:15:59.447137 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:16:13 crc kubenswrapper[4954]: I1206 10:16:13.443494 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:16:13 crc kubenswrapper[4954]: E1206 10:16:13.444261 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:16:28 crc kubenswrapper[4954]: I1206 10:16:28.444644 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:16:28 crc kubenswrapper[4954]: E1206 10:16:28.445292 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:16:30 crc kubenswrapper[4954]: I1206 10:16:30.114458 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w5m24"] Dec 06 10:16:30 crc kubenswrapper[4954]: E1206 10:16:30.115247 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c262a1d8-116e-4361-87e9-f38f668f13bf" containerName="collect-profiles" Dec 06 10:16:30 crc kubenswrapper[4954]: I1206 10:16:30.115261 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c262a1d8-116e-4361-87e9-f38f668f13bf" containerName="collect-profiles" Dec 06 10:16:30 crc kubenswrapper[4954]: I1206 10:16:30.115517 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c262a1d8-116e-4361-87e9-f38f668f13bf" containerName="collect-profiles" Dec 06 10:16:30 crc kubenswrapper[4954]: I1206 10:16:30.117128 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5m24" Dec 06 10:16:30 crc kubenswrapper[4954]: I1206 10:16:30.125047 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5m24"] Dec 06 10:16:30 crc kubenswrapper[4954]: I1206 10:16:30.159193 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4211f4c-93de-4414-acfc-1ca2afd0ba89-catalog-content\") pod \"redhat-operators-w5m24\" (UID: \"b4211f4c-93de-4414-acfc-1ca2afd0ba89\") " pod="openshift-marketplace/redhat-operators-w5m24" Dec 06 10:16:30 crc kubenswrapper[4954]: I1206 10:16:30.159284 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4211f4c-93de-4414-acfc-1ca2afd0ba89-utilities\") pod \"redhat-operators-w5m24\" (UID: \"b4211f4c-93de-4414-acfc-1ca2afd0ba89\") " pod="openshift-marketplace/redhat-operators-w5m24" Dec 06 10:16:30 crc kubenswrapper[4954]: I1206 10:16:30.159425 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvbrs\" (UniqueName: \"kubernetes.io/projected/b4211f4c-93de-4414-acfc-1ca2afd0ba89-kube-api-access-wvbrs\") pod \"redhat-operators-w5m24\" (UID: \"b4211f4c-93de-4414-acfc-1ca2afd0ba89\") " pod="openshift-marketplace/redhat-operators-w5m24" Dec 06 10:16:30 crc kubenswrapper[4954]: I1206 10:16:30.261246 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4211f4c-93de-4414-acfc-1ca2afd0ba89-catalog-content\") pod \"redhat-operators-w5m24\" (UID: \"b4211f4c-93de-4414-acfc-1ca2afd0ba89\") " pod="openshift-marketplace/redhat-operators-w5m24" Dec 06 10:16:30 crc kubenswrapper[4954]: I1206 10:16:30.261324 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4211f4c-93de-4414-acfc-1ca2afd0ba89-utilities\") pod \"redhat-operators-w5m24\" (UID: \"b4211f4c-93de-4414-acfc-1ca2afd0ba89\") " pod="openshift-marketplace/redhat-operators-w5m24" Dec 06 10:16:30 crc kubenswrapper[4954]: I1206 10:16:30.261442 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvbrs\" (UniqueName: \"kubernetes.io/projected/b4211f4c-93de-4414-acfc-1ca2afd0ba89-kube-api-access-wvbrs\") pod \"redhat-operators-w5m24\" (UID: \"b4211f4c-93de-4414-acfc-1ca2afd0ba89\") " pod="openshift-marketplace/redhat-operators-w5m24" Dec 06 10:16:30 crc kubenswrapper[4954]: I1206 10:16:30.261789 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4211f4c-93de-4414-acfc-1ca2afd0ba89-catalog-content\") pod \"redhat-operators-w5m24\" (UID: \"b4211f4c-93de-4414-acfc-1ca2afd0ba89\") " pod="openshift-marketplace/redhat-operators-w5m24" Dec 06 10:16:30 crc kubenswrapper[4954]: I1206 10:16:30.262119 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4211f4c-93de-4414-acfc-1ca2afd0ba89-utilities\") pod \"redhat-operators-w5m24\" (UID: \"b4211f4c-93de-4414-acfc-1ca2afd0ba89\") " pod="openshift-marketplace/redhat-operators-w5m24" Dec 06 10:16:30 crc kubenswrapper[4954]: I1206 10:16:30.291521 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvbrs\" (UniqueName: \"kubernetes.io/projected/b4211f4c-93de-4414-acfc-1ca2afd0ba89-kube-api-access-wvbrs\") pod \"redhat-operators-w5m24\" (UID: \"b4211f4c-93de-4414-acfc-1ca2afd0ba89\") " pod="openshift-marketplace/redhat-operators-w5m24" Dec 06 10:16:30 crc kubenswrapper[4954]: I1206 10:16:30.483152 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5m24" Dec 06 10:16:31 crc kubenswrapper[4954]: I1206 10:16:31.429504 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5m24"] Dec 06 10:16:31 crc kubenswrapper[4954]: W1206 10:16:31.435685 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4211f4c_93de_4414_acfc_1ca2afd0ba89.slice/crio-0bd7480316619dc52366e5c1cbadbe3078b798a04ab2284bb9230f8c8b05f5d7 WatchSource:0}: Error finding container 0bd7480316619dc52366e5c1cbadbe3078b798a04ab2284bb9230f8c8b05f5d7: Status 404 returned error can't find the container with id 0bd7480316619dc52366e5c1cbadbe3078b798a04ab2284bb9230f8c8b05f5d7 Dec 06 10:16:32 crc kubenswrapper[4954]: I1206 10:16:32.348734 4954 generic.go:334] "Generic (PLEG): container finished" podID="b4211f4c-93de-4414-acfc-1ca2afd0ba89" containerID="dbe8c0cd76787c463fc11599ae51cd072b398dcba69b4119cdcd2905b4938125" exitCode=0 Dec 06 10:16:32 crc kubenswrapper[4954]: I1206 10:16:32.348954 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5m24" event={"ID":"b4211f4c-93de-4414-acfc-1ca2afd0ba89","Type":"ContainerDied","Data":"dbe8c0cd76787c463fc11599ae51cd072b398dcba69b4119cdcd2905b4938125"} Dec 06 10:16:32 crc kubenswrapper[4954]: I1206 10:16:32.349070 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5m24" event={"ID":"b4211f4c-93de-4414-acfc-1ca2afd0ba89","Type":"ContainerStarted","Data":"0bd7480316619dc52366e5c1cbadbe3078b798a04ab2284bb9230f8c8b05f5d7"} Dec 06 10:16:33 crc kubenswrapper[4954]: I1206 10:16:33.378422 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5m24" event={"ID":"b4211f4c-93de-4414-acfc-1ca2afd0ba89","Type":"ContainerStarted","Data":"3a0b3a7125c60b20dbbd31b0f518b41f5860d673a9b50ca4d792332c1988a344"} Dec 06 10:16:37 crc kubenswrapper[4954]: I1206 10:16:37.914364 4954 generic.go:334] "Generic (PLEG): container finished" podID="b4211f4c-93de-4414-acfc-1ca2afd0ba89" containerID="3a0b3a7125c60b20dbbd31b0f518b41f5860d673a9b50ca4d792332c1988a344" exitCode=0 Dec 06 10:16:37 crc kubenswrapper[4954]: I1206 10:16:37.914437 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5m24" event={"ID":"b4211f4c-93de-4414-acfc-1ca2afd0ba89","Type":"ContainerDied","Data":"3a0b3a7125c60b20dbbd31b0f518b41f5860d673a9b50ca4d792332c1988a344"} Dec 06 10:16:38 crc kubenswrapper[4954]: I1206 10:16:38.926298 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5m24" event={"ID":"b4211f4c-93de-4414-acfc-1ca2afd0ba89","Type":"ContainerStarted","Data":"39ced84121cb7f0a018918fa8e7a0b22f258d366fa031ccb608264bffb26d15e"} Dec 06 10:16:40 crc kubenswrapper[4954]: I1206 10:16:40.484685 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w5m24" Dec 06 10:16:40 crc kubenswrapper[4954]: I1206 10:16:40.485077 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w5m24" Dec 06 10:16:41 crc kubenswrapper[4954]: I1206 10:16:41.545883 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w5m24" podUID="b4211f4c-93de-4414-acfc-1ca2afd0ba89" containerName="registry-server" probeResult="failure" output=< Dec 06 10:16:41 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 10:16:41 crc kubenswrapper[4954]: > Dec 06 10:16:42 crc kubenswrapper[4954]: I1206 10:16:42.444162 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:16:42 crc kubenswrapper[4954]: E1206 10:16:42.444620 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:16:51 crc kubenswrapper[4954]: I1206 10:16:51.531347 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w5m24" podUID="b4211f4c-93de-4414-acfc-1ca2afd0ba89" containerName="registry-server" probeResult="failure" output=< Dec 06 10:16:51 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 10:16:51 crc kubenswrapper[4954]: > Dec 06 10:16:54 crc kubenswrapper[4954]: I1206 10:16:54.443895 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:16:54 crc kubenswrapper[4954]: E1206 10:16:54.444442 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:17:00 crc kubenswrapper[4954]: I1206 10:17:00.538482 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w5m24" Dec 06 10:17:00 crc kubenswrapper[4954]: I1206 10:17:00.557112 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w5m24" podStartSLOduration=24.414001361 podStartE2EDuration="30.557063402s" podCreationTimestamp="2025-12-06 10:16:30 +0000 UTC" firstStartedPulling="2025-12-06 10:16:32.35060082 +0000 UTC m=+11967.163960209" lastFinishedPulling="2025-12-06 10:16:38.493662851 +0000 UTC m=+11973.307022250" observedRunningTime="2025-12-06 10:16:38.976703986 +0000 UTC m=+11973.790063375" watchObservedRunningTime="2025-12-06 10:17:00.557063402 +0000 UTC m=+11995.370422791" Dec 06 10:17:00 crc kubenswrapper[4954]: I1206 10:17:00.608922 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w5m24" Dec 06 10:17:01 crc kubenswrapper[4954]: I1206 10:17:01.315533 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5m24"] Dec 06 10:17:02 crc kubenswrapper[4954]: I1206 10:17:02.172581 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w5m24" podUID="b4211f4c-93de-4414-acfc-1ca2afd0ba89" containerName="registry-server" containerID="cri-o://39ced84121cb7f0a018918fa8e7a0b22f258d366fa031ccb608264bffb26d15e" gracePeriod=2 Dec 06 10:17:03 crc kubenswrapper[4954]: I1206 10:17:03.185817 4954 generic.go:334] "Generic (PLEG): container finished" podID="b4211f4c-93de-4414-acfc-1ca2afd0ba89" containerID="39ced84121cb7f0a018918fa8e7a0b22f258d366fa031ccb608264bffb26d15e" exitCode=0 Dec 06 10:17:03 crc kubenswrapper[4954]: I1206 10:17:03.186075 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5m24" event={"ID":"b4211f4c-93de-4414-acfc-1ca2afd0ba89","Type":"ContainerDied","Data":"39ced84121cb7f0a018918fa8e7a0b22f258d366fa031ccb608264bffb26d15e"} Dec 06 10:17:03 crc kubenswrapper[4954]: I1206 10:17:03.887062 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5m24" Dec 06 10:17:03 crc kubenswrapper[4954]: I1206 10:17:03.953640 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4211f4c-93de-4414-acfc-1ca2afd0ba89-catalog-content\") pod \"b4211f4c-93de-4414-acfc-1ca2afd0ba89\" (UID: \"b4211f4c-93de-4414-acfc-1ca2afd0ba89\") " Dec 06 10:17:03 crc kubenswrapper[4954]: I1206 10:17:03.953761 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvbrs\" (UniqueName: \"kubernetes.io/projected/b4211f4c-93de-4414-acfc-1ca2afd0ba89-kube-api-access-wvbrs\") pod \"b4211f4c-93de-4414-acfc-1ca2afd0ba89\" (UID: \"b4211f4c-93de-4414-acfc-1ca2afd0ba89\") " Dec 06 10:17:03 crc kubenswrapper[4954]: I1206 10:17:03.953816 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4211f4c-93de-4414-acfc-1ca2afd0ba89-utilities\") pod \"b4211f4c-93de-4414-acfc-1ca2afd0ba89\" (UID: \"b4211f4c-93de-4414-acfc-1ca2afd0ba89\") " Dec 06 10:17:03 crc kubenswrapper[4954]: I1206 10:17:03.956470 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4211f4c-93de-4414-acfc-1ca2afd0ba89-utilities" (OuterVolumeSpecName: "utilities") pod "b4211f4c-93de-4414-acfc-1ca2afd0ba89" (UID: "b4211f4c-93de-4414-acfc-1ca2afd0ba89"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:17:03 crc kubenswrapper[4954]: I1206 10:17:03.965921 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4211f4c-93de-4414-acfc-1ca2afd0ba89-kube-api-access-wvbrs" (OuterVolumeSpecName: "kube-api-access-wvbrs") pod "b4211f4c-93de-4414-acfc-1ca2afd0ba89" (UID: "b4211f4c-93de-4414-acfc-1ca2afd0ba89"). InnerVolumeSpecName "kube-api-access-wvbrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:17:04 crc kubenswrapper[4954]: I1206 10:17:04.057439 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvbrs\" (UniqueName: \"kubernetes.io/projected/b4211f4c-93de-4414-acfc-1ca2afd0ba89-kube-api-access-wvbrs\") on node \"crc\" DevicePath \"\"" Dec 06 10:17:04 crc kubenswrapper[4954]: I1206 10:17:04.057477 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4211f4c-93de-4414-acfc-1ca2afd0ba89-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:17:04 crc kubenswrapper[4954]: I1206 10:17:04.166338 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4211f4c-93de-4414-acfc-1ca2afd0ba89-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4211f4c-93de-4414-acfc-1ca2afd0ba89" (UID: "b4211f4c-93de-4414-acfc-1ca2afd0ba89"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:17:04 crc kubenswrapper[4954]: I1206 10:17:04.235237 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5m24" event={"ID":"b4211f4c-93de-4414-acfc-1ca2afd0ba89","Type":"ContainerDied","Data":"0bd7480316619dc52366e5c1cbadbe3078b798a04ab2284bb9230f8c8b05f5d7"} Dec 06 10:17:04 crc kubenswrapper[4954]: I1206 10:17:04.235303 4954 scope.go:117] "RemoveContainer" containerID="39ced84121cb7f0a018918fa8e7a0b22f258d366fa031ccb608264bffb26d15e" Dec 06 10:17:04 crc kubenswrapper[4954]: I1206 10:17:04.235476 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5m24" Dec 06 10:17:04 crc kubenswrapper[4954]: I1206 10:17:04.261216 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4211f4c-93de-4414-acfc-1ca2afd0ba89-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:17:04 crc kubenswrapper[4954]: I1206 10:17:04.280044 4954 scope.go:117] "RemoveContainer" containerID="3a0b3a7125c60b20dbbd31b0f518b41f5860d673a9b50ca4d792332c1988a344" Dec 06 10:17:04 crc kubenswrapper[4954]: I1206 10:17:04.287777 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5m24"] Dec 06 10:17:04 crc kubenswrapper[4954]: I1206 10:17:04.301880 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w5m24"] Dec 06 10:17:04 crc kubenswrapper[4954]: I1206 10:17:04.349381 4954 scope.go:117] "RemoveContainer" containerID="dbe8c0cd76787c463fc11599ae51cd072b398dcba69b4119cdcd2905b4938125" Dec 06 10:17:05 crc kubenswrapper[4954]: I1206 10:17:05.452431 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:17:05 crc kubenswrapper[4954]: E1206 10:17:05.452958 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:17:05 crc kubenswrapper[4954]: I1206 10:17:05.466612 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4211f4c-93de-4414-acfc-1ca2afd0ba89" path="/var/lib/kubelet/pods/b4211f4c-93de-4414-acfc-1ca2afd0ba89/volumes" Dec 06 10:17:17 crc kubenswrapper[4954]: I1206 10:17:17.443961 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:17:17 crc kubenswrapper[4954]: E1206 10:17:17.444608 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:17:32 crc kubenswrapper[4954]: I1206 10:17:32.443547 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:17:32 crc kubenswrapper[4954]: E1206 10:17:32.444390 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:17:46 crc kubenswrapper[4954]: I1206 10:17:46.443627 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:17:46 crc kubenswrapper[4954]: E1206 10:17:46.444430 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:18:01 crc kubenswrapper[4954]: I1206 10:18:01.443593 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:18:01 crc kubenswrapper[4954]: E1206 10:18:01.444639 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:18:12 crc kubenswrapper[4954]: I1206 10:18:12.444018 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:18:12 crc kubenswrapper[4954]: E1206 10:18:12.444705 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:18:12 crc kubenswrapper[4954]: I1206 10:18:12.758989 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qf6qx"] Dec 06 10:18:12 crc kubenswrapper[4954]: E1206 10:18:12.759418 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4211f4c-93de-4414-acfc-1ca2afd0ba89" containerName="extract-content" Dec 06 10:18:12 crc kubenswrapper[4954]: I1206 10:18:12.759430 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4211f4c-93de-4414-acfc-1ca2afd0ba89" containerName="extract-content" Dec 06 10:18:12 crc kubenswrapper[4954]: E1206 10:18:12.759462 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4211f4c-93de-4414-acfc-1ca2afd0ba89" containerName="extract-utilities" Dec 06 10:18:12 crc kubenswrapper[4954]: I1206 10:18:12.759468 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4211f4c-93de-4414-acfc-1ca2afd0ba89" containerName="extract-utilities" Dec 06 10:18:12 crc kubenswrapper[4954]: E1206 10:18:12.759477 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4211f4c-93de-4414-acfc-1ca2afd0ba89" containerName="registry-server" Dec 06 10:18:12 crc kubenswrapper[4954]: I1206 10:18:12.759483 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4211f4c-93de-4414-acfc-1ca2afd0ba89" containerName="registry-server" Dec 06 10:18:12 crc kubenswrapper[4954]: I1206 10:18:12.759735 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4211f4c-93de-4414-acfc-1ca2afd0ba89" containerName="registry-server" Dec 06 10:18:12 crc kubenswrapper[4954]: I1206 10:18:12.761298 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qf6qx" Dec 06 10:18:12 crc kubenswrapper[4954]: I1206 10:18:12.771864 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qf6qx"] Dec 06 10:18:12 crc kubenswrapper[4954]: I1206 10:18:12.881751 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7r59\" (UniqueName: \"kubernetes.io/projected/702b5f4a-0f6e-411d-8920-90ed2a0e68d3-kube-api-access-r7r59\") pod \"redhat-marketplace-qf6qx\" (UID: \"702b5f4a-0f6e-411d-8920-90ed2a0e68d3\") " pod="openshift-marketplace/redhat-marketplace-qf6qx" Dec 06 10:18:12 crc kubenswrapper[4954]: I1206 10:18:12.881820 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/702b5f4a-0f6e-411d-8920-90ed2a0e68d3-utilities\") pod \"redhat-marketplace-qf6qx\" (UID: \"702b5f4a-0f6e-411d-8920-90ed2a0e68d3\") " pod="openshift-marketplace/redhat-marketplace-qf6qx" Dec 06 10:18:12 crc kubenswrapper[4954]: I1206 10:18:12.881958 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/702b5f4a-0f6e-411d-8920-90ed2a0e68d3-catalog-content\") pod \"redhat-marketplace-qf6qx\" (UID: \"702b5f4a-0f6e-411d-8920-90ed2a0e68d3\") " pod="openshift-marketplace/redhat-marketplace-qf6qx" Dec 06 10:18:12 crc kubenswrapper[4954]: I1206 10:18:12.983789 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/702b5f4a-0f6e-411d-8920-90ed2a0e68d3-catalog-content\") pod \"redhat-marketplace-qf6qx\" (UID: \"702b5f4a-0f6e-411d-8920-90ed2a0e68d3\") " pod="openshift-marketplace/redhat-marketplace-qf6qx" Dec 06 10:18:12 crc kubenswrapper[4954]: I1206 10:18:12.983871 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7r59\" (UniqueName: \"kubernetes.io/projected/702b5f4a-0f6e-411d-8920-90ed2a0e68d3-kube-api-access-r7r59\") pod \"redhat-marketplace-qf6qx\" (UID: \"702b5f4a-0f6e-411d-8920-90ed2a0e68d3\") " pod="openshift-marketplace/redhat-marketplace-qf6qx" Dec 06 10:18:12 crc kubenswrapper[4954]: I1206 10:18:12.983909 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/702b5f4a-0f6e-411d-8920-90ed2a0e68d3-utilities\") pod \"redhat-marketplace-qf6qx\" (UID: \"702b5f4a-0f6e-411d-8920-90ed2a0e68d3\") " pod="openshift-marketplace/redhat-marketplace-qf6qx" Dec 06 10:18:12 crc kubenswrapper[4954]: I1206 10:18:12.984423 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/702b5f4a-0f6e-411d-8920-90ed2a0e68d3-utilities\") pod \"redhat-marketplace-qf6qx\" (UID: \"702b5f4a-0f6e-411d-8920-90ed2a0e68d3\") " pod="openshift-marketplace/redhat-marketplace-qf6qx" Dec 06 10:18:12 crc kubenswrapper[4954]: I1206 10:18:12.984435 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/702b5f4a-0f6e-411d-8920-90ed2a0e68d3-catalog-content\") pod \"redhat-marketplace-qf6qx\" (UID: \"702b5f4a-0f6e-411d-8920-90ed2a0e68d3\") " pod="openshift-marketplace/redhat-marketplace-qf6qx" Dec 06 10:18:13 crc kubenswrapper[4954]: I1206 10:18:13.017520 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7r59\" (UniqueName: \"kubernetes.io/projected/702b5f4a-0f6e-411d-8920-90ed2a0e68d3-kube-api-access-r7r59\") pod \"redhat-marketplace-qf6qx\" (UID: \"702b5f4a-0f6e-411d-8920-90ed2a0e68d3\") " pod="openshift-marketplace/redhat-marketplace-qf6qx" Dec 06 10:18:13 crc kubenswrapper[4954]: I1206 10:18:13.077935 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qf6qx" Dec 06 10:18:14 crc kubenswrapper[4954]: I1206 10:18:14.147241 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qf6qx"] Dec 06 10:18:15 crc kubenswrapper[4954]: I1206 10:18:15.020303 4954 generic.go:334] "Generic (PLEG): container finished" podID="702b5f4a-0f6e-411d-8920-90ed2a0e68d3" containerID="7d6899dd54156bd386935803b04e4d8a448c75b53e4692d69bb27a34b625e473" exitCode=0 Dec 06 10:18:15 crc kubenswrapper[4954]: I1206 10:18:15.020848 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qf6qx" event={"ID":"702b5f4a-0f6e-411d-8920-90ed2a0e68d3","Type":"ContainerDied","Data":"7d6899dd54156bd386935803b04e4d8a448c75b53e4692d69bb27a34b625e473"} Dec 06 10:18:15 crc kubenswrapper[4954]: I1206 10:18:15.021889 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qf6qx" event={"ID":"702b5f4a-0f6e-411d-8920-90ed2a0e68d3","Type":"ContainerStarted","Data":"a308b41295954a3fa11333f28b39ea676d1274c322e4e6f32c3f7a199f6a6242"} Dec 06 10:18:15 crc kubenswrapper[4954]: I1206 10:18:15.023421 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 10:18:17 crc kubenswrapper[4954]: I1206 10:18:17.044003 4954 generic.go:334] "Generic (PLEG): container finished" podID="702b5f4a-0f6e-411d-8920-90ed2a0e68d3" containerID="a75b1a1a5f9b8bd42584e0e2613f42866052eec12563286816e58f0886749390" exitCode=0 Dec 06 10:18:17 crc kubenswrapper[4954]: I1206 10:18:17.044102 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qf6qx" event={"ID":"702b5f4a-0f6e-411d-8920-90ed2a0e68d3","Type":"ContainerDied","Data":"a75b1a1a5f9b8bd42584e0e2613f42866052eec12563286816e58f0886749390"} Dec 06 10:18:18 crc kubenswrapper[4954]: I1206 10:18:18.058084 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qf6qx" event={"ID":"702b5f4a-0f6e-411d-8920-90ed2a0e68d3","Type":"ContainerStarted","Data":"a65aaef5d62f70236b68b4be0f56cdbd2482e9e09a6e8903787fe2c0972567fc"} Dec 06 10:18:23 crc kubenswrapper[4954]: I1206 10:18:23.079150 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qf6qx" Dec 06 10:18:23 crc kubenswrapper[4954]: I1206 10:18:23.079707 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qf6qx" Dec 06 10:18:23 crc kubenswrapper[4954]: I1206 10:18:23.133416 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qf6qx" Dec 06 10:18:23 crc kubenswrapper[4954]: I1206 10:18:23.157653 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qf6qx" podStartSLOduration=8.466084798 podStartE2EDuration="11.157633582s" podCreationTimestamp="2025-12-06 10:18:12 +0000 UTC" firstStartedPulling="2025-12-06 10:18:15.022911792 +0000 UTC m=+12069.836271181" lastFinishedPulling="2025-12-06 10:18:17.714460576 +0000 UTC m=+12072.527819965" observedRunningTime="2025-12-06 10:18:18.087084761 +0000 UTC m=+12072.900444160" watchObservedRunningTime="2025-12-06 10:18:23.157633582 +0000 UTC m=+12077.970992971" Dec 06 10:18:23 crc kubenswrapper[4954]: I1206 10:18:23.207466 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qf6qx" Dec 06 10:18:23 crc kubenswrapper[4954]: I1206 10:18:23.374294 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qf6qx"] Dec 06 10:18:23 crc kubenswrapper[4954]: I1206 10:18:23.444291 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:18:23 crc kubenswrapper[4954]: E1206 10:18:23.444546 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:18:25 crc kubenswrapper[4954]: I1206 10:18:25.142805 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qf6qx" podUID="702b5f4a-0f6e-411d-8920-90ed2a0e68d3" containerName="registry-server" containerID="cri-o://a65aaef5d62f70236b68b4be0f56cdbd2482e9e09a6e8903787fe2c0972567fc" gracePeriod=2 Dec 06 10:18:26 crc kubenswrapper[4954]: I1206 10:18:26.153611 4954 generic.go:334] "Generic (PLEG): container finished" podID="702b5f4a-0f6e-411d-8920-90ed2a0e68d3" containerID="a65aaef5d62f70236b68b4be0f56cdbd2482e9e09a6e8903787fe2c0972567fc" exitCode=0 Dec 06 10:18:26 crc kubenswrapper[4954]: I1206 10:18:26.153787 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qf6qx" event={"ID":"702b5f4a-0f6e-411d-8920-90ed2a0e68d3","Type":"ContainerDied","Data":"a65aaef5d62f70236b68b4be0f56cdbd2482e9e09a6e8903787fe2c0972567fc"} Dec 06 10:18:26 crc kubenswrapper[4954]: I1206 10:18:26.462937 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qf6qx" Dec 06 10:18:26 crc kubenswrapper[4954]: I1206 10:18:26.477155 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/702b5f4a-0f6e-411d-8920-90ed2a0e68d3-utilities\") pod \"702b5f4a-0f6e-411d-8920-90ed2a0e68d3\" (UID: \"702b5f4a-0f6e-411d-8920-90ed2a0e68d3\") " Dec 06 10:18:26 crc kubenswrapper[4954]: I1206 10:18:26.477959 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7r59\" (UniqueName: \"kubernetes.io/projected/702b5f4a-0f6e-411d-8920-90ed2a0e68d3-kube-api-access-r7r59\") pod \"702b5f4a-0f6e-411d-8920-90ed2a0e68d3\" (UID: \"702b5f4a-0f6e-411d-8920-90ed2a0e68d3\") " Dec 06 10:18:26 crc kubenswrapper[4954]: I1206 10:18:26.478185 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/702b5f4a-0f6e-411d-8920-90ed2a0e68d3-catalog-content\") pod \"702b5f4a-0f6e-411d-8920-90ed2a0e68d3\" (UID: \"702b5f4a-0f6e-411d-8920-90ed2a0e68d3\") " Dec 06 10:18:26 crc kubenswrapper[4954]: I1206 10:18:26.478210 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/702b5f4a-0f6e-411d-8920-90ed2a0e68d3-utilities" (OuterVolumeSpecName: "utilities") pod "702b5f4a-0f6e-411d-8920-90ed2a0e68d3" (UID: "702b5f4a-0f6e-411d-8920-90ed2a0e68d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:18:26 crc kubenswrapper[4954]: I1206 10:18:26.478890 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/702b5f4a-0f6e-411d-8920-90ed2a0e68d3-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:18:26 crc kubenswrapper[4954]: I1206 10:18:26.485496 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/702b5f4a-0f6e-411d-8920-90ed2a0e68d3-kube-api-access-r7r59" (OuterVolumeSpecName: "kube-api-access-r7r59") pod "702b5f4a-0f6e-411d-8920-90ed2a0e68d3" (UID: "702b5f4a-0f6e-411d-8920-90ed2a0e68d3"). InnerVolumeSpecName "kube-api-access-r7r59". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:18:26 crc kubenswrapper[4954]: I1206 10:18:26.508880 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/702b5f4a-0f6e-411d-8920-90ed2a0e68d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "702b5f4a-0f6e-411d-8920-90ed2a0e68d3" (UID: "702b5f4a-0f6e-411d-8920-90ed2a0e68d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:18:26 crc kubenswrapper[4954]: I1206 10:18:26.582275 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/702b5f4a-0f6e-411d-8920-90ed2a0e68d3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:18:26 crc kubenswrapper[4954]: I1206 10:18:26.582668 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7r59\" (UniqueName: \"kubernetes.io/projected/702b5f4a-0f6e-411d-8920-90ed2a0e68d3-kube-api-access-r7r59\") on node \"crc\" DevicePath \"\"" Dec 06 10:18:27 crc kubenswrapper[4954]: I1206 10:18:27.166211 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qf6qx" event={"ID":"702b5f4a-0f6e-411d-8920-90ed2a0e68d3","Type":"ContainerDied","Data":"a308b41295954a3fa11333f28b39ea676d1274c322e4e6f32c3f7a199f6a6242"} Dec 06 10:18:27 crc kubenswrapper[4954]: I1206 10:18:27.166244 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qf6qx" Dec 06 10:18:27 crc kubenswrapper[4954]: I1206 10:18:27.166265 4954 scope.go:117] "RemoveContainer" containerID="a65aaef5d62f70236b68b4be0f56cdbd2482e9e09a6e8903787fe2c0972567fc" Dec 06 10:18:27 crc kubenswrapper[4954]: I1206 10:18:27.194135 4954 scope.go:117] "RemoveContainer" containerID="a75b1a1a5f9b8bd42584e0e2613f42866052eec12563286816e58f0886749390" Dec 06 10:18:27 crc kubenswrapper[4954]: I1206 10:18:27.209713 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qf6qx"] Dec 06 10:18:27 crc kubenswrapper[4954]: I1206 10:18:27.228615 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qf6qx"] Dec 06 10:18:27 crc kubenswrapper[4954]: I1206 10:18:27.251090 4954 scope.go:117] "RemoveContainer" containerID="7d6899dd54156bd386935803b04e4d8a448c75b53e4692d69bb27a34b625e473" Dec 06 10:18:27 crc kubenswrapper[4954]: I1206 10:18:27.454423 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="702b5f4a-0f6e-411d-8920-90ed2a0e68d3" path="/var/lib/kubelet/pods/702b5f4a-0f6e-411d-8920-90ed2a0e68d3/volumes" Dec 06 10:18:36 crc kubenswrapper[4954]: I1206 10:18:36.444253 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:18:36 crc kubenswrapper[4954]: E1206 10:18:36.445065 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:18:38 crc kubenswrapper[4954]: I1206 10:18:38.722764 4954 scope.go:117] "RemoveContainer" containerID="7b4ee08b1c1fb73bc7e831f032a9a540ca446f2b59e405d66fb2945abb8e17c4" Dec 06 10:18:49 crc kubenswrapper[4954]: I1206 10:18:49.443120 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:18:49 crc kubenswrapper[4954]: E1206 10:18:49.444826 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:19:01 crc kubenswrapper[4954]: I1206 10:19:01.444374 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:19:01 crc kubenswrapper[4954]: E1206 10:19:01.445145 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:19:14 crc kubenswrapper[4954]: I1206 10:19:14.443975 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:19:14 crc kubenswrapper[4954]: E1206 10:19:14.444940 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:19:29 crc kubenswrapper[4954]: I1206 10:19:29.443166 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:19:29 crc kubenswrapper[4954]: E1206 10:19:29.444019 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:19:41 crc kubenswrapper[4954]: I1206 10:19:41.444234 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:19:41 crc kubenswrapper[4954]: I1206 10:19:41.933536 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"456932849a645b1fa6ce8e4313073ddb71cfbc8bbdd73a57e4801eb20ec2757c"} Dec 06 10:21:50 crc kubenswrapper[4954]: I1206 10:21:50.187393 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j4krp"] Dec 06 10:21:50 crc kubenswrapper[4954]: E1206 10:21:50.189209 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="702b5f4a-0f6e-411d-8920-90ed2a0e68d3" containerName="extract-utilities" Dec 06 10:21:50 crc kubenswrapper[4954]: I1206 10:21:50.189304 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="702b5f4a-0f6e-411d-8920-90ed2a0e68d3" containerName="extract-utilities" Dec 06 10:21:50 crc kubenswrapper[4954]: E1206 10:21:50.189338 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="702b5f4a-0f6e-411d-8920-90ed2a0e68d3" containerName="extract-content" Dec 06 10:21:50 crc kubenswrapper[4954]: I1206 10:21:50.189346 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="702b5f4a-0f6e-411d-8920-90ed2a0e68d3" containerName="extract-content" Dec 06 10:21:50 crc kubenswrapper[4954]: E1206 10:21:50.189374 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="702b5f4a-0f6e-411d-8920-90ed2a0e68d3" containerName="registry-server" Dec 06 10:21:50 crc kubenswrapper[4954]: I1206 10:21:50.189379 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="702b5f4a-0f6e-411d-8920-90ed2a0e68d3" containerName="registry-server" Dec 06 10:21:50 crc kubenswrapper[4954]: I1206 10:21:50.193156 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="702b5f4a-0f6e-411d-8920-90ed2a0e68d3" containerName="registry-server" Dec 06 10:21:50 crc kubenswrapper[4954]: I1206 10:21:50.196148 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j4krp" Dec 06 10:21:50 crc kubenswrapper[4954]: I1206 10:21:50.253451 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbxfd\" (UniqueName: \"kubernetes.io/projected/bbcd83f0-4c98-4200-9748-b2c8868cfeb6-kube-api-access-dbxfd\") pod \"certified-operators-j4krp\" (UID: \"bbcd83f0-4c98-4200-9748-b2c8868cfeb6\") " pod="openshift-marketplace/certified-operators-j4krp" Dec 06 10:21:50 crc kubenswrapper[4954]: I1206 10:21:50.253609 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbcd83f0-4c98-4200-9748-b2c8868cfeb6-utilities\") pod \"certified-operators-j4krp\" (UID: \"bbcd83f0-4c98-4200-9748-b2c8868cfeb6\") " pod="openshift-marketplace/certified-operators-j4krp" Dec 06 10:21:50 crc kubenswrapper[4954]: I1206 10:21:50.253709 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbcd83f0-4c98-4200-9748-b2c8868cfeb6-catalog-content\") pod \"certified-operators-j4krp\" (UID: \"bbcd83f0-4c98-4200-9748-b2c8868cfeb6\") " pod="openshift-marketplace/certified-operators-j4krp" Dec 06 10:21:50 crc kubenswrapper[4954]: I1206 10:21:50.283376 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j4krp"] Dec 06 10:21:50 crc kubenswrapper[4954]: I1206 10:21:50.355730 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbxfd\" (UniqueName: \"kubernetes.io/projected/bbcd83f0-4c98-4200-9748-b2c8868cfeb6-kube-api-access-dbxfd\") pod \"certified-operators-j4krp\" (UID: \"bbcd83f0-4c98-4200-9748-b2c8868cfeb6\") " pod="openshift-marketplace/certified-operators-j4krp" Dec 06 10:21:50 crc kubenswrapper[4954]: I1206 10:21:50.355901 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbcd83f0-4c98-4200-9748-b2c8868cfeb6-utilities\") pod \"certified-operators-j4krp\" (UID: \"bbcd83f0-4c98-4200-9748-b2c8868cfeb6\") " pod="openshift-marketplace/certified-operators-j4krp" Dec 06 10:21:50 crc kubenswrapper[4954]: I1206 10:21:50.356054 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbcd83f0-4c98-4200-9748-b2c8868cfeb6-catalog-content\") pod \"certified-operators-j4krp\" (UID: \"bbcd83f0-4c98-4200-9748-b2c8868cfeb6\") " pod="openshift-marketplace/certified-operators-j4krp" Dec 06 10:21:50 crc kubenswrapper[4954]: I1206 10:21:50.357147 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbcd83f0-4c98-4200-9748-b2c8868cfeb6-catalog-content\") pod \"certified-operators-j4krp\" (UID: \"bbcd83f0-4c98-4200-9748-b2c8868cfeb6\") " pod="openshift-marketplace/certified-operators-j4krp" Dec 06 10:21:50 crc kubenswrapper[4954]: I1206 10:21:50.357165 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbcd83f0-4c98-4200-9748-b2c8868cfeb6-utilities\") pod \"certified-operators-j4krp\" (UID: \"bbcd83f0-4c98-4200-9748-b2c8868cfeb6\") " pod="openshift-marketplace/certified-operators-j4krp" Dec 06 10:21:50 crc kubenswrapper[4954]: I1206 10:21:50.380022 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbxfd\" (UniqueName: \"kubernetes.io/projected/bbcd83f0-4c98-4200-9748-b2c8868cfeb6-kube-api-access-dbxfd\") pod \"certified-operators-j4krp\" (UID: \"bbcd83f0-4c98-4200-9748-b2c8868cfeb6\") " pod="openshift-marketplace/certified-operators-j4krp" Dec 06 10:21:50 crc kubenswrapper[4954]: I1206 10:21:50.520855 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j4krp" Dec 06 10:21:51 crc kubenswrapper[4954]: I1206 10:21:51.742231 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j4krp"] Dec 06 10:21:52 crc kubenswrapper[4954]: I1206 10:21:52.390149 4954 generic.go:334] "Generic (PLEG): container finished" podID="bbcd83f0-4c98-4200-9748-b2c8868cfeb6" containerID="cc987b9ede5ed31785526a76ef8347df2109d1d168dcab6a1acc5c23dda6d3f4" exitCode=0 Dec 06 10:21:52 crc kubenswrapper[4954]: I1206 10:21:52.390404 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4krp" event={"ID":"bbcd83f0-4c98-4200-9748-b2c8868cfeb6","Type":"ContainerDied","Data":"cc987b9ede5ed31785526a76ef8347df2109d1d168dcab6a1acc5c23dda6d3f4"} Dec 06 10:21:52 crc kubenswrapper[4954]: I1206 10:21:52.390430 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4krp" event={"ID":"bbcd83f0-4c98-4200-9748-b2c8868cfeb6","Type":"ContainerStarted","Data":"cf260ec2162f73db0050218cb44cb8b4143bed4608e5b292b4b6c13339e5d11b"} Dec 06 10:21:53 crc kubenswrapper[4954]: I1206 10:21:53.404162 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4krp" event={"ID":"bbcd83f0-4c98-4200-9748-b2c8868cfeb6","Type":"ContainerStarted","Data":"7385833dcb3df8029663bd48276715af3216998d7c6a06f3a6bf0f7a66fd8156"} Dec 06 10:21:55 crc kubenswrapper[4954]: I1206 10:21:55.430690 4954 generic.go:334] "Generic (PLEG): container finished" podID="bbcd83f0-4c98-4200-9748-b2c8868cfeb6" containerID="7385833dcb3df8029663bd48276715af3216998d7c6a06f3a6bf0f7a66fd8156" exitCode=0 Dec 06 10:21:55 crc kubenswrapper[4954]: I1206 10:21:55.430743 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4krp" event={"ID":"bbcd83f0-4c98-4200-9748-b2c8868cfeb6","Type":"ContainerDied","Data":"7385833dcb3df8029663bd48276715af3216998d7c6a06f3a6bf0f7a66fd8156"} Dec 06 10:21:56 crc kubenswrapper[4954]: I1206 10:21:56.442322 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4krp" event={"ID":"bbcd83f0-4c98-4200-9748-b2c8868cfeb6","Type":"ContainerStarted","Data":"4acc2997f734fc62da52e9d203f9b605d874cf33e77d411b25e32ffc1c7235f3"} Dec 06 10:21:56 crc kubenswrapper[4954]: I1206 10:21:56.460117 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j4krp" podStartSLOduration=2.869248365 podStartE2EDuration="6.460087902s" podCreationTimestamp="2025-12-06 10:21:50 +0000 UTC" firstStartedPulling="2025-12-06 10:21:52.392493205 +0000 UTC m=+12287.205852594" lastFinishedPulling="2025-12-06 10:21:55.983332742 +0000 UTC m=+12290.796692131" observedRunningTime="2025-12-06 10:21:56.4566612 +0000 UTC m=+12291.270020589" watchObservedRunningTime="2025-12-06 10:21:56.460087902 +0000 UTC m=+12291.273447281" Dec 06 10:22:00 crc kubenswrapper[4954]: I1206 10:22:00.523335 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j4krp" Dec 06 10:22:00 crc kubenswrapper[4954]: I1206 10:22:00.523895 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j4krp" Dec 06 10:22:01 crc kubenswrapper[4954]: I1206 10:22:01.635190 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-j4krp" podUID="bbcd83f0-4c98-4200-9748-b2c8868cfeb6" containerName="registry-server" probeResult="failure" output=< Dec 06 10:22:01 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 10:22:01 crc kubenswrapper[4954]: > Dec 06 10:22:10 crc kubenswrapper[4954]: I1206 10:22:10.101727 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:22:10 crc kubenswrapper[4954]: I1206 10:22:10.104815 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:22:10 crc kubenswrapper[4954]: I1206 10:22:10.582247 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j4krp" Dec 06 10:22:10 crc kubenswrapper[4954]: I1206 10:22:10.668719 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j4krp" Dec 06 10:22:12 crc kubenswrapper[4954]: I1206 10:22:12.717400 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j4krp"] Dec 06 10:22:12 crc kubenswrapper[4954]: I1206 10:22:12.720292 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j4krp" podUID="bbcd83f0-4c98-4200-9748-b2c8868cfeb6" containerName="registry-server" containerID="cri-o://4acc2997f734fc62da52e9d203f9b605d874cf33e77d411b25e32ffc1c7235f3" gracePeriod=2 Dec 06 10:22:13 crc kubenswrapper[4954]: I1206 10:22:13.682791 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4krp" event={"ID":"bbcd83f0-4c98-4200-9748-b2c8868cfeb6","Type":"ContainerDied","Data":"4acc2997f734fc62da52e9d203f9b605d874cf33e77d411b25e32ffc1c7235f3"} Dec 06 10:22:13 crc kubenswrapper[4954]: I1206 10:22:13.682838 4954 generic.go:334] "Generic (PLEG): container finished" podID="bbcd83f0-4c98-4200-9748-b2c8868cfeb6" containerID="4acc2997f734fc62da52e9d203f9b605d874cf33e77d411b25e32ffc1c7235f3" exitCode=0 Dec 06 10:22:14 crc kubenswrapper[4954]: I1206 10:22:14.697959 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4krp" event={"ID":"bbcd83f0-4c98-4200-9748-b2c8868cfeb6","Type":"ContainerDied","Data":"cf260ec2162f73db0050218cb44cb8b4143bed4608e5b292b4b6c13339e5d11b"} Dec 06 10:22:14 crc kubenswrapper[4954]: I1206 10:22:14.699132 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf260ec2162f73db0050218cb44cb8b4143bed4608e5b292b4b6c13339e5d11b" Dec 06 10:22:14 crc kubenswrapper[4954]: I1206 10:22:14.783217 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j4krp" Dec 06 10:22:14 crc kubenswrapper[4954]: I1206 10:22:14.885969 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbcd83f0-4c98-4200-9748-b2c8868cfeb6-catalog-content\") pod \"bbcd83f0-4c98-4200-9748-b2c8868cfeb6\" (UID: \"bbcd83f0-4c98-4200-9748-b2c8868cfeb6\") " Dec 06 10:22:14 crc kubenswrapper[4954]: I1206 10:22:14.886285 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbxfd\" (UniqueName: \"kubernetes.io/projected/bbcd83f0-4c98-4200-9748-b2c8868cfeb6-kube-api-access-dbxfd\") pod \"bbcd83f0-4c98-4200-9748-b2c8868cfeb6\" (UID: \"bbcd83f0-4c98-4200-9748-b2c8868cfeb6\") " Dec 06 10:22:14 crc kubenswrapper[4954]: I1206 10:22:14.886382 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbcd83f0-4c98-4200-9748-b2c8868cfeb6-utilities\") pod \"bbcd83f0-4c98-4200-9748-b2c8868cfeb6\" (UID: \"bbcd83f0-4c98-4200-9748-b2c8868cfeb6\") " Dec 06 10:22:14 crc kubenswrapper[4954]: I1206 10:22:14.891691 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbcd83f0-4c98-4200-9748-b2c8868cfeb6-utilities" (OuterVolumeSpecName: "utilities") pod "bbcd83f0-4c98-4200-9748-b2c8868cfeb6" (UID: "bbcd83f0-4c98-4200-9748-b2c8868cfeb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:22:14 crc kubenswrapper[4954]: I1206 10:22:14.905992 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbcd83f0-4c98-4200-9748-b2c8868cfeb6-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:22:14 crc kubenswrapper[4954]: I1206 10:22:14.945929 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbcd83f0-4c98-4200-9748-b2c8868cfeb6-kube-api-access-dbxfd" (OuterVolumeSpecName: "kube-api-access-dbxfd") pod "bbcd83f0-4c98-4200-9748-b2c8868cfeb6" (UID: "bbcd83f0-4c98-4200-9748-b2c8868cfeb6"). InnerVolumeSpecName "kube-api-access-dbxfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:22:14 crc kubenswrapper[4954]: I1206 10:22:14.968100 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbcd83f0-4c98-4200-9748-b2c8868cfeb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bbcd83f0-4c98-4200-9748-b2c8868cfeb6" (UID: "bbcd83f0-4c98-4200-9748-b2c8868cfeb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:22:15 crc kubenswrapper[4954]: I1206 10:22:15.008310 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbcd83f0-4c98-4200-9748-b2c8868cfeb6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:22:15 crc kubenswrapper[4954]: I1206 10:22:15.008354 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbxfd\" (UniqueName: \"kubernetes.io/projected/bbcd83f0-4c98-4200-9748-b2c8868cfeb6-kube-api-access-dbxfd\") on node \"crc\" DevicePath \"\"" Dec 06 10:22:15 crc kubenswrapper[4954]: I1206 10:22:15.706266 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j4krp" Dec 06 10:22:15 crc kubenswrapper[4954]: I1206 10:22:15.735168 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j4krp"] Dec 06 10:22:15 crc kubenswrapper[4954]: I1206 10:22:15.750072 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j4krp"] Dec 06 10:22:17 crc kubenswrapper[4954]: I1206 10:22:17.454459 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbcd83f0-4c98-4200-9748-b2c8868cfeb6" path="/var/lib/kubelet/pods/bbcd83f0-4c98-4200-9748-b2c8868cfeb6/volumes" Dec 06 10:22:37 crc kubenswrapper[4954]: I1206 10:22:37.021384 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-p95p2" podUID="bda5decc-cd20-487a-b0b5-01058cac828c" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 10:22:40 crc kubenswrapper[4954]: I1206 10:22:40.101388 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:22:40 crc kubenswrapper[4954]: I1206 10:22:40.102341 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:23:10 crc kubenswrapper[4954]: I1206 10:23:10.101018 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:23:10 crc kubenswrapper[4954]: I1206 10:23:10.102777 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:23:10 crc kubenswrapper[4954]: I1206 10:23:10.104698 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 10:23:10 crc kubenswrapper[4954]: I1206 10:23:10.106196 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"456932849a645b1fa6ce8e4313073ddb71cfbc8bbdd73a57e4801eb20ec2757c"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 10:23:10 crc kubenswrapper[4954]: I1206 10:23:10.106782 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://456932849a645b1fa6ce8e4313073ddb71cfbc8bbdd73a57e4801eb20ec2757c" gracePeriod=600 Dec 06 10:23:10 crc kubenswrapper[4954]: I1206 10:23:10.608163 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="456932849a645b1fa6ce8e4313073ddb71cfbc8bbdd73a57e4801eb20ec2757c" exitCode=0 Dec 06 10:23:10 crc kubenswrapper[4954]: I1206 10:23:10.608243 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"456932849a645b1fa6ce8e4313073ddb71cfbc8bbdd73a57e4801eb20ec2757c"} Dec 06 10:23:10 crc kubenswrapper[4954]: I1206 10:23:10.609509 4954 scope.go:117] "RemoveContainer" containerID="b0be7de17efa3b31122306b105c29bb76d4d9b6e8a75dce86e87d6192958bd4a" Dec 06 10:23:11 crc kubenswrapper[4954]: I1206 10:23:11.620941 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d"} Dec 06 10:24:08 crc kubenswrapper[4954]: I1206 10:24:08.241602 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jvvtc"] Dec 06 10:24:08 crc kubenswrapper[4954]: E1206 10:24:08.243428 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbcd83f0-4c98-4200-9748-b2c8868cfeb6" containerName="registry-server" Dec 06 10:24:08 crc kubenswrapper[4954]: I1206 10:24:08.243444 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbcd83f0-4c98-4200-9748-b2c8868cfeb6" containerName="registry-server" Dec 06 10:24:08 crc kubenswrapper[4954]: E1206 10:24:08.243725 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbcd83f0-4c98-4200-9748-b2c8868cfeb6" containerName="extract-utilities" Dec 06 10:24:08 crc kubenswrapper[4954]: I1206 10:24:08.243734 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbcd83f0-4c98-4200-9748-b2c8868cfeb6" containerName="extract-utilities" Dec 06 10:24:08 crc kubenswrapper[4954]: E1206 10:24:08.243748 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbcd83f0-4c98-4200-9748-b2c8868cfeb6" containerName="extract-content" Dec 06 10:24:08 crc kubenswrapper[4954]: I1206 10:24:08.243754 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbcd83f0-4c98-4200-9748-b2c8868cfeb6" containerName="extract-content" Dec 06 10:24:08 crc kubenswrapper[4954]: I1206 10:24:08.244413 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbcd83f0-4c98-4200-9748-b2c8868cfeb6" containerName="registry-server" Dec 06 10:24:08 crc kubenswrapper[4954]: I1206 10:24:08.248886 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jvvtc" Dec 06 10:24:08 crc kubenswrapper[4954]: I1206 10:24:08.317365 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs6nc\" (UniqueName: \"kubernetes.io/projected/633cc8c2-e974-4bed-bf88-21d2c68b83cd-kube-api-access-hs6nc\") pod \"community-operators-jvvtc\" (UID: \"633cc8c2-e974-4bed-bf88-21d2c68b83cd\") " pod="openshift-marketplace/community-operators-jvvtc" Dec 06 10:24:08 crc kubenswrapper[4954]: I1206 10:24:08.317850 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/633cc8c2-e974-4bed-bf88-21d2c68b83cd-utilities\") pod \"community-operators-jvvtc\" (UID: \"633cc8c2-e974-4bed-bf88-21d2c68b83cd\") " pod="openshift-marketplace/community-operators-jvvtc" Dec 06 10:24:08 crc kubenswrapper[4954]: I1206 10:24:08.318236 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/633cc8c2-e974-4bed-bf88-21d2c68b83cd-catalog-content\") pod \"community-operators-jvvtc\" (UID: \"633cc8c2-e974-4bed-bf88-21d2c68b83cd\") " pod="openshift-marketplace/community-operators-jvvtc" Dec 06 10:24:08 crc kubenswrapper[4954]: I1206 10:24:08.420511 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/633cc8c2-e974-4bed-bf88-21d2c68b83cd-catalog-content\") pod \"community-operators-jvvtc\" (UID: \"633cc8c2-e974-4bed-bf88-21d2c68b83cd\") " pod="openshift-marketplace/community-operators-jvvtc" Dec 06 10:24:08 crc kubenswrapper[4954]: I1206 10:24:08.420593 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs6nc\" (UniqueName: \"kubernetes.io/projected/633cc8c2-e974-4bed-bf88-21d2c68b83cd-kube-api-access-hs6nc\") pod \"community-operators-jvvtc\" (UID: \"633cc8c2-e974-4bed-bf88-21d2c68b83cd\") " pod="openshift-marketplace/community-operators-jvvtc" Dec 06 10:24:08 crc kubenswrapper[4954]: I1206 10:24:08.420720 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/633cc8c2-e974-4bed-bf88-21d2c68b83cd-utilities\") pod \"community-operators-jvvtc\" (UID: \"633cc8c2-e974-4bed-bf88-21d2c68b83cd\") " pod="openshift-marketplace/community-operators-jvvtc" Dec 06 10:24:08 crc kubenswrapper[4954]: I1206 10:24:08.422664 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/633cc8c2-e974-4bed-bf88-21d2c68b83cd-catalog-content\") pod \"community-operators-jvvtc\" (UID: \"633cc8c2-e974-4bed-bf88-21d2c68b83cd\") " pod="openshift-marketplace/community-operators-jvvtc" Dec 06 10:24:08 crc kubenswrapper[4954]: I1206 10:24:08.422748 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/633cc8c2-e974-4bed-bf88-21d2c68b83cd-utilities\") pod \"community-operators-jvvtc\" (UID: \"633cc8c2-e974-4bed-bf88-21d2c68b83cd\") " pod="openshift-marketplace/community-operators-jvvtc" Dec 06 10:24:08 crc kubenswrapper[4954]: I1206 10:24:08.439800 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jvvtc"] Dec 06 10:24:08 crc kubenswrapper[4954]: I1206 10:24:08.452556 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs6nc\" (UniqueName: \"kubernetes.io/projected/633cc8c2-e974-4bed-bf88-21d2c68b83cd-kube-api-access-hs6nc\") pod \"community-operators-jvvtc\" (UID: \"633cc8c2-e974-4bed-bf88-21d2c68b83cd\") " pod="openshift-marketplace/community-operators-jvvtc" Dec 06 10:24:08 crc kubenswrapper[4954]: I1206 10:24:08.579768 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jvvtc" Dec 06 10:24:09 crc kubenswrapper[4954]: I1206 10:24:09.852548 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jvvtc"] Dec 06 10:24:10 crc kubenswrapper[4954]: I1206 10:24:10.858056 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvvtc" event={"ID":"633cc8c2-e974-4bed-bf88-21d2c68b83cd","Type":"ContainerDied","Data":"6e1115244e24bfcf4dd5961e7a528bf3d40f26f7dc4be2e6407e4ac1c0d1bcf0"} Dec 06 10:24:10 crc kubenswrapper[4954]: I1206 10:24:10.858140 4954 generic.go:334] "Generic (PLEG): container finished" podID="633cc8c2-e974-4bed-bf88-21d2c68b83cd" containerID="6e1115244e24bfcf4dd5961e7a528bf3d40f26f7dc4be2e6407e4ac1c0d1bcf0" exitCode=0 Dec 06 10:24:10 crc kubenswrapper[4954]: I1206 10:24:10.859675 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvvtc" event={"ID":"633cc8c2-e974-4bed-bf88-21d2c68b83cd","Type":"ContainerStarted","Data":"ec66d1e6f615ef2d44d73672380f07ed25d4bdc38c3f079f644069d605310ecd"} Dec 06 10:24:10 crc kubenswrapper[4954]: I1206 10:24:10.869460 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 10:24:11 crc kubenswrapper[4954]: I1206 10:24:11.883169 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvvtc" event={"ID":"633cc8c2-e974-4bed-bf88-21d2c68b83cd","Type":"ContainerStarted","Data":"2093317de5b82d043b7de4f469d8789d15622b8e932eadb545124d0588bad307"} Dec 06 10:24:13 crc kubenswrapper[4954]: E1206 10:24:13.018442 4954 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod633cc8c2_e974_4bed_bf88_21d2c68b83cd.slice/crio-2093317de5b82d043b7de4f469d8789d15622b8e932eadb545124d0588bad307.scope\": RecentStats: unable to find data in memory cache]" Dec 06 10:24:13 crc kubenswrapper[4954]: I1206 10:24:13.922064 4954 generic.go:334] "Generic (PLEG): container finished" podID="633cc8c2-e974-4bed-bf88-21d2c68b83cd" containerID="2093317de5b82d043b7de4f469d8789d15622b8e932eadb545124d0588bad307" exitCode=0 Dec 06 10:24:13 crc kubenswrapper[4954]: I1206 10:24:13.922164 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvvtc" event={"ID":"633cc8c2-e974-4bed-bf88-21d2c68b83cd","Type":"ContainerDied","Data":"2093317de5b82d043b7de4f469d8789d15622b8e932eadb545124d0588bad307"} Dec 06 10:24:14 crc kubenswrapper[4954]: I1206 10:24:14.937507 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvvtc" event={"ID":"633cc8c2-e974-4bed-bf88-21d2c68b83cd","Type":"ContainerStarted","Data":"7dd7478a76a9090a0606b86fe383daff8a8d1c622c4ff9bb52905564fbcffb15"} Dec 06 10:24:14 crc kubenswrapper[4954]: I1206 10:24:14.965684 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jvvtc" podStartSLOduration=3.504280074 podStartE2EDuration="6.964848061s" podCreationTimestamp="2025-12-06 10:24:08 +0000 UTC" firstStartedPulling="2025-12-06 10:24:10.867669557 +0000 UTC m=+12425.681028946" lastFinishedPulling="2025-12-06 10:24:14.328237544 +0000 UTC m=+12429.141596933" observedRunningTime="2025-12-06 10:24:14.953047486 +0000 UTC m=+12429.766406896" watchObservedRunningTime="2025-12-06 10:24:14.964848061 +0000 UTC m=+12429.778207450" Dec 06 10:24:18 crc kubenswrapper[4954]: I1206 10:24:18.581268 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jvvtc" Dec 06 10:24:18 crc kubenswrapper[4954]: I1206 10:24:18.581892 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jvvtc" Dec 06 10:24:19 crc kubenswrapper[4954]: I1206 10:24:19.656528 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jvvtc" podUID="633cc8c2-e974-4bed-bf88-21d2c68b83cd" containerName="registry-server" probeResult="failure" output=< Dec 06 10:24:19 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 10:24:19 crc kubenswrapper[4954]: > Dec 06 10:24:28 crc kubenswrapper[4954]: I1206 10:24:28.634151 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jvvtc" Dec 06 10:24:28 crc kubenswrapper[4954]: I1206 10:24:28.704042 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jvvtc" Dec 06 10:24:28 crc kubenswrapper[4954]: I1206 10:24:28.830248 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jvvtc"] Dec 06 10:24:30 crc kubenswrapper[4954]: I1206 10:24:30.090279 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jvvtc" podUID="633cc8c2-e974-4bed-bf88-21d2c68b83cd" containerName="registry-server" containerID="cri-o://7dd7478a76a9090a0606b86fe383daff8a8d1c622c4ff9bb52905564fbcffb15" gracePeriod=2 Dec 06 10:24:31 crc kubenswrapper[4954]: I1206 10:24:31.122044 4954 generic.go:334] "Generic (PLEG): container finished" podID="633cc8c2-e974-4bed-bf88-21d2c68b83cd" containerID="7dd7478a76a9090a0606b86fe383daff8a8d1c622c4ff9bb52905564fbcffb15" exitCode=0 Dec 06 10:24:31 crc kubenswrapper[4954]: I1206 10:24:31.122410 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvvtc" event={"ID":"633cc8c2-e974-4bed-bf88-21d2c68b83cd","Type":"ContainerDied","Data":"7dd7478a76a9090a0606b86fe383daff8a8d1c622c4ff9bb52905564fbcffb15"} Dec 06 10:24:32 crc kubenswrapper[4954]: I1206 10:24:32.051907 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jvvtc" Dec 06 10:24:32 crc kubenswrapper[4954]: I1206 10:24:32.137208 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/633cc8c2-e974-4bed-bf88-21d2c68b83cd-catalog-content\") pod \"633cc8c2-e974-4bed-bf88-21d2c68b83cd\" (UID: \"633cc8c2-e974-4bed-bf88-21d2c68b83cd\") " Dec 06 10:24:32 crc kubenswrapper[4954]: I1206 10:24:32.137723 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs6nc\" (UniqueName: \"kubernetes.io/projected/633cc8c2-e974-4bed-bf88-21d2c68b83cd-kube-api-access-hs6nc\") pod \"633cc8c2-e974-4bed-bf88-21d2c68b83cd\" (UID: \"633cc8c2-e974-4bed-bf88-21d2c68b83cd\") " Dec 06 10:24:32 crc kubenswrapper[4954]: I1206 10:24:32.137979 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/633cc8c2-e974-4bed-bf88-21d2c68b83cd-utilities\") pod \"633cc8c2-e974-4bed-bf88-21d2c68b83cd\" (UID: \"633cc8c2-e974-4bed-bf88-21d2c68b83cd\") " Dec 06 10:24:32 crc kubenswrapper[4954]: I1206 10:24:32.139703 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/633cc8c2-e974-4bed-bf88-21d2c68b83cd-utilities" (OuterVolumeSpecName: "utilities") pod "633cc8c2-e974-4bed-bf88-21d2c68b83cd" (UID: "633cc8c2-e974-4bed-bf88-21d2c68b83cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:24:32 crc kubenswrapper[4954]: I1206 10:24:32.146463 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvvtc" event={"ID":"633cc8c2-e974-4bed-bf88-21d2c68b83cd","Type":"ContainerDied","Data":"ec66d1e6f615ef2d44d73672380f07ed25d4bdc38c3f079f644069d605310ecd"} Dec 06 10:24:32 crc kubenswrapper[4954]: I1206 10:24:32.146660 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jvvtc" Dec 06 10:24:32 crc kubenswrapper[4954]: I1206 10:24:32.147063 4954 scope.go:117] "RemoveContainer" containerID="7dd7478a76a9090a0606b86fe383daff8a8d1c622c4ff9bb52905564fbcffb15" Dec 06 10:24:32 crc kubenswrapper[4954]: I1206 10:24:32.166662 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/633cc8c2-e974-4bed-bf88-21d2c68b83cd-kube-api-access-hs6nc" (OuterVolumeSpecName: "kube-api-access-hs6nc") pod "633cc8c2-e974-4bed-bf88-21d2c68b83cd" (UID: "633cc8c2-e974-4bed-bf88-21d2c68b83cd"). InnerVolumeSpecName "kube-api-access-hs6nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:24:32 crc kubenswrapper[4954]: I1206 10:24:32.221793 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/633cc8c2-e974-4bed-bf88-21d2c68b83cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "633cc8c2-e974-4bed-bf88-21d2c68b83cd" (UID: "633cc8c2-e974-4bed-bf88-21d2c68b83cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:24:32 crc kubenswrapper[4954]: I1206 10:24:32.241951 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/633cc8c2-e974-4bed-bf88-21d2c68b83cd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:24:32 crc kubenswrapper[4954]: I1206 10:24:32.241995 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs6nc\" (UniqueName: \"kubernetes.io/projected/633cc8c2-e974-4bed-bf88-21d2c68b83cd-kube-api-access-hs6nc\") on node \"crc\" DevicePath \"\"" Dec 06 10:24:32 crc kubenswrapper[4954]: I1206 10:24:32.242008 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/633cc8c2-e974-4bed-bf88-21d2c68b83cd-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:24:32 crc kubenswrapper[4954]: I1206 10:24:32.256924 4954 scope.go:117] "RemoveContainer" containerID="2093317de5b82d043b7de4f469d8789d15622b8e932eadb545124d0588bad307" Dec 06 10:24:32 crc kubenswrapper[4954]: I1206 10:24:32.297588 4954 scope.go:117] "RemoveContainer" containerID="6e1115244e24bfcf4dd5961e7a528bf3d40f26f7dc4be2e6407e4ac1c0d1bcf0" Dec 06 10:24:32 crc kubenswrapper[4954]: I1206 10:24:32.487354 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jvvtc"] Dec 06 10:24:32 crc kubenswrapper[4954]: I1206 10:24:32.497657 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jvvtc"] Dec 06 10:24:33 crc kubenswrapper[4954]: I1206 10:24:33.454774 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="633cc8c2-e974-4bed-bf88-21d2c68b83cd" path="/var/lib/kubelet/pods/633cc8c2-e974-4bed-bf88-21d2c68b83cd/volumes" Dec 06 10:24:39 crc kubenswrapper[4954]: I1206 10:24:39.768302 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="237adbf8-c3e0-427c-ac06-656c110de87b" containerName="galera" probeResult="failure" output="command timed out" Dec 06 10:24:39 crc kubenswrapper[4954]: I1206 10:24:39.769820 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="237adbf8-c3e0-427c-ac06-656c110de87b" containerName="galera" probeResult="failure" output="command timed out" Dec 06 10:24:40 crc kubenswrapper[4954]: I1206 10:24:40.608049 4954 patch_prober.go:28] interesting pod/controller-manager-6986b9b5c-phln9 container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 10:24:40 crc kubenswrapper[4954]: I1206 10:24:40.609040 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" podUID="dc530b04-9a42-4424-86d6-4920002e0f07" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 10:24:40 crc kubenswrapper[4954]: I1206 10:24:40.609179 4954 patch_prober.go:28] interesting pod/controller-manager-6986b9b5c-phln9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 10:24:40 crc kubenswrapper[4954]: I1206 10:24:40.609213 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6986b9b5c-phln9" podUID="dc530b04-9a42-4424-86d6-4920002e0f07" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 10:24:40 crc kubenswrapper[4954]: I1206 10:24:40.768875 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="56016706-35a9-41e9-afef-3555f10a4e94" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 06 10:24:40 crc kubenswrapper[4954]: I1206 10:24:40.769615 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="56016706-35a9-41e9-afef-3555f10a4e94" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 06 10:25:04 crc kubenswrapper[4954]: I1206 10:25:04.127799 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kkcv7" podUID="0e05eb35-0ec5-4760-8c17-fa88a565b38c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.75:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 10:25:04 crc kubenswrapper[4954]: I1206 10:25:04.205956 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hp8pr" podUID="5c0b16d3-8d9a-44c8-882b-8a90fd89379d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 10:25:04 crc kubenswrapper[4954]: I1206 10:25:04.332934 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-pqzdl" podUID="a47971dc-993b-47f7-b65b-4348dfd56866" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.78:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 10:25:04 crc kubenswrapper[4954]: I1206 10:25:04.375163 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jsxl2" podUID="62e28422-a646-45cc-9a4b-f3de5e2fc463" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.79:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 10:25:04 crc kubenswrapper[4954]: I1206 10:25:04.458885 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l8h49" podUID="49a30376-5e1f-4065-a8cd-b728b7413a07" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.76:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 10:25:10 crc kubenswrapper[4954]: I1206 10:25:10.100814 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:25:10 crc kubenswrapper[4954]: I1206 10:25:10.101768 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:25:40 crc kubenswrapper[4954]: I1206 10:25:40.101448 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:25:40 crc kubenswrapper[4954]: I1206 10:25:40.101977 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:26:10 crc kubenswrapper[4954]: I1206 10:26:10.100733 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:26:10 crc kubenswrapper[4954]: I1206 10:26:10.102386 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:26:10 crc kubenswrapper[4954]: I1206 10:26:10.102948 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 10:26:10 crc kubenswrapper[4954]: I1206 10:26:10.104476 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 10:26:10 crc kubenswrapper[4954]: I1206 10:26:10.104857 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" gracePeriod=600 Dec 06 10:26:10 crc kubenswrapper[4954]: E1206 10:26:10.261446 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:26:10 crc kubenswrapper[4954]: I1206 10:26:10.486944 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" exitCode=0 Dec 06 10:26:10 crc kubenswrapper[4954]: I1206 10:26:10.487004 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d"} Dec 06 10:26:10 crc kubenswrapper[4954]: I1206 10:26:10.488601 4954 scope.go:117] "RemoveContainer" containerID="456932849a645b1fa6ce8e4313073ddb71cfbc8bbdd73a57e4801eb20ec2757c" Dec 06 10:26:10 crc kubenswrapper[4954]: I1206 10:26:10.488859 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:26:10 crc kubenswrapper[4954]: E1206 10:26:10.489594 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:26:22 crc kubenswrapper[4954]: I1206 10:26:22.443748 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:26:22 crc kubenswrapper[4954]: E1206 10:26:22.444653 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:26:33 crc kubenswrapper[4954]: I1206 10:26:33.444958 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:26:33 crc kubenswrapper[4954]: E1206 10:26:33.445677 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:26:45 crc kubenswrapper[4954]: I1206 10:26:45.447048 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:26:45 crc kubenswrapper[4954]: E1206 10:26:45.448308 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:26:59 crc kubenswrapper[4954]: I1206 10:26:59.461590 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:26:59 crc kubenswrapper[4954]: E1206 10:26:59.487638 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:27:01 crc kubenswrapper[4954]: I1206 10:27:01.613296 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ph8f9"] Dec 06 10:27:01 crc kubenswrapper[4954]: E1206 10:27:01.626200 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="633cc8c2-e974-4bed-bf88-21d2c68b83cd" containerName="extract-content" Dec 06 10:27:01 crc kubenswrapper[4954]: I1206 10:27:01.626245 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="633cc8c2-e974-4bed-bf88-21d2c68b83cd" containerName="extract-content" Dec 06 10:27:01 crc kubenswrapper[4954]: E1206 10:27:01.626385 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="633cc8c2-e974-4bed-bf88-21d2c68b83cd" containerName="extract-utilities" Dec 06 10:27:01 crc kubenswrapper[4954]: I1206 10:27:01.626408 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="633cc8c2-e974-4bed-bf88-21d2c68b83cd" containerName="extract-utilities" Dec 06 10:27:01 crc kubenswrapper[4954]: E1206 10:27:01.626446 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="633cc8c2-e974-4bed-bf88-21d2c68b83cd" containerName="registry-server" Dec 06 10:27:01 crc kubenswrapper[4954]: I1206 10:27:01.626455 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="633cc8c2-e974-4bed-bf88-21d2c68b83cd" containerName="registry-server" Dec 06 10:27:01 crc kubenswrapper[4954]: I1206 10:27:01.627087 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="633cc8c2-e974-4bed-bf88-21d2c68b83cd" containerName="registry-server" Dec 06 10:27:01 crc kubenswrapper[4954]: I1206 10:27:01.632285 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ph8f9" Dec 06 10:27:01 crc kubenswrapper[4954]: I1206 10:27:01.712128 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ph8f9"] Dec 06 10:27:01 crc kubenswrapper[4954]: I1206 10:27:01.819623 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8784656c-ced3-45fc-98db-857dd2b641ec-catalog-content\") pod \"redhat-operators-ph8f9\" (UID: \"8784656c-ced3-45fc-98db-857dd2b641ec\") " pod="openshift-marketplace/redhat-operators-ph8f9" Dec 06 10:27:01 crc kubenswrapper[4954]: I1206 10:27:01.819710 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42zr7\" (UniqueName: \"kubernetes.io/projected/8784656c-ced3-45fc-98db-857dd2b641ec-kube-api-access-42zr7\") pod \"redhat-operators-ph8f9\" (UID: \"8784656c-ced3-45fc-98db-857dd2b641ec\") " pod="openshift-marketplace/redhat-operators-ph8f9" Dec 06 10:27:01 crc kubenswrapper[4954]: I1206 10:27:01.820283 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8784656c-ced3-45fc-98db-857dd2b641ec-utilities\") pod \"redhat-operators-ph8f9\" (UID: \"8784656c-ced3-45fc-98db-857dd2b641ec\") " pod="openshift-marketplace/redhat-operators-ph8f9" Dec 06 10:27:01 crc kubenswrapper[4954]: I1206 10:27:01.922515 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8784656c-ced3-45fc-98db-857dd2b641ec-catalog-content\") pod \"redhat-operators-ph8f9\" (UID: \"8784656c-ced3-45fc-98db-857dd2b641ec\") " pod="openshift-marketplace/redhat-operators-ph8f9" Dec 06 10:27:01 crc kubenswrapper[4954]: I1206 10:27:01.923253 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8784656c-ced3-45fc-98db-857dd2b641ec-catalog-content\") pod \"redhat-operators-ph8f9\" (UID: \"8784656c-ced3-45fc-98db-857dd2b641ec\") " pod="openshift-marketplace/redhat-operators-ph8f9" Dec 06 10:27:01 crc kubenswrapper[4954]: I1206 10:27:01.923303 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42zr7\" (UniqueName: \"kubernetes.io/projected/8784656c-ced3-45fc-98db-857dd2b641ec-kube-api-access-42zr7\") pod \"redhat-operators-ph8f9\" (UID: \"8784656c-ced3-45fc-98db-857dd2b641ec\") " pod="openshift-marketplace/redhat-operators-ph8f9" Dec 06 10:27:01 crc kubenswrapper[4954]: I1206 10:27:01.923471 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8784656c-ced3-45fc-98db-857dd2b641ec-utilities\") pod \"redhat-operators-ph8f9\" (UID: \"8784656c-ced3-45fc-98db-857dd2b641ec\") " pod="openshift-marketplace/redhat-operators-ph8f9" Dec 06 10:27:01 crc kubenswrapper[4954]: I1206 10:27:01.923932 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8784656c-ced3-45fc-98db-857dd2b641ec-utilities\") pod \"redhat-operators-ph8f9\" (UID: \"8784656c-ced3-45fc-98db-857dd2b641ec\") " pod="openshift-marketplace/redhat-operators-ph8f9" Dec 06 10:27:01 crc kubenswrapper[4954]: I1206 10:27:01.951203 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42zr7\" (UniqueName: \"kubernetes.io/projected/8784656c-ced3-45fc-98db-857dd2b641ec-kube-api-access-42zr7\") pod \"redhat-operators-ph8f9\" (UID: \"8784656c-ced3-45fc-98db-857dd2b641ec\") " pod="openshift-marketplace/redhat-operators-ph8f9" Dec 06 10:27:01 crc kubenswrapper[4954]: I1206 10:27:01.960102 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ph8f9" Dec 06 10:27:03 crc kubenswrapper[4954]: I1206 10:27:03.461480 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ph8f9"] Dec 06 10:27:04 crc kubenswrapper[4954]: I1206 10:27:04.049839 4954 generic.go:334] "Generic (PLEG): container finished" podID="8784656c-ced3-45fc-98db-857dd2b641ec" containerID="c243eac00c843a76b5a3533aca446a2da8fda816c087bb136ab28f1ef3d45187" exitCode=0 Dec 06 10:27:04 crc kubenswrapper[4954]: I1206 10:27:04.050032 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph8f9" event={"ID":"8784656c-ced3-45fc-98db-857dd2b641ec","Type":"ContainerDied","Data":"c243eac00c843a76b5a3533aca446a2da8fda816c087bb136ab28f1ef3d45187"} Dec 06 10:27:04 crc kubenswrapper[4954]: I1206 10:27:04.050184 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph8f9" event={"ID":"8784656c-ced3-45fc-98db-857dd2b641ec","Type":"ContainerStarted","Data":"0950cd5bb90a139a7f8d396c1d91517e8e93b8701dfcc3ab5454dc9d99f00cc6"} Dec 06 10:27:06 crc kubenswrapper[4954]: I1206 10:27:06.075736 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph8f9" event={"ID":"8784656c-ced3-45fc-98db-857dd2b641ec","Type":"ContainerStarted","Data":"710e53f223b583f9e6c9cde9802fcc8a0ae2926d97769f739e09b211f4a06d25"} Dec 06 10:27:11 crc kubenswrapper[4954]: I1206 10:27:11.020717 4954 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-6wbv2 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 10:27:11 crc kubenswrapper[4954]: I1206 10:27:11.023067 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" podUID="76a9970f-0017-4289-b4d5-804bbf7b0e9d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 10:27:11 crc kubenswrapper[4954]: I1206 10:27:11.114214 4954 generic.go:334] "Generic (PLEG): container finished" podID="8784656c-ced3-45fc-98db-857dd2b641ec" containerID="710e53f223b583f9e6c9cde9802fcc8a0ae2926d97769f739e09b211f4a06d25" exitCode=0 Dec 06 10:27:11 crc kubenswrapper[4954]: I1206 10:27:11.114283 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph8f9" event={"ID":"8784656c-ced3-45fc-98db-857dd2b641ec","Type":"ContainerDied","Data":"710e53f223b583f9e6c9cde9802fcc8a0ae2926d97769f739e09b211f4a06d25"} Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:12.190680 4954 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-qlrf9 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:12.190748 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-qlrf9" podUID="21436169-5d86-482f-8277-dd88780f2b68" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:12.260101 4954 patch_prober.go:28] interesting pod/console-operator-58897d9998-7p9tp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:12.260169 4954 patch_prober.go:28] interesting pod/console-operator-58897d9998-7p9tp container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:12.260178 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-7p9tp" podUID="9078eab8-cd16-404e-a8e6-e02c60ddfe16" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:12.260228 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-7p9tp" podUID="9078eab8-cd16-404e-a8e6-e02c60ddfe16" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:12.443656 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:27:13 crc kubenswrapper[4954]: E1206 10:27:12.443986 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:13.194876 4954 patch_prober.go:28] interesting pod/downloads-7954f5f757-56lst container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:13.194944 4954 patch_prober.go:28] interesting pod/downloads-7954f5f757-56lst container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:13.194967 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-56lst" podUID="891188e6-3c26-44de-84b2-6585f0d5e7dd" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:13.195015 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-56lst" podUID="891188e6-3c26-44de-84b2-6585f0d5e7dd" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:13.553803 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-h2z85" podUID="e3a2a692-b8fa-4f39-b507-9dd36ab9593e" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.72:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:13.745796 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:13.745866 4954 patch_prober.go:28] interesting pod/router-default-5444994796-k5tts container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:13.745874 4954 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:13.745945 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-k5tts" podUID="ed8e0159-4aa0-4e24-982d-b8e43b561192" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:13.831215 4954 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.026859352s: [/var/lib/containers/storage/overlay/d261f8237150bf353d530364348ac8c8bbf414344f72d88c3ace6e80c408af85/diff /var/log/pods/openstack_swift-storage-2_0054b4c8-5d04-47a9-8794-992ac486c936/rsync/0.log]; will not log again for this container unless duration exceeds 2s Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:13.856737 4954 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.126120366s: [/var/lib/containers/storage/overlay/5e6d16170c2f496cf35bf6b84fde70f3abaa87cba198d332c8f9fd250a9203b0/diff /var/log/pods/openstack_swift-storage-1_fb7ca5c7-3922-457c-9709-b024e8587ced/rsync/0.log]; will not log again for this container unless duration exceeds 2s Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:13.856763 4954 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.15665532s: [/var/lib/containers/storage/overlay/61991646da41b8cadc9468f9529044240519b61b91f4ca46486caea6bfb57ad5/diff /var/log/pods/openstack_swift-storage-0_fa5412d3-ec01-4e3f-9525-89669e6a87fc/rsync/0.log]; will not log again for this container unless duration exceeds 2s Dec 06 10:27:13 crc kubenswrapper[4954]: I1206 10:27:13.875382 4954 trace.go:236] Trace[97527294]: "Calculate volume metrics of kube-api-access-nd8vd for pod openstack/dnsmasq-dns-7569c45955-cmgjc" (06-Dec-2025 10:27:12.292) (total time: 1538ms): Dec 06 10:27:13 crc kubenswrapper[4954]: Trace[97527294]: [1.538453099s] [1.538453099s] END Dec 06 10:27:14 crc kubenswrapper[4954]: I1206 10:27:14.898836 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph8f9" event={"ID":"8784656c-ced3-45fc-98db-857dd2b641ec","Type":"ContainerStarted","Data":"00b9a347f49230fc6f311a5d4693c4da9507f9c6295723e4b734b287f1461698"} Dec 06 10:27:14 crc kubenswrapper[4954]: I1206 10:27:14.924067 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ph8f9" podStartSLOduration=3.6614002230000002 podStartE2EDuration="13.923739926s" podCreationTimestamp="2025-12-06 10:27:01 +0000 UTC" firstStartedPulling="2025-12-06 10:27:04.051941399 +0000 UTC m=+12598.865300778" lastFinishedPulling="2025-12-06 10:27:14.314281092 +0000 UTC m=+12609.127640481" observedRunningTime="2025-12-06 10:27:14.915762563 +0000 UTC m=+12609.729121952" watchObservedRunningTime="2025-12-06 10:27:14.923739926 +0000 UTC m=+12609.737099315" Dec 06 10:27:21 crc kubenswrapper[4954]: I1206 10:27:21.961636 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ph8f9" Dec 06 10:27:21 crc kubenswrapper[4954]: I1206 10:27:21.963002 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ph8f9" Dec 06 10:27:23 crc kubenswrapper[4954]: I1206 10:27:23.027183 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ph8f9" podUID="8784656c-ced3-45fc-98db-857dd2b641ec" containerName="registry-server" probeResult="failure" output=< Dec 06 10:27:23 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 10:27:23 crc kubenswrapper[4954]: > Dec 06 10:27:26 crc kubenswrapper[4954]: I1206 10:27:26.444012 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:27:26 crc kubenswrapper[4954]: E1206 10:27:26.444872 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:27:33 crc kubenswrapper[4954]: I1206 10:27:33.070731 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ph8f9" podUID="8784656c-ced3-45fc-98db-857dd2b641ec" containerName="registry-server" probeResult="failure" output=< Dec 06 10:27:33 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 10:27:33 crc kubenswrapper[4954]: > Dec 06 10:27:39 crc kubenswrapper[4954]: I1206 10:27:39.454081 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:27:39 crc kubenswrapper[4954]: E1206 10:27:39.463391 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:27:40 crc kubenswrapper[4954]: I1206 10:27:40.981220 4954 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-6wbv2 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 10:27:40 crc kubenswrapper[4954]: I1206 10:27:40.987942 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbv2" podUID="76a9970f-0017-4289-b4d5-804bbf7b0e9d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 10:27:40 crc kubenswrapper[4954]: I1206 10:27:40.988633 4954 trace.go:236] Trace[1317931493]: "Calculate volume metrics of v4-0-config-system-trusted-ca-bundle for pod openshift-authentication/oauth-openshift-679cb4ddc5-x8vqv" (06-Dec-2025 10:27:39.952) (total time: 1031ms): Dec 06 10:27:40 crc kubenswrapper[4954]: Trace[1317931493]: [1.031852564s] [1.031852564s] END Dec 06 10:27:43 crc kubenswrapper[4954]: I1206 10:27:43.025722 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ph8f9" podUID="8784656c-ced3-45fc-98db-857dd2b641ec" containerName="registry-server" probeResult="failure" output=< Dec 06 10:27:43 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 10:27:43 crc kubenswrapper[4954]: > Dec 06 10:27:50 crc kubenswrapper[4954]: I1206 10:27:50.443639 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:27:50 crc kubenswrapper[4954]: E1206 10:27:50.444740 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:27:52 crc kubenswrapper[4954]: I1206 10:27:52.033307 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ph8f9" Dec 06 10:27:52 crc kubenswrapper[4954]: I1206 10:27:52.115828 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ph8f9" Dec 06 10:27:52 crc kubenswrapper[4954]: I1206 10:27:52.393731 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ph8f9"] Dec 06 10:27:53 crc kubenswrapper[4954]: I1206 10:27:53.490674 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ph8f9" podUID="8784656c-ced3-45fc-98db-857dd2b641ec" containerName="registry-server" containerID="cri-o://00b9a347f49230fc6f311a5d4693c4da9507f9c6295723e4b734b287f1461698" gracePeriod=2 Dec 06 10:27:54 crc kubenswrapper[4954]: I1206 10:27:54.504927 4954 generic.go:334] "Generic (PLEG): container finished" podID="8784656c-ced3-45fc-98db-857dd2b641ec" containerID="00b9a347f49230fc6f311a5d4693c4da9507f9c6295723e4b734b287f1461698" exitCode=0 Dec 06 10:27:54 crc kubenswrapper[4954]: I1206 10:27:54.505022 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph8f9" event={"ID":"8784656c-ced3-45fc-98db-857dd2b641ec","Type":"ContainerDied","Data":"00b9a347f49230fc6f311a5d4693c4da9507f9c6295723e4b734b287f1461698"} Dec 06 10:27:55 crc kubenswrapper[4954]: I1206 10:27:55.813701 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ph8f9" Dec 06 10:27:55 crc kubenswrapper[4954]: I1206 10:27:55.899427 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42zr7\" (UniqueName: \"kubernetes.io/projected/8784656c-ced3-45fc-98db-857dd2b641ec-kube-api-access-42zr7\") pod \"8784656c-ced3-45fc-98db-857dd2b641ec\" (UID: \"8784656c-ced3-45fc-98db-857dd2b641ec\") " Dec 06 10:27:55 crc kubenswrapper[4954]: I1206 10:27:55.899974 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8784656c-ced3-45fc-98db-857dd2b641ec-catalog-content\") pod \"8784656c-ced3-45fc-98db-857dd2b641ec\" (UID: \"8784656c-ced3-45fc-98db-857dd2b641ec\") " Dec 06 10:27:55 crc kubenswrapper[4954]: I1206 10:27:55.900084 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8784656c-ced3-45fc-98db-857dd2b641ec-utilities\") pod \"8784656c-ced3-45fc-98db-857dd2b641ec\" (UID: \"8784656c-ced3-45fc-98db-857dd2b641ec\") " Dec 06 10:27:55 crc kubenswrapper[4954]: I1206 10:27:55.904172 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8784656c-ced3-45fc-98db-857dd2b641ec-utilities" (OuterVolumeSpecName: "utilities") pod "8784656c-ced3-45fc-98db-857dd2b641ec" (UID: "8784656c-ced3-45fc-98db-857dd2b641ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:27:55 crc kubenswrapper[4954]: I1206 10:27:55.923500 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8784656c-ced3-45fc-98db-857dd2b641ec-kube-api-access-42zr7" (OuterVolumeSpecName: "kube-api-access-42zr7") pod "8784656c-ced3-45fc-98db-857dd2b641ec" (UID: "8784656c-ced3-45fc-98db-857dd2b641ec"). InnerVolumeSpecName "kube-api-access-42zr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:27:56 crc kubenswrapper[4954]: I1206 10:27:56.003204 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42zr7\" (UniqueName: \"kubernetes.io/projected/8784656c-ced3-45fc-98db-857dd2b641ec-kube-api-access-42zr7\") on node \"crc\" DevicePath \"\"" Dec 06 10:27:56 crc kubenswrapper[4954]: I1206 10:27:56.003412 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8784656c-ced3-45fc-98db-857dd2b641ec-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:27:56 crc kubenswrapper[4954]: I1206 10:27:56.019115 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8784656c-ced3-45fc-98db-857dd2b641ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8784656c-ced3-45fc-98db-857dd2b641ec" (UID: "8784656c-ced3-45fc-98db-857dd2b641ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:27:56 crc kubenswrapper[4954]: I1206 10:27:56.130212 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8784656c-ced3-45fc-98db-857dd2b641ec-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:27:56 crc kubenswrapper[4954]: I1206 10:27:56.526181 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph8f9" event={"ID":"8784656c-ced3-45fc-98db-857dd2b641ec","Type":"ContainerDied","Data":"0950cd5bb90a139a7f8d396c1d91517e8e93b8701dfcc3ab5454dc9d99f00cc6"} Dec 06 10:27:56 crc kubenswrapper[4954]: I1206 10:27:56.526228 4954 scope.go:117] "RemoveContainer" containerID="00b9a347f49230fc6f311a5d4693c4da9507f9c6295723e4b734b287f1461698" Dec 06 10:27:56 crc kubenswrapper[4954]: I1206 10:27:56.526376 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ph8f9" Dec 06 10:27:56 crc kubenswrapper[4954]: I1206 10:27:56.572497 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ph8f9"] Dec 06 10:27:56 crc kubenswrapper[4954]: I1206 10:27:56.596462 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ph8f9"] Dec 06 10:27:56 crc kubenswrapper[4954]: I1206 10:27:56.597920 4954 scope.go:117] "RemoveContainer" containerID="710e53f223b583f9e6c9cde9802fcc8a0ae2926d97769f739e09b211f4a06d25" Dec 06 10:27:56 crc kubenswrapper[4954]: I1206 10:27:56.643790 4954 scope.go:117] "RemoveContainer" containerID="c243eac00c843a76b5a3533aca446a2da8fda816c087bb136ab28f1ef3d45187" Dec 06 10:27:57 crc kubenswrapper[4954]: I1206 10:27:57.466800 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8784656c-ced3-45fc-98db-857dd2b641ec" path="/var/lib/kubelet/pods/8784656c-ced3-45fc-98db-857dd2b641ec/volumes" Dec 06 10:28:01 crc kubenswrapper[4954]: I1206 10:28:01.444762 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:28:01 crc kubenswrapper[4954]: E1206 10:28:01.445436 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:28:16 crc kubenswrapper[4954]: I1206 10:28:16.444175 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:28:16 crc kubenswrapper[4954]: E1206 10:28:16.445083 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:28:29 crc kubenswrapper[4954]: I1206 10:28:29.443056 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:28:29 crc kubenswrapper[4954]: E1206 10:28:29.443824 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:28:36 crc kubenswrapper[4954]: I1206 10:28:36.893428 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tvvs9"] Dec 06 10:28:36 crc kubenswrapper[4954]: E1206 10:28:36.894879 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8784656c-ced3-45fc-98db-857dd2b641ec" containerName="extract-content" Dec 06 10:28:36 crc kubenswrapper[4954]: I1206 10:28:36.894937 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8784656c-ced3-45fc-98db-857dd2b641ec" containerName="extract-content" Dec 06 10:28:36 crc kubenswrapper[4954]: E1206 10:28:36.894962 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8784656c-ced3-45fc-98db-857dd2b641ec" containerName="extract-utilities" Dec 06 10:28:36 crc kubenswrapper[4954]: I1206 10:28:36.894970 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8784656c-ced3-45fc-98db-857dd2b641ec" containerName="extract-utilities" Dec 06 10:28:36 crc kubenswrapper[4954]: E1206 10:28:36.895000 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8784656c-ced3-45fc-98db-857dd2b641ec" containerName="registry-server" Dec 06 10:28:36 crc kubenswrapper[4954]: I1206 10:28:36.895008 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="8784656c-ced3-45fc-98db-857dd2b641ec" containerName="registry-server" Dec 06 10:28:36 crc kubenswrapper[4954]: I1206 10:28:36.895235 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="8784656c-ced3-45fc-98db-857dd2b641ec" containerName="registry-server" Dec 06 10:28:36 crc kubenswrapper[4954]: I1206 10:28:36.898024 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tvvs9" Dec 06 10:28:36 crc kubenswrapper[4954]: I1206 10:28:36.929442 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvvs9"] Dec 06 10:28:37 crc kubenswrapper[4954]: I1206 10:28:37.007487 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ca28c5c-b483-4904-adb3-e419cc80bec4-catalog-content\") pod \"redhat-marketplace-tvvs9\" (UID: \"7ca28c5c-b483-4904-adb3-e419cc80bec4\") " pod="openshift-marketplace/redhat-marketplace-tvvs9" Dec 06 10:28:37 crc kubenswrapper[4954]: I1206 10:28:37.007593 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2rwb\" (UniqueName: \"kubernetes.io/projected/7ca28c5c-b483-4904-adb3-e419cc80bec4-kube-api-access-c2rwb\") pod \"redhat-marketplace-tvvs9\" (UID: \"7ca28c5c-b483-4904-adb3-e419cc80bec4\") " pod="openshift-marketplace/redhat-marketplace-tvvs9" Dec 06 10:28:37 crc kubenswrapper[4954]: I1206 10:28:37.007690 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ca28c5c-b483-4904-adb3-e419cc80bec4-utilities\") pod \"redhat-marketplace-tvvs9\" (UID: \"7ca28c5c-b483-4904-adb3-e419cc80bec4\") " pod="openshift-marketplace/redhat-marketplace-tvvs9" Dec 06 10:28:37 crc kubenswrapper[4954]: I1206 10:28:37.108973 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ca28c5c-b483-4904-adb3-e419cc80bec4-catalog-content\") pod \"redhat-marketplace-tvvs9\" (UID: \"7ca28c5c-b483-4904-adb3-e419cc80bec4\") " pod="openshift-marketplace/redhat-marketplace-tvvs9" Dec 06 10:28:37 crc kubenswrapper[4954]: I1206 10:28:37.109055 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2rwb\" (UniqueName: \"kubernetes.io/projected/7ca28c5c-b483-4904-adb3-e419cc80bec4-kube-api-access-c2rwb\") pod \"redhat-marketplace-tvvs9\" (UID: \"7ca28c5c-b483-4904-adb3-e419cc80bec4\") " pod="openshift-marketplace/redhat-marketplace-tvvs9" Dec 06 10:28:37 crc kubenswrapper[4954]: I1206 10:28:37.109111 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ca28c5c-b483-4904-adb3-e419cc80bec4-utilities\") pod \"redhat-marketplace-tvvs9\" (UID: \"7ca28c5c-b483-4904-adb3-e419cc80bec4\") " pod="openshift-marketplace/redhat-marketplace-tvvs9" Dec 06 10:28:37 crc kubenswrapper[4954]: I1206 10:28:37.109552 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ca28c5c-b483-4904-adb3-e419cc80bec4-catalog-content\") pod \"redhat-marketplace-tvvs9\" (UID: \"7ca28c5c-b483-4904-adb3-e419cc80bec4\") " pod="openshift-marketplace/redhat-marketplace-tvvs9" Dec 06 10:28:37 crc kubenswrapper[4954]: I1206 10:28:37.109632 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ca28c5c-b483-4904-adb3-e419cc80bec4-utilities\") pod \"redhat-marketplace-tvvs9\" (UID: \"7ca28c5c-b483-4904-adb3-e419cc80bec4\") " pod="openshift-marketplace/redhat-marketplace-tvvs9" Dec 06 10:28:37 crc kubenswrapper[4954]: I1206 10:28:37.135884 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2rwb\" (UniqueName: \"kubernetes.io/projected/7ca28c5c-b483-4904-adb3-e419cc80bec4-kube-api-access-c2rwb\") pod \"redhat-marketplace-tvvs9\" (UID: \"7ca28c5c-b483-4904-adb3-e419cc80bec4\") " pod="openshift-marketplace/redhat-marketplace-tvvs9" Dec 06 10:28:37 crc kubenswrapper[4954]: I1206 10:28:37.224977 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tvvs9" Dec 06 10:28:38 crc kubenswrapper[4954]: I1206 10:28:38.317470 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvvs9"] Dec 06 10:28:39 crc kubenswrapper[4954]: I1206 10:28:39.175160 4954 generic.go:334] "Generic (PLEG): container finished" podID="7ca28c5c-b483-4904-adb3-e419cc80bec4" containerID="66f95d592b06bcfacefe55bd03cb6b62847057e8408fa0c738b97fefeeb9cdc6" exitCode=0 Dec 06 10:28:39 crc kubenswrapper[4954]: I1206 10:28:39.175354 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvvs9" event={"ID":"7ca28c5c-b483-4904-adb3-e419cc80bec4","Type":"ContainerDied","Data":"66f95d592b06bcfacefe55bd03cb6b62847057e8408fa0c738b97fefeeb9cdc6"} Dec 06 10:28:39 crc kubenswrapper[4954]: I1206 10:28:39.175632 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvvs9" event={"ID":"7ca28c5c-b483-4904-adb3-e419cc80bec4","Type":"ContainerStarted","Data":"f31007964dc405e1ecd9871a9b2b0e8ebf960e7ba6da9eb5ba0f43d94537c456"} Dec 06 10:28:40 crc kubenswrapper[4954]: I1206 10:28:40.186961 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvvs9" event={"ID":"7ca28c5c-b483-4904-adb3-e419cc80bec4","Type":"ContainerStarted","Data":"5aabdde27bd0018c3ff9afdc794e4a07c252a9c6b937207aaadc155249628037"} Dec 06 10:28:41 crc kubenswrapper[4954]: I1206 10:28:41.091280 4954 scope.go:117] "RemoveContainer" containerID="7385833dcb3df8029663bd48276715af3216998d7c6a06f3a6bf0f7a66fd8156" Dec 06 10:28:41 crc kubenswrapper[4954]: I1206 10:28:41.133953 4954 scope.go:117] "RemoveContainer" containerID="4acc2997f734fc62da52e9d203f9b605d874cf33e77d411b25e32ffc1c7235f3" Dec 06 10:28:41 crc kubenswrapper[4954]: I1206 10:28:41.182607 4954 scope.go:117] "RemoveContainer" containerID="cc987b9ede5ed31785526a76ef8347df2109d1d168dcab6a1acc5c23dda6d3f4" Dec 06 10:28:41 crc kubenswrapper[4954]: I1206 10:28:41.219668 4954 generic.go:334] "Generic (PLEG): container finished" podID="7ca28c5c-b483-4904-adb3-e419cc80bec4" containerID="5aabdde27bd0018c3ff9afdc794e4a07c252a9c6b937207aaadc155249628037" exitCode=0 Dec 06 10:28:41 crc kubenswrapper[4954]: I1206 10:28:41.219802 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvvs9" event={"ID":"7ca28c5c-b483-4904-adb3-e419cc80bec4","Type":"ContainerDied","Data":"5aabdde27bd0018c3ff9afdc794e4a07c252a9c6b937207aaadc155249628037"} Dec 06 10:28:42 crc kubenswrapper[4954]: I1206 10:28:42.235070 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvvs9" event={"ID":"7ca28c5c-b483-4904-adb3-e419cc80bec4","Type":"ContainerStarted","Data":"aad764e5c2d46eae0339effbac86a5e7fd00af000b408705686f90c6a87fb1e2"} Dec 06 10:28:42 crc kubenswrapper[4954]: I1206 10:28:42.255266 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tvvs9" podStartSLOduration=3.767918157 podStartE2EDuration="6.25408964s" podCreationTimestamp="2025-12-06 10:28:36 +0000 UTC" firstStartedPulling="2025-12-06 10:28:39.177411898 +0000 UTC m=+12693.990771287" lastFinishedPulling="2025-12-06 10:28:41.663583381 +0000 UTC m=+12696.476942770" observedRunningTime="2025-12-06 10:28:42.251974704 +0000 UTC m=+12697.065334093" watchObservedRunningTime="2025-12-06 10:28:42.25408964 +0000 UTC m=+12697.067449029" Dec 06 10:28:44 crc kubenswrapper[4954]: I1206 10:28:44.443804 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:28:44 crc kubenswrapper[4954]: E1206 10:28:44.444383 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:28:47 crc kubenswrapper[4954]: I1206 10:28:47.225847 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tvvs9" Dec 06 10:28:47 crc kubenswrapper[4954]: I1206 10:28:47.226348 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tvvs9" Dec 06 10:28:47 crc kubenswrapper[4954]: I1206 10:28:47.294091 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tvvs9" Dec 06 10:28:47 crc kubenswrapper[4954]: I1206 10:28:47.358525 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tvvs9" Dec 06 10:28:47 crc kubenswrapper[4954]: I1206 10:28:47.541697 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvvs9"] Dec 06 10:28:49 crc kubenswrapper[4954]: I1206 10:28:49.303752 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tvvs9" podUID="7ca28c5c-b483-4904-adb3-e419cc80bec4" containerName="registry-server" containerID="cri-o://aad764e5c2d46eae0339effbac86a5e7fd00af000b408705686f90c6a87fb1e2" gracePeriod=2 Dec 06 10:28:50 crc kubenswrapper[4954]: I1206 10:28:50.321812 4954 generic.go:334] "Generic (PLEG): container finished" podID="7ca28c5c-b483-4904-adb3-e419cc80bec4" containerID="aad764e5c2d46eae0339effbac86a5e7fd00af000b408705686f90c6a87fb1e2" exitCode=0 Dec 06 10:28:50 crc kubenswrapper[4954]: I1206 10:28:50.321874 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvvs9" event={"ID":"7ca28c5c-b483-4904-adb3-e419cc80bec4","Type":"ContainerDied","Data":"aad764e5c2d46eae0339effbac86a5e7fd00af000b408705686f90c6a87fb1e2"} Dec 06 10:28:51 crc kubenswrapper[4954]: I1206 10:28:51.768827 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tvvs9" Dec 06 10:28:51 crc kubenswrapper[4954]: I1206 10:28:51.851647 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ca28c5c-b483-4904-adb3-e419cc80bec4-utilities\") pod \"7ca28c5c-b483-4904-adb3-e419cc80bec4\" (UID: \"7ca28c5c-b483-4904-adb3-e419cc80bec4\") " Dec 06 10:28:51 crc kubenswrapper[4954]: I1206 10:28:51.851780 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ca28c5c-b483-4904-adb3-e419cc80bec4-catalog-content\") pod \"7ca28c5c-b483-4904-adb3-e419cc80bec4\" (UID: \"7ca28c5c-b483-4904-adb3-e419cc80bec4\") " Dec 06 10:28:51 crc kubenswrapper[4954]: I1206 10:28:51.851976 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2rwb\" (UniqueName: \"kubernetes.io/projected/7ca28c5c-b483-4904-adb3-e419cc80bec4-kube-api-access-c2rwb\") pod \"7ca28c5c-b483-4904-adb3-e419cc80bec4\" (UID: \"7ca28c5c-b483-4904-adb3-e419cc80bec4\") " Dec 06 10:28:51 crc kubenswrapper[4954]: I1206 10:28:51.852522 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ca28c5c-b483-4904-adb3-e419cc80bec4-utilities" (OuterVolumeSpecName: "utilities") pod "7ca28c5c-b483-4904-adb3-e419cc80bec4" (UID: "7ca28c5c-b483-4904-adb3-e419cc80bec4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:28:51 crc kubenswrapper[4954]: I1206 10:28:51.858349 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca28c5c-b483-4904-adb3-e419cc80bec4-kube-api-access-c2rwb" (OuterVolumeSpecName: "kube-api-access-c2rwb") pod "7ca28c5c-b483-4904-adb3-e419cc80bec4" (UID: "7ca28c5c-b483-4904-adb3-e419cc80bec4"). InnerVolumeSpecName "kube-api-access-c2rwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:28:51 crc kubenswrapper[4954]: I1206 10:28:51.870687 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ca28c5c-b483-4904-adb3-e419cc80bec4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ca28c5c-b483-4904-adb3-e419cc80bec4" (UID: "7ca28c5c-b483-4904-adb3-e419cc80bec4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:28:51 crc kubenswrapper[4954]: I1206 10:28:51.954626 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2rwb\" (UniqueName: \"kubernetes.io/projected/7ca28c5c-b483-4904-adb3-e419cc80bec4-kube-api-access-c2rwb\") on node \"crc\" DevicePath \"\"" Dec 06 10:28:51 crc kubenswrapper[4954]: I1206 10:28:51.954661 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ca28c5c-b483-4904-adb3-e419cc80bec4-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:28:51 crc kubenswrapper[4954]: I1206 10:28:51.954671 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ca28c5c-b483-4904-adb3-e419cc80bec4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:28:52 crc kubenswrapper[4954]: I1206 10:28:52.351688 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvvs9" event={"ID":"7ca28c5c-b483-4904-adb3-e419cc80bec4","Type":"ContainerDied","Data":"f31007964dc405e1ecd9871a9b2b0e8ebf960e7ba6da9eb5ba0f43d94537c456"} Dec 06 10:28:52 crc kubenswrapper[4954]: I1206 10:28:52.351762 4954 scope.go:117] "RemoveContainer" containerID="aad764e5c2d46eae0339effbac86a5e7fd00af000b408705686f90c6a87fb1e2" Dec 06 10:28:52 crc kubenswrapper[4954]: I1206 10:28:52.351784 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tvvs9" Dec 06 10:28:52 crc kubenswrapper[4954]: I1206 10:28:52.393514 4954 scope.go:117] "RemoveContainer" containerID="5aabdde27bd0018c3ff9afdc794e4a07c252a9c6b937207aaadc155249628037" Dec 06 10:28:52 crc kubenswrapper[4954]: I1206 10:28:52.409764 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvvs9"] Dec 06 10:28:52 crc kubenswrapper[4954]: I1206 10:28:52.420971 4954 scope.go:117] "RemoveContainer" containerID="66f95d592b06bcfacefe55bd03cb6b62847057e8408fa0c738b97fefeeb9cdc6" Dec 06 10:28:52 crc kubenswrapper[4954]: I1206 10:28:52.424782 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvvs9"] Dec 06 10:28:53 crc kubenswrapper[4954]: I1206 10:28:53.455065 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ca28c5c-b483-4904-adb3-e419cc80bec4" path="/var/lib/kubelet/pods/7ca28c5c-b483-4904-adb3-e419cc80bec4/volumes" Dec 06 10:28:59 crc kubenswrapper[4954]: I1206 10:28:59.443603 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:28:59 crc kubenswrapper[4954]: E1206 10:28:59.444394 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:29:14 crc kubenswrapper[4954]: I1206 10:29:14.443276 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:29:14 crc kubenswrapper[4954]: E1206 10:29:14.444109 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:29:28 crc kubenswrapper[4954]: I1206 10:29:28.444414 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:29:28 crc kubenswrapper[4954]: E1206 10:29:28.445203 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:29:39 crc kubenswrapper[4954]: I1206 10:29:39.444116 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:29:39 crc kubenswrapper[4954]: E1206 10:29:39.445076 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:29:52 crc kubenswrapper[4954]: I1206 10:29:52.443966 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:29:52 crc kubenswrapper[4954]: E1206 10:29:52.445223 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.153844 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd"] Dec 06 10:30:00 crc kubenswrapper[4954]: E1206 10:30:00.154921 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca28c5c-b483-4904-adb3-e419cc80bec4" containerName="extract-content" Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.154937 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca28c5c-b483-4904-adb3-e419cc80bec4" containerName="extract-content" Dec 06 10:30:00 crc kubenswrapper[4954]: E1206 10:30:00.154962 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca28c5c-b483-4904-adb3-e419cc80bec4" containerName="extract-utilities" Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.154971 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca28c5c-b483-4904-adb3-e419cc80bec4" containerName="extract-utilities" Dec 06 10:30:00 crc kubenswrapper[4954]: E1206 10:30:00.154995 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca28c5c-b483-4904-adb3-e419cc80bec4" containerName="registry-server" Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.155002 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca28c5c-b483-4904-adb3-e419cc80bec4" containerName="registry-server" Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.155209 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca28c5c-b483-4904-adb3-e419cc80bec4" containerName="registry-server" Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.155994 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd" Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.170168 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.174867 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd"] Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.239233 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.337369 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0923ee42-b759-4708-8a3a-7c875f37d3f4-config-volume\") pod \"collect-profiles-29416950-h2xvd\" (UID: \"0923ee42-b759-4708-8a3a-7c875f37d3f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd" Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.337446 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq6qw\" (UniqueName: \"kubernetes.io/projected/0923ee42-b759-4708-8a3a-7c875f37d3f4-kube-api-access-xq6qw\") pod \"collect-profiles-29416950-h2xvd\" (UID: \"0923ee42-b759-4708-8a3a-7c875f37d3f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd" Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.337525 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0923ee42-b759-4708-8a3a-7c875f37d3f4-secret-volume\") pod \"collect-profiles-29416950-h2xvd\" (UID: \"0923ee42-b759-4708-8a3a-7c875f37d3f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd" Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.439501 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0923ee42-b759-4708-8a3a-7c875f37d3f4-config-volume\") pod \"collect-profiles-29416950-h2xvd\" (UID: \"0923ee42-b759-4708-8a3a-7c875f37d3f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd" Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.439602 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq6qw\" (UniqueName: \"kubernetes.io/projected/0923ee42-b759-4708-8a3a-7c875f37d3f4-kube-api-access-xq6qw\") pod \"collect-profiles-29416950-h2xvd\" (UID: \"0923ee42-b759-4708-8a3a-7c875f37d3f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd" Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.439651 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0923ee42-b759-4708-8a3a-7c875f37d3f4-secret-volume\") pod \"collect-profiles-29416950-h2xvd\" (UID: \"0923ee42-b759-4708-8a3a-7c875f37d3f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd" Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.440528 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0923ee42-b759-4708-8a3a-7c875f37d3f4-config-volume\") pod \"collect-profiles-29416950-h2xvd\" (UID: \"0923ee42-b759-4708-8a3a-7c875f37d3f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd" Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.445920 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0923ee42-b759-4708-8a3a-7c875f37d3f4-secret-volume\") pod \"collect-profiles-29416950-h2xvd\" (UID: \"0923ee42-b759-4708-8a3a-7c875f37d3f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd" Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.459135 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq6qw\" (UniqueName: \"kubernetes.io/projected/0923ee42-b759-4708-8a3a-7c875f37d3f4-kube-api-access-xq6qw\") pod \"collect-profiles-29416950-h2xvd\" (UID: \"0923ee42-b759-4708-8a3a-7c875f37d3f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd" Dec 06 10:30:00 crc kubenswrapper[4954]: I1206 10:30:00.551272 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd" Dec 06 10:30:01 crc kubenswrapper[4954]: I1206 10:30:01.463732 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd"] Dec 06 10:30:02 crc kubenswrapper[4954]: I1206 10:30:02.276315 4954 generic.go:334] "Generic (PLEG): container finished" podID="0923ee42-b759-4708-8a3a-7c875f37d3f4" containerID="1bb4665a70a4aeafef8c43517737c814e092d14b21d95512df315bfb8c3ea820" exitCode=0 Dec 06 10:30:02 crc kubenswrapper[4954]: I1206 10:30:02.276376 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd" event={"ID":"0923ee42-b759-4708-8a3a-7c875f37d3f4","Type":"ContainerDied","Data":"1bb4665a70a4aeafef8c43517737c814e092d14b21d95512df315bfb8c3ea820"} Dec 06 10:30:02 crc kubenswrapper[4954]: I1206 10:30:02.276942 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd" event={"ID":"0923ee42-b759-4708-8a3a-7c875f37d3f4","Type":"ContainerStarted","Data":"0df5dba5b3ea35613d578c95c7677f4cb946e03ae93304d0ae551a353f45f1f2"} Dec 06 10:30:04 crc kubenswrapper[4954]: I1206 10:30:04.803163 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd" Dec 06 10:30:04 crc kubenswrapper[4954]: I1206 10:30:04.933626 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0923ee42-b759-4708-8a3a-7c875f37d3f4-secret-volume\") pod \"0923ee42-b759-4708-8a3a-7c875f37d3f4\" (UID: \"0923ee42-b759-4708-8a3a-7c875f37d3f4\") " Dec 06 10:30:04 crc kubenswrapper[4954]: I1206 10:30:04.933723 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0923ee42-b759-4708-8a3a-7c875f37d3f4-config-volume\") pod \"0923ee42-b759-4708-8a3a-7c875f37d3f4\" (UID: \"0923ee42-b759-4708-8a3a-7c875f37d3f4\") " Dec 06 10:30:04 crc kubenswrapper[4954]: I1206 10:30:04.933758 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq6qw\" (UniqueName: \"kubernetes.io/projected/0923ee42-b759-4708-8a3a-7c875f37d3f4-kube-api-access-xq6qw\") pod \"0923ee42-b759-4708-8a3a-7c875f37d3f4\" (UID: \"0923ee42-b759-4708-8a3a-7c875f37d3f4\") " Dec 06 10:30:04 crc kubenswrapper[4954]: I1206 10:30:04.934461 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0923ee42-b759-4708-8a3a-7c875f37d3f4-config-volume" (OuterVolumeSpecName: "config-volume") pod "0923ee42-b759-4708-8a3a-7c875f37d3f4" (UID: "0923ee42-b759-4708-8a3a-7c875f37d3f4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:30:04 crc kubenswrapper[4954]: I1206 10:30:04.935102 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0923ee42-b759-4708-8a3a-7c875f37d3f4-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 10:30:04 crc kubenswrapper[4954]: I1206 10:30:04.940156 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0923ee42-b759-4708-8a3a-7c875f37d3f4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0923ee42-b759-4708-8a3a-7c875f37d3f4" (UID: "0923ee42-b759-4708-8a3a-7c875f37d3f4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:30:04 crc kubenswrapper[4954]: I1206 10:30:04.954787 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0923ee42-b759-4708-8a3a-7c875f37d3f4-kube-api-access-xq6qw" (OuterVolumeSpecName: "kube-api-access-xq6qw") pod "0923ee42-b759-4708-8a3a-7c875f37d3f4" (UID: "0923ee42-b759-4708-8a3a-7c875f37d3f4"). InnerVolumeSpecName "kube-api-access-xq6qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:30:05 crc kubenswrapper[4954]: I1206 10:30:05.036551 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0923ee42-b759-4708-8a3a-7c875f37d3f4-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 10:30:05 crc kubenswrapper[4954]: I1206 10:30:05.036612 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq6qw\" (UniqueName: \"kubernetes.io/projected/0923ee42-b759-4708-8a3a-7c875f37d3f4-kube-api-access-xq6qw\") on node \"crc\" DevicePath \"\"" Dec 06 10:30:05 crc kubenswrapper[4954]: I1206 10:30:05.307824 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd" event={"ID":"0923ee42-b759-4708-8a3a-7c875f37d3f4","Type":"ContainerDied","Data":"0df5dba5b3ea35613d578c95c7677f4cb946e03ae93304d0ae551a353f45f1f2"} Dec 06 10:30:05 crc kubenswrapper[4954]: I1206 10:30:05.308196 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0df5dba5b3ea35613d578c95c7677f4cb946e03ae93304d0ae551a353f45f1f2" Dec 06 10:30:05 crc kubenswrapper[4954]: I1206 10:30:05.307867 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416950-h2xvd" Dec 06 10:30:05 crc kubenswrapper[4954]: I1206 10:30:05.461865 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:30:05 crc kubenswrapper[4954]: E1206 10:30:05.462225 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:30:05 crc kubenswrapper[4954]: I1206 10:30:05.885285 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f"] Dec 06 10:30:05 crc kubenswrapper[4954]: I1206 10:30:05.901042 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416905-zmj6f"] Dec 06 10:30:07 crc kubenswrapper[4954]: I1206 10:30:07.456655 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f29c6b96-904b-4656-a13d-c338c5173352" path="/var/lib/kubelet/pods/f29c6b96-904b-4656-a13d-c338c5173352/volumes" Dec 06 10:30:17 crc kubenswrapper[4954]: I1206 10:30:17.443841 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:30:17 crc kubenswrapper[4954]: E1206 10:30:17.444696 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:30:32 crc kubenswrapper[4954]: I1206 10:30:32.443604 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:30:32 crc kubenswrapper[4954]: E1206 10:30:32.444352 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:30:41 crc kubenswrapper[4954]: I1206 10:30:41.331877 4954 scope.go:117] "RemoveContainer" containerID="ae6775abc5896a431d58f0a2fe22ceaf79e7edd6b6c5ae3e64c048fa4b9b9166" Dec 06 10:30:46 crc kubenswrapper[4954]: I1206 10:30:46.443525 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:30:46 crc kubenswrapper[4954]: E1206 10:30:46.444357 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:30:58 crc kubenswrapper[4954]: I1206 10:30:58.443477 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:30:58 crc kubenswrapper[4954]: E1206 10:30:58.444269 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:31:09 crc kubenswrapper[4954]: I1206 10:31:09.446290 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:31:09 crc kubenswrapper[4954]: E1206 10:31:09.446977 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:31:21 crc kubenswrapper[4954]: I1206 10:31:21.443800 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:31:22 crc kubenswrapper[4954]: I1206 10:31:22.105205 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"cd86614438811c10be67f6ffbdd9b17cbf00541ebcf0ba5313b72e6afc36b20c"} Dec 06 10:31:33 crc kubenswrapper[4954]: I1206 10:31:33.237772 4954 generic.go:334] "Generic (PLEG): container finished" podID="86f6275c-9439-4a32-a0b7-467f7df9670f" containerID="3292d8886bc231cb5507d8f5dfba26c04d8627aec908a3d6ebf2fc547a27f018" exitCode=0 Dec 06 10:31:33 crc kubenswrapper[4954]: I1206 10:31:33.237853 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"86f6275c-9439-4a32-a0b7-467f7df9670f","Type":"ContainerDied","Data":"3292d8886bc231cb5507d8f5dfba26c04d8627aec908a3d6ebf2fc547a27f018"} Dec 06 10:31:35 crc kubenswrapper[4954]: I1206 10:31:35.865589 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 10:31:35 crc kubenswrapper[4954]: I1206 10:31:35.973702 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/86f6275c-9439-4a32-a0b7-467f7df9670f-ca-certs\") pod \"86f6275c-9439-4a32-a0b7-467f7df9670f\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " Dec 06 10:31:35 crc kubenswrapper[4954]: I1206 10:31:35.973749 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"86f6275c-9439-4a32-a0b7-467f7df9670f\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " Dec 06 10:31:35 crc kubenswrapper[4954]: I1206 10:31:35.973776 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/86f6275c-9439-4a32-a0b7-467f7df9670f-openstack-config-secret\") pod \"86f6275c-9439-4a32-a0b7-467f7df9670f\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " Dec 06 10:31:35 crc kubenswrapper[4954]: I1206 10:31:35.973833 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86f6275c-9439-4a32-a0b7-467f7df9670f-ssh-key\") pod \"86f6275c-9439-4a32-a0b7-467f7df9670f\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " Dec 06 10:31:35 crc kubenswrapper[4954]: I1206 10:31:35.973849 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/86f6275c-9439-4a32-a0b7-467f7df9670f-openstack-config\") pod \"86f6275c-9439-4a32-a0b7-467f7df9670f\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " Dec 06 10:31:35 crc kubenswrapper[4954]: I1206 10:31:35.973931 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/86f6275c-9439-4a32-a0b7-467f7df9670f-config-data\") pod \"86f6275c-9439-4a32-a0b7-467f7df9670f\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " Dec 06 10:31:35 crc kubenswrapper[4954]: I1206 10:31:35.973967 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/86f6275c-9439-4a32-a0b7-467f7df9670f-test-operator-ephemeral-temporary\") pod \"86f6275c-9439-4a32-a0b7-467f7df9670f\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " Dec 06 10:31:35 crc kubenswrapper[4954]: I1206 10:31:35.974753 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86f6275c-9439-4a32-a0b7-467f7df9670f-config-data" (OuterVolumeSpecName: "config-data") pod "86f6275c-9439-4a32-a0b7-467f7df9670f" (UID: "86f6275c-9439-4a32-a0b7-467f7df9670f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:31:35 crc kubenswrapper[4954]: I1206 10:31:35.974781 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86f6275c-9439-4a32-a0b7-467f7df9670f-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "86f6275c-9439-4a32-a0b7-467f7df9670f" (UID: "86f6275c-9439-4a32-a0b7-467f7df9670f"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:31:35 crc kubenswrapper[4954]: I1206 10:31:35.974889 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/86f6275c-9439-4a32-a0b7-467f7df9670f-test-operator-ephemeral-workdir\") pod \"86f6275c-9439-4a32-a0b7-467f7df9670f\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " Dec 06 10:31:35 crc kubenswrapper[4954]: I1206 10:31:35.974909 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z6t6\" (UniqueName: \"kubernetes.io/projected/86f6275c-9439-4a32-a0b7-467f7df9670f-kube-api-access-7z6t6\") pod \"86f6275c-9439-4a32-a0b7-467f7df9670f\" (UID: \"86f6275c-9439-4a32-a0b7-467f7df9670f\") " Dec 06 10:31:35 crc kubenswrapper[4954]: I1206 10:31:35.978572 4954 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/86f6275c-9439-4a32-a0b7-467f7df9670f-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 10:31:35 crc kubenswrapper[4954]: I1206 10:31:35.979889 4954 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/86f6275c-9439-4a32-a0b7-467f7df9670f-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 06 10:31:35 crc kubenswrapper[4954]: I1206 10:31:35.982937 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86f6275c-9439-4a32-a0b7-467f7df9670f-kube-api-access-7z6t6" (OuterVolumeSpecName: "kube-api-access-7z6t6") pod "86f6275c-9439-4a32-a0b7-467f7df9670f" (UID: "86f6275c-9439-4a32-a0b7-467f7df9670f"). InnerVolumeSpecName "kube-api-access-7z6t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:31:35 crc kubenswrapper[4954]: I1206 10:31:35.983142 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86f6275c-9439-4a32-a0b7-467f7df9670f-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "86f6275c-9439-4a32-a0b7-467f7df9670f" (UID: "86f6275c-9439-4a32-a0b7-467f7df9670f"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:31:35 crc kubenswrapper[4954]: I1206 10:31:35.986723 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "test-operator-logs") pod "86f6275c-9439-4a32-a0b7-467f7df9670f" (UID: "86f6275c-9439-4a32-a0b7-467f7df9670f"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 10:31:36 crc kubenswrapper[4954]: I1206 10:31:36.016267 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86f6275c-9439-4a32-a0b7-467f7df9670f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "86f6275c-9439-4a32-a0b7-467f7df9670f" (UID: "86f6275c-9439-4a32-a0b7-467f7df9670f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:31:36 crc kubenswrapper[4954]: I1206 10:31:36.019901 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86f6275c-9439-4a32-a0b7-467f7df9670f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "86f6275c-9439-4a32-a0b7-467f7df9670f" (UID: "86f6275c-9439-4a32-a0b7-467f7df9670f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:31:36 crc kubenswrapper[4954]: I1206 10:31:36.020017 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86f6275c-9439-4a32-a0b7-467f7df9670f-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "86f6275c-9439-4a32-a0b7-467f7df9670f" (UID: "86f6275c-9439-4a32-a0b7-467f7df9670f"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:31:36 crc kubenswrapper[4954]: I1206 10:31:36.040092 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86f6275c-9439-4a32-a0b7-467f7df9670f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "86f6275c-9439-4a32-a0b7-467f7df9670f" (UID: "86f6275c-9439-4a32-a0b7-467f7df9670f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:31:36 crc kubenswrapper[4954]: I1206 10:31:36.082282 4954 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/86f6275c-9439-4a32-a0b7-467f7df9670f-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 06 10:31:36 crc kubenswrapper[4954]: I1206 10:31:36.082349 4954 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 06 10:31:36 crc kubenswrapper[4954]: I1206 10:31:36.082364 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/86f6275c-9439-4a32-a0b7-467f7df9670f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 06 10:31:36 crc kubenswrapper[4954]: I1206 10:31:36.082374 4954 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/86f6275c-9439-4a32-a0b7-467f7df9670f-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 06 10:31:36 crc kubenswrapper[4954]: I1206 10:31:36.082384 4954 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86f6275c-9439-4a32-a0b7-467f7df9670f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 10:31:36 crc kubenswrapper[4954]: I1206 10:31:36.082404 4954 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/86f6275c-9439-4a32-a0b7-467f7df9670f-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 06 10:31:36 crc kubenswrapper[4954]: I1206 10:31:36.082414 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z6t6\" (UniqueName: \"kubernetes.io/projected/86f6275c-9439-4a32-a0b7-467f7df9670f-kube-api-access-7z6t6\") on node \"crc\" DevicePath \"\"" Dec 06 10:31:36 crc kubenswrapper[4954]: I1206 10:31:36.108061 4954 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 06 10:31:36 crc kubenswrapper[4954]: I1206 10:31:36.184209 4954 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 06 10:31:36 crc kubenswrapper[4954]: I1206 10:31:36.276409 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"86f6275c-9439-4a32-a0b7-467f7df9670f","Type":"ContainerDied","Data":"9f948789366bfc24f3b17859f6ae0920a0692a8cebc5b71d42e4a487777fdc08"} Dec 06 10:31:36 crc kubenswrapper[4954]: I1206 10:31:36.276447 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f948789366bfc24f3b17859f6ae0920a0692a8cebc5b71d42e4a487777fdc08" Dec 06 10:31:36 crc kubenswrapper[4954]: I1206 10:31:36.276508 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 10:31:39 crc kubenswrapper[4954]: I1206 10:31:39.091285 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 06 10:31:39 crc kubenswrapper[4954]: E1206 10:31:39.092286 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f6275c-9439-4a32-a0b7-467f7df9670f" containerName="tempest-tests-tempest-tests-runner" Dec 06 10:31:39 crc kubenswrapper[4954]: I1206 10:31:39.092301 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f6275c-9439-4a32-a0b7-467f7df9670f" containerName="tempest-tests-tempest-tests-runner" Dec 06 10:31:39 crc kubenswrapper[4954]: E1206 10:31:39.092339 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0923ee42-b759-4708-8a3a-7c875f37d3f4" containerName="collect-profiles" Dec 06 10:31:39 crc kubenswrapper[4954]: I1206 10:31:39.092345 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="0923ee42-b759-4708-8a3a-7c875f37d3f4" containerName="collect-profiles" Dec 06 10:31:39 crc kubenswrapper[4954]: I1206 10:31:39.092538 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="0923ee42-b759-4708-8a3a-7c875f37d3f4" containerName="collect-profiles" Dec 06 10:31:39 crc kubenswrapper[4954]: I1206 10:31:39.092551 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f6275c-9439-4a32-a0b7-467f7df9670f" containerName="tempest-tests-tempest-tests-runner" Dec 06 10:31:39 crc kubenswrapper[4954]: I1206 10:31:39.093520 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 10:31:39 crc kubenswrapper[4954]: I1206 10:31:39.096105 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-s69fq" Dec 06 10:31:39 crc kubenswrapper[4954]: I1206 10:31:39.102521 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 06 10:31:39 crc kubenswrapper[4954]: I1206 10:31:39.243807 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5zc8\" (UniqueName: \"kubernetes.io/projected/540573fa-cbaa-42cc-9ed7-a927a28dc7c9-kube-api-access-x5zc8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"540573fa-cbaa-42cc-9ed7-a927a28dc7c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 10:31:39 crc kubenswrapper[4954]: I1206 10:31:39.243912 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"540573fa-cbaa-42cc-9ed7-a927a28dc7c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 10:31:39 crc kubenswrapper[4954]: I1206 10:31:39.346058 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5zc8\" (UniqueName: \"kubernetes.io/projected/540573fa-cbaa-42cc-9ed7-a927a28dc7c9-kube-api-access-x5zc8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"540573fa-cbaa-42cc-9ed7-a927a28dc7c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 10:31:39 crc kubenswrapper[4954]: I1206 10:31:39.346246 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"540573fa-cbaa-42cc-9ed7-a927a28dc7c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 10:31:39 crc kubenswrapper[4954]: I1206 10:31:39.346954 4954 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"540573fa-cbaa-42cc-9ed7-a927a28dc7c9\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 10:31:39 crc kubenswrapper[4954]: I1206 10:31:39.372427 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5zc8\" (UniqueName: \"kubernetes.io/projected/540573fa-cbaa-42cc-9ed7-a927a28dc7c9-kube-api-access-x5zc8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"540573fa-cbaa-42cc-9ed7-a927a28dc7c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 10:31:39 crc kubenswrapper[4954]: I1206 10:31:39.377147 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"540573fa-cbaa-42cc-9ed7-a927a28dc7c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 10:31:39 crc kubenswrapper[4954]: I1206 10:31:39.427770 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 10:31:40 crc kubenswrapper[4954]: I1206 10:31:40.134779 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 06 10:31:40 crc kubenswrapper[4954]: I1206 10:31:40.148188 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 10:31:40 crc kubenswrapper[4954]: I1206 10:31:40.376530 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"540573fa-cbaa-42cc-9ed7-a927a28dc7c9","Type":"ContainerStarted","Data":"398cf1adb27748864a7ef994e8652f9242356481bf3bf70727a1df2a1e56449b"} Dec 06 10:31:42 crc kubenswrapper[4954]: I1206 10:31:42.408767 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"540573fa-cbaa-42cc-9ed7-a927a28dc7c9","Type":"ContainerStarted","Data":"4ea44dd8a33a726f2ad42fef57fc202819be6d7dda520e76e5fabb577b5de9d1"} Dec 06 10:31:55 crc kubenswrapper[4954]: I1206 10:31:55.983761 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=15.891832167 podStartE2EDuration="16.983738758s" podCreationTimestamp="2025-12-06 10:31:39 +0000 UTC" firstStartedPulling="2025-12-06 10:31:40.147951765 +0000 UTC m=+12874.961311154" lastFinishedPulling="2025-12-06 10:31:41.239858356 +0000 UTC m=+12876.053217745" observedRunningTime="2025-12-06 10:31:42.427393384 +0000 UTC m=+12877.240752773" watchObservedRunningTime="2025-12-06 10:31:55.983738758 +0000 UTC m=+12890.797098147" Dec 06 10:31:55 crc kubenswrapper[4954]: I1206 10:31:55.991401 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f887p"] Dec 06 10:31:56 crc kubenswrapper[4954]: I1206 10:31:56.017440 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f887p" Dec 06 10:31:56 crc kubenswrapper[4954]: I1206 10:31:56.049884 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f887p"] Dec 06 10:31:56 crc kubenswrapper[4954]: I1206 10:31:56.078909 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9669e74c-1db4-47f6-aeec-291d98468906-catalog-content\") pod \"certified-operators-f887p\" (UID: \"9669e74c-1db4-47f6-aeec-291d98468906\") " pod="openshift-marketplace/certified-operators-f887p" Dec 06 10:31:56 crc kubenswrapper[4954]: I1206 10:31:56.079010 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtbqp\" (UniqueName: \"kubernetes.io/projected/9669e74c-1db4-47f6-aeec-291d98468906-kube-api-access-gtbqp\") pod \"certified-operators-f887p\" (UID: \"9669e74c-1db4-47f6-aeec-291d98468906\") " pod="openshift-marketplace/certified-operators-f887p" Dec 06 10:31:56 crc kubenswrapper[4954]: I1206 10:31:56.079056 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9669e74c-1db4-47f6-aeec-291d98468906-utilities\") pod \"certified-operators-f887p\" (UID: \"9669e74c-1db4-47f6-aeec-291d98468906\") " pod="openshift-marketplace/certified-operators-f887p" Dec 06 10:31:56 crc kubenswrapper[4954]: I1206 10:31:56.181964 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9669e74c-1db4-47f6-aeec-291d98468906-catalog-content\") pod \"certified-operators-f887p\" (UID: \"9669e74c-1db4-47f6-aeec-291d98468906\") " pod="openshift-marketplace/certified-operators-f887p" Dec 06 10:31:56 crc kubenswrapper[4954]: I1206 10:31:56.182068 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtbqp\" (UniqueName: \"kubernetes.io/projected/9669e74c-1db4-47f6-aeec-291d98468906-kube-api-access-gtbqp\") pod \"certified-operators-f887p\" (UID: \"9669e74c-1db4-47f6-aeec-291d98468906\") " pod="openshift-marketplace/certified-operators-f887p" Dec 06 10:31:56 crc kubenswrapper[4954]: I1206 10:31:56.182114 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9669e74c-1db4-47f6-aeec-291d98468906-utilities\") pod \"certified-operators-f887p\" (UID: \"9669e74c-1db4-47f6-aeec-291d98468906\") " pod="openshift-marketplace/certified-operators-f887p" Dec 06 10:31:56 crc kubenswrapper[4954]: I1206 10:31:56.182717 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9669e74c-1db4-47f6-aeec-291d98468906-catalog-content\") pod \"certified-operators-f887p\" (UID: \"9669e74c-1db4-47f6-aeec-291d98468906\") " pod="openshift-marketplace/certified-operators-f887p" Dec 06 10:31:56 crc kubenswrapper[4954]: I1206 10:31:56.182799 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9669e74c-1db4-47f6-aeec-291d98468906-utilities\") pod \"certified-operators-f887p\" (UID: \"9669e74c-1db4-47f6-aeec-291d98468906\") " pod="openshift-marketplace/certified-operators-f887p" Dec 06 10:31:56 crc kubenswrapper[4954]: I1206 10:31:56.216453 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtbqp\" (UniqueName: \"kubernetes.io/projected/9669e74c-1db4-47f6-aeec-291d98468906-kube-api-access-gtbqp\") pod \"certified-operators-f887p\" (UID: \"9669e74c-1db4-47f6-aeec-291d98468906\") " pod="openshift-marketplace/certified-operators-f887p" Dec 06 10:31:56 crc kubenswrapper[4954]: I1206 10:31:56.356632 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f887p" Dec 06 10:31:57 crc kubenswrapper[4954]: I1206 10:31:57.102180 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f887p"] Dec 06 10:31:57 crc kubenswrapper[4954]: W1206 10:31:57.115884 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9669e74c_1db4_47f6_aeec_291d98468906.slice/crio-85f167abb08b8028af6a9306a3ae112a0b0b2b86cc0fa314262b70364385c25c WatchSource:0}: Error finding container 85f167abb08b8028af6a9306a3ae112a0b0b2b86cc0fa314262b70364385c25c: Status 404 returned error can't find the container with id 85f167abb08b8028af6a9306a3ae112a0b0b2b86cc0fa314262b70364385c25c Dec 06 10:31:57 crc kubenswrapper[4954]: I1206 10:31:57.638499 4954 generic.go:334] "Generic (PLEG): container finished" podID="9669e74c-1db4-47f6-aeec-291d98468906" containerID="d520ef986a986a8367f59ebe995ed649d563a03b6398f639b7b6090642cbf063" exitCode=0 Dec 06 10:31:57 crc kubenswrapper[4954]: I1206 10:31:57.639023 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f887p" event={"ID":"9669e74c-1db4-47f6-aeec-291d98468906","Type":"ContainerDied","Data":"d520ef986a986a8367f59ebe995ed649d563a03b6398f639b7b6090642cbf063"} Dec 06 10:31:57 crc kubenswrapper[4954]: I1206 10:31:57.639050 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f887p" event={"ID":"9669e74c-1db4-47f6-aeec-291d98468906","Type":"ContainerStarted","Data":"85f167abb08b8028af6a9306a3ae112a0b0b2b86cc0fa314262b70364385c25c"} Dec 06 10:31:59 crc kubenswrapper[4954]: I1206 10:31:59.666447 4954 generic.go:334] "Generic (PLEG): container finished" podID="9669e74c-1db4-47f6-aeec-291d98468906" containerID="ddcb76e7cb96c8e05421a07a3edcaef865066e02183239861bd61221d9ea1e16" exitCode=0 Dec 06 10:31:59 crc kubenswrapper[4954]: I1206 10:31:59.666596 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f887p" event={"ID":"9669e74c-1db4-47f6-aeec-291d98468906","Type":"ContainerDied","Data":"ddcb76e7cb96c8e05421a07a3edcaef865066e02183239861bd61221d9ea1e16"} Dec 06 10:32:00 crc kubenswrapper[4954]: I1206 10:32:00.681125 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f887p" event={"ID":"9669e74c-1db4-47f6-aeec-291d98468906","Type":"ContainerStarted","Data":"b510fa056add55b4640d6fd263d48edb496990c07e59e1eb51ce1957f919b146"} Dec 06 10:32:00 crc kubenswrapper[4954]: I1206 10:32:00.709176 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f887p" podStartSLOduration=3.296829023 podStartE2EDuration="5.709159541s" podCreationTimestamp="2025-12-06 10:31:55 +0000 UTC" firstStartedPulling="2025-12-06 10:31:57.641648539 +0000 UTC m=+12892.455007928" lastFinishedPulling="2025-12-06 10:32:00.053979057 +0000 UTC m=+12894.867338446" observedRunningTime="2025-12-06 10:32:00.70273082 +0000 UTC m=+12895.516090209" watchObservedRunningTime="2025-12-06 10:32:00.709159541 +0000 UTC m=+12895.522518920" Dec 06 10:32:06 crc kubenswrapper[4954]: I1206 10:32:06.357750 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f887p" Dec 06 10:32:06 crc kubenswrapper[4954]: I1206 10:32:06.358374 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f887p" Dec 06 10:32:06 crc kubenswrapper[4954]: I1206 10:32:06.408194 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f887p" Dec 06 10:32:06 crc kubenswrapper[4954]: I1206 10:32:06.813546 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f887p" Dec 06 10:32:06 crc kubenswrapper[4954]: I1206 10:32:06.876731 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f887p"] Dec 06 10:32:08 crc kubenswrapper[4954]: I1206 10:32:08.760346 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f887p" podUID="9669e74c-1db4-47f6-aeec-291d98468906" containerName="registry-server" containerID="cri-o://b510fa056add55b4640d6fd263d48edb496990c07e59e1eb51ce1957f919b146" gracePeriod=2 Dec 06 10:32:09 crc kubenswrapper[4954]: I1206 10:32:09.773089 4954 generic.go:334] "Generic (PLEG): container finished" podID="9669e74c-1db4-47f6-aeec-291d98468906" containerID="b510fa056add55b4640d6fd263d48edb496990c07e59e1eb51ce1957f919b146" exitCode=0 Dec 06 10:32:09 crc kubenswrapper[4954]: I1206 10:32:09.773252 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f887p" event={"ID":"9669e74c-1db4-47f6-aeec-291d98468906","Type":"ContainerDied","Data":"b510fa056add55b4640d6fd263d48edb496990c07e59e1eb51ce1957f919b146"} Dec 06 10:32:10 crc kubenswrapper[4954]: I1206 10:32:10.316674 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f887p" Dec 06 10:32:10 crc kubenswrapper[4954]: I1206 10:32:10.472103 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9669e74c-1db4-47f6-aeec-291d98468906-utilities\") pod \"9669e74c-1db4-47f6-aeec-291d98468906\" (UID: \"9669e74c-1db4-47f6-aeec-291d98468906\") " Dec 06 10:32:10 crc kubenswrapper[4954]: I1206 10:32:10.472211 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtbqp\" (UniqueName: \"kubernetes.io/projected/9669e74c-1db4-47f6-aeec-291d98468906-kube-api-access-gtbqp\") pod \"9669e74c-1db4-47f6-aeec-291d98468906\" (UID: \"9669e74c-1db4-47f6-aeec-291d98468906\") " Dec 06 10:32:10 crc kubenswrapper[4954]: I1206 10:32:10.472235 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9669e74c-1db4-47f6-aeec-291d98468906-catalog-content\") pod \"9669e74c-1db4-47f6-aeec-291d98468906\" (UID: \"9669e74c-1db4-47f6-aeec-291d98468906\") " Dec 06 10:32:10 crc kubenswrapper[4954]: I1206 10:32:10.475414 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9669e74c-1db4-47f6-aeec-291d98468906-utilities" (OuterVolumeSpecName: "utilities") pod "9669e74c-1db4-47f6-aeec-291d98468906" (UID: "9669e74c-1db4-47f6-aeec-291d98468906"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:32:10 crc kubenswrapper[4954]: I1206 10:32:10.481862 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9669e74c-1db4-47f6-aeec-291d98468906-kube-api-access-gtbqp" (OuterVolumeSpecName: "kube-api-access-gtbqp") pod "9669e74c-1db4-47f6-aeec-291d98468906" (UID: "9669e74c-1db4-47f6-aeec-291d98468906"). InnerVolumeSpecName "kube-api-access-gtbqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:32:10 crc kubenswrapper[4954]: I1206 10:32:10.525920 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9669e74c-1db4-47f6-aeec-291d98468906-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9669e74c-1db4-47f6-aeec-291d98468906" (UID: "9669e74c-1db4-47f6-aeec-291d98468906"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:32:10 crc kubenswrapper[4954]: I1206 10:32:10.574088 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9669e74c-1db4-47f6-aeec-291d98468906-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:32:10 crc kubenswrapper[4954]: I1206 10:32:10.574124 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtbqp\" (UniqueName: \"kubernetes.io/projected/9669e74c-1db4-47f6-aeec-291d98468906-kube-api-access-gtbqp\") on node \"crc\" DevicePath \"\"" Dec 06 10:32:10 crc kubenswrapper[4954]: I1206 10:32:10.574135 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9669e74c-1db4-47f6-aeec-291d98468906-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:32:10 crc kubenswrapper[4954]: I1206 10:32:10.787747 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f887p" event={"ID":"9669e74c-1db4-47f6-aeec-291d98468906","Type":"ContainerDied","Data":"85f167abb08b8028af6a9306a3ae112a0b0b2b86cc0fa314262b70364385c25c"} Dec 06 10:32:10 crc kubenswrapper[4954]: I1206 10:32:10.787800 4954 scope.go:117] "RemoveContainer" containerID="b510fa056add55b4640d6fd263d48edb496990c07e59e1eb51ce1957f919b146" Dec 06 10:32:10 crc kubenswrapper[4954]: I1206 10:32:10.787823 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f887p" Dec 06 10:32:10 crc kubenswrapper[4954]: I1206 10:32:10.817508 4954 scope.go:117] "RemoveContainer" containerID="ddcb76e7cb96c8e05421a07a3edcaef865066e02183239861bd61221d9ea1e16" Dec 06 10:32:10 crc kubenswrapper[4954]: I1206 10:32:10.842620 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f887p"] Dec 06 10:32:10 crc kubenswrapper[4954]: I1206 10:32:10.861368 4954 scope.go:117] "RemoveContainer" containerID="d520ef986a986a8367f59ebe995ed649d563a03b6398f639b7b6090642cbf063" Dec 06 10:32:10 crc kubenswrapper[4954]: I1206 10:32:10.883053 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f887p"] Dec 06 10:32:11 crc kubenswrapper[4954]: I1206 10:32:11.456795 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9669e74c-1db4-47f6-aeec-291d98468906" path="/var/lib/kubelet/pods/9669e74c-1db4-47f6-aeec-291d98468906/volumes" Dec 06 10:32:54 crc kubenswrapper[4954]: I1206 10:32:54.989936 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kc95s/must-gather-xz68n"] Dec 06 10:32:54 crc kubenswrapper[4954]: E1206 10:32:54.990853 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9669e74c-1db4-47f6-aeec-291d98468906" containerName="registry-server" Dec 06 10:32:54 crc kubenswrapper[4954]: I1206 10:32:54.990867 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9669e74c-1db4-47f6-aeec-291d98468906" containerName="registry-server" Dec 06 10:32:54 crc kubenswrapper[4954]: E1206 10:32:54.990901 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9669e74c-1db4-47f6-aeec-291d98468906" containerName="extract-utilities" Dec 06 10:32:54 crc kubenswrapper[4954]: I1206 10:32:54.990907 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9669e74c-1db4-47f6-aeec-291d98468906" containerName="extract-utilities" Dec 06 10:32:54 crc kubenswrapper[4954]: E1206 10:32:54.990920 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9669e74c-1db4-47f6-aeec-291d98468906" containerName="extract-content" Dec 06 10:32:54 crc kubenswrapper[4954]: I1206 10:32:54.990926 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9669e74c-1db4-47f6-aeec-291d98468906" containerName="extract-content" Dec 06 10:32:54 crc kubenswrapper[4954]: I1206 10:32:54.991175 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9669e74c-1db4-47f6-aeec-291d98468906" containerName="registry-server" Dec 06 10:32:54 crc kubenswrapper[4954]: I1206 10:32:54.992268 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kc95s/must-gather-xz68n" Dec 06 10:32:55 crc kubenswrapper[4954]: I1206 10:32:55.000769 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kc95s"/"openshift-service-ca.crt" Dec 06 10:32:55 crc kubenswrapper[4954]: I1206 10:32:55.000858 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kc95s"/"kube-root-ca.crt" Dec 06 10:32:55 crc kubenswrapper[4954]: I1206 10:32:55.009077 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kc95s/must-gather-xz68n"] Dec 06 10:32:55 crc kubenswrapper[4954]: I1206 10:32:55.113773 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c6ef6020-8a06-4945-9286-ed7dec7f863b-must-gather-output\") pod \"must-gather-xz68n\" (UID: \"c6ef6020-8a06-4945-9286-ed7dec7f863b\") " pod="openshift-must-gather-kc95s/must-gather-xz68n" Dec 06 10:32:55 crc kubenswrapper[4954]: I1206 10:32:55.114192 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pjls\" (UniqueName: \"kubernetes.io/projected/c6ef6020-8a06-4945-9286-ed7dec7f863b-kube-api-access-6pjls\") pod \"must-gather-xz68n\" (UID: \"c6ef6020-8a06-4945-9286-ed7dec7f863b\") " pod="openshift-must-gather-kc95s/must-gather-xz68n" Dec 06 10:32:55 crc kubenswrapper[4954]: I1206 10:32:55.217701 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c6ef6020-8a06-4945-9286-ed7dec7f863b-must-gather-output\") pod \"must-gather-xz68n\" (UID: \"c6ef6020-8a06-4945-9286-ed7dec7f863b\") " pod="openshift-must-gather-kc95s/must-gather-xz68n" Dec 06 10:32:55 crc kubenswrapper[4954]: I1206 10:32:55.217884 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pjls\" (UniqueName: \"kubernetes.io/projected/c6ef6020-8a06-4945-9286-ed7dec7f863b-kube-api-access-6pjls\") pod \"must-gather-xz68n\" (UID: \"c6ef6020-8a06-4945-9286-ed7dec7f863b\") " pod="openshift-must-gather-kc95s/must-gather-xz68n" Dec 06 10:32:55 crc kubenswrapper[4954]: I1206 10:32:55.218082 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c6ef6020-8a06-4945-9286-ed7dec7f863b-must-gather-output\") pod \"must-gather-xz68n\" (UID: \"c6ef6020-8a06-4945-9286-ed7dec7f863b\") " pod="openshift-must-gather-kc95s/must-gather-xz68n" Dec 06 10:32:55 crc kubenswrapper[4954]: I1206 10:32:55.242221 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pjls\" (UniqueName: \"kubernetes.io/projected/c6ef6020-8a06-4945-9286-ed7dec7f863b-kube-api-access-6pjls\") pod \"must-gather-xz68n\" (UID: \"c6ef6020-8a06-4945-9286-ed7dec7f863b\") " pod="openshift-must-gather-kc95s/must-gather-xz68n" Dec 06 10:32:55 crc kubenswrapper[4954]: I1206 10:32:55.318791 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kc95s/must-gather-xz68n" Dec 06 10:32:56 crc kubenswrapper[4954]: I1206 10:32:56.355832 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kc95s/must-gather-xz68n"] Dec 06 10:32:57 crc kubenswrapper[4954]: I1206 10:32:57.348920 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kc95s/must-gather-xz68n" event={"ID":"c6ef6020-8a06-4945-9286-ed7dec7f863b","Type":"ContainerStarted","Data":"4c74f679d93ca6dc45e8bb7689e5f50138af5fb59c8359f83467223158536ac2"} Dec 06 10:33:07 crc kubenswrapper[4954]: I1206 10:33:07.479130 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kc95s/must-gather-xz68n" event={"ID":"c6ef6020-8a06-4945-9286-ed7dec7f863b","Type":"ContainerStarted","Data":"7359f1c0f5e65d1baf2629d769298e5ac9ed07e3dfce69c7b414fbf11d976d88"} Dec 06 10:33:07 crc kubenswrapper[4954]: I1206 10:33:07.479880 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kc95s/must-gather-xz68n" event={"ID":"c6ef6020-8a06-4945-9286-ed7dec7f863b","Type":"ContainerStarted","Data":"ff4059269113fb66ba8e9a7d4645cc6e30a0491ae5d9f45a05d69c6ed25c26b1"} Dec 06 10:33:07 crc kubenswrapper[4954]: I1206 10:33:07.524363 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kc95s/must-gather-xz68n" podStartSLOduration=4.423402915 podStartE2EDuration="13.524340635s" podCreationTimestamp="2025-12-06 10:32:54 +0000 UTC" firstStartedPulling="2025-12-06 10:32:56.365797142 +0000 UTC m=+12951.179156531" lastFinishedPulling="2025-12-06 10:33:05.466734862 +0000 UTC m=+12960.280094251" observedRunningTime="2025-12-06 10:33:07.496776621 +0000 UTC m=+12962.310136010" watchObservedRunningTime="2025-12-06 10:33:07.524340635 +0000 UTC m=+12962.337700024" Dec 06 10:33:14 crc kubenswrapper[4954]: I1206 10:33:14.185261 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kc95s/crc-debug-mjb5z"] Dec 06 10:33:14 crc kubenswrapper[4954]: I1206 10:33:14.187090 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kc95s/crc-debug-mjb5z" Dec 06 10:33:14 crc kubenswrapper[4954]: I1206 10:33:14.196832 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kc95s"/"default-dockercfg-rhqmt" Dec 06 10:33:14 crc kubenswrapper[4954]: I1206 10:33:14.369303 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hntk\" (UniqueName: \"kubernetes.io/projected/d273f60e-0c6f-40d4-aa34-e762c4ce09d8-kube-api-access-4hntk\") pod \"crc-debug-mjb5z\" (UID: \"d273f60e-0c6f-40d4-aa34-e762c4ce09d8\") " pod="openshift-must-gather-kc95s/crc-debug-mjb5z" Dec 06 10:33:14 crc kubenswrapper[4954]: I1206 10:33:14.369589 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d273f60e-0c6f-40d4-aa34-e762c4ce09d8-host\") pod \"crc-debug-mjb5z\" (UID: \"d273f60e-0c6f-40d4-aa34-e762c4ce09d8\") " pod="openshift-must-gather-kc95s/crc-debug-mjb5z" Dec 06 10:33:14 crc kubenswrapper[4954]: I1206 10:33:14.471217 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d273f60e-0c6f-40d4-aa34-e762c4ce09d8-host\") pod \"crc-debug-mjb5z\" (UID: \"d273f60e-0c6f-40d4-aa34-e762c4ce09d8\") " pod="openshift-must-gather-kc95s/crc-debug-mjb5z" Dec 06 10:33:14 crc kubenswrapper[4954]: I1206 10:33:14.471492 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hntk\" (UniqueName: \"kubernetes.io/projected/d273f60e-0c6f-40d4-aa34-e762c4ce09d8-kube-api-access-4hntk\") pod \"crc-debug-mjb5z\" (UID: \"d273f60e-0c6f-40d4-aa34-e762c4ce09d8\") " pod="openshift-must-gather-kc95s/crc-debug-mjb5z" Dec 06 10:33:14 crc kubenswrapper[4954]: I1206 10:33:14.471581 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d273f60e-0c6f-40d4-aa34-e762c4ce09d8-host\") pod \"crc-debug-mjb5z\" (UID: \"d273f60e-0c6f-40d4-aa34-e762c4ce09d8\") " pod="openshift-must-gather-kc95s/crc-debug-mjb5z" Dec 06 10:33:14 crc kubenswrapper[4954]: I1206 10:33:14.494483 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hntk\" (UniqueName: \"kubernetes.io/projected/d273f60e-0c6f-40d4-aa34-e762c4ce09d8-kube-api-access-4hntk\") pod \"crc-debug-mjb5z\" (UID: \"d273f60e-0c6f-40d4-aa34-e762c4ce09d8\") " pod="openshift-must-gather-kc95s/crc-debug-mjb5z" Dec 06 10:33:14 crc kubenswrapper[4954]: I1206 10:33:14.517159 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kc95s/crc-debug-mjb5z" Dec 06 10:33:14 crc kubenswrapper[4954]: I1206 10:33:14.609357 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kc95s/crc-debug-mjb5z" event={"ID":"d273f60e-0c6f-40d4-aa34-e762c4ce09d8","Type":"ContainerStarted","Data":"d393f850ace0f7d5d2d83740f391af4d56484eceea28c9fc5240c48ec01854b7"} Dec 06 10:33:32 crc kubenswrapper[4954]: E1206 10:33:32.596428 4954 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Dec 06 10:33:32 crc kubenswrapper[4954]: E1206 10:33:32.597976 4954 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4hntk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-mjb5z_openshift-must-gather-kc95s(d273f60e-0c6f-40d4-aa34-e762c4ce09d8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 10:33:32 crc kubenswrapper[4954]: E1206 10:33:32.599853 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-kc95s/crc-debug-mjb5z" podUID="d273f60e-0c6f-40d4-aa34-e762c4ce09d8" Dec 06 10:33:32 crc kubenswrapper[4954]: E1206 10:33:32.852864 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-kc95s/crc-debug-mjb5z" podUID="d273f60e-0c6f-40d4-aa34-e762c4ce09d8" Dec 06 10:33:40 crc kubenswrapper[4954]: I1206 10:33:40.101909 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:33:40 crc kubenswrapper[4954]: I1206 10:33:40.102757 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:33:47 crc kubenswrapper[4954]: I1206 10:33:47.025351 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kc95s/crc-debug-mjb5z" event={"ID":"d273f60e-0c6f-40d4-aa34-e762c4ce09d8","Type":"ContainerStarted","Data":"c7c0e4d35756979c3fcc73463dc91344fd30971ddf0d216a8a2d3a22b8c74d62"} Dec 06 10:33:47 crc kubenswrapper[4954]: I1206 10:33:47.047733 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kc95s/crc-debug-mjb5z" podStartSLOduration=1.486270749 podStartE2EDuration="33.047713245s" podCreationTimestamp="2025-12-06 10:33:14 +0000 UTC" firstStartedPulling="2025-12-06 10:33:14.566768537 +0000 UTC m=+12969.380127926" lastFinishedPulling="2025-12-06 10:33:46.128211013 +0000 UTC m=+13000.941570422" observedRunningTime="2025-12-06 10:33:47.040199555 +0000 UTC m=+13001.853558964" watchObservedRunningTime="2025-12-06 10:33:47.047713245 +0000 UTC m=+13001.861072634" Dec 06 10:34:10 crc kubenswrapper[4954]: I1206 10:34:10.101722 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:34:10 crc kubenswrapper[4954]: I1206 10:34:10.102465 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:34:33 crc kubenswrapper[4954]: I1206 10:34:33.580307 4954 generic.go:334] "Generic (PLEG): container finished" podID="d273f60e-0c6f-40d4-aa34-e762c4ce09d8" containerID="c7c0e4d35756979c3fcc73463dc91344fd30971ddf0d216a8a2d3a22b8c74d62" exitCode=0 Dec 06 10:34:33 crc kubenswrapper[4954]: I1206 10:34:33.580425 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kc95s/crc-debug-mjb5z" event={"ID":"d273f60e-0c6f-40d4-aa34-e762c4ce09d8","Type":"ContainerDied","Data":"c7c0e4d35756979c3fcc73463dc91344fd30971ddf0d216a8a2d3a22b8c74d62"} Dec 06 10:34:34 crc kubenswrapper[4954]: I1206 10:34:34.716826 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kc95s/crc-debug-mjb5z" Dec 06 10:34:34 crc kubenswrapper[4954]: I1206 10:34:34.767162 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kc95s/crc-debug-mjb5z"] Dec 06 10:34:34 crc kubenswrapper[4954]: I1206 10:34:34.784479 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kc95s/crc-debug-mjb5z"] Dec 06 10:34:34 crc kubenswrapper[4954]: I1206 10:34:34.827160 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d273f60e-0c6f-40d4-aa34-e762c4ce09d8-host\") pod \"d273f60e-0c6f-40d4-aa34-e762c4ce09d8\" (UID: \"d273f60e-0c6f-40d4-aa34-e762c4ce09d8\") " Dec 06 10:34:34 crc kubenswrapper[4954]: I1206 10:34:34.827729 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hntk\" (UniqueName: \"kubernetes.io/projected/d273f60e-0c6f-40d4-aa34-e762c4ce09d8-kube-api-access-4hntk\") pod \"d273f60e-0c6f-40d4-aa34-e762c4ce09d8\" (UID: \"d273f60e-0c6f-40d4-aa34-e762c4ce09d8\") " Dec 06 10:34:34 crc kubenswrapper[4954]: I1206 10:34:34.828075 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d273f60e-0c6f-40d4-aa34-e762c4ce09d8-host" (OuterVolumeSpecName: "host") pod "d273f60e-0c6f-40d4-aa34-e762c4ce09d8" (UID: "d273f60e-0c6f-40d4-aa34-e762c4ce09d8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 10:34:34 crc kubenswrapper[4954]: I1206 10:34:34.828416 4954 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d273f60e-0c6f-40d4-aa34-e762c4ce09d8-host\") on node \"crc\" DevicePath \"\"" Dec 06 10:34:34 crc kubenswrapper[4954]: I1206 10:34:34.856469 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d273f60e-0c6f-40d4-aa34-e762c4ce09d8-kube-api-access-4hntk" (OuterVolumeSpecName: "kube-api-access-4hntk") pod "d273f60e-0c6f-40d4-aa34-e762c4ce09d8" (UID: "d273f60e-0c6f-40d4-aa34-e762c4ce09d8"). InnerVolumeSpecName "kube-api-access-4hntk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:34:34 crc kubenswrapper[4954]: I1206 10:34:34.929840 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hntk\" (UniqueName: \"kubernetes.io/projected/d273f60e-0c6f-40d4-aa34-e762c4ce09d8-kube-api-access-4hntk\") on node \"crc\" DevicePath \"\"" Dec 06 10:34:35 crc kubenswrapper[4954]: I1206 10:34:35.469659 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d273f60e-0c6f-40d4-aa34-e762c4ce09d8" path="/var/lib/kubelet/pods/d273f60e-0c6f-40d4-aa34-e762c4ce09d8/volumes" Dec 06 10:34:35 crc kubenswrapper[4954]: I1206 10:34:35.604944 4954 scope.go:117] "RemoveContainer" containerID="c7c0e4d35756979c3fcc73463dc91344fd30971ddf0d216a8a2d3a22b8c74d62" Dec 06 10:34:35 crc kubenswrapper[4954]: I1206 10:34:35.604966 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kc95s/crc-debug-mjb5z" Dec 06 10:34:36 crc kubenswrapper[4954]: I1206 10:34:36.051976 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kc95s/crc-debug-vbwp5"] Dec 06 10:34:36 crc kubenswrapper[4954]: E1206 10:34:36.052494 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d273f60e-0c6f-40d4-aa34-e762c4ce09d8" containerName="container-00" Dec 06 10:34:36 crc kubenswrapper[4954]: I1206 10:34:36.052507 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="d273f60e-0c6f-40d4-aa34-e762c4ce09d8" containerName="container-00" Dec 06 10:34:36 crc kubenswrapper[4954]: I1206 10:34:36.052777 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="d273f60e-0c6f-40d4-aa34-e762c4ce09d8" containerName="container-00" Dec 06 10:34:36 crc kubenswrapper[4954]: I1206 10:34:36.053504 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kc95s/crc-debug-vbwp5" Dec 06 10:34:36 crc kubenswrapper[4954]: I1206 10:34:36.066234 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kc95s"/"default-dockercfg-rhqmt" Dec 06 10:34:36 crc kubenswrapper[4954]: I1206 10:34:36.160051 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7hxv\" (UniqueName: \"kubernetes.io/projected/af257e51-a2c9-4248-b0d1-97d3d4d0b2e0-kube-api-access-z7hxv\") pod \"crc-debug-vbwp5\" (UID: \"af257e51-a2c9-4248-b0d1-97d3d4d0b2e0\") " pod="openshift-must-gather-kc95s/crc-debug-vbwp5" Dec 06 10:34:36 crc kubenswrapper[4954]: I1206 10:34:36.160104 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af257e51-a2c9-4248-b0d1-97d3d4d0b2e0-host\") pod \"crc-debug-vbwp5\" (UID: \"af257e51-a2c9-4248-b0d1-97d3d4d0b2e0\") " pod="openshift-must-gather-kc95s/crc-debug-vbwp5" Dec 06 10:34:36 crc kubenswrapper[4954]: I1206 10:34:36.262811 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7hxv\" (UniqueName: \"kubernetes.io/projected/af257e51-a2c9-4248-b0d1-97d3d4d0b2e0-kube-api-access-z7hxv\") pod \"crc-debug-vbwp5\" (UID: \"af257e51-a2c9-4248-b0d1-97d3d4d0b2e0\") " pod="openshift-must-gather-kc95s/crc-debug-vbwp5" Dec 06 10:34:36 crc kubenswrapper[4954]: I1206 10:34:36.262879 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af257e51-a2c9-4248-b0d1-97d3d4d0b2e0-host\") pod \"crc-debug-vbwp5\" (UID: \"af257e51-a2c9-4248-b0d1-97d3d4d0b2e0\") " pod="openshift-must-gather-kc95s/crc-debug-vbwp5" Dec 06 10:34:36 crc kubenswrapper[4954]: I1206 10:34:36.263129 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af257e51-a2c9-4248-b0d1-97d3d4d0b2e0-host\") pod \"crc-debug-vbwp5\" (UID: \"af257e51-a2c9-4248-b0d1-97d3d4d0b2e0\") " pod="openshift-must-gather-kc95s/crc-debug-vbwp5" Dec 06 10:34:36 crc kubenswrapper[4954]: I1206 10:34:36.281554 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7hxv\" (UniqueName: \"kubernetes.io/projected/af257e51-a2c9-4248-b0d1-97d3d4d0b2e0-kube-api-access-z7hxv\") pod \"crc-debug-vbwp5\" (UID: \"af257e51-a2c9-4248-b0d1-97d3d4d0b2e0\") " pod="openshift-must-gather-kc95s/crc-debug-vbwp5" Dec 06 10:34:36 crc kubenswrapper[4954]: I1206 10:34:36.371478 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kc95s/crc-debug-vbwp5" Dec 06 10:34:36 crc kubenswrapper[4954]: I1206 10:34:36.616647 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kc95s/crc-debug-vbwp5" event={"ID":"af257e51-a2c9-4248-b0d1-97d3d4d0b2e0","Type":"ContainerStarted","Data":"5be693013f62588ea8384f759a232d21941f92afd5d383bc93d62a3d06c05090"} Dec 06 10:34:37 crc kubenswrapper[4954]: I1206 10:34:37.630381 4954 generic.go:334] "Generic (PLEG): container finished" podID="af257e51-a2c9-4248-b0d1-97d3d4d0b2e0" containerID="d10aafe02940ced5843237f6d252bfca0a86644203c9447ed7c754a7aea0b381" exitCode=0 Dec 06 10:34:37 crc kubenswrapper[4954]: I1206 10:34:37.630486 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kc95s/crc-debug-vbwp5" event={"ID":"af257e51-a2c9-4248-b0d1-97d3d4d0b2e0","Type":"ContainerDied","Data":"d10aafe02940ced5843237f6d252bfca0a86644203c9447ed7c754a7aea0b381"} Dec 06 10:34:38 crc kubenswrapper[4954]: I1206 10:34:38.769989 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kc95s/crc-debug-vbwp5" Dec 06 10:34:38 crc kubenswrapper[4954]: I1206 10:34:38.830966 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af257e51-a2c9-4248-b0d1-97d3d4d0b2e0-host\") pod \"af257e51-a2c9-4248-b0d1-97d3d4d0b2e0\" (UID: \"af257e51-a2c9-4248-b0d1-97d3d4d0b2e0\") " Dec 06 10:34:38 crc kubenswrapper[4954]: I1206 10:34:38.831071 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7hxv\" (UniqueName: \"kubernetes.io/projected/af257e51-a2c9-4248-b0d1-97d3d4d0b2e0-kube-api-access-z7hxv\") pod \"af257e51-a2c9-4248-b0d1-97d3d4d0b2e0\" (UID: \"af257e51-a2c9-4248-b0d1-97d3d4d0b2e0\") " Dec 06 10:34:38 crc kubenswrapper[4954]: I1206 10:34:38.831596 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af257e51-a2c9-4248-b0d1-97d3d4d0b2e0-host" (OuterVolumeSpecName: "host") pod "af257e51-a2c9-4248-b0d1-97d3d4d0b2e0" (UID: "af257e51-a2c9-4248-b0d1-97d3d4d0b2e0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 10:34:38 crc kubenswrapper[4954]: I1206 10:34:38.831734 4954 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af257e51-a2c9-4248-b0d1-97d3d4d0b2e0-host\") on node \"crc\" DevicePath \"\"" Dec 06 10:34:38 crc kubenswrapper[4954]: I1206 10:34:38.838238 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af257e51-a2c9-4248-b0d1-97d3d4d0b2e0-kube-api-access-z7hxv" (OuterVolumeSpecName: "kube-api-access-z7hxv") pod "af257e51-a2c9-4248-b0d1-97d3d4d0b2e0" (UID: "af257e51-a2c9-4248-b0d1-97d3d4d0b2e0"). InnerVolumeSpecName "kube-api-access-z7hxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:34:38 crc kubenswrapper[4954]: I1206 10:34:38.933388 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7hxv\" (UniqueName: \"kubernetes.io/projected/af257e51-a2c9-4248-b0d1-97d3d4d0b2e0-kube-api-access-z7hxv\") on node \"crc\" DevicePath \"\"" Dec 06 10:34:39 crc kubenswrapper[4954]: I1206 10:34:39.664275 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kc95s/crc-debug-vbwp5" event={"ID":"af257e51-a2c9-4248-b0d1-97d3d4d0b2e0","Type":"ContainerDied","Data":"5be693013f62588ea8384f759a232d21941f92afd5d383bc93d62a3d06c05090"} Dec 06 10:34:39 crc kubenswrapper[4954]: I1206 10:34:39.664538 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5be693013f62588ea8384f759a232d21941f92afd5d383bc93d62a3d06c05090" Dec 06 10:34:39 crc kubenswrapper[4954]: I1206 10:34:39.664401 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kc95s/crc-debug-vbwp5" Dec 06 10:34:39 crc kubenswrapper[4954]: I1206 10:34:39.722003 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kc95s/crc-debug-vbwp5"] Dec 06 10:34:39 crc kubenswrapper[4954]: I1206 10:34:39.732665 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kc95s/crc-debug-vbwp5"] Dec 06 10:34:40 crc kubenswrapper[4954]: I1206 10:34:40.101484 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:34:40 crc kubenswrapper[4954]: I1206 10:34:40.101588 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:34:40 crc kubenswrapper[4954]: I1206 10:34:40.101650 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 10:34:40 crc kubenswrapper[4954]: I1206 10:34:40.102951 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cd86614438811c10be67f6ffbdd9b17cbf00541ebcf0ba5313b72e6afc36b20c"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 10:34:40 crc kubenswrapper[4954]: I1206 10:34:40.103024 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://cd86614438811c10be67f6ffbdd9b17cbf00541ebcf0ba5313b72e6afc36b20c" gracePeriod=600 Dec 06 10:34:40 crc kubenswrapper[4954]: I1206 10:34:40.715962 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="cd86614438811c10be67f6ffbdd9b17cbf00541ebcf0ba5313b72e6afc36b20c" exitCode=0 Dec 06 10:34:40 crc kubenswrapper[4954]: I1206 10:34:40.716253 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"cd86614438811c10be67f6ffbdd9b17cbf00541ebcf0ba5313b72e6afc36b20c"} Dec 06 10:34:40 crc kubenswrapper[4954]: I1206 10:34:40.716279 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e"} Dec 06 10:34:40 crc kubenswrapper[4954]: I1206 10:34:40.716296 4954 scope.go:117] "RemoveContainer" containerID="392d99c5d74225ba76b899262e66f77901996fd39be5a3c325afbc54b2e2056d" Dec 06 10:34:41 crc kubenswrapper[4954]: I1206 10:34:41.020979 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kc95s/crc-debug-td6xx"] Dec 06 10:34:41 crc kubenswrapper[4954]: E1206 10:34:41.022306 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af257e51-a2c9-4248-b0d1-97d3d4d0b2e0" containerName="container-00" Dec 06 10:34:41 crc kubenswrapper[4954]: I1206 10:34:41.022394 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="af257e51-a2c9-4248-b0d1-97d3d4d0b2e0" containerName="container-00" Dec 06 10:34:41 crc kubenswrapper[4954]: I1206 10:34:41.022770 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="af257e51-a2c9-4248-b0d1-97d3d4d0b2e0" containerName="container-00" Dec 06 10:34:41 crc kubenswrapper[4954]: I1206 10:34:41.023658 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kc95s/crc-debug-td6xx" Dec 06 10:34:41 crc kubenswrapper[4954]: I1206 10:34:41.026393 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kc95s"/"default-dockercfg-rhqmt" Dec 06 10:34:41 crc kubenswrapper[4954]: I1206 10:34:41.122124 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsh7m\" (UniqueName: \"kubernetes.io/projected/ec9647e9-f9fd-485d-bbee-9bbd272e3e89-kube-api-access-vsh7m\") pod \"crc-debug-td6xx\" (UID: \"ec9647e9-f9fd-485d-bbee-9bbd272e3e89\") " pod="openshift-must-gather-kc95s/crc-debug-td6xx" Dec 06 10:34:41 crc kubenswrapper[4954]: I1206 10:34:41.122758 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec9647e9-f9fd-485d-bbee-9bbd272e3e89-host\") pod \"crc-debug-td6xx\" (UID: \"ec9647e9-f9fd-485d-bbee-9bbd272e3e89\") " pod="openshift-must-gather-kc95s/crc-debug-td6xx" Dec 06 10:34:41 crc kubenswrapper[4954]: I1206 10:34:41.225294 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsh7m\" (UniqueName: \"kubernetes.io/projected/ec9647e9-f9fd-485d-bbee-9bbd272e3e89-kube-api-access-vsh7m\") pod \"crc-debug-td6xx\" (UID: \"ec9647e9-f9fd-485d-bbee-9bbd272e3e89\") " pod="openshift-must-gather-kc95s/crc-debug-td6xx" Dec 06 10:34:41 crc kubenswrapper[4954]: I1206 10:34:41.225758 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec9647e9-f9fd-485d-bbee-9bbd272e3e89-host\") pod \"crc-debug-td6xx\" (UID: \"ec9647e9-f9fd-485d-bbee-9bbd272e3e89\") " pod="openshift-must-gather-kc95s/crc-debug-td6xx" Dec 06 10:34:41 crc kubenswrapper[4954]: I1206 10:34:41.225925 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec9647e9-f9fd-485d-bbee-9bbd272e3e89-host\") pod \"crc-debug-td6xx\" (UID: \"ec9647e9-f9fd-485d-bbee-9bbd272e3e89\") " pod="openshift-must-gather-kc95s/crc-debug-td6xx" Dec 06 10:34:41 crc kubenswrapper[4954]: I1206 10:34:41.246150 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsh7m\" (UniqueName: \"kubernetes.io/projected/ec9647e9-f9fd-485d-bbee-9bbd272e3e89-kube-api-access-vsh7m\") pod \"crc-debug-td6xx\" (UID: \"ec9647e9-f9fd-485d-bbee-9bbd272e3e89\") " pod="openshift-must-gather-kc95s/crc-debug-td6xx" Dec 06 10:34:41 crc kubenswrapper[4954]: I1206 10:34:41.341417 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kc95s/crc-debug-td6xx" Dec 06 10:34:41 crc kubenswrapper[4954]: W1206 10:34:41.370121 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec9647e9_f9fd_485d_bbee_9bbd272e3e89.slice/crio-aedc3828bd2165bc42d6d8b6ad004c3450ccbfa68566916f5643eb2e5fb9f488 WatchSource:0}: Error finding container aedc3828bd2165bc42d6d8b6ad004c3450ccbfa68566916f5643eb2e5fb9f488: Status 404 returned error can't find the container with id aedc3828bd2165bc42d6d8b6ad004c3450ccbfa68566916f5643eb2e5fb9f488 Dec 06 10:34:41 crc kubenswrapper[4954]: I1206 10:34:41.456475 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af257e51-a2c9-4248-b0d1-97d3d4d0b2e0" path="/var/lib/kubelet/pods/af257e51-a2c9-4248-b0d1-97d3d4d0b2e0/volumes" Dec 06 10:34:41 crc kubenswrapper[4954]: I1206 10:34:41.728574 4954 generic.go:334] "Generic (PLEG): container finished" podID="ec9647e9-f9fd-485d-bbee-9bbd272e3e89" containerID="70d0b679f916a269f3cbfef0e89b2d0ecb4e5de9233880a12204997ce63c6504" exitCode=0 Dec 06 10:34:41 crc kubenswrapper[4954]: I1206 10:34:41.728612 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kc95s/crc-debug-td6xx" event={"ID":"ec9647e9-f9fd-485d-bbee-9bbd272e3e89","Type":"ContainerDied","Data":"70d0b679f916a269f3cbfef0e89b2d0ecb4e5de9233880a12204997ce63c6504"} Dec 06 10:34:41 crc kubenswrapper[4954]: I1206 10:34:41.728660 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kc95s/crc-debug-td6xx" event={"ID":"ec9647e9-f9fd-485d-bbee-9bbd272e3e89","Type":"ContainerStarted","Data":"aedc3828bd2165bc42d6d8b6ad004c3450ccbfa68566916f5643eb2e5fb9f488"} Dec 06 10:34:41 crc kubenswrapper[4954]: I1206 10:34:41.772151 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kc95s/crc-debug-td6xx"] Dec 06 10:34:41 crc kubenswrapper[4954]: I1206 10:34:41.782774 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kc95s/crc-debug-td6xx"] Dec 06 10:34:42 crc kubenswrapper[4954]: I1206 10:34:42.876906 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kc95s/crc-debug-td6xx" Dec 06 10:34:42 crc kubenswrapper[4954]: I1206 10:34:42.961194 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsh7m\" (UniqueName: \"kubernetes.io/projected/ec9647e9-f9fd-485d-bbee-9bbd272e3e89-kube-api-access-vsh7m\") pod \"ec9647e9-f9fd-485d-bbee-9bbd272e3e89\" (UID: \"ec9647e9-f9fd-485d-bbee-9bbd272e3e89\") " Dec 06 10:34:42 crc kubenswrapper[4954]: I1206 10:34:42.961268 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec9647e9-f9fd-485d-bbee-9bbd272e3e89-host\") pod \"ec9647e9-f9fd-485d-bbee-9bbd272e3e89\" (UID: \"ec9647e9-f9fd-485d-bbee-9bbd272e3e89\") " Dec 06 10:34:42 crc kubenswrapper[4954]: I1206 10:34:42.961387 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec9647e9-f9fd-485d-bbee-9bbd272e3e89-host" (OuterVolumeSpecName: "host") pod "ec9647e9-f9fd-485d-bbee-9bbd272e3e89" (UID: "ec9647e9-f9fd-485d-bbee-9bbd272e3e89"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 10:34:42 crc kubenswrapper[4954]: I1206 10:34:42.962192 4954 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec9647e9-f9fd-485d-bbee-9bbd272e3e89-host\") on node \"crc\" DevicePath \"\"" Dec 06 10:34:42 crc kubenswrapper[4954]: I1206 10:34:42.967534 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec9647e9-f9fd-485d-bbee-9bbd272e3e89-kube-api-access-vsh7m" (OuterVolumeSpecName: "kube-api-access-vsh7m") pod "ec9647e9-f9fd-485d-bbee-9bbd272e3e89" (UID: "ec9647e9-f9fd-485d-bbee-9bbd272e3e89"). InnerVolumeSpecName "kube-api-access-vsh7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:34:43 crc kubenswrapper[4954]: I1206 10:34:43.064019 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsh7m\" (UniqueName: \"kubernetes.io/projected/ec9647e9-f9fd-485d-bbee-9bbd272e3e89-kube-api-access-vsh7m\") on node \"crc\" DevicePath \"\"" Dec 06 10:34:43 crc kubenswrapper[4954]: I1206 10:34:43.456351 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec9647e9-f9fd-485d-bbee-9bbd272e3e89" path="/var/lib/kubelet/pods/ec9647e9-f9fd-485d-bbee-9bbd272e3e89/volumes" Dec 06 10:34:43 crc kubenswrapper[4954]: I1206 10:34:43.756400 4954 scope.go:117] "RemoveContainer" containerID="70d0b679f916a269f3cbfef0e89b2d0ecb4e5de9233880a12204997ce63c6504" Dec 06 10:34:43 crc kubenswrapper[4954]: I1206 10:34:43.756529 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kc95s/crc-debug-td6xx" Dec 06 10:35:10 crc kubenswrapper[4954]: I1206 10:35:10.014412 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zd4rh"] Dec 06 10:35:10 crc kubenswrapper[4954]: E1206 10:35:10.015651 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec9647e9-f9fd-485d-bbee-9bbd272e3e89" containerName="container-00" Dec 06 10:35:10 crc kubenswrapper[4954]: I1206 10:35:10.015667 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec9647e9-f9fd-485d-bbee-9bbd272e3e89" containerName="container-00" Dec 06 10:35:10 crc kubenswrapper[4954]: I1206 10:35:10.015951 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec9647e9-f9fd-485d-bbee-9bbd272e3e89" containerName="container-00" Dec 06 10:35:10 crc kubenswrapper[4954]: I1206 10:35:10.017899 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zd4rh" Dec 06 10:35:10 crc kubenswrapper[4954]: I1206 10:35:10.028192 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zd4rh"] Dec 06 10:35:10 crc kubenswrapper[4954]: I1206 10:35:10.143781 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54236f39-a086-4d56-8939-fdc29dfa7c10-catalog-content\") pod \"community-operators-zd4rh\" (UID: \"54236f39-a086-4d56-8939-fdc29dfa7c10\") " pod="openshift-marketplace/community-operators-zd4rh" Dec 06 10:35:10 crc kubenswrapper[4954]: I1206 10:35:10.143892 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54236f39-a086-4d56-8939-fdc29dfa7c10-utilities\") pod \"community-operators-zd4rh\" (UID: \"54236f39-a086-4d56-8939-fdc29dfa7c10\") " pod="openshift-marketplace/community-operators-zd4rh" Dec 06 10:35:10 crc kubenswrapper[4954]: I1206 10:35:10.144003 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssdzk\" (UniqueName: \"kubernetes.io/projected/54236f39-a086-4d56-8939-fdc29dfa7c10-kube-api-access-ssdzk\") pod \"community-operators-zd4rh\" (UID: \"54236f39-a086-4d56-8939-fdc29dfa7c10\") " pod="openshift-marketplace/community-operators-zd4rh" Dec 06 10:35:10 crc kubenswrapper[4954]: I1206 10:35:10.246043 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54236f39-a086-4d56-8939-fdc29dfa7c10-catalog-content\") pod \"community-operators-zd4rh\" (UID: \"54236f39-a086-4d56-8939-fdc29dfa7c10\") " pod="openshift-marketplace/community-operators-zd4rh" Dec 06 10:35:10 crc kubenswrapper[4954]: I1206 10:35:10.246413 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54236f39-a086-4d56-8939-fdc29dfa7c10-utilities\") pod \"community-operators-zd4rh\" (UID: \"54236f39-a086-4d56-8939-fdc29dfa7c10\") " pod="openshift-marketplace/community-operators-zd4rh" Dec 06 10:35:10 crc kubenswrapper[4954]: I1206 10:35:10.246646 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54236f39-a086-4d56-8939-fdc29dfa7c10-catalog-content\") pod \"community-operators-zd4rh\" (UID: \"54236f39-a086-4d56-8939-fdc29dfa7c10\") " pod="openshift-marketplace/community-operators-zd4rh" Dec 06 10:35:10 crc kubenswrapper[4954]: I1206 10:35:10.246730 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54236f39-a086-4d56-8939-fdc29dfa7c10-utilities\") pod \"community-operators-zd4rh\" (UID: \"54236f39-a086-4d56-8939-fdc29dfa7c10\") " pod="openshift-marketplace/community-operators-zd4rh" Dec 06 10:35:10 crc kubenswrapper[4954]: I1206 10:35:10.246856 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssdzk\" (UniqueName: \"kubernetes.io/projected/54236f39-a086-4d56-8939-fdc29dfa7c10-kube-api-access-ssdzk\") pod \"community-operators-zd4rh\" (UID: \"54236f39-a086-4d56-8939-fdc29dfa7c10\") " pod="openshift-marketplace/community-operators-zd4rh" Dec 06 10:35:10 crc kubenswrapper[4954]: I1206 10:35:10.270704 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssdzk\" (UniqueName: \"kubernetes.io/projected/54236f39-a086-4d56-8939-fdc29dfa7c10-kube-api-access-ssdzk\") pod \"community-operators-zd4rh\" (UID: \"54236f39-a086-4d56-8939-fdc29dfa7c10\") " pod="openshift-marketplace/community-operators-zd4rh" Dec 06 10:35:10 crc kubenswrapper[4954]: I1206 10:35:10.345588 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zd4rh" Dec 06 10:35:11 crc kubenswrapper[4954]: I1206 10:35:11.165364 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zd4rh"] Dec 06 10:35:11 crc kubenswrapper[4954]: W1206 10:35:11.166943 4954 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54236f39_a086_4d56_8939_fdc29dfa7c10.slice/crio-1e8e8c60f9f75c9895be561f7fef619139d78cbf0df8f2f236cc2aa102b41bc9 WatchSource:0}: Error finding container 1e8e8c60f9f75c9895be561f7fef619139d78cbf0df8f2f236cc2aa102b41bc9: Status 404 returned error can't find the container with id 1e8e8c60f9f75c9895be561f7fef619139d78cbf0df8f2f236cc2aa102b41bc9 Dec 06 10:35:12 crc kubenswrapper[4954]: I1206 10:35:12.075820 4954 generic.go:334] "Generic (PLEG): container finished" podID="54236f39-a086-4d56-8939-fdc29dfa7c10" containerID="f960c4c731082eb761a81a6ff4f2274c135773ffb971331a1dba263791869dd1" exitCode=0 Dec 06 10:35:12 crc kubenswrapper[4954]: I1206 10:35:12.076006 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zd4rh" event={"ID":"54236f39-a086-4d56-8939-fdc29dfa7c10","Type":"ContainerDied","Data":"f960c4c731082eb761a81a6ff4f2274c135773ffb971331a1dba263791869dd1"} Dec 06 10:35:12 crc kubenswrapper[4954]: I1206 10:35:12.076109 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zd4rh" event={"ID":"54236f39-a086-4d56-8939-fdc29dfa7c10","Type":"ContainerStarted","Data":"1e8e8c60f9f75c9895be561f7fef619139d78cbf0df8f2f236cc2aa102b41bc9"} Dec 06 10:35:13 crc kubenswrapper[4954]: I1206 10:35:13.088275 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zd4rh" event={"ID":"54236f39-a086-4d56-8939-fdc29dfa7c10","Type":"ContainerStarted","Data":"0f8ad844be8b8f796ade7368a755035239523d42b0501b65178845a08c61f409"} Dec 06 10:35:15 crc kubenswrapper[4954]: I1206 10:35:15.119430 4954 generic.go:334] "Generic (PLEG): container finished" podID="54236f39-a086-4d56-8939-fdc29dfa7c10" containerID="0f8ad844be8b8f796ade7368a755035239523d42b0501b65178845a08c61f409" exitCode=0 Dec 06 10:35:15 crc kubenswrapper[4954]: I1206 10:35:15.119508 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zd4rh" event={"ID":"54236f39-a086-4d56-8939-fdc29dfa7c10","Type":"ContainerDied","Data":"0f8ad844be8b8f796ade7368a755035239523d42b0501b65178845a08c61f409"} Dec 06 10:35:16 crc kubenswrapper[4954]: I1206 10:35:16.132002 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zd4rh" event={"ID":"54236f39-a086-4d56-8939-fdc29dfa7c10","Type":"ContainerStarted","Data":"5a75d4e727d2c96beeb9a5993322bd8428af6a77b4271848eabf44fc42fd07a0"} Dec 06 10:35:16 crc kubenswrapper[4954]: I1206 10:35:16.158798 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zd4rh" podStartSLOduration=3.642718478 podStartE2EDuration="7.158775372s" podCreationTimestamp="2025-12-06 10:35:09 +0000 UTC" firstStartedPulling="2025-12-06 10:35:12.078238929 +0000 UTC m=+13086.891598318" lastFinishedPulling="2025-12-06 10:35:15.594295813 +0000 UTC m=+13090.407655212" observedRunningTime="2025-12-06 10:35:16.150119851 +0000 UTC m=+13090.963479260" watchObservedRunningTime="2025-12-06 10:35:16.158775372 +0000 UTC m=+13090.972134761" Dec 06 10:35:20 crc kubenswrapper[4954]: I1206 10:35:20.346720 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zd4rh" Dec 06 10:35:20 crc kubenswrapper[4954]: I1206 10:35:20.347167 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zd4rh" Dec 06 10:35:20 crc kubenswrapper[4954]: I1206 10:35:20.414268 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zd4rh" Dec 06 10:35:21 crc kubenswrapper[4954]: I1206 10:35:21.232419 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zd4rh" Dec 06 10:35:21 crc kubenswrapper[4954]: I1206 10:35:21.282842 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zd4rh"] Dec 06 10:35:23 crc kubenswrapper[4954]: I1206 10:35:23.194263 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zd4rh" podUID="54236f39-a086-4d56-8939-fdc29dfa7c10" containerName="registry-server" containerID="cri-o://5a75d4e727d2c96beeb9a5993322bd8428af6a77b4271848eabf44fc42fd07a0" gracePeriod=2 Dec 06 10:35:24 crc kubenswrapper[4954]: I1206 10:35:24.211872 4954 generic.go:334] "Generic (PLEG): container finished" podID="54236f39-a086-4d56-8939-fdc29dfa7c10" containerID="5a75d4e727d2c96beeb9a5993322bd8428af6a77b4271848eabf44fc42fd07a0" exitCode=0 Dec 06 10:35:24 crc kubenswrapper[4954]: I1206 10:35:24.211937 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zd4rh" event={"ID":"54236f39-a086-4d56-8939-fdc29dfa7c10","Type":"ContainerDied","Data":"5a75d4e727d2c96beeb9a5993322bd8428af6a77b4271848eabf44fc42fd07a0"} Dec 06 10:35:24 crc kubenswrapper[4954]: I1206 10:35:24.814076 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zd4rh" Dec 06 10:35:24 crc kubenswrapper[4954]: I1206 10:35:24.994229 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54236f39-a086-4d56-8939-fdc29dfa7c10-utilities\") pod \"54236f39-a086-4d56-8939-fdc29dfa7c10\" (UID: \"54236f39-a086-4d56-8939-fdc29dfa7c10\") " Dec 06 10:35:24 crc kubenswrapper[4954]: I1206 10:35:24.994749 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54236f39-a086-4d56-8939-fdc29dfa7c10-catalog-content\") pod \"54236f39-a086-4d56-8939-fdc29dfa7c10\" (UID: \"54236f39-a086-4d56-8939-fdc29dfa7c10\") " Dec 06 10:35:24 crc kubenswrapper[4954]: I1206 10:35:24.994935 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssdzk\" (UniqueName: \"kubernetes.io/projected/54236f39-a086-4d56-8939-fdc29dfa7c10-kube-api-access-ssdzk\") pod \"54236f39-a086-4d56-8939-fdc29dfa7c10\" (UID: \"54236f39-a086-4d56-8939-fdc29dfa7c10\") " Dec 06 10:35:24 crc kubenswrapper[4954]: I1206 10:35:24.995210 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54236f39-a086-4d56-8939-fdc29dfa7c10-utilities" (OuterVolumeSpecName: "utilities") pod "54236f39-a086-4d56-8939-fdc29dfa7c10" (UID: "54236f39-a086-4d56-8939-fdc29dfa7c10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:35:24 crc kubenswrapper[4954]: I1206 10:35:24.995725 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54236f39-a086-4d56-8939-fdc29dfa7c10-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:35:25 crc kubenswrapper[4954]: I1206 10:35:25.001865 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54236f39-a086-4d56-8939-fdc29dfa7c10-kube-api-access-ssdzk" (OuterVolumeSpecName: "kube-api-access-ssdzk") pod "54236f39-a086-4d56-8939-fdc29dfa7c10" (UID: "54236f39-a086-4d56-8939-fdc29dfa7c10"). InnerVolumeSpecName "kube-api-access-ssdzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:35:25 crc kubenswrapper[4954]: I1206 10:35:25.050773 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54236f39-a086-4d56-8939-fdc29dfa7c10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54236f39-a086-4d56-8939-fdc29dfa7c10" (UID: "54236f39-a086-4d56-8939-fdc29dfa7c10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:35:25 crc kubenswrapper[4954]: I1206 10:35:25.100512 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54236f39-a086-4d56-8939-fdc29dfa7c10-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:35:25 crc kubenswrapper[4954]: I1206 10:35:25.100548 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssdzk\" (UniqueName: \"kubernetes.io/projected/54236f39-a086-4d56-8939-fdc29dfa7c10-kube-api-access-ssdzk\") on node \"crc\" DevicePath \"\"" Dec 06 10:35:25 crc kubenswrapper[4954]: I1206 10:35:25.228100 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zd4rh" event={"ID":"54236f39-a086-4d56-8939-fdc29dfa7c10","Type":"ContainerDied","Data":"1e8e8c60f9f75c9895be561f7fef619139d78cbf0df8f2f236cc2aa102b41bc9"} Dec 06 10:35:25 crc kubenswrapper[4954]: I1206 10:35:25.228171 4954 scope.go:117] "RemoveContainer" containerID="5a75d4e727d2c96beeb9a5993322bd8428af6a77b4271848eabf44fc42fd07a0" Dec 06 10:35:25 crc kubenswrapper[4954]: I1206 10:35:25.228179 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zd4rh" Dec 06 10:35:25 crc kubenswrapper[4954]: I1206 10:35:25.274695 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zd4rh"] Dec 06 10:35:25 crc kubenswrapper[4954]: I1206 10:35:25.279035 4954 scope.go:117] "RemoveContainer" containerID="0f8ad844be8b8f796ade7368a755035239523d42b0501b65178845a08c61f409" Dec 06 10:35:25 crc kubenswrapper[4954]: I1206 10:35:25.295028 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zd4rh"] Dec 06 10:35:25 crc kubenswrapper[4954]: I1206 10:35:25.352139 4954 scope.go:117] "RemoveContainer" containerID="f960c4c731082eb761a81a6ff4f2274c135773ffb971331a1dba263791869dd1" Dec 06 10:35:25 crc kubenswrapper[4954]: I1206 10:35:25.456585 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54236f39-a086-4d56-8939-fdc29dfa7c10" path="/var/lib/kubelet/pods/54236f39-a086-4d56-8939-fdc29dfa7c10/volumes" Dec 06 10:36:40 crc kubenswrapper[4954]: I1206 10:36:40.102780 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:36:40 crc kubenswrapper[4954]: I1206 10:36:40.106082 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:37:10 crc kubenswrapper[4954]: I1206 10:37:10.101118 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:37:10 crc kubenswrapper[4954]: I1206 10:37:10.101705 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:37:27 crc kubenswrapper[4954]: I1206 10:37:27.098436 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_3e2b2f86-d759-4e14-a011-a0060c5003a2/init-config-reloader/0.log" Dec 06 10:37:27 crc kubenswrapper[4954]: I1206 10:37:27.384104 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_3e2b2f86-d759-4e14-a011-a0060c5003a2/init-config-reloader/0.log" Dec 06 10:37:27 crc kubenswrapper[4954]: I1206 10:37:27.446929 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_3e2b2f86-d759-4e14-a011-a0060c5003a2/alertmanager/0.log" Dec 06 10:37:27 crc kubenswrapper[4954]: I1206 10:37:27.549180 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_3e2b2f86-d759-4e14-a011-a0060c5003a2/config-reloader/0.log" Dec 06 10:37:27 crc kubenswrapper[4954]: I1206 10:37:27.762810 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c8944031-0936-4953-849c-83eb9f2d9d7f/aodh-listener/0.log" Dec 06 10:37:27 crc kubenswrapper[4954]: I1206 10:37:27.766865 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c8944031-0936-4953-849c-83eb9f2d9d7f/aodh-api/0.log" Dec 06 10:37:27 crc kubenswrapper[4954]: I1206 10:37:27.919057 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c8944031-0936-4953-849c-83eb9f2d9d7f/aodh-evaluator/0.log" Dec 06 10:37:27 crc kubenswrapper[4954]: I1206 10:37:27.923232 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c8944031-0936-4953-849c-83eb9f2d9d7f/aodh-notifier/0.log" Dec 06 10:37:28 crc kubenswrapper[4954]: I1206 10:37:28.103659 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-9d56f68b6-zctqj_f4aeecef-ecdc-4aa0-861b-d0a262302982/barbican-api/0.log" Dec 06 10:37:28 crc kubenswrapper[4954]: I1206 10:37:28.183520 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-9d56f68b6-zctqj_f4aeecef-ecdc-4aa0-861b-d0a262302982/barbican-api-log/0.log" Dec 06 10:37:28 crc kubenswrapper[4954]: I1206 10:37:28.483803 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-774cb646b6-tnq5v_04dc125f-8988-4195-975a-a8156cfc59f4/barbican-keystone-listener/0.log" Dec 06 10:37:28 crc kubenswrapper[4954]: I1206 10:37:28.552974 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-74c854b897-gshsw_c3694a89-4fb6-43ef-abe2-dcc7f455d4d4/barbican-worker/0.log" Dec 06 10:37:28 crc kubenswrapper[4954]: I1206 10:37:28.801980 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-74c854b897-gshsw_c3694a89-4fb6-43ef-abe2-dcc7f455d4d4/barbican-worker-log/0.log" Dec 06 10:37:29 crc kubenswrapper[4954]: I1206 10:37:29.014792 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-mxjln_c6c6edf1-d239-4369-a85a-822e425a2909/bootstrap-openstack-openstack-cell1/0.log" Dec 06 10:37:29 crc kubenswrapper[4954]: I1206 10:37:29.047224 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-774cb646b6-tnq5v_04dc125f-8988-4195-975a-a8156cfc59f4/barbican-keystone-listener-log/0.log" Dec 06 10:37:29 crc kubenswrapper[4954]: I1206 10:37:29.245648 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-networker-6c94j_b6ef0d02-4ed6-4021-b60d-a06dc6566c48/bootstrap-openstack-openstack-networker/0.log" Dec 06 10:37:29 crc kubenswrapper[4954]: I1206 10:37:29.364271 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_185b09ad-fc16-4179-bafe-cd1bf48fef37/ceilometer-central-agent/0.log" Dec 06 10:37:29 crc kubenswrapper[4954]: I1206 10:37:29.599853 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_185b09ad-fc16-4179-bafe-cd1bf48fef37/proxy-httpd/0.log" Dec 06 10:37:29 crc kubenswrapper[4954]: I1206 10:37:29.764053 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_185b09ad-fc16-4179-bafe-cd1bf48fef37/sg-core/0.log" Dec 06 10:37:30 crc kubenswrapper[4954]: I1206 10:37:30.131434 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_185b09ad-fc16-4179-bafe-cd1bf48fef37/ceilometer-notification-agent/0.log" Dec 06 10:37:30 crc kubenswrapper[4954]: I1206 10:37:30.142832 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_93ed7eaf-c22c-4933-943f-0bd990945ef1/cinder-api/0.log" Dec 06 10:37:30 crc kubenswrapper[4954]: I1206 10:37:30.331071 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_93ed7eaf-c22c-4933-943f-0bd990945ef1/cinder-api-log/0.log" Dec 06 10:37:30 crc kubenswrapper[4954]: I1206 10:37:30.741593 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_905b98df-c8e3-4553-9e44-6ce21971e83a/cinder-scheduler/0.log" Dec 06 10:37:30 crc kubenswrapper[4954]: I1206 10:37:30.770471 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_905b98df-c8e3-4553-9e44-6ce21971e83a/probe/0.log" Dec 06 10:37:30 crc kubenswrapper[4954]: I1206 10:37:30.824730 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-lxt7m_723e1f46-4585-4b35-ab0f-cdd381997052/configure-network-openstack-openstack-cell1/0.log" Dec 06 10:37:31 crc kubenswrapper[4954]: I1206 10:37:31.017758 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-networker-dkvqf_68cf26b2-12c2-40a9-9dcc-42048e97c0f4/configure-network-openstack-openstack-networker/0.log" Dec 06 10:37:31 crc kubenswrapper[4954]: I1206 10:37:31.161984 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-67w67_bc171c73-4a40-48cc-9a75-9dd222a9da01/configure-os-openstack-openstack-cell1/0.log" Dec 06 10:37:31 crc kubenswrapper[4954]: I1206 10:37:31.417296 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-networker-tkq4q_e3e7f64c-c6f1-4a15-8bf2-26f4eb13f5b1/configure-os-openstack-openstack-networker/0.log" Dec 06 10:37:31 crc kubenswrapper[4954]: I1206 10:37:31.546325 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7569c45955-cmgjc_9b7d4e13-c0ed-4d44-8586-0706363924f6/init/0.log" Dec 06 10:37:31 crc kubenswrapper[4954]: I1206 10:37:31.932156 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7569c45955-cmgjc_9b7d4e13-c0ed-4d44-8586-0706363924f6/init/0.log" Dec 06 10:37:32 crc kubenswrapper[4954]: I1206 10:37:32.000505 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-bjn9q_7fda341b-7000-4aab-ba41-ba427a3c33bd/download-cache-openstack-openstack-cell1/0.log" Dec 06 10:37:32 crc kubenswrapper[4954]: I1206 10:37:32.257466 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-networker-8wnbc_8d75c3e1-cfed-4a30-867d-e9aeabd7cee7/download-cache-openstack-openstack-networker/0.log" Dec 06 10:37:32 crc kubenswrapper[4954]: I1206 10:37:32.386049 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0be3b2df-5fe0-44f7-aa96-6c714b5b96b3/glance-log/0.log" Dec 06 10:37:32 crc kubenswrapper[4954]: I1206 10:37:32.494840 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0be3b2df-5fe0-44f7-aa96-6c714b5b96b3/glance-httpd/0.log" Dec 06 10:37:32 crc kubenswrapper[4954]: I1206 10:37:32.705918 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d9a2d5ad-ab05-499d-afa3-c52316bb2502/glance-httpd/0.log" Dec 06 10:37:32 crc kubenswrapper[4954]: I1206 10:37:32.942194 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d9a2d5ad-ab05-499d-afa3-c52316bb2502/glance-log/0.log" Dec 06 10:37:33 crc kubenswrapper[4954]: I1206 10:37:33.463413 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-944444f77-s592d_2027c398-a02d-4d9b-a2b7-5ff833e850c3/heat-engine/0.log" Dec 06 10:37:34 crc kubenswrapper[4954]: I1206 10:37:34.012259 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7c75d6449b-lzlh9_42795beb-5796-4fb5-a767-6d241d559e75/horizon/0.log" Dec 06 10:37:34 crc kubenswrapper[4954]: I1206 10:37:34.092497 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-79d7bb7d88-c64lf_4827e274-0c02-46b7-a637-412a532734d8/heat-api/0.log" Dec 06 10:37:34 crc kubenswrapper[4954]: I1206 10:37:34.395511 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-75b9554dd5-rd52z_7e0d0be9-4d2e-48f5-a8b7-cb697b8bba66/heat-cfnapi/0.log" Dec 06 10:37:34 crc kubenswrapper[4954]: I1206 10:37:34.580986 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-5hwtq_772337e6-04b4-40a1-9cc9-2a482d88b845/install-certs-openstack-openstack-cell1/0.log" Dec 06 10:37:34 crc kubenswrapper[4954]: I1206 10:37:34.938899 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-networker-bvkwz_0a15c7ab-e735-41c0-ba7b-d2cb7eb8912b/install-certs-openstack-openstack-networker/0.log" Dec 06 10:37:35 crc kubenswrapper[4954]: I1206 10:37:35.208313 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-pxq8m_ceb965f1-31a2-439a-aee5-4619e914a9f0/install-os-openstack-openstack-cell1/0.log" Dec 06 10:37:35 crc kubenswrapper[4954]: I1206 10:37:35.268668 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-networker-r9k8t_191da969-b76a-4ca2-95fd-101f479af42e/install-os-openstack-openstack-networker/0.log" Dec 06 10:37:35 crc kubenswrapper[4954]: I1206 10:37:35.393355 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7c75d6449b-lzlh9_42795beb-5796-4fb5-a767-6d241d559e75/horizon-log/0.log" Dec 06 10:37:35 crc kubenswrapper[4954]: I1206 10:37:35.698849 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29416861-2xng2_36e918d7-8a47-464a-be52-aa9a7f8a875a/keystone-cron/0.log" Dec 06 10:37:35 crc kubenswrapper[4954]: I1206 10:37:35.962965 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29416921-n7phh_df3840b1-bd73-4b75-9b02-2e48ccc35182/keystone-cron/0.log" Dec 06 10:37:36 crc kubenswrapper[4954]: I1206 10:37:36.198376 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_a51b2504-4a33-4058-be27-67afbfe5efc4/kube-state-metrics/0.log" Dec 06 10:37:36 crc kubenswrapper[4954]: I1206 10:37:36.599121 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-rjsg8_5f829cef-d637-404f-b735-02768488c5f7/libvirt-openstack-openstack-cell1/0.log" Dec 06 10:37:36 crc kubenswrapper[4954]: I1206 10:37:36.816684 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6dc49868fc-6xrnw_9b2e2368-f7ef-44f1-a105-8c72469d28a1/keystone-api/0.log" Dec 06 10:37:37 crc kubenswrapper[4954]: I1206 10:37:37.080682 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7569c45955-cmgjc_9b7d4e13-c0ed-4d44-8586-0706363924f6/dnsmasq-dns/0.log" Dec 06 10:37:37 crc kubenswrapper[4954]: I1206 10:37:37.475986 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-wfjjc_d2897056-ac73-4b75-8fe9-932a99a6bf80/neutron-dhcp-openstack-openstack-cell1/0.log" Dec 06 10:37:37 crc kubenswrapper[4954]: I1206 10:37:37.559602 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-686ff754fc-sms8t_02ebab58-0e79-4851-9845-15e647a11e68/neutron-httpd/0.log" Dec 06 10:37:37 crc kubenswrapper[4954]: I1206 10:37:37.635625 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-686ff754fc-sms8t_02ebab58-0e79-4851-9845-15e647a11e68/neutron-api/0.log" Dec 06 10:37:37 crc kubenswrapper[4954]: I1206 10:37:37.851651 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-5mxhl_9135ca19-eb24-4e05-a3b7-a8511ef9368e/neutron-metadata-openstack-openstack-cell1/0.log" Dec 06 10:37:37 crc kubenswrapper[4954]: I1206 10:37:37.878680 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_2f56f9e3-a30b-4342-981a-170433fd75ea/memcached/0.log" Dec 06 10:37:38 crc kubenswrapper[4954]: I1206 10:37:38.095660 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-networker-hrbwm_50fb29f7-2be5-45d6-b204-692aec122a55/neutron-metadata-openstack-openstack-networker/0.log" Dec 06 10:37:38 crc kubenswrapper[4954]: I1206 10:37:38.240673 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-gg4q9_3aa52c3d-befd-4077-b47c-b56664b536eb/neutron-sriov-openstack-openstack-cell1/0.log" Dec 06 10:37:38 crc kubenswrapper[4954]: I1206 10:37:38.719113 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_45520f31-d94a-4fb2-85a6-f5b9bc5ef338/nova-api-log/0.log" Dec 06 10:37:38 crc kubenswrapper[4954]: I1206 10:37:38.984914 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_745282f6-0c52-4f5b-a2e2-6e30a5a6d763/nova-cell0-conductor-conductor/0.log" Dec 06 10:37:39 crc kubenswrapper[4954]: I1206 10:37:39.002503 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_45520f31-d94a-4fb2-85a6-f5b9bc5ef338/nova-api-api/0.log" Dec 06 10:37:39 crc kubenswrapper[4954]: I1206 10:37:39.130578 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_1ff9df60-ff7e-4974-94d3-eb3d76cf6887/nova-cell1-conductor-conductor/0.log" Dec 06 10:37:39 crc kubenswrapper[4954]: I1206 10:37:39.324007 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgxfzx_4787a099-d3d6-4ae4-9cfa-b50d3d082889/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Dec 06 10:37:39 crc kubenswrapper[4954]: I1206 10:37:39.353706 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_7f9f9d71-3f70-4ef5-9d90-f0d792e5b646/nova-cell1-novncproxy-novncproxy/0.log" Dec 06 10:37:39 crc kubenswrapper[4954]: I1206 10:37:39.524222 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-vg4vw_88a2ce35-e4bc-4a06-9f75-9dfc51828c7c/nova-cell1-openstack-openstack-cell1/0.log" Dec 06 10:37:39 crc kubenswrapper[4954]: I1206 10:37:39.600534 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e24ace15-e3ed-42ec-ba16-dfc982a11a0c/nova-metadata-log/0.log" Dec 06 10:37:39 crc kubenswrapper[4954]: I1206 10:37:39.961002 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_237adbf8-c3e0-427c-ac06-656c110de87b/mysql-bootstrap/0.log" Dec 06 10:37:39 crc kubenswrapper[4954]: I1206 10:37:39.970901 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_75659995-67ef-4ae7-8908-f47ff93e36eb/nova-scheduler-scheduler/0.log" Dec 06 10:37:40 crc kubenswrapper[4954]: I1206 10:37:40.100876 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:37:40 crc kubenswrapper[4954]: I1206 10:37:40.100944 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:37:40 crc kubenswrapper[4954]: I1206 10:37:40.101002 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 10:37:40 crc kubenswrapper[4954]: I1206 10:37:40.101877 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 10:37:40 crc kubenswrapper[4954]: I1206 10:37:40.101944 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" gracePeriod=600 Dec 06 10:37:40 crc kubenswrapper[4954]: E1206 10:37:40.232254 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:37:40 crc kubenswrapper[4954]: I1206 10:37:40.298122 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_237adbf8-c3e0-427c-ac06-656c110de87b/mysql-bootstrap/0.log" Dec 06 10:37:40 crc kubenswrapper[4954]: I1206 10:37:40.351226 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ef4d1830-bfa7-4aba-8718-a7e540e52222/mysql-bootstrap/0.log" Dec 06 10:37:40 crc kubenswrapper[4954]: I1206 10:37:40.374342 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_237adbf8-c3e0-427c-ac06-656c110de87b/galera/0.log" Dec 06 10:37:40 crc kubenswrapper[4954]: I1206 10:37:40.633613 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ef4d1830-bfa7-4aba-8718-a7e540e52222/mysql-bootstrap/0.log" Dec 06 10:37:40 crc kubenswrapper[4954]: I1206 10:37:40.665525 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ef4d1830-bfa7-4aba-8718-a7e540e52222/galera/0.log" Dec 06 10:37:40 crc kubenswrapper[4954]: I1206 10:37:40.710270 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e24ace15-e3ed-42ec-ba16-dfc982a11a0c/nova-metadata-metadata/0.log" Dec 06 10:37:40 crc kubenswrapper[4954]: I1206 10:37:40.785713 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a4494926-d786-4b95-b768-7434a47be11b/openstackclient/0.log" Dec 06 10:37:40 crc kubenswrapper[4954]: I1206 10:37:40.902314 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_56016706-35a9-41e9-afef-3555f10a4e94/openstack-network-exporter/0.log" Dec 06 10:37:40 crc kubenswrapper[4954]: I1206 10:37:40.950656 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_56016706-35a9-41e9-afef-3555f10a4e94/ovn-northd/0.log" Dec 06 10:37:40 crc kubenswrapper[4954]: I1206 10:37:40.996486 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" exitCode=0 Dec 06 10:37:40 crc kubenswrapper[4954]: I1206 10:37:40.996528 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e"} Dec 06 10:37:40 crc kubenswrapper[4954]: I1206 10:37:40.996588 4954 scope.go:117] "RemoveContainer" containerID="cd86614438811c10be67f6ffbdd9b17cbf00541ebcf0ba5313b72e6afc36b20c" Dec 06 10:37:40 crc kubenswrapper[4954]: I1206 10:37:40.997523 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:37:40 crc kubenswrapper[4954]: E1206 10:37:40.997911 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:37:41 crc kubenswrapper[4954]: I1206 10:37:41.162998 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-g2lrh_025be865-7bd0-4184-b542-76879dcb05c4/ovn-openstack-openstack-cell1/0.log" Dec 06 10:37:41 crc kubenswrapper[4954]: I1206 10:37:41.321434 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-networker-ftv6g_1668e041-54f7-4e5c-8b9e-0c5bb97cdf77/ovn-openstack-openstack-networker/0.log" Dec 06 10:37:41 crc kubenswrapper[4954]: I1206 10:37:41.440207 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_1dd08ec8-09d0-414b-a539-44085352f9d8/openstack-network-exporter/0.log" Dec 06 10:37:41 crc kubenswrapper[4954]: I1206 10:37:41.517641 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_1dd08ec8-09d0-414b-a539-44085352f9d8/ovsdbserver-nb/0.log" Dec 06 10:37:41 crc kubenswrapper[4954]: I1206 10:37:41.590151 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_81b9c775-f786-4b39-9b42-9a3ad830eb17/openstack-network-exporter/0.log" Dec 06 10:37:41 crc kubenswrapper[4954]: I1206 10:37:41.621244 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_81b9c775-f786-4b39-9b42-9a3ad830eb17/ovsdbserver-nb/0.log" Dec 06 10:37:41 crc kubenswrapper[4954]: I1206 10:37:41.776300 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_da5862bf-e1b3-4b7c-8730-47c2de9e7f40/openstack-network-exporter/0.log" Dec 06 10:37:41 crc kubenswrapper[4954]: I1206 10:37:41.875953 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_da5862bf-e1b3-4b7c-8730-47c2de9e7f40/ovsdbserver-nb/0.log" Dec 06 10:37:41 crc kubenswrapper[4954]: I1206 10:37:41.894555 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0d819f14-112f-40a4-82df-4c2d74278715/openstack-network-exporter/0.log" Dec 06 10:37:42 crc kubenswrapper[4954]: I1206 10:37:42.054926 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0d819f14-112f-40a4-82df-4c2d74278715/ovsdbserver-sb/0.log" Dec 06 10:37:42 crc kubenswrapper[4954]: I1206 10:37:42.079679 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_56b50592-4180-4185-a6df-1ae1346e18b1/openstack-network-exporter/0.log" Dec 06 10:37:42 crc kubenswrapper[4954]: I1206 10:37:42.120067 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_56b50592-4180-4185-a6df-1ae1346e18b1/ovsdbserver-sb/0.log" Dec 06 10:37:42 crc kubenswrapper[4954]: I1206 10:37:42.291326 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_78ff1ab7-855e-450b-86e4-12a4e394d429/openstack-network-exporter/0.log" Dec 06 10:37:42 crc kubenswrapper[4954]: I1206 10:37:42.308506 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_78ff1ab7-855e-450b-86e4-12a4e394d429/ovsdbserver-sb/0.log" Dec 06 10:37:42 crc kubenswrapper[4954]: I1206 10:37:42.520181 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-75f95f96c6-h66hl_4e149340-c1cd-4bc4-acc8-396bc4c314f7/placement-api/0.log" Dec 06 10:37:42 crc kubenswrapper[4954]: I1206 10:37:42.775058 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cbr7tm_05fb018b-abd2-4aec-8e4f-7705beb33bf6/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Dec 06 10:37:42 crc kubenswrapper[4954]: I1206 10:37:42.780494 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-75f95f96c6-h66hl_4e149340-c1cd-4bc4-acc8-396bc4c314f7/placement-log/0.log" Dec 06 10:37:43 crc kubenswrapper[4954]: I1206 10:37:43.165143 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-ndwp5v_e11e817c-9bd1-4276-b2ec-c7732d9950ff/pre-adoption-validation-openstack-pre-adoption-openstack-networ/0.log" Dec 06 10:37:43 crc kubenswrapper[4954]: I1206 10:37:43.239379 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_82879efe-6f16-4ce2-b4cb-17ddd3349804/init-config-reloader/0.log" Dec 06 10:37:43 crc kubenswrapper[4954]: I1206 10:37:43.408634 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_82879efe-6f16-4ce2-b4cb-17ddd3349804/config-reloader/0.log" Dec 06 10:37:43 crc kubenswrapper[4954]: I1206 10:37:43.417120 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_82879efe-6f16-4ce2-b4cb-17ddd3349804/prometheus/0.log" Dec 06 10:37:43 crc kubenswrapper[4954]: I1206 10:37:43.489353 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_82879efe-6f16-4ce2-b4cb-17ddd3349804/init-config-reloader/0.log" Dec 06 10:37:43 crc kubenswrapper[4954]: I1206 10:37:43.524011 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_82879efe-6f16-4ce2-b4cb-17ddd3349804/thanos-sidecar/0.log" Dec 06 10:37:43 crc kubenswrapper[4954]: I1206 10:37:43.627161 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_813293b6-616e-4bc6-a118-c73410b6cc88/setup-container/0.log" Dec 06 10:37:43 crc kubenswrapper[4954]: I1206 10:37:43.850758 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_813293b6-616e-4bc6-a118-c73410b6cc88/setup-container/0.log" Dec 06 10:37:43 crc kubenswrapper[4954]: I1206 10:37:43.875929 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_813293b6-616e-4bc6-a118-c73410b6cc88/rabbitmq/0.log" Dec 06 10:37:43 crc kubenswrapper[4954]: I1206 10:37:43.945794 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_350fbf38-ed6d-43f8-bb3f-54afba2c0a08/setup-container/0.log" Dec 06 10:37:44 crc kubenswrapper[4954]: I1206 10:37:44.131703 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_350fbf38-ed6d-43f8-bb3f-54afba2c0a08/setup-container/0.log" Dec 06 10:37:44 crc kubenswrapper[4954]: I1206 10:37:44.223675 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_350fbf38-ed6d-43f8-bb3f-54afba2c0a08/rabbitmq/0.log" Dec 06 10:37:44 crc kubenswrapper[4954]: I1206 10:37:44.268972 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-k6tcb_4b2b5006-9910-4d4f-86f1-99ccb1c6f942/reboot-os-openstack-openstack-cell1/0.log" Dec 06 10:37:44 crc kubenswrapper[4954]: I1206 10:37:44.376058 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-networker-qwhjm_cfb803e6-8e44-4318-a660-4b90c4b19150/reboot-os-openstack-openstack-networker/0.log" Dec 06 10:37:44 crc kubenswrapper[4954]: I1206 10:37:44.460489 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-networker-z82wc_00ae1c89-e237-492e-b815-d385026df7e6/run-os-openstack-openstack-networker/0.log" Dec 06 10:37:44 crc kubenswrapper[4954]: I1206 10:37:44.476209 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-skz67_10123b35-1d1c-41fa-b752-896706a99792/run-os-openstack-openstack-cell1/0.log" Dec 06 10:37:44 crc kubenswrapper[4954]: I1206 10:37:44.766450 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-jgtdv_1e80411b-5c3f-4ea5-9963-6fb8b639ae94/ssh-known-hosts-openstack/0.log" Dec 06 10:37:45 crc kubenswrapper[4954]: I1206 10:37:45.003287 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-d4b995744-2qch6_4f6b671e-20ef-4d02-aedd-a5d46bc23b40/proxy-httpd/0.log" Dec 06 10:37:45 crc kubenswrapper[4954]: I1206 10:37:45.064949 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-d4b995744-2qch6_4f6b671e-20ef-4d02-aedd-a5d46bc23b40/proxy-server/0.log" Dec 06 10:37:45 crc kubenswrapper[4954]: I1206 10:37:45.139734 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-tfp9f_72a00ae9-92aa-42b4-8e20-dd0bbb65c689/swift-ring-rebalance/0.log" Dec 06 10:37:45 crc kubenswrapper[4954]: I1206 10:37:45.314352 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa5412d3-ec01-4e3f-9525-89669e6a87fc/account-auditor/0.log" Dec 06 10:37:45 crc kubenswrapper[4954]: I1206 10:37:45.366585 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa5412d3-ec01-4e3f-9525-89669e6a87fc/account-reaper/0.log" Dec 06 10:37:45 crc kubenswrapper[4954]: I1206 10:37:45.425440 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa5412d3-ec01-4e3f-9525-89669e6a87fc/account-server/0.log" Dec 06 10:37:45 crc kubenswrapper[4954]: I1206 10:37:45.439987 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa5412d3-ec01-4e3f-9525-89669e6a87fc/container-auditor/0.log" Dec 06 10:37:45 crc kubenswrapper[4954]: I1206 10:37:45.441004 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa5412d3-ec01-4e3f-9525-89669e6a87fc/account-replicator/0.log" Dec 06 10:37:45 crc kubenswrapper[4954]: I1206 10:37:45.606681 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa5412d3-ec01-4e3f-9525-89669e6a87fc/container-replicator/0.log" Dec 06 10:37:45 crc kubenswrapper[4954]: I1206 10:37:45.647922 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa5412d3-ec01-4e3f-9525-89669e6a87fc/object-expirer/0.log" Dec 06 10:37:45 crc kubenswrapper[4954]: I1206 10:37:45.673357 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa5412d3-ec01-4e3f-9525-89669e6a87fc/object-auditor/0.log" Dec 06 10:37:45 crc kubenswrapper[4954]: I1206 10:37:45.708499 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa5412d3-ec01-4e3f-9525-89669e6a87fc/container-updater/0.log" Dec 06 10:37:45 crc kubenswrapper[4954]: I1206 10:37:45.904882 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa5412d3-ec01-4e3f-9525-89669e6a87fc/object-updater/0.log" Dec 06 10:37:45 crc kubenswrapper[4954]: I1206 10:37:45.916658 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa5412d3-ec01-4e3f-9525-89669e6a87fc/object-replicator/0.log" Dec 06 10:37:46 crc kubenswrapper[4954]: I1206 10:37:46.065767 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa5412d3-ec01-4e3f-9525-89669e6a87fc/container-server/0.log" Dec 06 10:37:46 crc kubenswrapper[4954]: I1206 10:37:46.230213 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa5412d3-ec01-4e3f-9525-89669e6a87fc/swift-recon-cron/0.log" Dec 06 10:37:46 crc kubenswrapper[4954]: I1206 10:37:46.308552 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_fb7ca5c7-3922-457c-9709-b024e8587ced/account-auditor/0.log" Dec 06 10:37:46 crc kubenswrapper[4954]: I1206 10:37:46.521143 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_fb7ca5c7-3922-457c-9709-b024e8587ced/account-reaper/0.log" Dec 06 10:37:46 crc kubenswrapper[4954]: I1206 10:37:46.541457 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa5412d3-ec01-4e3f-9525-89669e6a87fc/object-server/0.log" Dec 06 10:37:46 crc kubenswrapper[4954]: I1206 10:37:46.544500 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa5412d3-ec01-4e3f-9525-89669e6a87fc/rsync/0.log" Dec 06 10:37:46 crc kubenswrapper[4954]: I1206 10:37:46.617357 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_fb7ca5c7-3922-457c-9709-b024e8587ced/account-replicator/0.log" Dec 06 10:37:46 crc kubenswrapper[4954]: I1206 10:37:46.803047 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_fb7ca5c7-3922-457c-9709-b024e8587ced/account-server/0.log" Dec 06 10:37:46 crc kubenswrapper[4954]: I1206 10:37:46.811525 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_fb7ca5c7-3922-457c-9709-b024e8587ced/container-replicator/0.log" Dec 06 10:37:46 crc kubenswrapper[4954]: I1206 10:37:46.814193 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_fb7ca5c7-3922-457c-9709-b024e8587ced/container-auditor/0.log" Dec 06 10:37:46 crc kubenswrapper[4954]: I1206 10:37:46.934040 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_fb7ca5c7-3922-457c-9709-b024e8587ced/container-updater/0.log" Dec 06 10:37:47 crc kubenswrapper[4954]: I1206 10:37:47.029859 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_fb7ca5c7-3922-457c-9709-b024e8587ced/object-expirer/0.log" Dec 06 10:37:47 crc kubenswrapper[4954]: I1206 10:37:47.101696 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_fb7ca5c7-3922-457c-9709-b024e8587ced/object-auditor/0.log" Dec 06 10:37:47 crc kubenswrapper[4954]: I1206 10:37:47.134753 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_fb7ca5c7-3922-457c-9709-b024e8587ced/object-replicator/0.log" Dec 06 10:37:47 crc kubenswrapper[4954]: I1206 10:37:47.336538 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_fb7ca5c7-3922-457c-9709-b024e8587ced/object-updater/0.log" Dec 06 10:37:47 crc kubenswrapper[4954]: I1206 10:37:47.507278 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_fb7ca5c7-3922-457c-9709-b024e8587ced/swift-recon-cron/0.log" Dec 06 10:37:47 crc kubenswrapper[4954]: I1206 10:37:47.675194 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_fb7ca5c7-3922-457c-9709-b024e8587ced/container-server/0.log" Dec 06 10:37:47 crc kubenswrapper[4954]: I1206 10:37:47.779186 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_0054b4c8-5d04-47a9-8794-992ac486c936/account-auditor/0.log" Dec 06 10:37:47 crc kubenswrapper[4954]: I1206 10:37:47.875423 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_fb7ca5c7-3922-457c-9709-b024e8587ced/object-server/0.log" Dec 06 10:37:48 crc kubenswrapper[4954]: I1206 10:37:48.015668 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_fb7ca5c7-3922-457c-9709-b024e8587ced/rsync/0.log" Dec 06 10:37:48 crc kubenswrapper[4954]: I1206 10:37:48.104628 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_0054b4c8-5d04-47a9-8794-992ac486c936/account-reaper/0.log" Dec 06 10:37:48 crc kubenswrapper[4954]: I1206 10:37:48.146919 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_0054b4c8-5d04-47a9-8794-992ac486c936/account-replicator/0.log" Dec 06 10:37:48 crc kubenswrapper[4954]: I1206 10:37:48.183171 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_0054b4c8-5d04-47a9-8794-992ac486c936/account-server/0.log" Dec 06 10:37:48 crc kubenswrapper[4954]: I1206 10:37:48.217292 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_0054b4c8-5d04-47a9-8794-992ac486c936/container-auditor/0.log" Dec 06 10:37:48 crc kubenswrapper[4954]: I1206 10:37:48.295045 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_0054b4c8-5d04-47a9-8794-992ac486c936/container-replicator/0.log" Dec 06 10:37:48 crc kubenswrapper[4954]: I1206 10:37:48.369510 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_0054b4c8-5d04-47a9-8794-992ac486c936/container-updater/0.log" Dec 06 10:37:48 crc kubenswrapper[4954]: I1206 10:37:48.564926 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_0054b4c8-5d04-47a9-8794-992ac486c936/object-expirer/0.log" Dec 06 10:37:48 crc kubenswrapper[4954]: I1206 10:37:48.664323 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_0054b4c8-5d04-47a9-8794-992ac486c936/object-auditor/0.log" Dec 06 10:37:48 crc kubenswrapper[4954]: I1206 10:37:48.709477 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_0054b4c8-5d04-47a9-8794-992ac486c936/container-server/0.log" Dec 06 10:37:48 crc kubenswrapper[4954]: I1206 10:37:48.721761 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_0054b4c8-5d04-47a9-8794-992ac486c936/object-replicator/0.log" Dec 06 10:37:48 crc kubenswrapper[4954]: I1206 10:37:48.912059 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_0054b4c8-5d04-47a9-8794-992ac486c936/object-updater/0.log" Dec 06 10:37:48 crc kubenswrapper[4954]: I1206 10:37:48.989270 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_0054b4c8-5d04-47a9-8794-992ac486c936/swift-recon-cron/0.log" Dec 06 10:37:49 crc kubenswrapper[4954]: I1206 10:37:49.027983 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_0054b4c8-5d04-47a9-8794-992ac486c936/object-server/0.log" Dec 06 10:37:49 crc kubenswrapper[4954]: I1206 10:37:49.231298 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-2k4qx_509d808a-804b-405a-9a0a-545a8d15a90e/telemetry-openstack-openstack-cell1/0.log" Dec 06 10:37:49 crc kubenswrapper[4954]: I1206 10:37:49.356670 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_0054b4c8-5d04-47a9-8794-992ac486c936/rsync/0.log" Dec 06 10:37:49 crc kubenswrapper[4954]: I1206 10:37:49.399707 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_86f6275c-9439-4a32-a0b7-467f7df9670f/tempest-tests-tempest-tests-runner/0.log" Dec 06 10:37:49 crc kubenswrapper[4954]: I1206 10:37:49.560308 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_540573fa-cbaa-42cc-9ed7-a927a28dc7c9/test-operator-logs-container/0.log" Dec 06 10:37:49 crc kubenswrapper[4954]: I1206 10:37:49.682194 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-phhr8_cbcd2e0b-c24c-4c7e-bad8-82b47335ace8/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Dec 06 10:37:49 crc kubenswrapper[4954]: I1206 10:37:49.807489 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-networker-bs846_310d8011-2558-4976-a3f3-0c28d9ea366f/tripleo-cleanup-tripleo-cleanup-openstack-networker/0.log" Dec 06 10:37:49 crc kubenswrapper[4954]: I1206 10:37:49.996163 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-lnnrb_8772370c-5e61-442a-a0c4-ca20e1e43d07/validate-network-openstack-openstack-cell1/0.log" Dec 06 10:37:50 crc kubenswrapper[4954]: I1206 10:37:50.113780 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-networker-vk44j_b3a3b17a-af50-4e16-8ee2-011bfe71370b/validate-network-openstack-openstack-networker/0.log" Dec 06 10:37:55 crc kubenswrapper[4954]: I1206 10:37:55.454523 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:37:55 crc kubenswrapper[4954]: E1206 10:37:55.455810 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:38:07 crc kubenswrapper[4954]: I1206 10:38:07.443454 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:38:07 crc kubenswrapper[4954]: E1206 10:38:07.444182 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:38:19 crc kubenswrapper[4954]: I1206 10:38:19.443664 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:38:19 crc kubenswrapper[4954]: E1206 10:38:19.444657 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:38:23 crc kubenswrapper[4954]: I1206 10:38:23.295204 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ttsrz"] Dec 06 10:38:23 crc kubenswrapper[4954]: E1206 10:38:23.295991 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54236f39-a086-4d56-8939-fdc29dfa7c10" containerName="extract-content" Dec 06 10:38:23 crc kubenswrapper[4954]: I1206 10:38:23.296005 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="54236f39-a086-4d56-8939-fdc29dfa7c10" containerName="extract-content" Dec 06 10:38:23 crc kubenswrapper[4954]: E1206 10:38:23.296056 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54236f39-a086-4d56-8939-fdc29dfa7c10" containerName="registry-server" Dec 06 10:38:23 crc kubenswrapper[4954]: I1206 10:38:23.296062 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="54236f39-a086-4d56-8939-fdc29dfa7c10" containerName="registry-server" Dec 06 10:38:23 crc kubenswrapper[4954]: E1206 10:38:23.296087 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54236f39-a086-4d56-8939-fdc29dfa7c10" containerName="extract-utilities" Dec 06 10:38:23 crc kubenswrapper[4954]: I1206 10:38:23.296093 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="54236f39-a086-4d56-8939-fdc29dfa7c10" containerName="extract-utilities" Dec 06 10:38:23 crc kubenswrapper[4954]: I1206 10:38:23.296311 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="54236f39-a086-4d56-8939-fdc29dfa7c10" containerName="registry-server" Dec 06 10:38:23 crc kubenswrapper[4954]: I1206 10:38:23.297938 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttsrz" Dec 06 10:38:23 crc kubenswrapper[4954]: I1206 10:38:23.352009 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ttsrz"] Dec 06 10:38:23 crc kubenswrapper[4954]: I1206 10:38:23.436993 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/879ab57c-2611-4c25-be84-ab404c500a86-catalog-content\") pod \"redhat-operators-ttsrz\" (UID: \"879ab57c-2611-4c25-be84-ab404c500a86\") " pod="openshift-marketplace/redhat-operators-ttsrz" Dec 06 10:38:23 crc kubenswrapper[4954]: I1206 10:38:23.437856 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxbt4\" (UniqueName: \"kubernetes.io/projected/879ab57c-2611-4c25-be84-ab404c500a86-kube-api-access-wxbt4\") pod \"redhat-operators-ttsrz\" (UID: \"879ab57c-2611-4c25-be84-ab404c500a86\") " pod="openshift-marketplace/redhat-operators-ttsrz" Dec 06 10:38:23 crc kubenswrapper[4954]: I1206 10:38:23.438292 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/879ab57c-2611-4c25-be84-ab404c500a86-utilities\") pod \"redhat-operators-ttsrz\" (UID: \"879ab57c-2611-4c25-be84-ab404c500a86\") " pod="openshift-marketplace/redhat-operators-ttsrz" Dec 06 10:38:23 crc kubenswrapper[4954]: I1206 10:38:23.540448 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxbt4\" (UniqueName: \"kubernetes.io/projected/879ab57c-2611-4c25-be84-ab404c500a86-kube-api-access-wxbt4\") pod \"redhat-operators-ttsrz\" (UID: \"879ab57c-2611-4c25-be84-ab404c500a86\") " pod="openshift-marketplace/redhat-operators-ttsrz" Dec 06 10:38:23 crc kubenswrapper[4954]: I1206 10:38:23.540620 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/879ab57c-2611-4c25-be84-ab404c500a86-utilities\") pod \"redhat-operators-ttsrz\" (UID: \"879ab57c-2611-4c25-be84-ab404c500a86\") " pod="openshift-marketplace/redhat-operators-ttsrz" Dec 06 10:38:23 crc kubenswrapper[4954]: I1206 10:38:23.540702 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/879ab57c-2611-4c25-be84-ab404c500a86-catalog-content\") pod \"redhat-operators-ttsrz\" (UID: \"879ab57c-2611-4c25-be84-ab404c500a86\") " pod="openshift-marketplace/redhat-operators-ttsrz" Dec 06 10:38:23 crc kubenswrapper[4954]: I1206 10:38:23.541165 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/879ab57c-2611-4c25-be84-ab404c500a86-utilities\") pod \"redhat-operators-ttsrz\" (UID: \"879ab57c-2611-4c25-be84-ab404c500a86\") " pod="openshift-marketplace/redhat-operators-ttsrz" Dec 06 10:38:23 crc kubenswrapper[4954]: I1206 10:38:23.541238 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/879ab57c-2611-4c25-be84-ab404c500a86-catalog-content\") pod \"redhat-operators-ttsrz\" (UID: \"879ab57c-2611-4c25-be84-ab404c500a86\") " pod="openshift-marketplace/redhat-operators-ttsrz" Dec 06 10:38:23 crc kubenswrapper[4954]: I1206 10:38:23.559499 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxbt4\" (UniqueName: \"kubernetes.io/projected/879ab57c-2611-4c25-be84-ab404c500a86-kube-api-access-wxbt4\") pod \"redhat-operators-ttsrz\" (UID: \"879ab57c-2611-4c25-be84-ab404c500a86\") " pod="openshift-marketplace/redhat-operators-ttsrz" Dec 06 10:38:23 crc kubenswrapper[4954]: I1206 10:38:23.682761 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttsrz" Dec 06 10:38:24 crc kubenswrapper[4954]: I1206 10:38:24.719028 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ttsrz"] Dec 06 10:38:25 crc kubenswrapper[4954]: I1206 10:38:25.531581 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttsrz" event={"ID":"879ab57c-2611-4c25-be84-ab404c500a86","Type":"ContainerStarted","Data":"66fce05b5d5722c59a92ee135130a60c0ec9ac07427edfab6df329e32d009801"} Dec 06 10:38:26 crc kubenswrapper[4954]: I1206 10:38:26.565538 4954 generic.go:334] "Generic (PLEG): container finished" podID="879ab57c-2611-4c25-be84-ab404c500a86" containerID="a98e40705751262de7cf7cc7e568276093f91635dff25b50fa8961ae2fd5382c" exitCode=0 Dec 06 10:38:26 crc kubenswrapper[4954]: I1206 10:38:26.566976 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttsrz" event={"ID":"879ab57c-2611-4c25-be84-ab404c500a86","Type":"ContainerDied","Data":"a98e40705751262de7cf7cc7e568276093f91635dff25b50fa8961ae2fd5382c"} Dec 06 10:38:26 crc kubenswrapper[4954]: I1206 10:38:26.571573 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 10:38:29 crc kubenswrapper[4954]: I1206 10:38:29.616524 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttsrz" event={"ID":"879ab57c-2611-4c25-be84-ab404c500a86","Type":"ContainerStarted","Data":"8d50027dcc836c94eb4e37a02cbf0c8249b50d4dd30b1349c01350c3cedcd584"} Dec 06 10:38:33 crc kubenswrapper[4954]: I1206 10:38:33.444118 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:38:33 crc kubenswrapper[4954]: E1206 10:38:33.445638 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:38:35 crc kubenswrapper[4954]: I1206 10:38:35.674490 4954 generic.go:334] "Generic (PLEG): container finished" podID="879ab57c-2611-4c25-be84-ab404c500a86" containerID="8d50027dcc836c94eb4e37a02cbf0c8249b50d4dd30b1349c01350c3cedcd584" exitCode=0 Dec 06 10:38:35 crc kubenswrapper[4954]: I1206 10:38:35.674603 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttsrz" event={"ID":"879ab57c-2611-4c25-be84-ab404c500a86","Type":"ContainerDied","Data":"8d50027dcc836c94eb4e37a02cbf0c8249b50d4dd30b1349c01350c3cedcd584"} Dec 06 10:38:37 crc kubenswrapper[4954]: I1206 10:38:37.274188 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-p95p2" podUID="bda5decc-cd20-487a-b0b5-01058cac828c" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 10:38:43 crc kubenswrapper[4954]: I1206 10:38:43.433028 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7_e725d552-aee7-4bc6-abff-f04a2b2522b2/util/0.log" Dec 06 10:38:43 crc kubenswrapper[4954]: I1206 10:38:43.433131 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7_e725d552-aee7-4bc6-abff-f04a2b2522b2/util/0.log" Dec 06 10:38:43 crc kubenswrapper[4954]: I1206 10:38:43.437664 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7_e725d552-aee7-4bc6-abff-f04a2b2522b2/pull/0.log" Dec 06 10:38:43 crc kubenswrapper[4954]: I1206 10:38:43.437939 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7_e725d552-aee7-4bc6-abff-f04a2b2522b2/pull/0.log" Dec 06 10:38:43 crc kubenswrapper[4954]: I1206 10:38:43.662467 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7_e725d552-aee7-4bc6-abff-f04a2b2522b2/extract/0.log" Dec 06 10:38:43 crc kubenswrapper[4954]: I1206 10:38:43.880305 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-vkd9c_5f12ac11-7904-4ca1-beff-3bc654b8bf24/kube-rbac-proxy/0.log" Dec 06 10:38:43 crc kubenswrapper[4954]: I1206 10:38:43.883390 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7_e725d552-aee7-4bc6-abff-f04a2b2522b2/util/0.log" Dec 06 10:38:43 crc kubenswrapper[4954]: I1206 10:38:43.955259 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaflmcs7_e725d552-aee7-4bc6-abff-f04a2b2522b2/pull/0.log" Dec 06 10:38:44 crc kubenswrapper[4954]: I1206 10:38:44.130909 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-vkd9c_5f12ac11-7904-4ca1-beff-3bc654b8bf24/manager/0.log" Dec 06 10:38:44 crc kubenswrapper[4954]: I1206 10:38:44.220228 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-kkcv7_0e05eb35-0ec5-4760-8c17-fa88a565b38c/kube-rbac-proxy/0.log" Dec 06 10:38:44 crc kubenswrapper[4954]: I1206 10:38:44.480199 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-kkcv7_0e05eb35-0ec5-4760-8c17-fa88a565b38c/manager/0.log" Dec 06 10:38:44 crc kubenswrapper[4954]: I1206 10:38:44.482439 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-l8h49_49a30376-5e1f-4065-a8cd-b728b7413a07/kube-rbac-proxy/0.log" Dec 06 10:38:44 crc kubenswrapper[4954]: I1206 10:38:44.644125 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-l8h49_49a30376-5e1f-4065-a8cd-b728b7413a07/manager/0.log" Dec 06 10:38:45 crc kubenswrapper[4954]: I1206 10:38:45.349447 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-hp8pr_5c0b16d3-8d9a-44c8-882b-8a90fd89379d/kube-rbac-proxy/0.log" Dec 06 10:38:45 crc kubenswrapper[4954]: I1206 10:38:45.361559 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-pqzdl_a47971dc-993b-47f7-b65b-4348dfd56866/kube-rbac-proxy/0.log" Dec 06 10:38:45 crc kubenswrapper[4954]: I1206 10:38:45.405470 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-pqzdl_a47971dc-993b-47f7-b65b-4348dfd56866/manager/0.log" Dec 06 10:38:45 crc kubenswrapper[4954]: I1206 10:38:45.735686 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-jsxl2_62e28422-a646-45cc-9a4b-f3de5e2fc463/kube-rbac-proxy/0.log" Dec 06 10:38:45 crc kubenswrapper[4954]: I1206 10:38:45.753024 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-jsxl2_62e28422-a646-45cc-9a4b-f3de5e2fc463/manager/0.log" Dec 06 10:38:45 crc kubenswrapper[4954]: I1206 10:38:45.847526 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-hp8pr_5c0b16d3-8d9a-44c8-882b-8a90fd89379d/manager/0.log" Dec 06 10:38:46 crc kubenswrapper[4954]: I1206 10:38:46.268206 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-fn6mj_41e31a3c-2482-492f-a8d6-220d9aaeefe3/kube-rbac-proxy/0.log" Dec 06 10:38:46 crc kubenswrapper[4954]: I1206 10:38:46.308923 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-4kdrf_17e0dd71-107b-454c-97f6-134644f51144/kube-rbac-proxy/0.log" Dec 06 10:38:47 crc kubenswrapper[4954]: I1206 10:38:47.079931 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-t67pd_b6dcd22b-76ae-440f-8f04-2d2ea96a07f5/kube-rbac-proxy/0.log" Dec 06 10:38:47 crc kubenswrapper[4954]: I1206 10:38:47.086439 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-4kdrf_17e0dd71-107b-454c-97f6-134644f51144/manager/0.log" Dec 06 10:38:47 crc kubenswrapper[4954]: I1206 10:38:47.214814 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-fn6mj_41e31a3c-2482-492f-a8d6-220d9aaeefe3/manager/0.log" Dec 06 10:38:47 crc kubenswrapper[4954]: I1206 10:38:47.346295 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-t67pd_b6dcd22b-76ae-440f-8f04-2d2ea96a07f5/manager/0.log" Dec 06 10:38:47 crc kubenswrapper[4954]: I1206 10:38:47.443329 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:38:47 crc kubenswrapper[4954]: E1206 10:38:47.443727 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:38:47 crc kubenswrapper[4954]: I1206 10:38:47.608886 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-m9bfd_f0e990d9-0ccf-44f0-9811-265e79f933c3/kube-rbac-proxy/0.log" Dec 06 10:38:47 crc kubenswrapper[4954]: I1206 10:38:47.612099 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-m9bfd_f0e990d9-0ccf-44f0-9811-265e79f933c3/manager/0.log" Dec 06 10:38:47 crc kubenswrapper[4954]: I1206 10:38:47.744684 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-vmctk_02417e24-828d-4cb4-9a33-3908caad8c9c/kube-rbac-proxy/0.log" Dec 06 10:38:47 crc kubenswrapper[4954]: I1206 10:38:47.927097 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-vmctk_02417e24-828d-4cb4-9a33-3908caad8c9c/manager/0.log" Dec 06 10:38:48 crc kubenswrapper[4954]: I1206 10:38:48.035710 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-dltbn_8b7cf0fe-276e-41bf-88a8-76f5f56e884e/manager/0.log" Dec 06 10:38:48 crc kubenswrapper[4954]: I1206 10:38:48.131375 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-dltbn_8b7cf0fe-276e-41bf-88a8-76f5f56e884e/kube-rbac-proxy/0.log" Dec 06 10:38:48 crc kubenswrapper[4954]: I1206 10:38:48.446232 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-8dxm6_2d02b32a-0bae-4d0e-9b2a-14d8f2cdfb9b/manager/0.log" Dec 06 10:38:48 crc kubenswrapper[4954]: I1206 10:38:48.491352 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-8dxm6_2d02b32a-0bae-4d0e-9b2a-14d8f2cdfb9b/kube-rbac-proxy/0.log" Dec 06 10:38:48 crc kubenswrapper[4954]: I1206 10:38:48.493422 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttsrz" event={"ID":"879ab57c-2611-4c25-be84-ab404c500a86","Type":"ContainerStarted","Data":"23c4ca6bf6f816c3a88845d871beff750403ac3eb22635acf5b9ca12d4314e25"} Dec 06 10:38:48 crc kubenswrapper[4954]: I1206 10:38:48.510021 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ttsrz" podStartSLOduration=4.453492253 podStartE2EDuration="25.510003116s" podCreationTimestamp="2025-12-06 10:38:23 +0000 UTC" firstStartedPulling="2025-12-06 10:38:26.571283904 +0000 UTC m=+13281.384643293" lastFinishedPulling="2025-12-06 10:38:47.627794777 +0000 UTC m=+13302.441154156" observedRunningTime="2025-12-06 10:38:48.508588738 +0000 UTC m=+13303.321948127" watchObservedRunningTime="2025-12-06 10:38:48.510003116 +0000 UTC m=+13303.323362515" Dec 06 10:38:48 crc kubenswrapper[4954]: I1206 10:38:48.565662 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-g2rfv_5dac40b5-0993-4696-995b-2476436126fc/kube-rbac-proxy/0.log" Dec 06 10:38:48 crc kubenswrapper[4954]: I1206 10:38:48.713229 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-g2rfv_5dac40b5-0993-4696-995b-2476436126fc/manager/0.log" Dec 06 10:38:48 crc kubenswrapper[4954]: I1206 10:38:48.833342 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55c85496f56gzs7_f97f4b54-5ea8-4fe9-b0bc-4937979ad468/kube-rbac-proxy/0.log" Dec 06 10:38:48 crc kubenswrapper[4954]: I1206 10:38:48.863875 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55c85496f56gzs7_f97f4b54-5ea8-4fe9-b0bc-4937979ad468/manager/0.log" Dec 06 10:38:50 crc kubenswrapper[4954]: I1206 10:38:50.188639 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-khxwr_c6aa3674-f8f4-4a0e-bc00-2629d49818e7/kube-rbac-proxy/0.log" Dec 06 10:38:50 crc kubenswrapper[4954]: I1206 10:38:50.322757 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-55b6fb9447-h2z85_e3a2a692-b8fa-4f39-b507-9dd36ab9593e/operator/0.log" Dec 06 10:38:50 crc kubenswrapper[4954]: I1206 10:38:50.398917 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-khxwr_c6aa3674-f8f4-4a0e-bc00-2629d49818e7/manager/0.log" Dec 06 10:38:50 crc kubenswrapper[4954]: I1206 10:38:50.529008 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-ts4vc_4ee8b8a2-75dc-4a36-93e4-4c7b6da2432a/registry-server/0.log" Dec 06 10:38:50 crc kubenswrapper[4954]: I1206 10:38:50.791697 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-w7lxk_4aa03482-7c89-4997-b9a4-096ba0f2d47a/manager/0.log" Dec 06 10:38:50 crc kubenswrapper[4954]: I1206 10:38:50.804455 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-w7lxk_4aa03482-7c89-4997-b9a4-096ba0f2d47a/kube-rbac-proxy/0.log" Dec 06 10:38:50 crc kubenswrapper[4954]: I1206 10:38:50.924788 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-gt8z5_f5cd0bdb-bcec-4ba6-8272-5fe99b6e4008/operator/0.log" Dec 06 10:38:51 crc kubenswrapper[4954]: I1206 10:38:51.218053 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-frjm7_56b42964-963e-469d-bc21-867a3c560e66/kube-rbac-proxy/0.log" Dec 06 10:38:51 crc kubenswrapper[4954]: I1206 10:38:51.243387 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-frjm7_56b42964-963e-469d-bc21-867a3c560e66/manager/0.log" Dec 06 10:38:51 crc kubenswrapper[4954]: I1206 10:38:51.522348 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-2x8x2_ae04bc7e-d1cf-4063-8236-d3717d1b8f51/kube-rbac-proxy/0.log" Dec 06 10:38:51 crc kubenswrapper[4954]: I1206 10:38:51.580268 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-fpbr4_d05a311c-788e-41fa-9ddc-52c98fe6dc7c/kube-rbac-proxy/0.log" Dec 06 10:38:51 crc kubenswrapper[4954]: I1206 10:38:51.966943 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-fpbr4_d05a311c-788e-41fa-9ddc-52c98fe6dc7c/manager/0.log" Dec 06 10:38:52 crc kubenswrapper[4954]: I1206 10:38:52.023501 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-2x8x2_ae04bc7e-d1cf-4063-8236-d3717d1b8f51/manager/0.log" Dec 06 10:38:52 crc kubenswrapper[4954]: I1206 10:38:52.073514 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-zh2kh_fb5012c2-0242-4c82-bedb-58f94bc2c8d1/kube-rbac-proxy/0.log" Dec 06 10:38:52 crc kubenswrapper[4954]: I1206 10:38:52.562542 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-zh2kh_fb5012c2-0242-4c82-bedb-58f94bc2c8d1/manager/0.log" Dec 06 10:38:52 crc kubenswrapper[4954]: I1206 10:38:52.749780 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6knhm"] Dec 06 10:38:52 crc kubenswrapper[4954]: I1206 10:38:52.752267 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6knhm" Dec 06 10:38:52 crc kubenswrapper[4954]: I1206 10:38:52.765519 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6knhm"] Dec 06 10:38:52 crc kubenswrapper[4954]: I1206 10:38:52.847962 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549a9acc-8ef7-4188-aa1c-1f616e54c4be-catalog-content\") pod \"redhat-marketplace-6knhm\" (UID: \"549a9acc-8ef7-4188-aa1c-1f616e54c4be\") " pod="openshift-marketplace/redhat-marketplace-6knhm" Dec 06 10:38:52 crc kubenswrapper[4954]: I1206 10:38:52.848006 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbhnx\" (UniqueName: \"kubernetes.io/projected/549a9acc-8ef7-4188-aa1c-1f616e54c4be-kube-api-access-kbhnx\") pod \"redhat-marketplace-6knhm\" (UID: \"549a9acc-8ef7-4188-aa1c-1f616e54c4be\") " pod="openshift-marketplace/redhat-marketplace-6knhm" Dec 06 10:38:52 crc kubenswrapper[4954]: I1206 10:38:52.848162 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549a9acc-8ef7-4188-aa1c-1f616e54c4be-utilities\") pod \"redhat-marketplace-6knhm\" (UID: \"549a9acc-8ef7-4188-aa1c-1f616e54c4be\") " pod="openshift-marketplace/redhat-marketplace-6knhm" Dec 06 10:38:52 crc kubenswrapper[4954]: I1206 10:38:52.951101 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549a9acc-8ef7-4188-aa1c-1f616e54c4be-utilities\") pod \"redhat-marketplace-6knhm\" (UID: \"549a9acc-8ef7-4188-aa1c-1f616e54c4be\") " pod="openshift-marketplace/redhat-marketplace-6knhm" Dec 06 10:38:52 crc kubenswrapper[4954]: I1206 10:38:52.951469 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549a9acc-8ef7-4188-aa1c-1f616e54c4be-catalog-content\") pod \"redhat-marketplace-6knhm\" (UID: \"549a9acc-8ef7-4188-aa1c-1f616e54c4be\") " pod="openshift-marketplace/redhat-marketplace-6knhm" Dec 06 10:38:52 crc kubenswrapper[4954]: I1206 10:38:52.951497 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbhnx\" (UniqueName: \"kubernetes.io/projected/549a9acc-8ef7-4188-aa1c-1f616e54c4be-kube-api-access-kbhnx\") pod \"redhat-marketplace-6knhm\" (UID: \"549a9acc-8ef7-4188-aa1c-1f616e54c4be\") " pod="openshift-marketplace/redhat-marketplace-6knhm" Dec 06 10:38:52 crc kubenswrapper[4954]: I1206 10:38:52.952466 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549a9acc-8ef7-4188-aa1c-1f616e54c4be-catalog-content\") pod \"redhat-marketplace-6knhm\" (UID: \"549a9acc-8ef7-4188-aa1c-1f616e54c4be\") " pod="openshift-marketplace/redhat-marketplace-6knhm" Dec 06 10:38:52 crc kubenswrapper[4954]: I1206 10:38:52.952524 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549a9acc-8ef7-4188-aa1c-1f616e54c4be-utilities\") pod \"redhat-marketplace-6knhm\" (UID: \"549a9acc-8ef7-4188-aa1c-1f616e54c4be\") " pod="openshift-marketplace/redhat-marketplace-6knhm" Dec 06 10:38:52 crc kubenswrapper[4954]: I1206 10:38:52.981501 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbhnx\" (UniqueName: \"kubernetes.io/projected/549a9acc-8ef7-4188-aa1c-1f616e54c4be-kube-api-access-kbhnx\") pod \"redhat-marketplace-6knhm\" (UID: \"549a9acc-8ef7-4188-aa1c-1f616e54c4be\") " pod="openshift-marketplace/redhat-marketplace-6knhm" Dec 06 10:38:53 crc kubenswrapper[4954]: I1206 10:38:53.076118 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6knhm" Dec 06 10:38:53 crc kubenswrapper[4954]: I1206 10:38:53.560746 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54bdf956c4-xbgpz_a233e22e-ec80-4af3-b6db-488834c983de/manager/0.log" Dec 06 10:38:53 crc kubenswrapper[4954]: I1206 10:38:53.683668 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ttsrz" Dec 06 10:38:53 crc kubenswrapper[4954]: I1206 10:38:53.685788 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ttsrz" Dec 06 10:38:54 crc kubenswrapper[4954]: I1206 10:38:54.034050 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6knhm"] Dec 06 10:38:54 crc kubenswrapper[4954]: I1206 10:38:54.590478 4954 generic.go:334] "Generic (PLEG): container finished" podID="549a9acc-8ef7-4188-aa1c-1f616e54c4be" containerID="26e762062e0d6d3c0e9a33c420babc1050581bda583718da8a3ee324fa62291c" exitCode=0 Dec 06 10:38:54 crc kubenswrapper[4954]: I1206 10:38:54.590550 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6knhm" event={"ID":"549a9acc-8ef7-4188-aa1c-1f616e54c4be","Type":"ContainerDied","Data":"26e762062e0d6d3c0e9a33c420babc1050581bda583718da8a3ee324fa62291c"} Dec 06 10:38:54 crc kubenswrapper[4954]: I1206 10:38:54.590928 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6knhm" event={"ID":"549a9acc-8ef7-4188-aa1c-1f616e54c4be","Type":"ContainerStarted","Data":"95de2d68df9e1105e7c9cc2daa67ea9c4fae0c7da774a7361ef90480dfa7b17d"} Dec 06 10:38:54 crc kubenswrapper[4954]: I1206 10:38:54.745246 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ttsrz" podUID="879ab57c-2611-4c25-be84-ab404c500a86" containerName="registry-server" probeResult="failure" output=< Dec 06 10:38:54 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 10:38:54 crc kubenswrapper[4954]: > Dec 06 10:38:55 crc kubenswrapper[4954]: I1206 10:38:55.602647 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6knhm" event={"ID":"549a9acc-8ef7-4188-aa1c-1f616e54c4be","Type":"ContainerStarted","Data":"c3722a7eb1e5aaac3dc0fe0c55a11e8ea37b435d6a354046fa10a1b4e0e794e9"} Dec 06 10:38:56 crc kubenswrapper[4954]: I1206 10:38:56.625514 4954 generic.go:334] "Generic (PLEG): container finished" podID="549a9acc-8ef7-4188-aa1c-1f616e54c4be" containerID="c3722a7eb1e5aaac3dc0fe0c55a11e8ea37b435d6a354046fa10a1b4e0e794e9" exitCode=0 Dec 06 10:38:56 crc kubenswrapper[4954]: I1206 10:38:56.625592 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6knhm" event={"ID":"549a9acc-8ef7-4188-aa1c-1f616e54c4be","Type":"ContainerDied","Data":"c3722a7eb1e5aaac3dc0fe0c55a11e8ea37b435d6a354046fa10a1b4e0e794e9"} Dec 06 10:38:58 crc kubenswrapper[4954]: I1206 10:38:58.445711 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:38:58 crc kubenswrapper[4954]: E1206 10:38:58.446744 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:38:58 crc kubenswrapper[4954]: I1206 10:38:58.660580 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6knhm" event={"ID":"549a9acc-8ef7-4188-aa1c-1f616e54c4be","Type":"ContainerStarted","Data":"efbf196ca5f467d3a920370330074773f1337a455d276de5f76b417ddc93c219"} Dec 06 10:38:58 crc kubenswrapper[4954]: I1206 10:38:58.697048 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6knhm" podStartSLOduration=3.589450704 podStartE2EDuration="6.697029162s" podCreationTimestamp="2025-12-06 10:38:52 +0000 UTC" firstStartedPulling="2025-12-06 10:38:54.592963513 +0000 UTC m=+13309.406322902" lastFinishedPulling="2025-12-06 10:38:57.700541971 +0000 UTC m=+13312.513901360" observedRunningTime="2025-12-06 10:38:58.692934843 +0000 UTC m=+13313.506294232" watchObservedRunningTime="2025-12-06 10:38:58.697029162 +0000 UTC m=+13313.510388551" Dec 06 10:39:03 crc kubenswrapper[4954]: I1206 10:39:03.076588 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6knhm" Dec 06 10:39:03 crc kubenswrapper[4954]: I1206 10:39:03.077198 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6knhm" Dec 06 10:39:03 crc kubenswrapper[4954]: I1206 10:39:03.134224 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6knhm" Dec 06 10:39:03 crc kubenswrapper[4954]: I1206 10:39:03.745068 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ttsrz" Dec 06 10:39:03 crc kubenswrapper[4954]: I1206 10:39:03.785394 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6knhm" Dec 06 10:39:03 crc kubenswrapper[4954]: I1206 10:39:03.810895 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ttsrz" Dec 06 10:39:04 crc kubenswrapper[4954]: I1206 10:39:04.778014 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6knhm"] Dec 06 10:39:05 crc kubenswrapper[4954]: I1206 10:39:05.727079 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6knhm" podUID="549a9acc-8ef7-4188-aa1c-1f616e54c4be" containerName="registry-server" containerID="cri-o://efbf196ca5f467d3a920370330074773f1337a455d276de5f76b417ddc93c219" gracePeriod=2 Dec 06 10:39:06 crc kubenswrapper[4954]: I1206 10:39:06.179911 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ttsrz"] Dec 06 10:39:06 crc kubenswrapper[4954]: I1206 10:39:06.180407 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ttsrz" podUID="879ab57c-2611-4c25-be84-ab404c500a86" containerName="registry-server" containerID="cri-o://23c4ca6bf6f816c3a88845d871beff750403ac3eb22635acf5b9ca12d4314e25" gracePeriod=2 Dec 06 10:39:06 crc kubenswrapper[4954]: I1206 10:39:06.747572 4954 generic.go:334] "Generic (PLEG): container finished" podID="879ab57c-2611-4c25-be84-ab404c500a86" containerID="23c4ca6bf6f816c3a88845d871beff750403ac3eb22635acf5b9ca12d4314e25" exitCode=0 Dec 06 10:39:06 crc kubenswrapper[4954]: I1206 10:39:06.747734 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttsrz" event={"ID":"879ab57c-2611-4c25-be84-ab404c500a86","Type":"ContainerDied","Data":"23c4ca6bf6f816c3a88845d871beff750403ac3eb22635acf5b9ca12d4314e25"} Dec 06 10:39:06 crc kubenswrapper[4954]: I1206 10:39:06.751830 4954 generic.go:334] "Generic (PLEG): container finished" podID="549a9acc-8ef7-4188-aa1c-1f616e54c4be" containerID="efbf196ca5f467d3a920370330074773f1337a455d276de5f76b417ddc93c219" exitCode=0 Dec 06 10:39:06 crc kubenswrapper[4954]: I1206 10:39:06.752035 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6knhm" event={"ID":"549a9acc-8ef7-4188-aa1c-1f616e54c4be","Type":"ContainerDied","Data":"efbf196ca5f467d3a920370330074773f1337a455d276de5f76b417ddc93c219"} Dec 06 10:39:07 crc kubenswrapper[4954]: I1206 10:39:07.411130 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6knhm" Dec 06 10:39:07 crc kubenswrapper[4954]: I1206 10:39:07.539275 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549a9acc-8ef7-4188-aa1c-1f616e54c4be-utilities\") pod \"549a9acc-8ef7-4188-aa1c-1f616e54c4be\" (UID: \"549a9acc-8ef7-4188-aa1c-1f616e54c4be\") " Dec 06 10:39:07 crc kubenswrapper[4954]: I1206 10:39:07.539478 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbhnx\" (UniqueName: \"kubernetes.io/projected/549a9acc-8ef7-4188-aa1c-1f616e54c4be-kube-api-access-kbhnx\") pod \"549a9acc-8ef7-4188-aa1c-1f616e54c4be\" (UID: \"549a9acc-8ef7-4188-aa1c-1f616e54c4be\") " Dec 06 10:39:07 crc kubenswrapper[4954]: I1206 10:39:07.539514 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549a9acc-8ef7-4188-aa1c-1f616e54c4be-catalog-content\") pod \"549a9acc-8ef7-4188-aa1c-1f616e54c4be\" (UID: \"549a9acc-8ef7-4188-aa1c-1f616e54c4be\") " Dec 06 10:39:07 crc kubenswrapper[4954]: I1206 10:39:07.541026 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/549a9acc-8ef7-4188-aa1c-1f616e54c4be-utilities" (OuterVolumeSpecName: "utilities") pod "549a9acc-8ef7-4188-aa1c-1f616e54c4be" (UID: "549a9acc-8ef7-4188-aa1c-1f616e54c4be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:39:07 crc kubenswrapper[4954]: I1206 10:39:07.547903 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/549a9acc-8ef7-4188-aa1c-1f616e54c4be-kube-api-access-kbhnx" (OuterVolumeSpecName: "kube-api-access-kbhnx") pod "549a9acc-8ef7-4188-aa1c-1f616e54c4be" (UID: "549a9acc-8ef7-4188-aa1c-1f616e54c4be"). InnerVolumeSpecName "kube-api-access-kbhnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:39:07 crc kubenswrapper[4954]: I1206 10:39:07.572882 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/549a9acc-8ef7-4188-aa1c-1f616e54c4be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "549a9acc-8ef7-4188-aa1c-1f616e54c4be" (UID: "549a9acc-8ef7-4188-aa1c-1f616e54c4be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:39:07 crc kubenswrapper[4954]: I1206 10:39:07.642019 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbhnx\" (UniqueName: \"kubernetes.io/projected/549a9acc-8ef7-4188-aa1c-1f616e54c4be-kube-api-access-kbhnx\") on node \"crc\" DevicePath \"\"" Dec 06 10:39:07 crc kubenswrapper[4954]: I1206 10:39:07.642068 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549a9acc-8ef7-4188-aa1c-1f616e54c4be-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:39:07 crc kubenswrapper[4954]: I1206 10:39:07.642085 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549a9acc-8ef7-4188-aa1c-1f616e54c4be-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:39:07 crc kubenswrapper[4954]: I1206 10:39:07.779521 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6knhm" event={"ID":"549a9acc-8ef7-4188-aa1c-1f616e54c4be","Type":"ContainerDied","Data":"95de2d68df9e1105e7c9cc2daa67ea9c4fae0c7da774a7361ef90480dfa7b17d"} Dec 06 10:39:07 crc kubenswrapper[4954]: I1206 10:39:07.779606 4954 scope.go:117] "RemoveContainer" containerID="efbf196ca5f467d3a920370330074773f1337a455d276de5f76b417ddc93c219" Dec 06 10:39:07 crc kubenswrapper[4954]: I1206 10:39:07.779651 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6knhm" Dec 06 10:39:07 crc kubenswrapper[4954]: I1206 10:39:07.804706 4954 scope.go:117] "RemoveContainer" containerID="c3722a7eb1e5aaac3dc0fe0c55a11e8ea37b435d6a354046fa10a1b4e0e794e9" Dec 06 10:39:07 crc kubenswrapper[4954]: I1206 10:39:07.869788 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6knhm"] Dec 06 10:39:07 crc kubenswrapper[4954]: I1206 10:39:07.879867 4954 scope.go:117] "RemoveContainer" containerID="26e762062e0d6d3c0e9a33c420babc1050581bda583718da8a3ee324fa62291c" Dec 06 10:39:07 crc kubenswrapper[4954]: I1206 10:39:07.900232 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6knhm"] Dec 06 10:39:07 crc kubenswrapper[4954]: I1206 10:39:07.961828 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttsrz" Dec 06 10:39:08 crc kubenswrapper[4954]: I1206 10:39:08.053162 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/879ab57c-2611-4c25-be84-ab404c500a86-utilities" (OuterVolumeSpecName: "utilities") pod "879ab57c-2611-4c25-be84-ab404c500a86" (UID: "879ab57c-2611-4c25-be84-ab404c500a86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:39:08 crc kubenswrapper[4954]: I1206 10:39:08.053406 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/879ab57c-2611-4c25-be84-ab404c500a86-utilities\") pod \"879ab57c-2611-4c25-be84-ab404c500a86\" (UID: \"879ab57c-2611-4c25-be84-ab404c500a86\") " Dec 06 10:39:08 crc kubenswrapper[4954]: I1206 10:39:08.053537 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/879ab57c-2611-4c25-be84-ab404c500a86-catalog-content\") pod \"879ab57c-2611-4c25-be84-ab404c500a86\" (UID: \"879ab57c-2611-4c25-be84-ab404c500a86\") " Dec 06 10:39:08 crc kubenswrapper[4954]: I1206 10:39:08.066852 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxbt4\" (UniqueName: \"kubernetes.io/projected/879ab57c-2611-4c25-be84-ab404c500a86-kube-api-access-wxbt4\") pod \"879ab57c-2611-4c25-be84-ab404c500a86\" (UID: \"879ab57c-2611-4c25-be84-ab404c500a86\") " Dec 06 10:39:08 crc kubenswrapper[4954]: I1206 10:39:08.067793 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/879ab57c-2611-4c25-be84-ab404c500a86-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:39:08 crc kubenswrapper[4954]: I1206 10:39:08.073764 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/879ab57c-2611-4c25-be84-ab404c500a86-kube-api-access-wxbt4" (OuterVolumeSpecName: "kube-api-access-wxbt4") pod "879ab57c-2611-4c25-be84-ab404c500a86" (UID: "879ab57c-2611-4c25-be84-ab404c500a86"). InnerVolumeSpecName "kube-api-access-wxbt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:39:08 crc kubenswrapper[4954]: I1206 10:39:08.166222 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/879ab57c-2611-4c25-be84-ab404c500a86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "879ab57c-2611-4c25-be84-ab404c500a86" (UID: "879ab57c-2611-4c25-be84-ab404c500a86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:39:08 crc kubenswrapper[4954]: I1206 10:39:08.170020 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/879ab57c-2611-4c25-be84-ab404c500a86-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:39:08 crc kubenswrapper[4954]: I1206 10:39:08.170094 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxbt4\" (UniqueName: \"kubernetes.io/projected/879ab57c-2611-4c25-be84-ab404c500a86-kube-api-access-wxbt4\") on node \"crc\" DevicePath \"\"" Dec 06 10:39:08 crc kubenswrapper[4954]: I1206 10:39:08.793664 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttsrz" event={"ID":"879ab57c-2611-4c25-be84-ab404c500a86","Type":"ContainerDied","Data":"66fce05b5d5722c59a92ee135130a60c0ec9ac07427edfab6df329e32d009801"} Dec 06 10:39:08 crc kubenswrapper[4954]: I1206 10:39:08.793718 4954 scope.go:117] "RemoveContainer" containerID="23c4ca6bf6f816c3a88845d871beff750403ac3eb22635acf5b9ca12d4314e25" Dec 06 10:39:08 crc kubenswrapper[4954]: I1206 10:39:08.793853 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttsrz" Dec 06 10:39:08 crc kubenswrapper[4954]: I1206 10:39:08.840520 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ttsrz"] Dec 06 10:39:08 crc kubenswrapper[4954]: I1206 10:39:08.860767 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ttsrz"] Dec 06 10:39:08 crc kubenswrapper[4954]: I1206 10:39:08.863806 4954 scope.go:117] "RemoveContainer" containerID="8d50027dcc836c94eb4e37a02cbf0c8249b50d4dd30b1349c01350c3cedcd584" Dec 06 10:39:08 crc kubenswrapper[4954]: I1206 10:39:08.946502 4954 scope.go:117] "RemoveContainer" containerID="a98e40705751262de7cf7cc7e568276093f91635dff25b50fa8961ae2fd5382c" Dec 06 10:39:09 crc kubenswrapper[4954]: I1206 10:39:09.458653 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="549a9acc-8ef7-4188-aa1c-1f616e54c4be" path="/var/lib/kubelet/pods/549a9acc-8ef7-4188-aa1c-1f616e54c4be/volumes" Dec 06 10:39:09 crc kubenswrapper[4954]: I1206 10:39:09.459482 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="879ab57c-2611-4c25-be84-ab404c500a86" path="/var/lib/kubelet/pods/879ab57c-2611-4c25-be84-ab404c500a86/volumes" Dec 06 10:39:12 crc kubenswrapper[4954]: I1206 10:39:12.443823 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:39:12 crc kubenswrapper[4954]: E1206 10:39:12.445539 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:39:21 crc kubenswrapper[4954]: I1206 10:39:21.173950 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-wq6gx_5e9d36ab-4a09-4275-9732-cd9bd681a917/control-plane-machine-set-operator/0.log" Dec 06 10:39:21 crc kubenswrapper[4954]: I1206 10:39:21.292573 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-z6mbx_2f2c25db-28aa-4b19-8f9a-21b78e02089f/kube-rbac-proxy/0.log" Dec 06 10:39:21 crc kubenswrapper[4954]: I1206 10:39:21.335039 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-z6mbx_2f2c25db-28aa-4b19-8f9a-21b78e02089f/machine-api-operator/0.log" Dec 06 10:39:23 crc kubenswrapper[4954]: I1206 10:39:23.443343 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:39:23 crc kubenswrapper[4954]: E1206 10:39:23.443954 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:39:35 crc kubenswrapper[4954]: I1206 10:39:35.453823 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:39:35 crc kubenswrapper[4954]: E1206 10:39:35.454618 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:39:37 crc kubenswrapper[4954]: I1206 10:39:37.444806 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-qvrwn_3edd2749-556f-4ce2-a0d9-e13cb2455243/cert-manager-controller/0.log" Dec 06 10:39:37 crc kubenswrapper[4954]: I1206 10:39:37.687786 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-djdk2_21c46708-0103-458e-b609-635ce217adb1/cert-manager-webhook/0.log" Dec 06 10:39:37 crc kubenswrapper[4954]: I1206 10:39:37.807699 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-z4hph_e45848cf-9e42-436e-9719-8229bac3fb66/cert-manager-cainjector/0.log" Dec 06 10:39:46 crc kubenswrapper[4954]: I1206 10:39:46.444008 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:39:46 crc kubenswrapper[4954]: E1206 10:39:46.444793 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:39:52 crc kubenswrapper[4954]: I1206 10:39:52.722486 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-zl4qd_86a6ada5-37c3-4250-9f12-d1e208786bcf/nmstate-console-plugin/0.log" Dec 06 10:39:53 crc kubenswrapper[4954]: I1206 10:39:53.076046 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-k9nmn_60516977-e70c-4cd1-a029-15f0be8f3a12/nmstate-handler/0.log" Dec 06 10:39:53 crc kubenswrapper[4954]: I1206 10:39:53.096448 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-cvwlf_257074d3-7882-4f60-b7d6-92374480fafa/kube-rbac-proxy/0.log" Dec 06 10:39:53 crc kubenswrapper[4954]: I1206 10:39:53.240660 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-cvwlf_257074d3-7882-4f60-b7d6-92374480fafa/nmstate-metrics/0.log" Dec 06 10:39:53 crc kubenswrapper[4954]: I1206 10:39:53.420732 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-ftlnm_b50936c1-0d88-46be-8fbb-d554594c21a3/nmstate-operator/0.log" Dec 06 10:39:53 crc kubenswrapper[4954]: I1206 10:39:53.538345 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-llch7_2d304ffb-d35d-4464-9f6f-8b1ed846b641/nmstate-webhook/0.log" Dec 06 10:39:58 crc kubenswrapper[4954]: I1206 10:39:58.444269 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:39:58 crc kubenswrapper[4954]: E1206 10:39:58.445203 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:40:11 crc kubenswrapper[4954]: I1206 10:40:11.443946 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:40:11 crc kubenswrapper[4954]: E1206 10:40:11.444691 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:40:12 crc kubenswrapper[4954]: I1206 10:40:12.416014 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-ql4ww_ab7695f2-803f-469d-9302-c1bec12de24e/kube-rbac-proxy/0.log" Dec 06 10:40:12 crc kubenswrapper[4954]: I1206 10:40:12.659763 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p95p2_bda5decc-cd20-487a-b0b5-01058cac828c/cp-frr-files/0.log" Dec 06 10:40:12 crc kubenswrapper[4954]: I1206 10:40:12.749186 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-ql4ww_ab7695f2-803f-469d-9302-c1bec12de24e/controller/0.log" Dec 06 10:40:12 crc kubenswrapper[4954]: I1206 10:40:12.932709 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p95p2_bda5decc-cd20-487a-b0b5-01058cac828c/cp-metrics/0.log" Dec 06 10:40:12 crc kubenswrapper[4954]: I1206 10:40:12.939613 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p95p2_bda5decc-cd20-487a-b0b5-01058cac828c/cp-reloader/0.log" Dec 06 10:40:12 crc kubenswrapper[4954]: I1206 10:40:12.973257 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p95p2_bda5decc-cd20-487a-b0b5-01058cac828c/cp-frr-files/0.log" Dec 06 10:40:12 crc kubenswrapper[4954]: I1206 10:40:12.977469 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p95p2_bda5decc-cd20-487a-b0b5-01058cac828c/cp-reloader/0.log" Dec 06 10:40:13 crc kubenswrapper[4954]: I1206 10:40:13.158261 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p95p2_bda5decc-cd20-487a-b0b5-01058cac828c/cp-frr-files/0.log" Dec 06 10:40:13 crc kubenswrapper[4954]: I1206 10:40:13.200777 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p95p2_bda5decc-cd20-487a-b0b5-01058cac828c/cp-metrics/0.log" Dec 06 10:40:13 crc kubenswrapper[4954]: I1206 10:40:13.201930 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p95p2_bda5decc-cd20-487a-b0b5-01058cac828c/cp-metrics/0.log" Dec 06 10:40:13 crc kubenswrapper[4954]: I1206 10:40:13.220466 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p95p2_bda5decc-cd20-487a-b0b5-01058cac828c/cp-reloader/0.log" Dec 06 10:40:13 crc kubenswrapper[4954]: I1206 10:40:13.351233 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p95p2_bda5decc-cd20-487a-b0b5-01058cac828c/cp-frr-files/0.log" Dec 06 10:40:13 crc kubenswrapper[4954]: I1206 10:40:13.380468 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p95p2_bda5decc-cd20-487a-b0b5-01058cac828c/cp-reloader/0.log" Dec 06 10:40:13 crc kubenswrapper[4954]: I1206 10:40:13.404094 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p95p2_bda5decc-cd20-487a-b0b5-01058cac828c/controller/0.log" Dec 06 10:40:13 crc kubenswrapper[4954]: I1206 10:40:13.442527 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p95p2_bda5decc-cd20-487a-b0b5-01058cac828c/cp-metrics/0.log" Dec 06 10:40:13 crc kubenswrapper[4954]: I1206 10:40:13.574132 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p95p2_bda5decc-cd20-487a-b0b5-01058cac828c/frr-metrics/0.log" Dec 06 10:40:13 crc kubenswrapper[4954]: I1206 10:40:13.642351 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p95p2_bda5decc-cd20-487a-b0b5-01058cac828c/kube-rbac-proxy-frr/0.log" Dec 06 10:40:13 crc kubenswrapper[4954]: I1206 10:40:13.674248 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p95p2_bda5decc-cd20-487a-b0b5-01058cac828c/kube-rbac-proxy/0.log" Dec 06 10:40:13 crc kubenswrapper[4954]: I1206 10:40:13.781882 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p95p2_bda5decc-cd20-487a-b0b5-01058cac828c/reloader/0.log" Dec 06 10:40:13 crc kubenswrapper[4954]: I1206 10:40:13.917809 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-8f4wp_4509df09-42d5-4323-8d31-88fef1e763c2/frr-k8s-webhook-server/0.log" Dec 06 10:40:14 crc kubenswrapper[4954]: I1206 10:40:14.242436 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-574b6cdf6f-tkt77_63d7776b-9c1b-4fe6-a76e-74225b88dab7/manager/0.log" Dec 06 10:40:14 crc kubenswrapper[4954]: I1206 10:40:14.405369 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-767c58648-lwjz8_932f3a22-69f1-427c-8b8c-7d1cb2f32f8a/webhook-server/0.log" Dec 06 10:40:14 crc kubenswrapper[4954]: I1206 10:40:14.517989 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-fjt49_3048d4de-ffa6-4164-afde-4c6fbfe6abb3/kube-rbac-proxy/0.log" Dec 06 10:40:15 crc kubenswrapper[4954]: I1206 10:40:15.602600 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-fjt49_3048d4de-ffa6-4164-afde-4c6fbfe6abb3/speaker/0.log" Dec 06 10:40:17 crc kubenswrapper[4954]: I1206 10:40:17.296249 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p95p2_bda5decc-cd20-487a-b0b5-01058cac828c/frr/0.log" Dec 06 10:40:23 crc kubenswrapper[4954]: I1206 10:40:23.444381 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:40:23 crc kubenswrapper[4954]: E1206 10:40:23.445909 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:40:29 crc kubenswrapper[4954]: I1206 10:40:29.463454 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx_456e3ce9-0eb6-466a-92f1-7eaa684b6b7a/util/0.log" Dec 06 10:40:29 crc kubenswrapper[4954]: I1206 10:40:29.765906 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx_456e3ce9-0eb6-466a-92f1-7eaa684b6b7a/util/0.log" Dec 06 10:40:29 crc kubenswrapper[4954]: I1206 10:40:29.782749 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx_456e3ce9-0eb6-466a-92f1-7eaa684b6b7a/pull/0.log" Dec 06 10:40:29 crc kubenswrapper[4954]: I1206 10:40:29.809424 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx_456e3ce9-0eb6-466a-92f1-7eaa684b6b7a/pull/0.log" Dec 06 10:40:30 crc kubenswrapper[4954]: I1206 10:40:30.021237 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx_456e3ce9-0eb6-466a-92f1-7eaa684b6b7a/pull/0.log" Dec 06 10:40:30 crc kubenswrapper[4954]: I1206 10:40:30.048854 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx_456e3ce9-0eb6-466a-92f1-7eaa684b6b7a/util/0.log" Dec 06 10:40:30 crc kubenswrapper[4954]: I1206 10:40:30.113487 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afqwrx_456e3ce9-0eb6-466a-92f1-7eaa684b6b7a/extract/0.log" Dec 06 10:40:30 crc kubenswrapper[4954]: I1206 10:40:30.220493 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9_dd3269d5-87e6-4cd2-a667-217f6da3b3f9/util/0.log" Dec 06 10:40:30 crc kubenswrapper[4954]: I1206 10:40:30.520210 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9_dd3269d5-87e6-4cd2-a667-217f6da3b3f9/util/0.log" Dec 06 10:40:30 crc kubenswrapper[4954]: I1206 10:40:30.521176 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9_dd3269d5-87e6-4cd2-a667-217f6da3b3f9/pull/0.log" Dec 06 10:40:30 crc kubenswrapper[4954]: I1206 10:40:30.629421 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9_dd3269d5-87e6-4cd2-a667-217f6da3b3f9/pull/0.log" Dec 06 10:40:30 crc kubenswrapper[4954]: I1206 10:40:30.741493 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9_dd3269d5-87e6-4cd2-a667-217f6da3b3f9/pull/0.log" Dec 06 10:40:30 crc kubenswrapper[4954]: I1206 10:40:30.809250 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9_dd3269d5-87e6-4cd2-a667-217f6da3b3f9/util/0.log" Dec 06 10:40:30 crc kubenswrapper[4954]: I1206 10:40:30.862751 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f9f9p9_dd3269d5-87e6-4cd2-a667-217f6da3b3f9/extract/0.log" Dec 06 10:40:30 crc kubenswrapper[4954]: I1206 10:40:30.999292 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x_b46b0863-d9e9-4adb-a2ba-0e992b84dc73/util/0.log" Dec 06 10:40:31 crc kubenswrapper[4954]: I1206 10:40:31.229495 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x_b46b0863-d9e9-4adb-a2ba-0e992b84dc73/pull/0.log" Dec 06 10:40:31 crc kubenswrapper[4954]: I1206 10:40:31.231026 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x_b46b0863-d9e9-4adb-a2ba-0e992b84dc73/util/0.log" Dec 06 10:40:31 crc kubenswrapper[4954]: I1206 10:40:31.244835 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x_b46b0863-d9e9-4adb-a2ba-0e992b84dc73/pull/0.log" Dec 06 10:40:31 crc kubenswrapper[4954]: I1206 10:40:31.432359 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x_b46b0863-d9e9-4adb-a2ba-0e992b84dc73/pull/0.log" Dec 06 10:40:31 crc kubenswrapper[4954]: I1206 10:40:31.437160 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x_b46b0863-d9e9-4adb-a2ba-0e992b84dc73/util/0.log" Dec 06 10:40:31 crc kubenswrapper[4954]: I1206 10:40:31.515184 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xw84x_b46b0863-d9e9-4adb-a2ba-0e992b84dc73/extract/0.log" Dec 06 10:40:31 crc kubenswrapper[4954]: I1206 10:40:31.665906 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd_688b9e13-c31d-446c-acb7-ba10a4a6a0f1/util/0.log" Dec 06 10:40:31 crc kubenswrapper[4954]: I1206 10:40:31.906127 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd_688b9e13-c31d-446c-acb7-ba10a4a6a0f1/pull/0.log" Dec 06 10:40:31 crc kubenswrapper[4954]: I1206 10:40:31.940859 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd_688b9e13-c31d-446c-acb7-ba10a4a6a0f1/util/0.log" Dec 06 10:40:31 crc kubenswrapper[4954]: I1206 10:40:31.969964 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd_688b9e13-c31d-446c-acb7-ba10a4a6a0f1/pull/0.log" Dec 06 10:40:32 crc kubenswrapper[4954]: I1206 10:40:32.257387 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd_688b9e13-c31d-446c-acb7-ba10a4a6a0f1/pull/0.log" Dec 06 10:40:32 crc kubenswrapper[4954]: I1206 10:40:32.289929 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd_688b9e13-c31d-446c-acb7-ba10a4a6a0f1/util/0.log" Dec 06 10:40:32 crc kubenswrapper[4954]: I1206 10:40:32.336158 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f837cpsd_688b9e13-c31d-446c-acb7-ba10a4a6a0f1/extract/0.log" Dec 06 10:40:32 crc kubenswrapper[4954]: I1206 10:40:32.572973 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sbt66_89a2ab41-05c3-433e-aa02-4cc276fe349a/extract-utilities/0.log" Dec 06 10:40:32 crc kubenswrapper[4954]: I1206 10:40:32.807769 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sbt66_89a2ab41-05c3-433e-aa02-4cc276fe349a/extract-content/0.log" Dec 06 10:40:32 crc kubenswrapper[4954]: I1206 10:40:32.819692 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sbt66_89a2ab41-05c3-433e-aa02-4cc276fe349a/extract-utilities/0.log" Dec 06 10:40:32 crc kubenswrapper[4954]: I1206 10:40:32.854609 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sbt66_89a2ab41-05c3-433e-aa02-4cc276fe349a/extract-content/0.log" Dec 06 10:40:33 crc kubenswrapper[4954]: I1206 10:40:33.090327 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sbt66_89a2ab41-05c3-433e-aa02-4cc276fe349a/extract-utilities/0.log" Dec 06 10:40:33 crc kubenswrapper[4954]: I1206 10:40:33.161457 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sbt66_89a2ab41-05c3-433e-aa02-4cc276fe349a/extract-content/0.log" Dec 06 10:40:33 crc kubenswrapper[4954]: I1206 10:40:33.456187 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rthwf_8fbbc4c6-c6da-42af-b612-d7febe934e94/extract-utilities/0.log" Dec 06 10:40:33 crc kubenswrapper[4954]: I1206 10:40:33.862006 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rthwf_8fbbc4c6-c6da-42af-b612-d7febe934e94/extract-content/0.log" Dec 06 10:40:33 crc kubenswrapper[4954]: I1206 10:40:33.891136 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rthwf_8fbbc4c6-c6da-42af-b612-d7febe934e94/extract-utilities/0.log" Dec 06 10:40:34 crc kubenswrapper[4954]: I1206 10:40:34.136030 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rthwf_8fbbc4c6-c6da-42af-b612-d7febe934e94/extract-content/0.log" Dec 06 10:40:34 crc kubenswrapper[4954]: I1206 10:40:34.344446 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rthwf_8fbbc4c6-c6da-42af-b612-d7febe934e94/extract-utilities/0.log" Dec 06 10:40:34 crc kubenswrapper[4954]: I1206 10:40:34.458791 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rthwf_8fbbc4c6-c6da-42af-b612-d7febe934e94/extract-content/0.log" Dec 06 10:40:34 crc kubenswrapper[4954]: I1206 10:40:34.714114 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-p4n6q_61c42882-513e-47f2-8faf-a95e4b0126f6/marketplace-operator/0.log" Dec 06 10:40:34 crc kubenswrapper[4954]: I1206 10:40:34.896005 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76xzd_2f2fbf41-1df2-47da-96e7-84c6c4514646/extract-utilities/0.log" Dec 06 10:40:35 crc kubenswrapper[4954]: I1206 10:40:35.177857 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76xzd_2f2fbf41-1df2-47da-96e7-84c6c4514646/extract-content/0.log" Dec 06 10:40:35 crc kubenswrapper[4954]: I1206 10:40:35.216324 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76xzd_2f2fbf41-1df2-47da-96e7-84c6c4514646/extract-utilities/0.log" Dec 06 10:40:35 crc kubenswrapper[4954]: I1206 10:40:35.229464 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76xzd_2f2fbf41-1df2-47da-96e7-84c6c4514646/extract-content/0.log" Dec 06 10:40:35 crc kubenswrapper[4954]: I1206 10:40:35.486901 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:40:35 crc kubenswrapper[4954]: E1206 10:40:35.487967 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:40:35 crc kubenswrapper[4954]: I1206 10:40:35.558578 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76xzd_2f2fbf41-1df2-47da-96e7-84c6c4514646/extract-utilities/0.log" Dec 06 10:40:35 crc kubenswrapper[4954]: I1206 10:40:35.632521 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76xzd_2f2fbf41-1df2-47da-96e7-84c6c4514646/extract-content/0.log" Dec 06 10:40:35 crc kubenswrapper[4954]: I1206 10:40:35.848930 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sbt66_89a2ab41-05c3-433e-aa02-4cc276fe349a/registry-server/0.log" Dec 06 10:40:35 crc kubenswrapper[4954]: I1206 10:40:35.902705 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hzqhb_6ccf6324-bb8e-416b-b705-27a3b16f01ed/extract-utilities/0.log" Dec 06 10:40:36 crc kubenswrapper[4954]: I1206 10:40:36.425486 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hzqhb_6ccf6324-bb8e-416b-b705-27a3b16f01ed/extract-utilities/0.log" Dec 06 10:40:36 crc kubenswrapper[4954]: I1206 10:40:36.465222 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76xzd_2f2fbf41-1df2-47da-96e7-84c6c4514646/registry-server/0.log" Dec 06 10:40:36 crc kubenswrapper[4954]: I1206 10:40:36.528299 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hzqhb_6ccf6324-bb8e-416b-b705-27a3b16f01ed/extract-content/0.log" Dec 06 10:40:36 crc kubenswrapper[4954]: I1206 10:40:36.596052 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hzqhb_6ccf6324-bb8e-416b-b705-27a3b16f01ed/extract-content/0.log" Dec 06 10:40:36 crc kubenswrapper[4954]: I1206 10:40:36.726936 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hzqhb_6ccf6324-bb8e-416b-b705-27a3b16f01ed/extract-utilities/0.log" Dec 06 10:40:36 crc kubenswrapper[4954]: I1206 10:40:36.891969 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hzqhb_6ccf6324-bb8e-416b-b705-27a3b16f01ed/extract-content/0.log" Dec 06 10:40:37 crc kubenswrapper[4954]: I1206 10:40:37.279089 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rthwf_8fbbc4c6-c6da-42af-b612-d7febe934e94/registry-server/0.log" Dec 06 10:40:38 crc kubenswrapper[4954]: I1206 10:40:38.443290 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hzqhb_6ccf6324-bb8e-416b-b705-27a3b16f01ed/registry-server/0.log" Dec 06 10:40:41 crc kubenswrapper[4954]: I1206 10:40:41.804157 4954 scope.go:117] "RemoveContainer" containerID="d10aafe02940ced5843237f6d252bfca0a86644203c9447ed7c754a7aea0b381" Dec 06 10:40:46 crc kubenswrapper[4954]: I1206 10:40:46.444469 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:40:46 crc kubenswrapper[4954]: E1206 10:40:46.445927 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:40:51 crc kubenswrapper[4954]: I1206 10:40:51.658204 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-wdm7s_c5a70aef-4a54-42be-9bf4-1832a3497ac5/prometheus-operator/0.log" Dec 06 10:40:51 crc kubenswrapper[4954]: I1206 10:40:51.781275 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7ff4d4474f-f8lph_fb1ff733-3aca-4019-8535-ce5a462c6099/prometheus-operator-admission-webhook/0.log" Dec 06 10:40:52 crc kubenswrapper[4954]: I1206 10:40:52.056736 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7ff4d4474f-prltb_fe9110b5-92c9-4f7a-8f19-e8eef8fd7040/prometheus-operator-admission-webhook/0.log" Dec 06 10:40:52 crc kubenswrapper[4954]: I1206 10:40:52.070691 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-bppch_acee39da-b368-4329-921b-8051f2493627/operator/0.log" Dec 06 10:40:52 crc kubenswrapper[4954]: I1206 10:40:52.529129 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-fmvfc_4105a6d8-9e43-435c-b69f-de2f10d97eea/perses-operator/0.log" Dec 06 10:40:59 crc kubenswrapper[4954]: I1206 10:40:59.443491 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:40:59 crc kubenswrapper[4954]: E1206 10:40:59.444183 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:41:13 crc kubenswrapper[4954]: I1206 10:41:13.443752 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:41:13 crc kubenswrapper[4954]: E1206 10:41:13.445385 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:41:25 crc kubenswrapper[4954]: I1206 10:41:25.457214 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:41:25 crc kubenswrapper[4954]: E1206 10:41:25.458045 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:41:36 crc kubenswrapper[4954]: I1206 10:41:36.443790 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:41:36 crc kubenswrapper[4954]: E1206 10:41:36.444545 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:41:48 crc kubenswrapper[4954]: I1206 10:41:48.444755 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:41:48 crc kubenswrapper[4954]: E1206 10:41:48.445365 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:41:59 crc kubenswrapper[4954]: I1206 10:41:59.444199 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:41:59 crc kubenswrapper[4954]: E1206 10:41:59.444859 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:42:13 crc kubenswrapper[4954]: I1206 10:42:13.443134 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:42:13 crc kubenswrapper[4954]: E1206 10:42:13.443910 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:42:25 crc kubenswrapper[4954]: I1206 10:42:25.454010 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:42:25 crc kubenswrapper[4954]: E1206 10:42:25.455117 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:42:40 crc kubenswrapper[4954]: I1206 10:42:40.444700 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:42:42 crc kubenswrapper[4954]: I1206 10:42:42.059207 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"1cb4e4c868fdbb8d7f45835a42881eb0b59cf12b98813a43b4488157840240ad"} Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.672702 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xrr62"] Dec 06 10:44:24 crc kubenswrapper[4954]: E1206 10:44:24.673836 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549a9acc-8ef7-4188-aa1c-1f616e54c4be" containerName="extract-utilities" Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.673856 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="549a9acc-8ef7-4188-aa1c-1f616e54c4be" containerName="extract-utilities" Dec 06 10:44:24 crc kubenswrapper[4954]: E1206 10:44:24.673889 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549a9acc-8ef7-4188-aa1c-1f616e54c4be" containerName="registry-server" Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.673896 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="549a9acc-8ef7-4188-aa1c-1f616e54c4be" containerName="registry-server" Dec 06 10:44:24 crc kubenswrapper[4954]: E1206 10:44:24.673920 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="879ab57c-2611-4c25-be84-ab404c500a86" containerName="registry-server" Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.673926 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="879ab57c-2611-4c25-be84-ab404c500a86" containerName="registry-server" Dec 06 10:44:24 crc kubenswrapper[4954]: E1206 10:44:24.673938 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="879ab57c-2611-4c25-be84-ab404c500a86" containerName="extract-utilities" Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.673944 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="879ab57c-2611-4c25-be84-ab404c500a86" containerName="extract-utilities" Dec 06 10:44:24 crc kubenswrapper[4954]: E1206 10:44:24.673957 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="879ab57c-2611-4c25-be84-ab404c500a86" containerName="extract-content" Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.673963 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="879ab57c-2611-4c25-be84-ab404c500a86" containerName="extract-content" Dec 06 10:44:24 crc kubenswrapper[4954]: E1206 10:44:24.673979 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549a9acc-8ef7-4188-aa1c-1f616e54c4be" containerName="extract-content" Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.673984 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="549a9acc-8ef7-4188-aa1c-1f616e54c4be" containerName="extract-content" Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.674188 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="879ab57c-2611-4c25-be84-ab404c500a86" containerName="registry-server" Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.674225 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="549a9acc-8ef7-4188-aa1c-1f616e54c4be" containerName="registry-server" Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.676753 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xrr62" Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.684593 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xrr62"] Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.832088 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faeb5445-6a23-44a9-ad71-21c67d705519-catalog-content\") pod \"certified-operators-xrr62\" (UID: \"faeb5445-6a23-44a9-ad71-21c67d705519\") " pod="openshift-marketplace/certified-operators-xrr62" Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.832144 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smrlp\" (UniqueName: \"kubernetes.io/projected/faeb5445-6a23-44a9-ad71-21c67d705519-kube-api-access-smrlp\") pod \"certified-operators-xrr62\" (UID: \"faeb5445-6a23-44a9-ad71-21c67d705519\") " pod="openshift-marketplace/certified-operators-xrr62" Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.832193 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faeb5445-6a23-44a9-ad71-21c67d705519-utilities\") pod \"certified-operators-xrr62\" (UID: \"faeb5445-6a23-44a9-ad71-21c67d705519\") " pod="openshift-marketplace/certified-operators-xrr62" Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.933918 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faeb5445-6a23-44a9-ad71-21c67d705519-catalog-content\") pod \"certified-operators-xrr62\" (UID: \"faeb5445-6a23-44a9-ad71-21c67d705519\") " pod="openshift-marketplace/certified-operators-xrr62" Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.933976 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smrlp\" (UniqueName: \"kubernetes.io/projected/faeb5445-6a23-44a9-ad71-21c67d705519-kube-api-access-smrlp\") pod \"certified-operators-xrr62\" (UID: \"faeb5445-6a23-44a9-ad71-21c67d705519\") " pod="openshift-marketplace/certified-operators-xrr62" Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.934026 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faeb5445-6a23-44a9-ad71-21c67d705519-utilities\") pod \"certified-operators-xrr62\" (UID: \"faeb5445-6a23-44a9-ad71-21c67d705519\") " pod="openshift-marketplace/certified-operators-xrr62" Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.934585 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faeb5445-6a23-44a9-ad71-21c67d705519-catalog-content\") pod \"certified-operators-xrr62\" (UID: \"faeb5445-6a23-44a9-ad71-21c67d705519\") " pod="openshift-marketplace/certified-operators-xrr62" Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.934626 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faeb5445-6a23-44a9-ad71-21c67d705519-utilities\") pod \"certified-operators-xrr62\" (UID: \"faeb5445-6a23-44a9-ad71-21c67d705519\") " pod="openshift-marketplace/certified-operators-xrr62" Dec 06 10:44:24 crc kubenswrapper[4954]: I1206 10:44:24.953353 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smrlp\" (UniqueName: \"kubernetes.io/projected/faeb5445-6a23-44a9-ad71-21c67d705519-kube-api-access-smrlp\") pod \"certified-operators-xrr62\" (UID: \"faeb5445-6a23-44a9-ad71-21c67d705519\") " pod="openshift-marketplace/certified-operators-xrr62" Dec 06 10:44:25 crc kubenswrapper[4954]: I1206 10:44:25.051991 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xrr62" Dec 06 10:44:26 crc kubenswrapper[4954]: I1206 10:44:26.630285 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xrr62"] Dec 06 10:44:27 crc kubenswrapper[4954]: I1206 10:44:27.204195 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrr62" event={"ID":"faeb5445-6a23-44a9-ad71-21c67d705519","Type":"ContainerStarted","Data":"e29cc4e0f8aaf0225aa9bac70f850e616d479e676181112b5761cf9dfd903366"} Dec 06 10:44:27 crc kubenswrapper[4954]: I1206 10:44:27.204609 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrr62" event={"ID":"faeb5445-6a23-44a9-ad71-21c67d705519","Type":"ContainerStarted","Data":"ee25b213cc6d019f7998cd4bb797c6eace109e416cb0826c0c1c96f953062bd5"} Dec 06 10:44:28 crc kubenswrapper[4954]: I1206 10:44:28.216712 4954 generic.go:334] "Generic (PLEG): container finished" podID="faeb5445-6a23-44a9-ad71-21c67d705519" containerID="e29cc4e0f8aaf0225aa9bac70f850e616d479e676181112b5761cf9dfd903366" exitCode=0 Dec 06 10:44:28 crc kubenswrapper[4954]: I1206 10:44:28.216985 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrr62" event={"ID":"faeb5445-6a23-44a9-ad71-21c67d705519","Type":"ContainerDied","Data":"e29cc4e0f8aaf0225aa9bac70f850e616d479e676181112b5761cf9dfd903366"} Dec 06 10:44:28 crc kubenswrapper[4954]: I1206 10:44:28.219076 4954 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 10:44:30 crc kubenswrapper[4954]: I1206 10:44:30.236639 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrr62" event={"ID":"faeb5445-6a23-44a9-ad71-21c67d705519","Type":"ContainerStarted","Data":"2784e017f4be92f4918962573de3f10adfa3418460e36473df0146652dd3c3cc"} Dec 06 10:44:32 crc kubenswrapper[4954]: I1206 10:44:32.262937 4954 patch_prober.go:28] interesting pod/console-operator-58897d9998-7p9tp container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 10:44:32 crc kubenswrapper[4954]: I1206 10:44:32.263042 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-7p9tp" podUID="9078eab8-cd16-404e-a8e6-e02c60ddfe16" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 10:44:33 crc kubenswrapper[4954]: I1206 10:44:33.855011 4954 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4x2sh container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 10:44:33 crc kubenswrapper[4954]: I1206 10:44:33.855460 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4x2sh" podUID="e48f6ba2-ebc4-45d0-bc53-a6f50cf31a67" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 10:44:33 crc kubenswrapper[4954]: I1206 10:44:33.902598 4954 trace.go:236] Trace[1857718253]: "Calculate volume metrics of swift for pod openstack/swift-storage-2" (06-Dec-2025 10:44:30.354) (total time: 3529ms): Dec 06 10:44:33 crc kubenswrapper[4954]: Trace[1857718253]: [3.529746348s] [3.529746348s] END Dec 06 10:44:36 crc kubenswrapper[4954]: I1206 10:44:36.941481 4954 generic.go:334] "Generic (PLEG): container finished" podID="faeb5445-6a23-44a9-ad71-21c67d705519" containerID="2784e017f4be92f4918962573de3f10adfa3418460e36473df0146652dd3c3cc" exitCode=0 Dec 06 10:44:36 crc kubenswrapper[4954]: I1206 10:44:36.941588 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrr62" event={"ID":"faeb5445-6a23-44a9-ad71-21c67d705519","Type":"ContainerDied","Data":"2784e017f4be92f4918962573de3f10adfa3418460e36473df0146652dd3c3cc"} Dec 06 10:44:41 crc kubenswrapper[4954]: I1206 10:44:41.998311 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrr62" event={"ID":"faeb5445-6a23-44a9-ad71-21c67d705519","Type":"ContainerStarted","Data":"ee680dc903ea08a63cffc271a83ddb2600d634371a0464b1acf975b60e5d2fe4"} Dec 06 10:44:42 crc kubenswrapper[4954]: I1206 10:44:42.016853 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xrr62" podStartSLOduration=6.395533624 podStartE2EDuration="18.016836057s" podCreationTimestamp="2025-12-06 10:44:24 +0000 UTC" firstStartedPulling="2025-12-06 10:44:28.21883784 +0000 UTC m=+13643.032197229" lastFinishedPulling="2025-12-06 10:44:39.840140273 +0000 UTC m=+13654.653499662" observedRunningTime="2025-12-06 10:44:42.014016732 +0000 UTC m=+13656.827376121" watchObservedRunningTime="2025-12-06 10:44:42.016836057 +0000 UTC m=+13656.830195436" Dec 06 10:44:45 crc kubenswrapper[4954]: I1206 10:44:45.053233 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xrr62" Dec 06 10:44:45 crc kubenswrapper[4954]: I1206 10:44:45.053592 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xrr62" Dec 06 10:44:46 crc kubenswrapper[4954]: I1206 10:44:46.138003 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-xrr62" podUID="faeb5445-6a23-44a9-ad71-21c67d705519" containerName="registry-server" probeResult="failure" output=< Dec 06 10:44:46 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 10:44:46 crc kubenswrapper[4954]: > Dec 06 10:44:55 crc kubenswrapper[4954]: I1206 10:44:55.106231 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xrr62" Dec 06 10:44:55 crc kubenswrapper[4954]: I1206 10:44:55.182133 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xrr62" Dec 06 10:44:56 crc kubenswrapper[4954]: I1206 10:44:56.257582 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xrr62"] Dec 06 10:44:56 crc kubenswrapper[4954]: I1206 10:44:56.258115 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xrr62" podUID="faeb5445-6a23-44a9-ad71-21c67d705519" containerName="registry-server" containerID="cri-o://ee680dc903ea08a63cffc271a83ddb2600d634371a0464b1acf975b60e5d2fe4" gracePeriod=2 Dec 06 10:44:57 crc kubenswrapper[4954]: I1206 10:44:57.147295 4954 generic.go:334] "Generic (PLEG): container finished" podID="faeb5445-6a23-44a9-ad71-21c67d705519" containerID="ee680dc903ea08a63cffc271a83ddb2600d634371a0464b1acf975b60e5d2fe4" exitCode=0 Dec 06 10:44:57 crc kubenswrapper[4954]: I1206 10:44:57.147324 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrr62" event={"ID":"faeb5445-6a23-44a9-ad71-21c67d705519","Type":"ContainerDied","Data":"ee680dc903ea08a63cffc271a83ddb2600d634371a0464b1acf975b60e5d2fe4"} Dec 06 10:44:58 crc kubenswrapper[4954]: I1206 10:44:58.385258 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xrr62" Dec 06 10:44:58 crc kubenswrapper[4954]: I1206 10:44:58.558522 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faeb5445-6a23-44a9-ad71-21c67d705519-utilities\") pod \"faeb5445-6a23-44a9-ad71-21c67d705519\" (UID: \"faeb5445-6a23-44a9-ad71-21c67d705519\") " Dec 06 10:44:58 crc kubenswrapper[4954]: I1206 10:44:58.558809 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faeb5445-6a23-44a9-ad71-21c67d705519-catalog-content\") pod \"faeb5445-6a23-44a9-ad71-21c67d705519\" (UID: \"faeb5445-6a23-44a9-ad71-21c67d705519\") " Dec 06 10:44:58 crc kubenswrapper[4954]: I1206 10:44:58.558873 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smrlp\" (UniqueName: \"kubernetes.io/projected/faeb5445-6a23-44a9-ad71-21c67d705519-kube-api-access-smrlp\") pod \"faeb5445-6a23-44a9-ad71-21c67d705519\" (UID: \"faeb5445-6a23-44a9-ad71-21c67d705519\") " Dec 06 10:44:58 crc kubenswrapper[4954]: I1206 10:44:58.559266 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faeb5445-6a23-44a9-ad71-21c67d705519-utilities" (OuterVolumeSpecName: "utilities") pod "faeb5445-6a23-44a9-ad71-21c67d705519" (UID: "faeb5445-6a23-44a9-ad71-21c67d705519"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:44:58 crc kubenswrapper[4954]: I1206 10:44:58.559887 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faeb5445-6a23-44a9-ad71-21c67d705519-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:44:58 crc kubenswrapper[4954]: I1206 10:44:58.565420 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faeb5445-6a23-44a9-ad71-21c67d705519-kube-api-access-smrlp" (OuterVolumeSpecName: "kube-api-access-smrlp") pod "faeb5445-6a23-44a9-ad71-21c67d705519" (UID: "faeb5445-6a23-44a9-ad71-21c67d705519"). InnerVolumeSpecName "kube-api-access-smrlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:44:58 crc kubenswrapper[4954]: I1206 10:44:58.616054 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faeb5445-6a23-44a9-ad71-21c67d705519-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "faeb5445-6a23-44a9-ad71-21c67d705519" (UID: "faeb5445-6a23-44a9-ad71-21c67d705519"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:44:58 crc kubenswrapper[4954]: I1206 10:44:58.662287 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faeb5445-6a23-44a9-ad71-21c67d705519-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:44:58 crc kubenswrapper[4954]: I1206 10:44:58.662322 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smrlp\" (UniqueName: \"kubernetes.io/projected/faeb5445-6a23-44a9-ad71-21c67d705519-kube-api-access-smrlp\") on node \"crc\" DevicePath \"\"" Dec 06 10:44:59 crc kubenswrapper[4954]: I1206 10:44:59.171479 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrr62" event={"ID":"faeb5445-6a23-44a9-ad71-21c67d705519","Type":"ContainerDied","Data":"ee25b213cc6d019f7998cd4bb797c6eace109e416cb0826c0c1c96f953062bd5"} Dec 06 10:44:59 crc kubenswrapper[4954]: I1206 10:44:59.171528 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xrr62" Dec 06 10:44:59 crc kubenswrapper[4954]: I1206 10:44:59.171589 4954 scope.go:117] "RemoveContainer" containerID="ee680dc903ea08a63cffc271a83ddb2600d634371a0464b1acf975b60e5d2fe4" Dec 06 10:44:59 crc kubenswrapper[4954]: I1206 10:44:59.211544 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xrr62"] Dec 06 10:44:59 crc kubenswrapper[4954]: I1206 10:44:59.224634 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xrr62"] Dec 06 10:44:59 crc kubenswrapper[4954]: I1206 10:44:59.228140 4954 scope.go:117] "RemoveContainer" containerID="2784e017f4be92f4918962573de3f10adfa3418460e36473df0146652dd3c3cc" Dec 06 10:44:59 crc kubenswrapper[4954]: I1206 10:44:59.256866 4954 scope.go:117] "RemoveContainer" containerID="e29cc4e0f8aaf0225aa9bac70f850e616d479e676181112b5761cf9dfd903366" Dec 06 10:44:59 crc kubenswrapper[4954]: I1206 10:44:59.455409 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faeb5445-6a23-44a9-ad71-21c67d705519" path="/var/lib/kubelet/pods/faeb5445-6a23-44a9-ad71-21c67d705519/volumes" Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.172686 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv"] Dec 06 10:45:00 crc kubenswrapper[4954]: E1206 10:45:00.173245 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faeb5445-6a23-44a9-ad71-21c67d705519" containerName="registry-server" Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.173267 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="faeb5445-6a23-44a9-ad71-21c67d705519" containerName="registry-server" Dec 06 10:45:00 crc kubenswrapper[4954]: E1206 10:45:00.173289 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faeb5445-6a23-44a9-ad71-21c67d705519" containerName="extract-content" Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.173298 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="faeb5445-6a23-44a9-ad71-21c67d705519" containerName="extract-content" Dec 06 10:45:00 crc kubenswrapper[4954]: E1206 10:45:00.173351 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faeb5445-6a23-44a9-ad71-21c67d705519" containerName="extract-utilities" Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.173362 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="faeb5445-6a23-44a9-ad71-21c67d705519" containerName="extract-utilities" Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.173679 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="faeb5445-6a23-44a9-ad71-21c67d705519" containerName="registry-server" Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.174769 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv" Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.178190 4954 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.179225 4954 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.197554 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv"] Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.296386 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wmlq\" (UniqueName: \"kubernetes.io/projected/9a70bea5-fe2f-41d1-a967-e386c2734e81-kube-api-access-8wmlq\") pod \"collect-profiles-29416965-6hkqv\" (UID: \"9a70bea5-fe2f-41d1-a967-e386c2734e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv" Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.296530 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a70bea5-fe2f-41d1-a967-e386c2734e81-config-volume\") pod \"collect-profiles-29416965-6hkqv\" (UID: \"9a70bea5-fe2f-41d1-a967-e386c2734e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv" Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.297072 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a70bea5-fe2f-41d1-a967-e386c2734e81-secret-volume\") pod \"collect-profiles-29416965-6hkqv\" (UID: \"9a70bea5-fe2f-41d1-a967-e386c2734e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv" Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.399526 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a70bea5-fe2f-41d1-a967-e386c2734e81-config-volume\") pod \"collect-profiles-29416965-6hkqv\" (UID: \"9a70bea5-fe2f-41d1-a967-e386c2734e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv" Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.400070 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a70bea5-fe2f-41d1-a967-e386c2734e81-secret-volume\") pod \"collect-profiles-29416965-6hkqv\" (UID: \"9a70bea5-fe2f-41d1-a967-e386c2734e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv" Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.400125 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wmlq\" (UniqueName: \"kubernetes.io/projected/9a70bea5-fe2f-41d1-a967-e386c2734e81-kube-api-access-8wmlq\") pod \"collect-profiles-29416965-6hkqv\" (UID: \"9a70bea5-fe2f-41d1-a967-e386c2734e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv" Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.400853 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a70bea5-fe2f-41d1-a967-e386c2734e81-config-volume\") pod \"collect-profiles-29416965-6hkqv\" (UID: \"9a70bea5-fe2f-41d1-a967-e386c2734e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv" Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.406820 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a70bea5-fe2f-41d1-a967-e386c2734e81-secret-volume\") pod \"collect-profiles-29416965-6hkqv\" (UID: \"9a70bea5-fe2f-41d1-a967-e386c2734e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv" Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.417301 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wmlq\" (UniqueName: \"kubernetes.io/projected/9a70bea5-fe2f-41d1-a967-e386c2734e81-kube-api-access-8wmlq\") pod \"collect-profiles-29416965-6hkqv\" (UID: \"9a70bea5-fe2f-41d1-a967-e386c2734e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv" Dec 06 10:45:00 crc kubenswrapper[4954]: I1206 10:45:00.501276 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv" Dec 06 10:45:01 crc kubenswrapper[4954]: I1206 10:45:01.238936 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv"] Dec 06 10:45:02 crc kubenswrapper[4954]: I1206 10:45:02.212244 4954 generic.go:334] "Generic (PLEG): container finished" podID="9a70bea5-fe2f-41d1-a967-e386c2734e81" containerID="f08f6c38439286c56fd91907bb291f6b25d70817b40859cb6422818d9db5904d" exitCode=0 Dec 06 10:45:02 crc kubenswrapper[4954]: I1206 10:45:02.212291 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv" event={"ID":"9a70bea5-fe2f-41d1-a967-e386c2734e81","Type":"ContainerDied","Data":"f08f6c38439286c56fd91907bb291f6b25d70817b40859cb6422818d9db5904d"} Dec 06 10:45:02 crc kubenswrapper[4954]: I1206 10:45:02.212318 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv" event={"ID":"9a70bea5-fe2f-41d1-a967-e386c2734e81","Type":"ContainerStarted","Data":"93d7bf4e4c905d96cac7937009bbf8c1d5fd31387faf78d21229183ad1366d64"} Dec 06 10:45:06 crc kubenswrapper[4954]: I1206 10:45:05.030986 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv" Dec 06 10:45:06 crc kubenswrapper[4954]: I1206 10:45:05.112575 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a70bea5-fe2f-41d1-a967-e386c2734e81-config-volume\") pod \"9a70bea5-fe2f-41d1-a967-e386c2734e81\" (UID: \"9a70bea5-fe2f-41d1-a967-e386c2734e81\") " Dec 06 10:45:06 crc kubenswrapper[4954]: I1206 10:45:05.112734 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a70bea5-fe2f-41d1-a967-e386c2734e81-secret-volume\") pod \"9a70bea5-fe2f-41d1-a967-e386c2734e81\" (UID: \"9a70bea5-fe2f-41d1-a967-e386c2734e81\") " Dec 06 10:45:06 crc kubenswrapper[4954]: I1206 10:45:05.112774 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wmlq\" (UniqueName: \"kubernetes.io/projected/9a70bea5-fe2f-41d1-a967-e386c2734e81-kube-api-access-8wmlq\") pod \"9a70bea5-fe2f-41d1-a967-e386c2734e81\" (UID: \"9a70bea5-fe2f-41d1-a967-e386c2734e81\") " Dec 06 10:45:06 crc kubenswrapper[4954]: I1206 10:45:05.113424 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a70bea5-fe2f-41d1-a967-e386c2734e81-config-volume" (OuterVolumeSpecName: "config-volume") pod "9a70bea5-fe2f-41d1-a967-e386c2734e81" (UID: "9a70bea5-fe2f-41d1-a967-e386c2734e81"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 10:45:06 crc kubenswrapper[4954]: I1206 10:45:05.119410 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a70bea5-fe2f-41d1-a967-e386c2734e81-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9a70bea5-fe2f-41d1-a967-e386c2734e81" (UID: "9a70bea5-fe2f-41d1-a967-e386c2734e81"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 10:45:06 crc kubenswrapper[4954]: I1206 10:45:05.120320 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a70bea5-fe2f-41d1-a967-e386c2734e81-kube-api-access-8wmlq" (OuterVolumeSpecName: "kube-api-access-8wmlq") pod "9a70bea5-fe2f-41d1-a967-e386c2734e81" (UID: "9a70bea5-fe2f-41d1-a967-e386c2734e81"). InnerVolumeSpecName "kube-api-access-8wmlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:45:06 crc kubenswrapper[4954]: I1206 10:45:05.215171 4954 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a70bea5-fe2f-41d1-a967-e386c2734e81-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 10:45:06 crc kubenswrapper[4954]: I1206 10:45:05.215210 4954 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a70bea5-fe2f-41d1-a967-e386c2734e81-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 10:45:06 crc kubenswrapper[4954]: I1206 10:45:05.215225 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wmlq\" (UniqueName: \"kubernetes.io/projected/9a70bea5-fe2f-41d1-a967-e386c2734e81-kube-api-access-8wmlq\") on node \"crc\" DevicePath \"\"" Dec 06 10:45:06 crc kubenswrapper[4954]: I1206 10:45:05.241132 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv" event={"ID":"9a70bea5-fe2f-41d1-a967-e386c2734e81","Type":"ContainerDied","Data":"93d7bf4e4c905d96cac7937009bbf8c1d5fd31387faf78d21229183ad1366d64"} Dec 06 10:45:06 crc kubenswrapper[4954]: I1206 10:45:05.241174 4954 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93d7bf4e4c905d96cac7937009bbf8c1d5fd31387faf78d21229183ad1366d64" Dec 06 10:45:06 crc kubenswrapper[4954]: I1206 10:45:05.241193 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416965-6hkqv" Dec 06 10:45:06 crc kubenswrapper[4954]: I1206 10:45:06.143018 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j"] Dec 06 10:45:06 crc kubenswrapper[4954]: I1206 10:45:06.166387 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416920-v2n7j"] Dec 06 10:45:07 crc kubenswrapper[4954]: I1206 10:45:07.454593 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bda2e2be-14f5-448f-994e-6a7c0e9a43a1" path="/var/lib/kubelet/pods/bda2e2be-14f5-448f-994e-6a7c0e9a43a1/volumes" Dec 06 10:45:10 crc kubenswrapper[4954]: I1206 10:45:10.100922 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:45:10 crc kubenswrapper[4954]: I1206 10:45:10.101391 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:45:40 crc kubenswrapper[4954]: I1206 10:45:40.100910 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:45:40 crc kubenswrapper[4954]: I1206 10:45:40.101346 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:45:41 crc kubenswrapper[4954]: I1206 10:45:41.973747 4954 scope.go:117] "RemoveContainer" containerID="bdc87d53b14d2e5ba2258f705b75e6e7ec2c804645b37c8dab7631c7d23994c4" Dec 06 10:45:50 crc kubenswrapper[4954]: I1206 10:45:50.291972 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xj59d"] Dec 06 10:45:50 crc kubenswrapper[4954]: E1206 10:45:50.293144 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a70bea5-fe2f-41d1-a967-e386c2734e81" containerName="collect-profiles" Dec 06 10:45:50 crc kubenswrapper[4954]: I1206 10:45:50.293167 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a70bea5-fe2f-41d1-a967-e386c2734e81" containerName="collect-profiles" Dec 06 10:45:50 crc kubenswrapper[4954]: I1206 10:45:50.293541 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a70bea5-fe2f-41d1-a967-e386c2734e81" containerName="collect-profiles" Dec 06 10:45:50 crc kubenswrapper[4954]: I1206 10:45:50.303256 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xj59d" Dec 06 10:45:50 crc kubenswrapper[4954]: I1206 10:45:50.321213 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xj59d"] Dec 06 10:45:50 crc kubenswrapper[4954]: I1206 10:45:50.402603 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dd77\" (UniqueName: \"kubernetes.io/projected/f1fe9a36-080b-47c6-9332-ac1a1eb25e18-kube-api-access-7dd77\") pod \"community-operators-xj59d\" (UID: \"f1fe9a36-080b-47c6-9332-ac1a1eb25e18\") " pod="openshift-marketplace/community-operators-xj59d" Dec 06 10:45:50 crc kubenswrapper[4954]: I1206 10:45:50.402765 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1fe9a36-080b-47c6-9332-ac1a1eb25e18-utilities\") pod \"community-operators-xj59d\" (UID: \"f1fe9a36-080b-47c6-9332-ac1a1eb25e18\") " pod="openshift-marketplace/community-operators-xj59d" Dec 06 10:45:50 crc kubenswrapper[4954]: I1206 10:45:50.403891 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1fe9a36-080b-47c6-9332-ac1a1eb25e18-catalog-content\") pod \"community-operators-xj59d\" (UID: \"f1fe9a36-080b-47c6-9332-ac1a1eb25e18\") " pod="openshift-marketplace/community-operators-xj59d" Dec 06 10:45:50 crc kubenswrapper[4954]: I1206 10:45:50.506294 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1fe9a36-080b-47c6-9332-ac1a1eb25e18-catalog-content\") pod \"community-operators-xj59d\" (UID: \"f1fe9a36-080b-47c6-9332-ac1a1eb25e18\") " pod="openshift-marketplace/community-operators-xj59d" Dec 06 10:45:50 crc kubenswrapper[4954]: I1206 10:45:50.506403 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dd77\" (UniqueName: \"kubernetes.io/projected/f1fe9a36-080b-47c6-9332-ac1a1eb25e18-kube-api-access-7dd77\") pod \"community-operators-xj59d\" (UID: \"f1fe9a36-080b-47c6-9332-ac1a1eb25e18\") " pod="openshift-marketplace/community-operators-xj59d" Dec 06 10:45:50 crc kubenswrapper[4954]: I1206 10:45:50.506495 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1fe9a36-080b-47c6-9332-ac1a1eb25e18-utilities\") pod \"community-operators-xj59d\" (UID: \"f1fe9a36-080b-47c6-9332-ac1a1eb25e18\") " pod="openshift-marketplace/community-operators-xj59d" Dec 06 10:45:50 crc kubenswrapper[4954]: I1206 10:45:50.507161 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1fe9a36-080b-47c6-9332-ac1a1eb25e18-catalog-content\") pod \"community-operators-xj59d\" (UID: \"f1fe9a36-080b-47c6-9332-ac1a1eb25e18\") " pod="openshift-marketplace/community-operators-xj59d" Dec 06 10:45:50 crc kubenswrapper[4954]: I1206 10:45:50.507240 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1fe9a36-080b-47c6-9332-ac1a1eb25e18-utilities\") pod \"community-operators-xj59d\" (UID: \"f1fe9a36-080b-47c6-9332-ac1a1eb25e18\") " pod="openshift-marketplace/community-operators-xj59d" Dec 06 10:45:50 crc kubenswrapper[4954]: I1206 10:45:50.550544 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dd77\" (UniqueName: \"kubernetes.io/projected/f1fe9a36-080b-47c6-9332-ac1a1eb25e18-kube-api-access-7dd77\") pod \"community-operators-xj59d\" (UID: \"f1fe9a36-080b-47c6-9332-ac1a1eb25e18\") " pod="openshift-marketplace/community-operators-xj59d" Dec 06 10:45:50 crc kubenswrapper[4954]: I1206 10:45:50.636413 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xj59d" Dec 06 10:45:51 crc kubenswrapper[4954]: I1206 10:45:51.420180 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xj59d"] Dec 06 10:45:51 crc kubenswrapper[4954]: I1206 10:45:51.770555 4954 generic.go:334] "Generic (PLEG): container finished" podID="f1fe9a36-080b-47c6-9332-ac1a1eb25e18" containerID="afeb2d1dbe6b5a73537796ab36dc9d1582977ef2815e973c268036c3d4159a57" exitCode=0 Dec 06 10:45:51 crc kubenswrapper[4954]: I1206 10:45:51.770702 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xj59d" event={"ID":"f1fe9a36-080b-47c6-9332-ac1a1eb25e18","Type":"ContainerDied","Data":"afeb2d1dbe6b5a73537796ab36dc9d1582977ef2815e973c268036c3d4159a57"} Dec 06 10:45:51 crc kubenswrapper[4954]: I1206 10:45:51.770970 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xj59d" event={"ID":"f1fe9a36-080b-47c6-9332-ac1a1eb25e18","Type":"ContainerStarted","Data":"f908d376a59b574fe26724b6ebf8b616d6205d8c04ce37d5f94864bcfc46fa85"} Dec 06 10:45:57 crc kubenswrapper[4954]: I1206 10:45:57.838937 4954 generic.go:334] "Generic (PLEG): container finished" podID="c6ef6020-8a06-4945-9286-ed7dec7f863b" containerID="ff4059269113fb66ba8e9a7d4645cc6e30a0491ae5d9f45a05d69c6ed25c26b1" exitCode=0 Dec 06 10:45:57 crc kubenswrapper[4954]: I1206 10:45:57.839055 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kc95s/must-gather-xz68n" event={"ID":"c6ef6020-8a06-4945-9286-ed7dec7f863b","Type":"ContainerDied","Data":"ff4059269113fb66ba8e9a7d4645cc6e30a0491ae5d9f45a05d69c6ed25c26b1"} Dec 06 10:45:57 crc kubenswrapper[4954]: I1206 10:45:57.840151 4954 scope.go:117] "RemoveContainer" containerID="ff4059269113fb66ba8e9a7d4645cc6e30a0491ae5d9f45a05d69c6ed25c26b1" Dec 06 10:45:58 crc kubenswrapper[4954]: I1206 10:45:58.447534 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kc95s_must-gather-xz68n_c6ef6020-8a06-4945-9286-ed7dec7f863b/gather/0.log" Dec 06 10:45:59 crc kubenswrapper[4954]: I1206 10:45:59.869665 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xj59d" event={"ID":"f1fe9a36-080b-47c6-9332-ac1a1eb25e18","Type":"ContainerStarted","Data":"23800d71b76fdcae13ac0d78032490bf3687b2cad91ebe7924a34a4d8bb21e85"} Dec 06 10:46:01 crc kubenswrapper[4954]: I1206 10:46:01.891440 4954 generic.go:334] "Generic (PLEG): container finished" podID="f1fe9a36-080b-47c6-9332-ac1a1eb25e18" containerID="23800d71b76fdcae13ac0d78032490bf3687b2cad91ebe7924a34a4d8bb21e85" exitCode=0 Dec 06 10:46:01 crc kubenswrapper[4954]: I1206 10:46:01.891494 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xj59d" event={"ID":"f1fe9a36-080b-47c6-9332-ac1a1eb25e18","Type":"ContainerDied","Data":"23800d71b76fdcae13ac0d78032490bf3687b2cad91ebe7924a34a4d8bb21e85"} Dec 06 10:46:04 crc kubenswrapper[4954]: I1206 10:46:04.935127 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xj59d" event={"ID":"f1fe9a36-080b-47c6-9332-ac1a1eb25e18","Type":"ContainerStarted","Data":"4d27c109839177fdfa1302d3be685f7de64e5c5571ef313f81873d9ae9e1670b"} Dec 06 10:46:04 crc kubenswrapper[4954]: I1206 10:46:04.957391 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xj59d" podStartSLOduration=2.9566620439999998 podStartE2EDuration="14.95737202s" podCreationTimestamp="2025-12-06 10:45:50 +0000 UTC" firstStartedPulling="2025-12-06 10:45:51.772812233 +0000 UTC m=+13726.586171622" lastFinishedPulling="2025-12-06 10:46:03.773522209 +0000 UTC m=+13738.586881598" observedRunningTime="2025-12-06 10:46:04.9558873 +0000 UTC m=+13739.769246719" watchObservedRunningTime="2025-12-06 10:46:04.95737202 +0000 UTC m=+13739.770731409" Dec 06 10:46:10 crc kubenswrapper[4954]: I1206 10:46:10.101050 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:46:10 crc kubenswrapper[4954]: I1206 10:46:10.101477 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:46:10 crc kubenswrapper[4954]: I1206 10:46:10.101527 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 10:46:10 crc kubenswrapper[4954]: I1206 10:46:10.102460 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1cb4e4c868fdbb8d7f45835a42881eb0b59cf12b98813a43b4488157840240ad"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 10:46:10 crc kubenswrapper[4954]: I1206 10:46:10.102529 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://1cb4e4c868fdbb8d7f45835a42881eb0b59cf12b98813a43b4488157840240ad" gracePeriod=600 Dec 06 10:46:10 crc kubenswrapper[4954]: I1206 10:46:10.636820 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xj59d" Dec 06 10:46:10 crc kubenswrapper[4954]: I1206 10:46:10.637134 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xj59d" Dec 06 10:46:10 crc kubenswrapper[4954]: I1206 10:46:10.696523 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xj59d" Dec 06 10:46:11 crc kubenswrapper[4954]: I1206 10:46:11.010367 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="1cb4e4c868fdbb8d7f45835a42881eb0b59cf12b98813a43b4488157840240ad" exitCode=0 Dec 06 10:46:11 crc kubenswrapper[4954]: I1206 10:46:11.011720 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"1cb4e4c868fdbb8d7f45835a42881eb0b59cf12b98813a43b4488157840240ad"} Dec 06 10:46:11 crc kubenswrapper[4954]: I1206 10:46:11.011753 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerStarted","Data":"f2cd0a92467771b4777c43b0c6e43b5db767be10fd5394f1e72bc471475f1b5c"} Dec 06 10:46:11 crc kubenswrapper[4954]: I1206 10:46:11.011770 4954 scope.go:117] "RemoveContainer" containerID="acedab5a924675d3dfa36107c79253a2bf19d288888043803e0576f67e23489e" Dec 06 10:46:11 crc kubenswrapper[4954]: I1206 10:46:11.071548 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xj59d" Dec 06 10:46:11 crc kubenswrapper[4954]: I1206 10:46:11.123494 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xj59d"] Dec 06 10:46:13 crc kubenswrapper[4954]: I1206 10:46:13.034351 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xj59d" podUID="f1fe9a36-080b-47c6-9332-ac1a1eb25e18" containerName="registry-server" containerID="cri-o://4d27c109839177fdfa1302d3be685f7de64e5c5571ef313f81873d9ae9e1670b" gracePeriod=2 Dec 06 10:46:14 crc kubenswrapper[4954]: I1206 10:46:14.048014 4954 generic.go:334] "Generic (PLEG): container finished" podID="f1fe9a36-080b-47c6-9332-ac1a1eb25e18" containerID="4d27c109839177fdfa1302d3be685f7de64e5c5571ef313f81873d9ae9e1670b" exitCode=0 Dec 06 10:46:14 crc kubenswrapper[4954]: I1206 10:46:14.048096 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xj59d" event={"ID":"f1fe9a36-080b-47c6-9332-ac1a1eb25e18","Type":"ContainerDied","Data":"4d27c109839177fdfa1302d3be685f7de64e5c5571ef313f81873d9ae9e1670b"} Dec 06 10:46:14 crc kubenswrapper[4954]: I1206 10:46:14.539381 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xj59d" Dec 06 10:46:14 crc kubenswrapper[4954]: I1206 10:46:14.718725 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1fe9a36-080b-47c6-9332-ac1a1eb25e18-catalog-content\") pod \"f1fe9a36-080b-47c6-9332-ac1a1eb25e18\" (UID: \"f1fe9a36-080b-47c6-9332-ac1a1eb25e18\") " Dec 06 10:46:14 crc kubenswrapper[4954]: I1206 10:46:14.718789 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1fe9a36-080b-47c6-9332-ac1a1eb25e18-utilities\") pod \"f1fe9a36-080b-47c6-9332-ac1a1eb25e18\" (UID: \"f1fe9a36-080b-47c6-9332-ac1a1eb25e18\") " Dec 06 10:46:14 crc kubenswrapper[4954]: I1206 10:46:14.718965 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dd77\" (UniqueName: \"kubernetes.io/projected/f1fe9a36-080b-47c6-9332-ac1a1eb25e18-kube-api-access-7dd77\") pod \"f1fe9a36-080b-47c6-9332-ac1a1eb25e18\" (UID: \"f1fe9a36-080b-47c6-9332-ac1a1eb25e18\") " Dec 06 10:46:14 crc kubenswrapper[4954]: I1206 10:46:14.720076 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1fe9a36-080b-47c6-9332-ac1a1eb25e18-utilities" (OuterVolumeSpecName: "utilities") pod "f1fe9a36-080b-47c6-9332-ac1a1eb25e18" (UID: "f1fe9a36-080b-47c6-9332-ac1a1eb25e18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:46:14 crc kubenswrapper[4954]: I1206 10:46:14.728193 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1fe9a36-080b-47c6-9332-ac1a1eb25e18-kube-api-access-7dd77" (OuterVolumeSpecName: "kube-api-access-7dd77") pod "f1fe9a36-080b-47c6-9332-ac1a1eb25e18" (UID: "f1fe9a36-080b-47c6-9332-ac1a1eb25e18"). InnerVolumeSpecName "kube-api-access-7dd77". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:46:14 crc kubenswrapper[4954]: I1206 10:46:14.772395 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1fe9a36-080b-47c6-9332-ac1a1eb25e18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1fe9a36-080b-47c6-9332-ac1a1eb25e18" (UID: "f1fe9a36-080b-47c6-9332-ac1a1eb25e18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:46:14 crc kubenswrapper[4954]: I1206 10:46:14.821529 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dd77\" (UniqueName: \"kubernetes.io/projected/f1fe9a36-080b-47c6-9332-ac1a1eb25e18-kube-api-access-7dd77\") on node \"crc\" DevicePath \"\"" Dec 06 10:46:14 crc kubenswrapper[4954]: I1206 10:46:14.821931 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1fe9a36-080b-47c6-9332-ac1a1eb25e18-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:46:14 crc kubenswrapper[4954]: I1206 10:46:14.821949 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1fe9a36-080b-47c6-9332-ac1a1eb25e18-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:46:15 crc kubenswrapper[4954]: I1206 10:46:15.060510 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xj59d" event={"ID":"f1fe9a36-080b-47c6-9332-ac1a1eb25e18","Type":"ContainerDied","Data":"f908d376a59b574fe26724b6ebf8b616d6205d8c04ce37d5f94864bcfc46fa85"} Dec 06 10:46:15 crc kubenswrapper[4954]: I1206 10:46:15.060582 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xj59d" Dec 06 10:46:15 crc kubenswrapper[4954]: I1206 10:46:15.060602 4954 scope.go:117] "RemoveContainer" containerID="4d27c109839177fdfa1302d3be685f7de64e5c5571ef313f81873d9ae9e1670b" Dec 06 10:46:15 crc kubenswrapper[4954]: I1206 10:46:15.088627 4954 scope.go:117] "RemoveContainer" containerID="23800d71b76fdcae13ac0d78032490bf3687b2cad91ebe7924a34a4d8bb21e85" Dec 06 10:46:15 crc kubenswrapper[4954]: I1206 10:46:15.105458 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xj59d"] Dec 06 10:46:15 crc kubenswrapper[4954]: I1206 10:46:15.120738 4954 scope.go:117] "RemoveContainer" containerID="afeb2d1dbe6b5a73537796ab36dc9d1582977ef2815e973c268036c3d4159a57" Dec 06 10:46:15 crc kubenswrapper[4954]: I1206 10:46:15.148046 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xj59d"] Dec 06 10:46:15 crc kubenswrapper[4954]: I1206 10:46:15.266543 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kc95s/must-gather-xz68n"] Dec 06 10:46:15 crc kubenswrapper[4954]: I1206 10:46:15.266818 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-kc95s/must-gather-xz68n" podUID="c6ef6020-8a06-4945-9286-ed7dec7f863b" containerName="copy" containerID="cri-o://7359f1c0f5e65d1baf2629d769298e5ac9ed07e3dfce69c7b414fbf11d976d88" gracePeriod=2 Dec 06 10:46:15 crc kubenswrapper[4954]: I1206 10:46:15.275702 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kc95s/must-gather-xz68n"] Dec 06 10:46:15 crc kubenswrapper[4954]: I1206 10:46:15.463032 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1fe9a36-080b-47c6-9332-ac1a1eb25e18" path="/var/lib/kubelet/pods/f1fe9a36-080b-47c6-9332-ac1a1eb25e18/volumes" Dec 06 10:46:16 crc kubenswrapper[4954]: I1206 10:46:16.074596 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kc95s_must-gather-xz68n_c6ef6020-8a06-4945-9286-ed7dec7f863b/copy/0.log" Dec 06 10:46:16 crc kubenswrapper[4954]: I1206 10:46:16.076085 4954 generic.go:334] "Generic (PLEG): container finished" podID="c6ef6020-8a06-4945-9286-ed7dec7f863b" containerID="7359f1c0f5e65d1baf2629d769298e5ac9ed07e3dfce69c7b414fbf11d976d88" exitCode=143 Dec 06 10:46:17 crc kubenswrapper[4954]: I1206 10:46:17.026889 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kc95s_must-gather-xz68n_c6ef6020-8a06-4945-9286-ed7dec7f863b/copy/0.log" Dec 06 10:46:17 crc kubenswrapper[4954]: I1206 10:46:17.027641 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kc95s/must-gather-xz68n" Dec 06 10:46:17 crc kubenswrapper[4954]: I1206 10:46:17.087949 4954 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kc95s_must-gather-xz68n_c6ef6020-8a06-4945-9286-ed7dec7f863b/copy/0.log" Dec 06 10:46:17 crc kubenswrapper[4954]: I1206 10:46:17.089177 4954 scope.go:117] "RemoveContainer" containerID="7359f1c0f5e65d1baf2629d769298e5ac9ed07e3dfce69c7b414fbf11d976d88" Dec 06 10:46:17 crc kubenswrapper[4954]: I1206 10:46:17.089252 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kc95s/must-gather-xz68n" Dec 06 10:46:17 crc kubenswrapper[4954]: I1206 10:46:17.129698 4954 scope.go:117] "RemoveContainer" containerID="ff4059269113fb66ba8e9a7d4645cc6e30a0491ae5d9f45a05d69c6ed25c26b1" Dec 06 10:46:17 crc kubenswrapper[4954]: I1206 10:46:17.182412 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c6ef6020-8a06-4945-9286-ed7dec7f863b-must-gather-output\") pod \"c6ef6020-8a06-4945-9286-ed7dec7f863b\" (UID: \"c6ef6020-8a06-4945-9286-ed7dec7f863b\") " Dec 06 10:46:17 crc kubenswrapper[4954]: I1206 10:46:17.182602 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pjls\" (UniqueName: \"kubernetes.io/projected/c6ef6020-8a06-4945-9286-ed7dec7f863b-kube-api-access-6pjls\") pod \"c6ef6020-8a06-4945-9286-ed7dec7f863b\" (UID: \"c6ef6020-8a06-4945-9286-ed7dec7f863b\") " Dec 06 10:46:17 crc kubenswrapper[4954]: I1206 10:46:17.189782 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ef6020-8a06-4945-9286-ed7dec7f863b-kube-api-access-6pjls" (OuterVolumeSpecName: "kube-api-access-6pjls") pod "c6ef6020-8a06-4945-9286-ed7dec7f863b" (UID: "c6ef6020-8a06-4945-9286-ed7dec7f863b"). InnerVolumeSpecName "kube-api-access-6pjls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:46:17 crc kubenswrapper[4954]: I1206 10:46:17.284636 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pjls\" (UniqueName: \"kubernetes.io/projected/c6ef6020-8a06-4945-9286-ed7dec7f863b-kube-api-access-6pjls\") on node \"crc\" DevicePath \"\"" Dec 06 10:46:17 crc kubenswrapper[4954]: I1206 10:46:17.465273 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6ef6020-8a06-4945-9286-ed7dec7f863b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c6ef6020-8a06-4945-9286-ed7dec7f863b" (UID: "c6ef6020-8a06-4945-9286-ed7dec7f863b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:46:17 crc kubenswrapper[4954]: I1206 10:46:17.489092 4954 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c6ef6020-8a06-4945-9286-ed7dec7f863b-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 06 10:46:19 crc kubenswrapper[4954]: I1206 10:46:19.456547 4954 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6ef6020-8a06-4945-9286-ed7dec7f863b" path="/var/lib/kubelet/pods/c6ef6020-8a06-4945-9286-ed7dec7f863b/volumes" Dec 06 10:48:10 crc kubenswrapper[4954]: I1206 10:48:10.104055 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:48:10 crc kubenswrapper[4954]: I1206 10:48:10.104606 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:48:40 crc kubenswrapper[4954]: I1206 10:48:40.101195 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:48:40 crc kubenswrapper[4954]: I1206 10:48:40.102860 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:49:10 crc kubenswrapper[4954]: I1206 10:49:10.101280 4954 patch_prober.go:28] interesting pod/machine-config-daemon-f5lgw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 10:49:10 crc kubenswrapper[4954]: I1206 10:49:10.101827 4954 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 10:49:10 crc kubenswrapper[4954]: I1206 10:49:10.101881 4954 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" Dec 06 10:49:10 crc kubenswrapper[4954]: I1206 10:49:10.102820 4954 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2cd0a92467771b4777c43b0c6e43b5db767be10fd5394f1e72bc471475f1b5c"} pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 10:49:10 crc kubenswrapper[4954]: I1206 10:49:10.102882 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerName="machine-config-daemon" containerID="cri-o://f2cd0a92467771b4777c43b0c6e43b5db767be10fd5394f1e72bc471475f1b5c" gracePeriod=600 Dec 06 10:49:10 crc kubenswrapper[4954]: E1206 10:49:10.232234 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:49:11 crc kubenswrapper[4954]: I1206 10:49:11.011644 4954 generic.go:334] "Generic (PLEG): container finished" podID="7e0babbe-21ce-42f4-90cf-c3eb21991413" containerID="f2cd0a92467771b4777c43b0c6e43b5db767be10fd5394f1e72bc471475f1b5c" exitCode=0 Dec 06 10:49:11 crc kubenswrapper[4954]: I1206 10:49:11.011751 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" event={"ID":"7e0babbe-21ce-42f4-90cf-c3eb21991413","Type":"ContainerDied","Data":"f2cd0a92467771b4777c43b0c6e43b5db767be10fd5394f1e72bc471475f1b5c"} Dec 06 10:49:11 crc kubenswrapper[4954]: I1206 10:49:11.011979 4954 scope.go:117] "RemoveContainer" containerID="1cb4e4c868fdbb8d7f45835a42881eb0b59cf12b98813a43b4488157840240ad" Dec 06 10:49:11 crc kubenswrapper[4954]: I1206 10:49:11.012778 4954 scope.go:117] "RemoveContainer" containerID="f2cd0a92467771b4777c43b0c6e43b5db767be10fd5394f1e72bc471475f1b5c" Dec 06 10:49:11 crc kubenswrapper[4954]: E1206 10:49:11.013103 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.849525 4954 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qqw59"] Dec 06 10:49:21 crc kubenswrapper[4954]: E1206 10:49:21.850889 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1fe9a36-080b-47c6-9332-ac1a1eb25e18" containerName="registry-server" Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.850907 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1fe9a36-080b-47c6-9332-ac1a1eb25e18" containerName="registry-server" Dec 06 10:49:21 crc kubenswrapper[4954]: E1206 10:49:21.850948 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ef6020-8a06-4945-9286-ed7dec7f863b" containerName="gather" Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.850956 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ef6020-8a06-4945-9286-ed7dec7f863b" containerName="gather" Dec 06 10:49:21 crc kubenswrapper[4954]: E1206 10:49:21.850977 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1fe9a36-080b-47c6-9332-ac1a1eb25e18" containerName="extract-content" Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.850986 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1fe9a36-080b-47c6-9332-ac1a1eb25e18" containerName="extract-content" Dec 06 10:49:21 crc kubenswrapper[4954]: E1206 10:49:21.851001 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1fe9a36-080b-47c6-9332-ac1a1eb25e18" containerName="extract-utilities" Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.851011 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1fe9a36-080b-47c6-9332-ac1a1eb25e18" containerName="extract-utilities" Dec 06 10:49:21 crc kubenswrapper[4954]: E1206 10:49:21.851036 4954 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ef6020-8a06-4945-9286-ed7dec7f863b" containerName="copy" Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.851042 4954 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ef6020-8a06-4945-9286-ed7dec7f863b" containerName="copy" Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.851326 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ef6020-8a06-4945-9286-ed7dec7f863b" containerName="copy" Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.851359 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ef6020-8a06-4945-9286-ed7dec7f863b" containerName="gather" Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.851382 4954 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1fe9a36-080b-47c6-9332-ac1a1eb25e18" containerName="registry-server" Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.853329 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qqw59" Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.863548 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qqw59"] Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.892031 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ca72985-4cd5-4ede-9a81-4f4a75f94cab-utilities\") pod \"redhat-operators-qqw59\" (UID: \"4ca72985-4cd5-4ede-9a81-4f4a75f94cab\") " pod="openshift-marketplace/redhat-operators-qqw59" Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.892151 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ca72985-4cd5-4ede-9a81-4f4a75f94cab-catalog-content\") pod \"redhat-operators-qqw59\" (UID: \"4ca72985-4cd5-4ede-9a81-4f4a75f94cab\") " pod="openshift-marketplace/redhat-operators-qqw59" Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.892349 4954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x7qs\" (UniqueName: \"kubernetes.io/projected/4ca72985-4cd5-4ede-9a81-4f4a75f94cab-kube-api-access-2x7qs\") pod \"redhat-operators-qqw59\" (UID: \"4ca72985-4cd5-4ede-9a81-4f4a75f94cab\") " pod="openshift-marketplace/redhat-operators-qqw59" Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.993757 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ca72985-4cd5-4ede-9a81-4f4a75f94cab-utilities\") pod \"redhat-operators-qqw59\" (UID: \"4ca72985-4cd5-4ede-9a81-4f4a75f94cab\") " pod="openshift-marketplace/redhat-operators-qqw59" Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.993823 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ca72985-4cd5-4ede-9a81-4f4a75f94cab-catalog-content\") pod \"redhat-operators-qqw59\" (UID: \"4ca72985-4cd5-4ede-9a81-4f4a75f94cab\") " pod="openshift-marketplace/redhat-operators-qqw59" Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.993904 4954 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x7qs\" (UniqueName: \"kubernetes.io/projected/4ca72985-4cd5-4ede-9a81-4f4a75f94cab-kube-api-access-2x7qs\") pod \"redhat-operators-qqw59\" (UID: \"4ca72985-4cd5-4ede-9a81-4f4a75f94cab\") " pod="openshift-marketplace/redhat-operators-qqw59" Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.994268 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ca72985-4cd5-4ede-9a81-4f4a75f94cab-utilities\") pod \"redhat-operators-qqw59\" (UID: \"4ca72985-4cd5-4ede-9a81-4f4a75f94cab\") " pod="openshift-marketplace/redhat-operators-qqw59" Dec 06 10:49:21 crc kubenswrapper[4954]: I1206 10:49:21.994341 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ca72985-4cd5-4ede-9a81-4f4a75f94cab-catalog-content\") pod \"redhat-operators-qqw59\" (UID: \"4ca72985-4cd5-4ede-9a81-4f4a75f94cab\") " pod="openshift-marketplace/redhat-operators-qqw59" Dec 06 10:49:22 crc kubenswrapper[4954]: I1206 10:49:22.013941 4954 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x7qs\" (UniqueName: \"kubernetes.io/projected/4ca72985-4cd5-4ede-9a81-4f4a75f94cab-kube-api-access-2x7qs\") pod \"redhat-operators-qqw59\" (UID: \"4ca72985-4cd5-4ede-9a81-4f4a75f94cab\") " pod="openshift-marketplace/redhat-operators-qqw59" Dec 06 10:49:22 crc kubenswrapper[4954]: I1206 10:49:22.171058 4954 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qqw59" Dec 06 10:49:22 crc kubenswrapper[4954]: I1206 10:49:22.929200 4954 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qqw59"] Dec 06 10:49:23 crc kubenswrapper[4954]: I1206 10:49:23.132772 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqw59" event={"ID":"4ca72985-4cd5-4ede-9a81-4f4a75f94cab","Type":"ContainerStarted","Data":"7fe35c6aa26d9e2773e73f39dabb44c5b8930082e1c49b374653979ea366182f"} Dec 06 10:49:24 crc kubenswrapper[4954]: I1206 10:49:24.144681 4954 generic.go:334] "Generic (PLEG): container finished" podID="4ca72985-4cd5-4ede-9a81-4f4a75f94cab" containerID="82cbd2fea27136dbe0ab562fe346af3d3e3628f878a060045578f364fcfd1724" exitCode=0 Dec 06 10:49:24 crc kubenswrapper[4954]: I1206 10:49:24.144798 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqw59" event={"ID":"4ca72985-4cd5-4ede-9a81-4f4a75f94cab","Type":"ContainerDied","Data":"82cbd2fea27136dbe0ab562fe346af3d3e3628f878a060045578f364fcfd1724"} Dec 06 10:49:24 crc kubenswrapper[4954]: I1206 10:49:24.443280 4954 scope.go:117] "RemoveContainer" containerID="f2cd0a92467771b4777c43b0c6e43b5db767be10fd5394f1e72bc471475f1b5c" Dec 06 10:49:24 crc kubenswrapper[4954]: E1206 10:49:24.443562 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:49:25 crc kubenswrapper[4954]: I1206 10:49:25.158540 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqw59" event={"ID":"4ca72985-4cd5-4ede-9a81-4f4a75f94cab","Type":"ContainerStarted","Data":"d900160449796cd2e91bb051260b98214f6f08128ccbe5cd868a4d36c18c4c6a"} Dec 06 10:49:28 crc kubenswrapper[4954]: I1206 10:49:28.193131 4954 generic.go:334] "Generic (PLEG): container finished" podID="4ca72985-4cd5-4ede-9a81-4f4a75f94cab" containerID="d900160449796cd2e91bb051260b98214f6f08128ccbe5cd868a4d36c18c4c6a" exitCode=0 Dec 06 10:49:28 crc kubenswrapper[4954]: I1206 10:49:28.193242 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqw59" event={"ID":"4ca72985-4cd5-4ede-9a81-4f4a75f94cab","Type":"ContainerDied","Data":"d900160449796cd2e91bb051260b98214f6f08128ccbe5cd868a4d36c18c4c6a"} Dec 06 10:49:31 crc kubenswrapper[4954]: I1206 10:49:31.223041 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqw59" event={"ID":"4ca72985-4cd5-4ede-9a81-4f4a75f94cab","Type":"ContainerStarted","Data":"35d6b11298aa891e4e688517df43ea4752ddf7a947cb4a9aef05e6db74bdb178"} Dec 06 10:49:31 crc kubenswrapper[4954]: I1206 10:49:31.254811 4954 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qqw59" podStartSLOduration=3.892856289 podStartE2EDuration="10.254789867s" podCreationTimestamp="2025-12-06 10:49:21 +0000 UTC" firstStartedPulling="2025-12-06 10:49:24.146582937 +0000 UTC m=+13938.959942326" lastFinishedPulling="2025-12-06 10:49:30.508516515 +0000 UTC m=+13945.321875904" observedRunningTime="2025-12-06 10:49:31.248329565 +0000 UTC m=+13946.061688964" watchObservedRunningTime="2025-12-06 10:49:31.254789867 +0000 UTC m=+13946.068149256" Dec 06 10:49:32 crc kubenswrapper[4954]: I1206 10:49:32.172178 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qqw59" Dec 06 10:49:32 crc kubenswrapper[4954]: I1206 10:49:32.172325 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qqw59" Dec 06 10:49:33 crc kubenswrapper[4954]: I1206 10:49:33.226540 4954 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qqw59" podUID="4ca72985-4cd5-4ede-9a81-4f4a75f94cab" containerName="registry-server" probeResult="failure" output=< Dec 06 10:49:33 crc kubenswrapper[4954]: timeout: failed to connect service ":50051" within 1s Dec 06 10:49:33 crc kubenswrapper[4954]: > Dec 06 10:49:36 crc kubenswrapper[4954]: I1206 10:49:36.444079 4954 scope.go:117] "RemoveContainer" containerID="f2cd0a92467771b4777c43b0c6e43b5db767be10fd5394f1e72bc471475f1b5c" Dec 06 10:49:36 crc kubenswrapper[4954]: E1206 10:49:36.444847 4954 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5lgw_openshift-machine-config-operator(7e0babbe-21ce-42f4-90cf-c3eb21991413)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5lgw" podUID="7e0babbe-21ce-42f4-90cf-c3eb21991413" Dec 06 10:49:42 crc kubenswrapper[4954]: I1206 10:49:42.223556 4954 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qqw59" Dec 06 10:49:42 crc kubenswrapper[4954]: I1206 10:49:42.279962 4954 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qqw59" Dec 06 10:49:42 crc kubenswrapper[4954]: I1206 10:49:42.468232 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qqw59"] Dec 06 10:49:43 crc kubenswrapper[4954]: I1206 10:49:43.348062 4954 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qqw59" podUID="4ca72985-4cd5-4ede-9a81-4f4a75f94cab" containerName="registry-server" containerID="cri-o://35d6b11298aa891e4e688517df43ea4752ddf7a947cb4a9aef05e6db74bdb178" gracePeriod=2 Dec 06 10:49:44 crc kubenswrapper[4954]: I1206 10:49:44.400078 4954 generic.go:334] "Generic (PLEG): container finished" podID="4ca72985-4cd5-4ede-9a81-4f4a75f94cab" containerID="35d6b11298aa891e4e688517df43ea4752ddf7a947cb4a9aef05e6db74bdb178" exitCode=0 Dec 06 10:49:44 crc kubenswrapper[4954]: I1206 10:49:44.400121 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqw59" event={"ID":"4ca72985-4cd5-4ede-9a81-4f4a75f94cab","Type":"ContainerDied","Data":"35d6b11298aa891e4e688517df43ea4752ddf7a947cb4a9aef05e6db74bdb178"} Dec 06 10:49:45 crc kubenswrapper[4954]: I1206 10:49:45.270364 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qqw59" Dec 06 10:49:45 crc kubenswrapper[4954]: I1206 10:49:45.307832 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x7qs\" (UniqueName: \"kubernetes.io/projected/4ca72985-4cd5-4ede-9a81-4f4a75f94cab-kube-api-access-2x7qs\") pod \"4ca72985-4cd5-4ede-9a81-4f4a75f94cab\" (UID: \"4ca72985-4cd5-4ede-9a81-4f4a75f94cab\") " Dec 06 10:49:45 crc kubenswrapper[4954]: I1206 10:49:45.308069 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ca72985-4cd5-4ede-9a81-4f4a75f94cab-catalog-content\") pod \"4ca72985-4cd5-4ede-9a81-4f4a75f94cab\" (UID: \"4ca72985-4cd5-4ede-9a81-4f4a75f94cab\") " Dec 06 10:49:45 crc kubenswrapper[4954]: I1206 10:49:45.308117 4954 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ca72985-4cd5-4ede-9a81-4f4a75f94cab-utilities\") pod \"4ca72985-4cd5-4ede-9a81-4f4a75f94cab\" (UID: \"4ca72985-4cd5-4ede-9a81-4f4a75f94cab\") " Dec 06 10:49:45 crc kubenswrapper[4954]: I1206 10:49:45.308807 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ca72985-4cd5-4ede-9a81-4f4a75f94cab-utilities" (OuterVolumeSpecName: "utilities") pod "4ca72985-4cd5-4ede-9a81-4f4a75f94cab" (UID: "4ca72985-4cd5-4ede-9a81-4f4a75f94cab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:49:45 crc kubenswrapper[4954]: I1206 10:49:45.314350 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ca72985-4cd5-4ede-9a81-4f4a75f94cab-kube-api-access-2x7qs" (OuterVolumeSpecName: "kube-api-access-2x7qs") pod "4ca72985-4cd5-4ede-9a81-4f4a75f94cab" (UID: "4ca72985-4cd5-4ede-9a81-4f4a75f94cab"). InnerVolumeSpecName "kube-api-access-2x7qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 10:49:45 crc kubenswrapper[4954]: I1206 10:49:45.410910 4954 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ca72985-4cd5-4ede-9a81-4f4a75f94cab-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 10:49:45 crc kubenswrapper[4954]: I1206 10:49:45.410953 4954 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x7qs\" (UniqueName: \"kubernetes.io/projected/4ca72985-4cd5-4ede-9a81-4f4a75f94cab-kube-api-access-2x7qs\") on node \"crc\" DevicePath \"\"" Dec 06 10:49:45 crc kubenswrapper[4954]: I1206 10:49:45.415536 4954 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqw59" event={"ID":"4ca72985-4cd5-4ede-9a81-4f4a75f94cab","Type":"ContainerDied","Data":"7fe35c6aa26d9e2773e73f39dabb44c5b8930082e1c49b374653979ea366182f"} Dec 06 10:49:45 crc kubenswrapper[4954]: I1206 10:49:45.415612 4954 scope.go:117] "RemoveContainer" containerID="35d6b11298aa891e4e688517df43ea4752ddf7a947cb4a9aef05e6db74bdb178" Dec 06 10:49:45 crc kubenswrapper[4954]: I1206 10:49:45.416208 4954 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qqw59" Dec 06 10:49:45 crc kubenswrapper[4954]: I1206 10:49:45.417043 4954 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ca72985-4cd5-4ede-9a81-4f4a75f94cab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ca72985-4cd5-4ede-9a81-4f4a75f94cab" (UID: "4ca72985-4cd5-4ede-9a81-4f4a75f94cab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 10:49:45 crc kubenswrapper[4954]: I1206 10:49:45.447103 4954 scope.go:117] "RemoveContainer" containerID="d900160449796cd2e91bb051260b98214f6f08128ccbe5cd868a4d36c18c4c6a" Dec 06 10:49:45 crc kubenswrapper[4954]: I1206 10:49:45.486010 4954 scope.go:117] "RemoveContainer" containerID="82cbd2fea27136dbe0ab562fe346af3d3e3628f878a060045578f364fcfd1724" Dec 06 10:49:45 crc kubenswrapper[4954]: I1206 10:49:45.512984 4954 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ca72985-4cd5-4ede-9a81-4f4a75f94cab-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 10:49:45 crc kubenswrapper[4954]: I1206 10:49:45.750866 4954 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qqw59"] Dec 06 10:49:45 crc kubenswrapper[4954]: I1206 10:49:45.762747 4954 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qqw59"]